Skip to content
This repository has been archived by the owner on Dec 29, 2022. It is now read-only.

InvalidArgumentError, Found Inf or NaN gradient(global norm). #353

Open
CodeXiaoLingYun opened this issue Mar 2, 2019 · 2 comments
Open

Comments

@CodeXiaoLingYun
Copy link

NO,I am seeing the same error. I also used the same function(tf.clip_by_global_norm),but I found learning rate and function are not the key reasons. when i generate Vocab,i set the size is 4682,and the vocab_size is 4682 in train.py,too. as the same,i do not know whether decrease the batch size is useful.
I look at an answer that it might be related to the vanishing/exploding gradient? I do not have any methods.

In train.py
image

the error is
image

and the error is found here
image

@CodeXiaoLingYun
Copy link
Author

i found a questionable point is GPU:0. I think it may be related with my GPU,so i try to add the code like this
image
i do not know whether it is useful,i want to try.

@laosanli
Copy link

Have you solved the mistake

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants