You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I ran this, after 4 epochs G appeared to be farther and farther from converging (currently g_loss~=3) while D does appear to be converging slowly. Can you post a tensorboard screenshot of the convergence behavior?
The text was updated successfully, but these errors were encountered:
I trained it yesterday with the default settings for ±18 epochs (468 batches of 128 images each) and then plotted d_loss and g_loss:
It seems like it has 3 phases, 0-300, 300-1500 and the rest. A closer look to the first ±600 batches:
One interesting finding is that when generating output with the nice option turned off, it creates mostly 1s and a lot of 0s. With nice turned on, the class distribution is much more even.
When I ran this, after 4 epochs G appeared to be farther and farther from converging (currently g_loss~=3) while D does appear to be converging slowly. Can you post a tensorboard screenshot of the convergence behavior?
The text was updated successfully, but these errors were encountered: