-
Notifications
You must be signed in to change notification settings - Fork 825
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
is it possible to store intermediate results? #103
Comments
I looked into doing this myself a while back. I couldn't find a way to make it work with the default lbfgs optimizer. I think it's related to: https://stackoverflow.com/questions/44685228/how-to-get-loss-function-history-using-tf-contrib-opt-scipyoptimizerinterface I thought there would have been a way since it displays a notice every print_iterations but that's built into the optimizer. Then I realized with the adam optimizer the printing of iterations was coded into a loop right there in neural_style.py. But, I didn't want to bother with the adam optimizer because I don't know why. I honestly haven't tried it at all. Anyway, that's old news as I just now tried implementing output during the print_iteration step with the adam optimizer and it was a success! I am not as impressed with the results from adam as lbfgs, though adam was super fast: Find the modifications I did below to get intermediate results. These are just the edits, not the whole files. neural_style.py
stylize_image.sh
|
thanks, I will try this |
I have been trying to find out how and when the code stores the image. I wanted to know if it's possible to store results, say,for every 100 iterations?
if yes, how can I do it?
The text was updated successfully, but these errors were encountered: