Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference Time Problem and Model Storage Problem #8

Open
Fyzjym opened this issue Jan 29, 2024 · 0 comments
Open

Inference Time Problem and Model Storage Problem #8

Fyzjym opened this issue Jan 29, 2024 · 0 comments

Comments

@Fyzjym
Copy link

Fyzjym commented Jan 29, 2024

Thank you for your contribution to HTG and sharing the code.

Q1:

When calculating the FID, is it generating a dataset that is the same as the test set and then calculating the distance between the two. For example, fid (TestSet_iam & TestSet_WordStylist)
The reason for this problem is that WordStylist takes a lot of time to generate images, it takes 30s to reason 1000 steps for one image, and the complete test set is about 25k images. This means that testing FID will take about 8 days.

Q2:

I trained WordStylist using 3 V100s, however the size of the stored model A does not match the size of the model B you published. When inferencing, I got an error (some layers are missing) using Model A. I can inference normally using Model B that you have made public.
Is this because I am training with multiple cards?

Thank you again and look forward to your reply.
:D

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant