-
-
Notifications
You must be signed in to change notification settings - Fork 860
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Llama will not save properly #1947
Comments
For any future people who may stumble across this, just don't use FSDP |
Did the model total size bloat / appear much different from the original's size? |
I used a similar configuration file to train the model, and was able to do inference without running into error. I made sure that my FSDP configurations are the same as your yml. Here is mine:
Perhaps your issue has to do with your (presumably) customised tokenizer config? Would you be able to provide me that so I can dig deeper? Thanks! |
Please check that this issue hasn't been reported before.
Expected Behavior
When my model completes and I try to do inference with it it should load without error
Current behaviour
My model is missing parameters and thus errors out when loading
Steps to reproduce
Train a model with my config, and any pre-tokenized dataset, and then try to run it
Config yaml
Possible solution
No response
Which Operating Systems are you using?
Python Version
3.10
axolotl branch-commit
main
Acknowledgements
The text was updated successfully, but these errors were encountered: