Replies: 1 comment
-
I would try to fit as much samples as possible on the device. You can do that too using PEFT, ex.LoRA |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I've read through multiple issues here which mention that fine-tuning requires a large batch size. By virtue of this, I imagine it also requires a good few number of fine-tuning examples.
Could someone provide a rule of thumb of what we should be looking at for the number of fine-tuning examples and batch size? Is default batch size of 128 enough? Or should we be going higher
Beta Was this translation helpful? Give feedback.
All reactions