Skip to content

Commit

Permalink
Try simple torch optimizer
Browse files Browse the repository at this point in the history
  • Loading branch information
mwaskom committed Feb 7, 2024
1 parent e855a8b commit 791a5af
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion config/codellama.yml
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ wandb_run_id:
gradient_accumulation_steps: 1
micro_batch_size: 32
num_epochs: 4
optimizer: adamw_bnb_8bit
optimizer: adamw_torch
lr_scheduler: cosine
learning_rate: 0.0001

Expand Down
2 changes: 1 addition & 1 deletion config/llama-2.yml
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ wandb_run_id:
gradient_accumulation_steps: 1
micro_batch_size: 32
num_epochs: 4
optimizer: adamw_bnb_8bit
optimizer: adamw_torch
lr_scheduler: cosine
learning_rate: 0.0001

Expand Down
2 changes: 1 addition & 1 deletion config/mistral.yml
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ wandb_run_id:
gradient_accumulation_steps: 1
micro_batch_size: 32
num_epochs: 4
optimizer: adamw_bnb_8bit
optimizer: adamw_torch
lr_scheduler: cosine
learning_rate: 0.0001

Expand Down

0 comments on commit 791a5af

Please sign in to comment.