-
-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Issues: unslothai/unsloth
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Unexpected behavior with LLaMA 3.1 8b instruct model after reinstallation of unsloth and libraries
#1588
opened Jan 28, 2025 by
andrijafactoryww
Did you tested unsloth/phi-4-bnb-4bit model with text generation inference (TGI)
#1582
opened Jan 27, 2025 by
farzanehnakhaee70
Facing this error while trying to use unsloth's 4bit llama 3.2 vision 11B model for OCR task
currently fixing
Am fixing now!
#1581
opened Jan 27, 2025 by
Pavankurapati03
Add support for GRPO
feature request
Feature request pending on roadmap
#1579
opened Jan 25, 2025 by
gagan3012
Continual Pretraining: Unexpected Trainable Parameters in PEFT Model
currently fixing
Am fixing now!
#1578
opened Jan 25, 2025 by
kailas711
Load an existing model under no Internet condition
currently fixing
Am fixing now!
#1577
opened Jan 25, 2025 by
Taimin
torch._dynamo.exc.Unsupported when finetuning QWEN2-VL
#1574
opened Jan 23, 2025 by
definitelynotadoge
TypeError: 'str' object is not callable error with Llama 3.1 8B instruct model (4 bit quantization)
#1570
opened Jan 21, 2025 by
BijanProjects
AttributeError: 'NoneType' object has no attribute 'attn_bias'
#1566
opened Jan 20, 2025 by
Bhabuk10
Validation of Fine-Tuning and Inference Methods for Multi-Turn Conversations with LLaMA 3.1 8B
#1564
opened Jan 20, 2025 by
Kshitiz-Khandel
[Fixing] Better vision model finetuning
currently fixing
Am fixing now!
#1559
opened Jan 19, 2025 by
danielhanchen
5 tasks
[Fixing] Better exporting to Am fixing now!
llama.cpp
and 16bit merging
currently fixing
#1558
opened Jan 19, 2025 by
danielhanchen
7 tasks
flash-attn Detection Logic Fails for flash-attn
fixed - pending confirmation
Fixed, waiting for confirmation from poster
#1555
opened Jan 19, 2025 by
Zzhiter
Previous Next
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.