Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Fixing] More finetuning support #1561

Open
5 tasks
danielhanchen opened this issue Jan 19, 2025 · 5 comments
Open
5 tasks

[Fixing] More finetuning support #1561

danielhanchen opened this issue Jan 19, 2025 · 5 comments

Comments

@danielhanchen
Copy link
Contributor

danielhanchen commented Jan 19, 2025

  • Support sequence classification
  • Flex Attention for Gemma and others
  • Variable sequence length and auto unpadding / padding
  • Tool Calling
  • Refactor and merge xformers, SDPA, flash-attn, flex-attention
@Zzhiter
Copy link
Contributor

Zzhiter commented Jan 20, 2025

Need help? I can help with 1

@Zzhiter
Copy link
Contributor

Zzhiter commented Jan 24, 2025

@danielhanchen Hi, in fact, I have implemented sequence classification and supported LoRA. Through my experiments, there is no loss. May I contribute my code to the community?

@shimmyshimmer
Copy link
Collaborator

@danielhanchen Hi, in fact, I have implemented sequence classification and supported LoRA. Through my experiments, there is no loss. May I contribute my code to the community?

Need help? I can help with 1

Hi there @Zzhiter much much apologies for the late reply! We missed your comment. absolutely we would LOVE to have you collab with us. Do you happen to have Discord or Slack or would you just like to communicate here? :)

@Zzhiter
Copy link
Contributor

Zzhiter commented Feb 3, 2025

@danielhanchen Hi, in fact, I have implemented sequence classification and supported LoRA. Through my experiments, there is no loss. May I contribute my code to the community?

Need help? I can help with 1

Hi there @Zzhiter much much apologies for the late reply! We missed your comment. absolutely we would LOVE to have you collab with us. Do you happen to have Discord or Slack or would you just like to communicate here? :)

I've joined the Unsloth community's Discord server. However, I haven't used Discord frequently before. Let's communicate here directly! Shall I briefly introduce my implementation ideas first?

@Dillion
Copy link

Dillion commented Feb 5, 2025

i'm interested in joining this discussion, @Zzhiter can you share?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants