-
I've got model parameters in safetensor format by fine-tuning a BERT model with the classification head using Transformers library (i.e. using Does Candle have an implementation of the classification head for BERT or should I implement it myself? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
I've implemented a classification head for |
Beta Was this translation helpful? Give feedback.
-
Mind sharing the implementation? Trying to understand how to use BERT for ranking :) |
Beta Was this translation helpful? Give feedback.
You can definitely share the example, it'll help others for sure.
As for the inclusion within candle-transformers, I'm not sure we have taken a stance on implementing all the potential heads in transformers. So we might not merge it or not soon. Currently bert is mostly used (to the best of my knowledge) for RAG implementations, which is why it has the current form. Adding the classification in client code seems the most reasonable for me at the present.