You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was aggregating models for a different project and realized there are a couple of older multilingual baselines we should try. facebook/mcontriever-msmarco (Contriever multilingual) and castorini/mdpr-tied-pft-msmarco (DPR-based with tied encoders). I don't think they're going to be very strong, but they're potentially worth having since they're baselines people may have heard of. Both can be run with sentence-transformers out of the box.
Great ran nthakur/mcontriever-base-msmarco (facebook one did not work with ST for me) & castorini/mdpr-tied-pft-msmarco - Results are here: embeddings-benchmark/results#40
Are we good on bge-m3 & gte-multilingual-base or are results still missing? I think we should also have them
I was aggregating models for a different project and realized there are a couple of older multilingual baselines we should try.
facebook/mcontriever-msmarco
(Contriever multilingual) andcastorini/mdpr-tied-pft-msmarco
(DPR-based with tied encoders). I don't think they're going to be very strong, but they're potentially worth having since they're baselines people may have heard of. Both can be run with sentence-transformers out of the box.cc @KennethEnevoldsen and @Muennighoff
The text was updated successfully, but these errors were encountered: