-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SpanMarker with ONNX models #26
Comments
Hello! I have done a very quick experiment to try and export SpanMarker to ONNX, but I got some incomprehensible errors. I don't have the experience with ONNX at the moment to quickly create such an exporter.
|
Hi @tomaarsen , I would like to collaborate with this issue. |
That would be awesome! I'm open to PRs on the matter. |
This would indeed be a nice feature to add. We export all our models to ONNX before deploying and this is unfortunately not currently possible with SpanMarker. Keep up the good work! |
@tomaarsen , can you upload ONNX format for Span Marker. |
I'm afraid I haven't been able to convert SpanMarker models to ONNX yet. |
Hello @tomaarsen . |
Hello! Awesome! I'd love to get ONNX support for SpanMarker somehow.
|
Great! I will push the branch this weekend as soon as I can. |
ONNX support would be amazing! One can also quantize the models for further inference speed optimization once the base models are converted to ONNX. It is essentially 5 lines of code from ONNX to quantized ONNX. |
@tomaarsen @polodealvarado is ONNX implementation done? |
Hi @tomaarsen! Is there a ONNX exporter planned? Have you tried using SpanMarker with ONNX models for inference?
Would be really curious if you experimented with that already! :-)
The text was updated successfully, but these errors were encountered: