Skip to content
This repository has been archived by the owner on Mar 26, 2019. It is now read-only.

Is it possible to load an onnx model with a non-Python API? #41

Open
newling opened this issue Mar 13, 2018 · 5 comments
Open

Is it possible to load an onnx model with a non-Python API? #41

newling opened this issue Mar 13, 2018 · 5 comments

Comments

@newling
Copy link

newling commented Mar 13, 2018

In particular I'd like to load an .onnx file in C++. Thanks.

@rajanksin
Copy link
Collaborator

@newling Currently we do not have a C++ API to import onnx models to mxnet.

@lupesko
Copy link
Contributor

lupesko commented Mar 15, 2018

@newling can you describe your use case a bit? we may be able to help.

@newling
Copy link
Author

newling commented Mar 15, 2018

I would like to load onnx models, and run them (on CPU). Mostly for testing: I would like to compare the outputs obtained with my (C++) implementations. Not urgent.

@jiankang1991
Copy link

Does anyone know how to do that?

@lupesko
Copy link
Contributor

lupesko commented Nov 22, 2018

In MXNet, at least, ONNX load is supported only via Python API. Not sure about other frameworks.
However, you can write a simple script that exports the onnx model to MXNet model, and then you can load the MXNet model with the various MXNet language APIs such as Scala or C++, and there is a Java API coming soon. You can also use MXNet Model Server that supports ONNX out of the box (no conversions needed).

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

5 participants