-
Notifications
You must be signed in to change notification settings - Fork 5.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add onnxslim support #199
add onnxslim support #199
Conversation
Signed-off-by: inisis <[email protected]>
Reviewer's Guide by SourceryThis pull request adds support for optimizing the 'inswapper_128_fp16.onnx' model using the 'onnxslim' tool. The changes are implemented by updating the README.md file to include instructions on how to use 'onnxslim' for model optimization. File-Level Changes
Tips
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @inisis - I've reviewed your changes - here's some feedback:
Overall Comments:
- Consider adding a brief explanation of what onnxslim does and its benefits (e.g., reduced model size, faster inference).
- Include instructions on how to install onnxslim, or add it to the project's dependencies if it's not already included.
Here's what I looked at during the review
- 🟢 General issues: all looks good
- 🟢 Security: all looks good
- 🟢 Testing: all looks good
- 🟢 Complexity: all looks good
- 🟡 Documentation: 1 issue found
Help me be more useful! Please click 👍 or 👎 on each comment to tell me if it was helpful.
README.md
Outdated
Then put those 2 files on the "**models**" folder, you can further optimize inswapper_128_fp16.onnx by using | ||
``` | ||
onnxslim inswapper_128_fp16.onnx inswapper_128_fp16.onnx |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
suggestion (documentation): Consider adding a brief explanation for the onnxslim
command.
Adding a brief explanation or a link to the onnxslim
documentation would help users who are not familiar with this tool.
Then put those 2 files on the "**models**" folder, you can further optimize inswapper_128_fp16.onnx by using | |
``` | |
onnxslim inswapper_128_fp16.onnx inswapper_128_fp16.onnx | |
Then put those 2 files on the "**models**" folder. You can further optimize inswapper_128_fp16.onnx by using onnxslim: | |
onnxslim inswapper_128_fp16.onnx inswapper_128_fp16.onnx
onnxslim is a tool that reduces the size of ONNX models by removing unnecessary data. For more information, visit: https://github.com/onnx/optimizer
Signed-off-by: inisis <[email protected]>
#198
Summary by Sourcery
Add support for onnxslim by updating the README.md with instructions on how to optimize the inswapper_128_fp16.onnx model.
Documentation: