Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add onnxslim support #215

Open
wants to merge 1 commit into
base: experimental
Choose a base branch
from
Open

Conversation

inisis
Copy link

@inisis inisis commented Aug 12, 2024

Summary by Sourcery

Update README.md to provide additional instructions for optimizing the inswapper_128_fp16.onnx model using onnxslim.

Documentation:

  • Updated README.md to include instructions for optimizing the inswapper_128_fp16.onnx model using onnxslim.

Signed-off-by: inisis <[email protected]>
Copy link
Contributor

sourcery-ai bot commented Aug 12, 2024

Reviewer's Guide by Sourcery

This pull request updates the README.md file to include information about optimizing the inswapper_128_fp16.onnx model using onnxslim. The change provides users with an optional step to improve the performance of the model.

File-Level Changes

Files Changes
README.md Added information about optimizing the inswapper_128_fp16.onnx model using onnxslim
README.md Included a command-line example for using onnxslim to optimize the model

Tips
  • Trigger a new Sourcery review by commenting @sourcery-ai review on the pull request.
  • Continue your discussion with Sourcery by replying directly to review comments.
  • You can change your review settings at any time by accessing your dashboard:
    • Enable or disable the Sourcery-generated pull request summary or reviewer's guide;
    • Change the review language;
  • You can always contact us if you have any questions or feedback.

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @inisis - I've reviewed your changes - here's some feedback:

Overall Comments:

  • Consider adding more context about the benefits of using onnxslim for optimization (e.g., reduced file size, improved inference speed).
  • It might be beneficial to move this optimization tip to a separate 'Advanced Usage' or 'Optimization' section in the documentation to keep the basic setup instructions clear and concise.
Here's what I looked at during the review
  • 🟢 General issues: all looks good
  • 🟢 Security: all looks good
  • 🟢 Testing: all looks good
  • 🟢 Complexity: all looks good
  • 🟡 Documentation: 1 issue found

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment to tell me if it was helpful.

@@ -26,7 +26,10 @@ Users of this software are expected to use this software responsibly while abidi
1. [GFPGANv1.4](https://huggingface.co/hacksider/deep-live-cam/resolve/main/GFPGANv1.4.pth)
2. [inswapper_128_fp16.onnx](https://huggingface.co/hacksider/deep-live-cam/resolve/main/inswapper_128_fp16.onnx)

Then put those 2 files on the "**models**" folder
Then put those 2 files on the "**models**" folder, you can further optimize inswapper_128_fp16.onnx by using [onnxslim](https://github.com/inisis/OnnxSlim)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (documentation): Change 'on the "models" folder' to 'in the "models" folder'.

The correct preposition is 'in' rather than 'on' when referring to placing files within a folder.

@inisis inisis mentioned this pull request Aug 12, 2024
@inisis inisis changed the title experimental branch add onnxslim support Aug 13, 2024
@hacksider
Copy link
Owner

As per checking, it just doesn't work based on the description

Exception in Tkinter callback
Traceback (most recent call last):
File "C:\Users\My Pc\AppData\Local\Programs\Python\Python311\Lib\tkinter_init_.py", line 1967, in call
return self.func(*args)
^^^^^^^^^^^^^^^^
File "E:\Deep-Live-Cam\Deep-Live-Cam\venv\Lib\site-packages\customtkinter\windows\widgets\ctk_button.py", line 554, in _clicked
self._command()
File "E:\Deep-Live-Cam\Deep-Live-Cam\modules\ui.py", line 172, in
live_button = ctk.CTkButton(root, text='Live', cursor='hand2', command=lambda: webcam_preview(camera_variable.get(), virtual_cam_out_value.get()))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Deep-Live-Cam\Deep-Live-Cam\modules\ui.py", line 442, in webcam_preview
preview_running = webcam_preview_loop(camera, source_image, frame_processors)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Deep-Live-Cam\Deep-Live-Cam\modules\ui.py", line 365, in webcam_preview_loop
temp_frame = frame_processor.process_frame(source_image, temp_frame)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Deep-Live-Cam\Deep-Live-Cam\modules\processors\frame\face_swapper.py", line 58, in process_frame
temp_frame = swap_face(source_face, target_face, temp_frame)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Deep-Live-Cam\Deep-Live-Cam\modules\processors\frame\face_swapper.py", line 47, in swap_face
return get_face_swapper().get(temp_frame, target_face, source_face, paste_back=True)
^^^^^^^^^^^^^^^^^^
File "E:\Deep-Live-Cam\Deep-Live-Cam\modules\processors\frame\face_swapper.py", line 43, in get_face_swapper
FACE_SWAPPER = insightface.model_zoo.get_model(model_path, providers=modules.globals.execution_providers)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Deep-Live-Cam\Deep-Live-Cam\venv\Lib\site-packages\insightface\model_zoo\model_zoo.py", line 96, in get_model
model = router.get_model(providers=providers, provider_options=provider_options)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Deep-Live-Cam\Deep-Live-Cam\venv\Lib\site-packages\insightface\model_zoo\model_zoo.py", line 40, in get_model
session = PickableInferenceSession(self.onnx_file, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "E:\Deep-Live-Cam\Deep-Live-Cam\venv\Lib\site-packages\insightface\model_zoo\model_zoo.py", line 25, in init
super().init(model_path, **kwargs)
File "E:\Deep-Live-Cam\Deep-Live-Cam\venv\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "E:\Deep-Live-Cam\Deep-Live-Cam\venv\Lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 452, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from E:\Deep-Live-Cam\Deep-Live-Cam\modules..\models\inswapper_128.onnx failed:D:\a_work\1\s\onnxruntime\core\graph\model.cc:150 onnxruntime::Model::Model Unsupported model IR version: 10, max supported IR version: 9

@inisis
Copy link
Author

inisis commented Aug 30, 2024

@hacksider Hi, there seems to be a version mismatch, you can check the similar issue here or official doc, and can you attch your onnx and onnxruntime version

@inisis
Copy link
Author

inisis commented Aug 30, 2024

and if you want to use the slimmed model, it's preferred to slim under the same environment.

@inisis
Copy link
Author

inisis commented Aug 31, 2024

@hacksider is there any updates, I have tested and it really works.

@inisis
Copy link
Author

inisis commented Oct 7, 2024

@hacksider Hi, is there any updates?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants