Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Whether the comfyui_tensorrt node can load the tensorrt plan model generated by the tool ? #85

Open
blacklong28 opened this issue Oct 11, 2024 · 0 comments

Comments

@blacklong28
Copy link

blacklong28 commented Oct 11, 2024

I used the tool to convert an SDXL model to generate the tensorrt model (.plan), and then I tried to load it using Comfyui and plug it into the sampler's MODEL input, but the following error was reported:

10/11/2024-13:32:07] [TRT] [I] Loaded engine size: 5324 MiB
[10/11/2024-13:32:07] [TRT] [I] [MS] Running engine with multi stream info
[10/11/2024-13:32:07] [TRT] [I] [MS] Number of aux streams is 2
[10/11/2024-13:32:07] [TRT] [I] [MS] Number of total worker streams is 3
[10/11/2024-13:32:07] [TRT] [I] [MS] The main stream provided by execute/enqueue calls is the first worker stream
[10/11/2024-13:32:10] [TRT] [I] [MemUsageChange] TensorRT-managed allocation in IExecutionContext creation: CPU +0, GPU +3112, now: CPU 0, GPU 8365 (MiB)
model_type EPS
#10 [TensorRTLoader]: 7.20s
Requested to load SDXL
Loading 1 new model
loaded completely 0.0 0.0 True
  0%|                                                                                                                                                       | 0/20 [00:00<?, ?it/s]
[10/11/2024-13:32:10] [TRT] [E] IExecutionContext::setInputShape: Error Code 3: Internal Error (Given invalid tensor name: x. Get valid tensor names with getIOTensorName())
[10/11/2024-13:32:10] [TRT] [E] IExecutionContext::setInputShape: Error Code 3: Internal Error (Given invalid tensor name: timesteps. Get valid tensor names with getIOTensorName())
[10/11/2024-13:32:10] [TRT] [E] IExecutionContext::setInputShape: Error Code 3: Internal Error (Given invalid tensor name: context. Get valid tensor names with getIOTensorName())
[10/11/2024-13:32:10] [TRT] [E] IExecutionContext::setInputShape: Error Code 3: Internal Error (Given invalid tensor name: y. Get valid tensor names with getIOTensorName())
[10/11/2024-13:32:10] [TRT] [E] ICudaEngine::getTensorDataType: Error Code 3: Internal Error (Given invalid tensor name: x. Get valid tensor names with getIOTensorName())
[10/11/2024-13:32:10] [TRT] [E] ICudaEngine::getTensorDataType: Error Code 3: Internal Error (Given invalid tensor name: timesteps. Get valid tensor names with getIOTensorName())
[10/11/2024-13:32:10] [TRT] [E] ICudaEngine::getTensorDataType: Error Code 3: Internal Error (Given invalid tensor name: context. Get valid tensor names with getIOTensorName())
[10/11/2024-13:32:10] [TRT] [E] ICudaEngine::getTensorDataType: Error Code 3: Internal Error (Given invalid tensor name: y. Get valid tensor names with getIOTensorName())
[10/11/2024-13:32:10] [TRT] [E] IExecutionContext::setTensorAddress: Error Code 3: Internal Error (Given invalid tensor name: x. Get valid tensor names with getIOTensorName())
[10/11/2024-13:32:10] [TRT] [E] IExecutionContext::setTensorAddress: Error Code 3: Internal Error (Given invalid tensor name: timesteps. Get valid tensor names with getIOTensorName())
[10/11/2024-13:32:10] [TRT] [E] IExecutionContext::setTensorAddress: Error Code 3: Internal Error (Given invalid tensor name: context. Get valid tensor names with getIOTensorName())
[10/11/2024-13:32:10] [TRT] [E] IExecutionContext::setTensorAddress: Error Code 3: Internal Error (Given invalid tensor name: y. Get valid tensor names with getIOTensorName())
[10/11/2024-13:32:10] [TRT] [E] IExecutionContext::enqueueV3: Error Code 3: API Usage Error (Parameter check failed, condition: mContext.profileObliviousBindings.at(profileObliviousIndex) != nullptr. Address is not set for input tensor sample. Call setInputTensorAddress or setTensorAddress before enqueue/execute.)
  0%|                                                                                                                                                       | 0/20 [00:00<?, ?it/s]
!!! Exception during processing !!! The size of tensor a (128) must match the size of tensor b (6) at non-singleton dimension 3
...

The cmd convert from onnx to trt I used:

/usr/src/tensorrt/bin/trtexec --builderOptimizationLevel=4 --stronglyTyped --onnx=./backbone.onnx   --minShapes=sample:2x4x128x128,timestep:1,encoder_hidden_states:2x77x2048,text_embeds:2x1280,time_ids:2x6   --optShapes=sample:16x4x128x128,timestep:1,encoder_hidden_states:16x77x2048,text_embeds:16x1280,time_ids:16x6   --maxShapes=sample:16x4x128x128,timestep:1,encoder_hidden_states:16x77x2048,text_embeds:16x1280,time_ids:16x6   --saveEngine=backbone.plan

Can I use the backbone.plan model Load by Comfyui (https://github.com/comfyanonymous/ComfyUI_TensorRT) TensorRT Loader Node ?
Can someone help me? Thank you!

@blacklong28 blacklong28 changed the title Whether the sampler inference can be performed after comfyui load through the tensorrt plan model generated by the tool Whether the comfyui_tensorrt node can load the tensorrt plan model generated by the tool ? Oct 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant