-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gpu inference #3
Comments
WebGPU browser compatibility is still rather limited, however it should most definitely be possible. I'll be working on it asap. |
thanks, and for cpu inference? it uses multithread inference? when i try raw onxx on wasm cannot do it with multithread inference, no errors just did not do anything |
Yes it can be multithreaded but you need to enable cross-origin isolation. |
This is a great idea and would be very useful. I would like to look into this or help you out. Have there been any updates on this or pointers for where to start? I wasn't sure how much of this needs to be done in WebGPU via JS or inside the WebAssembly. I also wasn't sure whether or not this will touch the build process in the piper wasm repo. |
I'm also interested in helping with this where possible. From my understanding the blocker is related to the actual piper library and not vits. ONNX runtime has gpu support of course, and piper can be ran in GPU locally, but it looks like there's no compiled WASM bundle of piper that is built for gpu support which would run on the ONNX runtime. I'm pretty new to WASM especially GPU WASM, so I could be wrong. |
I think that is correct. As I have look into this further, it also seems based on ken107/read-aloud#424 (comment) that even in piper itself within a standard desktop runtime there are issues with GPU utilization. So I think this might be blocked on piper before pursuing web / wasm issues further. |
is there a way to use webgpu or multithread cpu inference? or that depends on the browser/os host? regards
The text was updated successfully, but these errors were encountered: