Running Deep-Live-Cam on Raspberry Pi 5 #453
Replies: 3 comments 2 replies
-
That's amazing |
Beta Was this translation helpful? Give feedback.
-
博主能不能帮我看一下,我的代码运行出来的图片很模糊 |
Beta Was this translation helpful? Give feedback.
-
As far as my understanding goes, if at all there would be next to none benefit with the current Hailo NPU, primarily not because of any potential abnormal amounts of bugs or the TOPS of hardware, but due to the memory bandwidth, the it uses just 1x pcie slot for communication and even in it's best form (pcie 3.0) chances are that just the data transfer from cpu to npu would be a big bottleneck, negating it as a feasible solution(again, like I haven't explored much specifics of to be cent percent sure but this is a big consideration.......) |
Beta Was this translation helpful? Give feedback.
-
After succeeding running this cool project it on a Macbook, I have tried making it work on my RPI5+AI kit.
It took me the weekend, but I was eventually able to make it run, even if very laggy:
This wonderful project is able to run on "simple" machines, not just heavily reinforced with GPU.
As far as I understand, it unfortunately not harnessing any of the Hailo 8L abilities.
Are there members here that have experimented with such libraries and applications over the Pi5+8L kit?
What I did I order to run it on the Raspberry pi?
This is the state of my env that allowed me to run it. I did the minimal changes required to make it work, and I am sure it can be done better or more efficient.
1. Set up
2. Install python 3.10
3. Create venv, install adjusted requirements
with
rpi5_requirements.txt
as follows:Even though it runs slow, I think it is amazing that such simple hardware is able to run it, and it is equally impressive to make the project that simple to deploy. I am sure it can be better adapted to the Pi (especially if using Hailo's Pi 5 AI kit).
If anyone has some experience doing that- please comment below 🤓
obama-trump-src.mp4
Beta Was this translation helpful? Give feedback.
All reactions