Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: How does the renderer work? #4385

Open
johnwheeler opened this issue Oct 8, 2024 · 2 comments
Open

Question: How does the renderer work? #4385

johnwheeler opened this issue Oct 8, 2024 · 2 comments

Comments

@johnwheeler
Copy link

I took a look in the renderer package, and as I started to dig into different threads on here, I read about distributed rendering. Also, the rust code in there is above my head.

What does remotion "do"?

Does it automate the use of tabCapture to capture video?

Does it takes screenshots with puppeteer and stitch them together?

Does it somehow convert animations into ffmpeg pan and zoom filters?

Can you give me an idea of the process, so I can evaluate what it means for my product.

Thanks!
John

@aadarsh-nagrath
Copy link

Remotion lets you build videos using React components. No fancy screen captures or Puppeteer hacks. You sort of code your video, frame by frame, just like you would a web app.

It doesn’t convert stuff into FFmpeg filters either—it generates the frames and stitches them into a video for you.

Basically, you write React code, and Remotion turns it into a slick video.

Doc

@johnwheeler
Copy link
Author

So @aadarsh-nagrath, I did some digging and I think this is incorrect. It definitely looks like they're using ffmpeg filters

https://github.com/remotion-dev/remotion/blob/main/packages/renderer/src/create-ffmpeg-complex-filter.ts
https://github.com/remotion-dev/remotion/blob/main/packages/renderer/src/create-ffmpeg-merge-filter.ts
https://github.com/remotion-dev/remotion/blob/main/packages/renderer/src/ffmpeg-filter-file.ts

Also, there's a lot of Puppeteer stuff in there

https://github.com/remotion-dev/remotion/blob/main/packages/renderer/src/puppeteer-screenshot.ts
https://github.com/remotion-dev/remotion/blob/main/packages/renderer/src/puppeteer-evaluate.ts

It looks like what its doing is launching a cluster of puppeteer instances and driving the screenshots using useCurrentFrame. It also has animation primitives that can take a frame numbers and convert those into spring values (without an animated transition which would break the screen shots)

For video elements, they extract each frame of the video and 'splice' the jpg or png in where the video would be during render.

I don't know if that's exactly how they're doing things, but I was able to build a proof-of-concept for my own needs, and I was surprised at how well it works. I even got the audio to sync with the video without much problem, which blew my mind.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants