An experiment into controllable short-form show generation. For detailed information visit the Technical Overview
There are two available execution modes:
-
Show from News Article:
./bin/show-from-article <url of article>
-
Show from rough script:
./bin/show-from-script <path to script>
The content of a script file is just a plain text file that looks like a script. For some examples, look into show_scripts/
- Blender 4.2 with Autorig Pro installed
- Python 3.10
- GPU with at least 12 GB of VRAM (Tested on GeForce RTX 3060)
- Clone the repo
- Run
./setup.sh
- Download the pretrained models in each of the cloned AniPortrait and MoMask repos
- Fill out the settings in
settings.env
*Shows an image of a smiling emoji*
(Applause)
Walter: Hey there! (Waving)
(Oooh)
Walter: Signing off (Salutes)
(Awww)
Video:
waving.mp4
Article link: Quanta - Even a Single Bacterial Cell Can Sense the Seasons Changing
Video:
bacteria1.mp4
Apache 2.0
All contributions are welcome! Some potential extensions include:
- Optimize end-to-end generation time using distributed computing
- Deploy as a web service for video generation
- Create a continuous show similar to "Nothing Forever"
- Integrate 3D generation models for new characters/scenes
- Add img2img processing for frame styling
Feel free to fork the repository and contribute your own ideas!