- Download EgoBody scene mesh. (For more EgoBody scenes please visit here. You may need to process the scene mesh to Z-axis up and zero floor height by yourself.)
- Sign in BEDLAM. Download clothing textures (diffuse and normal map).
- Download HOOD data.
Organize them as following:
EgoGen
├── motion/
├── experiments/
| ├── exp_data/
| | └── seminar_d78/
| |
| ├── HOOD/
| └── hood_data
| ├── bedlam/
| ├── clothing_textures/ # from BEDLAM
| ├── ...
Make sure you install HOOD conda environment:
cd HOOD
conda env create -f hood.yml
We simulate the EgoBody data collection process as two people switching locations in EgoBody scenes.
# Make sure you are still using egogen environment!
cd experiments/
python gen_egobody_depth.py
Synthetic data will be saved at experiments/tmp/
. Note: We only used zero-shape male motion data to train our motion model, but during this evaluation, we randomly sample body shape, gender, and initial motion seed to increase data diversity. As a result, synthesized motions might have decreased motion quality.
Saved smplx parameters format:
0:3 transl
3:6 global_orient
6:69 body_pose
69:85 camera pose inverse matrix (used to transform to egocentric camera frame coordinates)
85:95 beta
95 gender (male = 0, female = 1)
We sample body textures and 3D clothing meshes from BEDLAM. And perform automated clothing simulation leveraging HOOD. When we develop EgoGen, HOOD only supports T-pose garment meshes as input. So we modified their code by ourselves. You may check their up-to-date repo for their updated implementation. In addition, we also modified their supported body model from smpl to smplx.
# First make sure you are still using egogen environment!
# And you need to replace HOOD_PYTHON with your hood env python path
python gen_egobody_rgb.py
Synthetic data will be saved at experiments/tmp/
. Saved smplx parameters format:
0:96 same as depth
96 cx (camera intrinsics)
97 cy
98 fx(fy)
Regarding how to use our synthetic data to train an HMR model, please refer to utils_gen_depth_npz.py
and utils_gen_rgb_npz.py
Due to limitations of HOOD, we could only do separate simulation of top garments and pants. Besides, it only support clothing meshes that can be represented as a connected graph. In our implementation, the initial pose of garments is A-pose (same as BEDLAM).
You may add more clothing meshes by refering to our script and their repo. You may need to separate a single garment mesh from BEDLAM to upper and lower meshes by yourself.
We have provided a training dataset for human mesh recovery generated by Egogen. If you wish to generate the data on your own, please refer to README.md
- Download SMPL and SMPLX body models.
- Download CMU Mocap dataset MOCAP.
- Download other dataset files from here.
Organize them as following:
EgoGen
├── experiments/
| ├── HMR/
| └── data
| ├── datasets
| ├── cmu_mocap.npz
| ├── egobody_depth_new_new
| ├── ...
| ├── egobody_release
| ├── ...
| ├── egobody_rgb_new
| ├── ...
| ├── smpl
| ├── ...
| ├── smplx_model
| ├── ...
| ├── smplx_spin_npz
| ├── egocapture_test_smplx.npz
| ├── egocapture_train_smplx.npz
| ├── egocapture_val_smplx.npz
| ├── smpl_mean_params.npz
| ├── smplx_to_smpl.npz
Make sure you install prohmr conda environment:
cd HMR
conda env create -f prohmr.yml
You can train your own models for egocentric human mesh recovery with both depth/RGB input. We also provide our pre-trained model here. You can download the pre-trained model and run the evaluation code.
# Pretrain the model on Egogen depth dataset.
python train_prohmr_depth_egobody.py --data_source synthetic --train_dataset_root /PATH/TO/egogen_depth --val_dataset_root /PATH/TO/egobody_release
# Finetune the model on Egobody depth dataset.
python train_prohmr_depth_egobody.py --load_pretrained True --checkpoint /PATH/TO/MODEL/ --data_source real --train_dataset_root /PATH/TO/egobody_release --val_dataset_root /PATH/TO/egobody_release
# Pretrain the model on Egogen RGB dataset.
python train_prohmr_egobody_rgb_smplx.py --data_source synthetic --train_dataset_root /PATH/TO/egogen_rgb --val_dataset_root /PATH/TO/egobody_release --train_dataset_file /PATH/TO/egogen_rgb.npz --val_dataset_file /PATH/TO/egobody_release_val.npz
# Finetune the model on Egobody RGB dataset.
python train_prohmr_egobody_rgb_smplx.py --load_pretrained True --checkpoint /PATH/TO/MODEL/ --data_source real --train_dataset_root /PATH/TO/egobody_release --val_dataset_root /PATH/TO/egobody_release --train_dataset_file /PATH/TO/egobody_release_train.npz --val_dataset_file /PATH/TO/egobody_release_val.npz
Run following commands to do the evaluation.
python eval_regression_depth_egobody.py --checkpoint /PATH/TO/MODEL/best_model.pt --dataset_root /PATH/TO/egobody_release
python eval_regression_egobody_rgb_smplx.py --checkpoint /PATH/TO/MODEL/best_model.pt --dataset_root /PATH/TO/egobody_release