In this repository, we implement 2 Diffusion (DDPM) models for the purpose of robotic grasping. The first training script uses the Acronym Dataset. The second training script uses the CONG Dataset.
Authors - Shashwat Ghatiwala and Jingchao Xie from Technical University of Munich
Structure of the repository -
- Overfitting a diffusion model on all grasps of one object
- Unconstrained Grasp Diffusion Model
- Dataset Preparation
- Training
- Visualizing Model's outputs
- Constrained Grasp Diffusion Model
- Dataset Preparation
- Training
- Visualizing Model's outputs
- Evaluation using PyBullet
Unconstrained Model
Constrained Model
Before we train the model on the entire dataset, we overfit a diffusion model on all grasps of one object. The purpose of this is to check if our diffusion-based pipeline is working properly
The code for this is in overfit_1_object.ipynb
Dataset used for Grasps - Acronym Dataset
Dataset for Meshes - ShapeNetSem meshes
Code to preprocess the acronym dataset is in utils/prepare_unconstrained_dataset.ipynb
See train_unconstrained_model.py
See visualize_unconstrained_model.ipynb
Dataset used for Constrained Grasps - CONG Dataset
The main zipfile is also available on HuggingFace
Code to preprocess the CONG dataset - utils/prepare_constrained_dataset.ipynb
You can check if the preprocessing has been done correctly by visualizing the mask - utils/visualize_constrained_data.ipynb
See train_constrained_model.py
See visualize_constrained_model.ipynb
We use PyBullet for our model evaluation
See the evaluation
folder for the corresponding files.
The evaluate_grasps_pybullet.py
takes the generated grasps from the model and evaluates its feasbility using the Frank Panda gripper.
Our experiment's results are below:
- Noise Scheduler and Positional Embeddings from tiny-diffusion.