Skip to content

Latest commit

 

History

History
50 lines (31 loc) · 2.33 KB

README.md

File metadata and controls

50 lines (31 loc) · 2.33 KB

Generative Grasping CNN (GG-CNN)

The GG-CNN is a lightweight, fully-convolutional network which predicts the quality and pose of antipodal grasps at every pixel in an input depth image. The lightweight and single-pass generative nature of GG-CNN allows for fast execution and closed-loop control, enabling accurate grasping in dynamic environments where objects are moved during the grasp attempt.

This repository contains the implementation of the Generative Grasping Convolutional Neural Network (GG-CNN) from the paper:

Closing the Loop for Robotic Grasping: A Real-time, Generative Grasp Synthesis Approach

Douglas Morrison, Peter Corke, Jürgen Leitner

Robotics: Science and Systems (RSS) 2018

arXiv | Video

If you use this work, please cite:

@article{morrison2018closing, 
	title={Closing the Loop for Robotic Grasping: A Real-time, Generative Grasp Synthesis Approach}, 
	author={Morrison, Douglas and Corke, Peter and Leitner, Jürgen}, 
	booktitle={Robotics: Science and Systems (RSS)}, 
	year={2018} 
}

Contact

Any questions or comments contact Doug Morrison.

Installation

This code was developed with Python 2.7 on Ubuntu 16.04. Python requirements can be found in requirements.txt.

Pre-trained Model

The pre-trained Keras model used in the RSS paper can be downloaded by running download_pretrained_ggcnn.sh in the data folder.

Training

To train your own GG-CNN:

  1. Download the Cornell Grasping Dataset by running download_cornell.sh in the data folder.
  2. Run generate_dataset.py to generate the manipulated dataset. Dataset creation settings can be specified in generate_dataset.py.
  3. Specify the path to the INPUT_DATASET in train_ggcnn.py.
  4. Run train_ggcnn.py.
  5. You can visualise the detected grasp outputs and evaluate against the ground-truth grasps of the Cornell Grasping Dataset by running evaluate.py

Running on a Robot

Our ROS implementation for running the grasping system on a Kinva Mico arm can be found in the repository https://github.com/dougsm/ggcnn_kinova_grasping.