The overarching goal of this framework is to make a robot manipulator able to conduct automatic ultrasound imaging.
As a first step, a simulation environment for experimenting with soft contacts has been created. Using reinforcement learning, the goal of the ultrasound task is to learn a robot how to perform a sweep across the surface of a soft body, while both exerting a desired force and keeping a constant velocity.
The framework has been tested to run with Ubuntu20.04 and python3.8.
Download MuJoCo 2.0 and unzip its contents into ~/.mujoco/mujoco200
. A license key can be obtained from here. Copy your MuJoCo license key into ~/.mujoco/mjkey.txt
. The finalized folder structure should look like
~/.mujoco
│ mjkey.txt
└───mujoco200
│ │ bin
│ │ doc
│ │ include
| | model
| | sample
Lastly, add the following line to the bottom of ~/.bashrc
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:<path_to_home>/.mujoco/mujoco200/bin
One of the Python packages, mujoco-py
, has additional system dependencies. In order to install these dependencies, run the following:
sudo apt install libosmesa6-dev libgl1-mesa-glx libglfw3 patchelf
For further information, check out the mujoco-py
installation guide.
To avoid package conflicts, it is smart to create a virtual environment. It is possible to create a virtual environment using either pip or conda.
First, follow the guidelines on how to install Miniconda. The virtual environment can then be created by running
conda env create -f ./assets/docs/conda_env.yaml
To activate the virtual environment, run
conda activate ultrasound_scan
Run the following command to set up a virtual environment
sudo apt install python-virtualenv
python3 -m venv venv
The virtual environment can be activated with
source venv/bin/activate
The required packages can then be installed with
pip3 install wheel
pip3 install -r ./assets/docs/requirements.txt
It is possible to train an RL agent to perform the ultrasound task, where the framework has been integrated with the PPO algorithm from stable-baselines. Different settings (e.g. controller specifications) can be specified in rl_scan.yaml
. Note that the config file is not complete, so there exists numerous of other settings and hyperparameters that are not specifed in the file. For these parameters, the default values are used.
To train (or run) an agent, it is as simple as running
python3 rl_scan.py
Whether to train an agent, or evaluate a trained agent, is specified in rl_scan.yaml
.