We are using PartSlip as our part segmentation model. As we have mentioned in the paper, the model itself can be substituted with more advanced work such as Point-Sam.
Important: If you stick with PartSlip, please use our modified version included in the repo instead of the official implementation. The official implementation is incompatible with more modern CUDA and PyTorch due to the deprecation of TH/TH(C)
namespace and the introduction of ATen
into PyTorch.
We will install everything in the fusionsense
environment.
conda activate fusionsense
conda install boost eigen
pip install yacs nltk inflect einops prettytable ftfy openai
Install PyTorch3D. This specific installation method seems most reliable.
pip install "git+https://github.com/facebookresearch/pytorch3d.git"
Then, we get into the PartSlip folder to manually compile a few things
cd PartSlip
Download the pre-trained model we need.
bash download_ckpts.sh
Then we install GLIP, a key dependency.
cd GLIP
python setup.py build develop --user
Finally, we compile the cut-pursuit for computing superpoints.
CONDAENV=YOUR_CONDA_ENVIRONMENT_LOCATION
An example is
CONDAENV=/home/irving/miniconda3/envs/fusionsense
We are ready to compile the package.
cd ../partition/cut-pursuit
mkdir build && cd build
cmake .. -DPYTHON_LIBRARY=$CONDAENV/lib/libpython3.8.so -DPYTHON_INCLUDE_DIR=$CONDAENV/include/python3.8 -DBOOST_INCLUDEDIR=$CONDAENV/include -DEIGEN3_INCLUDE_DIR=$CONDAENV/include/eigen3
make
After this, we should have everything ready for perform Active Touch Selection.