UniUSNet: A Promptable Framework for Universal Ultrasound Disease Prediction and Tissue Segmentation
UniUSNet is a universal framework for ultrasound image classification and segmentation, featuring:
- A novel promptable module for incorporating detailed information into the model's learning process.
- Versatility across various ultrasound natures, anatomical positions, and input types. Proficiency in both segmentation and classification tasks
- Strong generalization capabilities demonstrated through zero-shot and fine-tuning experiments on new datasets.
For more details, see the accompanying paper and Project Page,
UniUSNet: A Promptable Framework for Universal Ultrasound Disease Prediction and Tissue Segmentation
Zehui Lin, Zhuoneng Zhang, Xindi Hu, Zhifan Gao, Xin Yang, Yue Sun, Dong Ni, Tao Tan. Arxiv, Jun 3, 2024. https://arxiv.org/abs/2406.01154
- Clone this repository.
git clone https://github.com/Zehui-Lin/UniUSNet.git
cd UniUSNet
- Create a new conda environment.
conda create -n UniUSNet python=3.10
conda activate UniUSNet
- Install the required packages.
pip install -r requirements.txt
- BroadUS-9.7K consists of ten publicly-available datasets, including BUSI, BUSIS, UDIAT, BUS-BRA, Fatty-Liver, kidneyUS, DDTI, Fetal HC, CAMUS and Appendix.
- You can prepare the data by downloading the datasets and organizing them as follows:
data
├── classification
│ └── UDIAT
│ ├── 0
│ │ ├── 000001.png
│ │ ├── ...
│ ├── 1
│ │ ├── 000100.png
│ │ ├── ...
│ ├── config.yaml
│ ├── test.txt
│ ├── train.txt
│ └── val.txt
│ └── ...
└── segmentation
└── BUSIS
├── config.yaml
├── imgs
│ ├── 000001.png
│ ├── ...
├── masks
│ ├── 000001.png
│ ├── ...
├── test.txt
├── train.txt
└── val.txt
└── ...
- Please refer to the
data_demo
folder for examples.
We use torch.distributed
for multi-GPU training (also supports single GPU training). To train the model, run the following command:
python -m torch.distributed.launch --nproc_per_node=1 --master_port=1234 omni_train.py --output_dir exp_out/trial_1 --prompt
To test the model, run the following command:
python -m torch.distributed.launch --nproc_per_node=1 --master_port=1234 omni_test.py --output_dir exp_out/trial_1 --prompt
- You can download the pre-trained checkpoints from BaiduYun.
If you find this work useful, please consider citing:
@article{lin2024uniusnet,
title={UniUSNet: A Promptable Framework for Universal Ultrasound Disease Prediction and Tissue Segmentation},
author={Lin, Zehui and Zhang, Zhuoneng and Hu, Xindi and Gao, Zhifan and Yang, Xin and Sun, Yue and Ni, Dong and Tan, Tao},
journal={arXiv preprint arXiv:2406.01154},
year={2024}
}
This repository is based on the Swin-Unet repository. We thank the authors for their contributions.