This is the official repository for Action Recognition for Privacy-Preserving Ambient Assisted Living, our paper published at the International Conference on AI in Healthcare (AIiH2024)
The paper can be found here
Python >= 3.6
PyTorch >= 1.1.0
PyYAML, tqdm
Run pip install -e torchlight
`
NTU RGB+D 60 Skeleton
NW-UCLA
- NTU RGB+D 60: Download the Skeleton dataset here
- NW-UCLA: Download the dataset here
Put downloaded data into the following directory structure.
- data/
- NW-UCLA/
- all_sqe
...
- ntu/
- nturgbd_raw/
- nturgb+d_skeletons
...
NW-UCLA dataset
Move folder all_sqe
to ./data/NW-UCLA
NTU RGB+D 60 dataset
First, extract all skeleton files to ./data/ntu/nturgbd_raw
cd ./data/ntu
# Get the skeleton of each performer
python get_raw_skes_data.py
# Remove the bad skeleton
python get_raw_denoised_data.py
# Transform the skeleton to the centre of the first frame
python seq_transformation.py
For cross-view, run python main.py --device 0 1 --config ./config/nturgbd-cross-view/default.yaml
For cross-subject, run python main.py --device 0 1 --config ./config/nturgbd-cross-subject/default.yaml
Run python main.py --device 0 1 --config ./config/ucla/nw-ucla.yaml
For cross-view, run python main.py --device 0 1 --config ./config/nturgbd-cross-view/default.yaml --phase test --weights path_to_model_weight
For cross-subject, run python main.py --device 0 1 --config ./config/nturgbd-cross-subject/default.yaml --phase test --weights path_to_model_weight
Run python main.py --device 0 1 --config ./config/ucla/nw-ucla.yaml --phase test --weights path_to_model_weigh
An environment with no occlusions
An environment with occlusions and noise
@InProceedings{10.1007/978-3-031-67285-9_15,
author="Zakka, Vincent Gbouna
and Dai, Zhuangzhuang
and Manso, Luis J.",
editor="Xie, Xianghua
and Styles, Iain
and Powathil, Gibin
and Ceccarelli, Marco",
title="Action Recognition for Privacy-Preserving Ambient Assisted Living",
booktitle="Artificial Intelligence in Healthcare",
year="2024",
publisher="Springer Nature Switzerland",
address="Cham",
pages="203--217",
isbn="978-3-031-67285-9"
}