Skip to content
/ MOMA Public

MOMA: A Multi-task Attention Learning Algorithm for Multi-omics Data Interpretation and Classification

Notifications You must be signed in to change notification settings

DMCB-GIST/MOMA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

40 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MOMA: A Multi-task Attention Learning Algorithm for Multi-omics Data Interpretation and Classification

MOMA is a multi-task attention learning model that provides a general classification framework for multi-data.
MOMA can capture important biological processes for high diagnostic performance and interpretability.
The model vectorizes features and modules using a geometric approach, and focuses on important modules in multi-omics data via an attention mechanism.


We found that there was a download error related to the supplementary material, so we uploaded the file.


⭐⭐⭐
The appropriate hyperparameters are different according to various multi-modal data and tasks.
The following is an empirical priority order, and it is recommended to put it on the tuning list at a minimum.
Number of module = 16, 32, 64, 128, ...
LearningRate = 5e-7, 5e-6, 5e-5, 5e-4, ... (with ADAM on PyTorch)


MOMA workflow


Example

Check dependencies in requirements.txt, and necessarily run

pip install -r requirements.txt

Example codes that employ MOMA to build classifiers of simulation data are included in the /Example/ folder and /MOMA/MOMA_toy_example.ipynb.

About

MOMA: A Multi-task Attention Learning Algorithm for Multi-omics Data Interpretation and Classification

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published