TE
-de
pendent ana
lysis (tedana
) is a Python library for denoising multi-echo functional magnetic resonance imaging (fMRI) data.
tedana
originally came about as a part of the ME-ICA pipeline, although it has since diverged.
An important distinction is that while the ME-ICA pipeline originally performed both pre-processing and TE-dependent analysis of multi-echo fMRI data,
tedana
now assumes that you're working with data which has been previously preprocessed.
More information and documentation can be found at https://tedana.readthedocs.io.
If you use tedana
, please cite the following papers, as well as our most recent Zenodo release:
- DuPre, E. M., Salo, T., Ahmed, Z., Bandettini, P. A., Bottenhorn, K. L., Caballero-Gaudes, C., Dowdle, L. T., Gonzalez-Castillo, J., Heunis, S., Kundu, P., Laird, A. R., Markello, R., Markiewicz, C. J., Moia, S., Staden, I., Teves, J. B., Uruรฑuela, E., Vaziri-Pashkam, M., Whitaker, K., & Handwerker, D. A. (2021). TE-dependent analysis of multi-echo fMRI with tedana. Journal of Open Source Software, 6(66), 3669. doi:10.21105/joss.03669.
- Kundu, P., Inati, S. J., Evans, J. W., Luh, W. M., & Bandettini, P. A. (2011). Differentiating BOLD and non-BOLD signals in fMRI time series using multi-echo EPI. NeuroImage, 60, 1759-1770.
- Kundu, P., Brenowitz, N. D., Voon, V., Worbe, Y., Vรฉrtes, P. E., Inati, S. J., Saad, Z. S., Bandettini, P. A., & Bullmore, E. T. (2013). Integrated strategy for improving functional connectivity mapping using multiecho fMRI. Proceedings of the National Academy of Sciences, 110, 16187-16192.
You'll need to set up a working development environment to use tedana
.
To set up a local environment, you will need Python >=3.8 and the following packages will need to be installed:
You can then install tedana
with
pip install tedana
In using tedana
, you can optionally configure a conda environment.
We recommend using miniconda3.
After installation, you can use the following commands to create an environment for tedana
:
conda create -n ENVIRONMENT_NAME python=3 pip mdp numpy scikit-learn scipy
conda activate ENVIRONMENT_NAME
pip install nilearn nibabel
pip install tedana
tedana
will then be available in your path.
This will also allow any previously existing tedana
installations to remain untouched.
To exit this conda environment, use
conda deactivate
NOTE: Conda < 4.6 users will need to use the soon-to-be-deprecated option source
rather than conda
for the activation and deactivation steps.
You can read more about managing conda environments and this discrepancy here.
You can confirm that tedana
has successfully installed by launching a Python instance and running:
import tedana
You can check that it is available through the command line interface (CLI) with:
tedana --help
If no error occurs, tedana
has correctly installed in your environment!
If you aim to contribute to the tedana
code base and/or documentation, please first read the developer installation instructions in our contributing section. You can then continue to set up your preferred development environment.
We ๐ new contributors! To get started, check out our contributing guidelines and our developer's guide.
Want to learn more about our plans for developing tedana
?
Have a question, comment, or suggestion?
Open or comment on one of our issues!
If you're not sure where to begin, feel free to pop into Mattermost and introduce yourself! We will be happy to help you find somewhere to get started.
If you don't want to get lots of notifications, we send out newsletters approximately once per month though our Google Group mailing list. You can view the previous newsletters and/or sign up to receive future ones by joining at https://groups.google.com/g/tedana-newsletter.
We ask that all contributors to tedana
across all project-related spaces (including but not limited to: GitHub, Mattermost, and project emails), adhere to our code of conduct.
Thanks goes to these wonderful people (emoji key):
This project follows the all-contributors specification. Contributions of any kind welcome! To see what contributors feel they've done in their own words, please see our contribution recognition page.