pyActigraphy
Open-source python package for actigraphy and light exposure data visualization and analysis.
This package is meant to provide a comprehensive set of tools to:
- read native actigraphy data files with various formats:
- Actigraph: wGT3X-BT
- CamNtech: Actiwatch 4, 7, L(-Plus) and MotionWatch 8
- Condor Instrument: ActTrust 2
- Daqtix: Daqtometer
- Respironics: Actiwatch 2 and Actiwatch Spectrum (plus)
- Tempatilumi (CE Brasil)
- NEW read actigraphy data format from the MESA dataset, hosted by the National Sleep Research Resource.
- NEW read actigraphy data files produced by the accelerometer package that can be used to calibrate and convert raw accelerometer data recorded with:
- Axivity: AX3, device used by UK Biobank,
- Activinsights: GENEActiv, used by the Whitehall II study.
- NEW read light exposure data recorded by the aforementioned devices (when available)
- clean the raw data and mask spurious periods of inactivity
- produce activity profile plots
- visualize sleep agendas and compute summary statistics
- calculate typical wake/sleep cycle-related variables:
- Non-parametric rest-activity variables: IS(m), IV(m), RA
- Activity or Rest fragmentation: kRA, kAR
- Sleep regularity index (SRI)
- NEW compute light exposure metrics (TAT, MLit^{500}, summary statistics, ...)
- automatically detect rest periods using various algorithms (Cole-Kripke, Sadeh, ..., Crespo, Roenneberg)
- perform complex analyses:
- Cosinor analysis
- Detrended Fluctuation Analysis (DFA)
- Functional Linear Modelling (FLM)
- Locomotor Inactivity During Sleep (LIDS)
- Singular Spectrum Analysis (SSA)
- and much more...
Citation
We are very pleased to announce that the v1.0 version of the pyActigraphy package has been published. So, if you find this package useful in your research, please consider citing:
Hammad G, Reyt M, Beliy N, Baillet M, Deantoni M, Lesoinne A, et al. (2021) pyActigraphy: Open-source python package for actigraphy data visualization and analysis. PLoS Comput Biol 17(10): e1009514. https://doi.org/10.1371/journal.pcbi.1009514
pyLight
In the context of the Daylight Academy Project, The role of daylight for humans and thanks to the support of its members, Dr. Mirjam Münch and Prof. Manuel Spitschan, a pyActigraphy module for analysing light exposure data has been developed, pyLight. This module is part of the Human Light Exposure Database and is included in pyActigraphy version v1.1 and higher.
A manuscript describing the pyLight module is available as a preprint.
Code and documentation
The pyActigraphy package is open-source and its source code is accessible online.
An online documentation of the package is also available here. It contains notebooks illustrating various functionalities of the package. Specific tutorials for the processing and the analysis of light exposure data with pyLight are also available.
Installation
In a (bash) shell, simply type:
- For users:
pip3 install pyActigraphy
To update the package:
pip3 install -U pyActigraphy
- For developers:
git clone [email protected]:ghammad/pyActigraphy.git
cd pyActigraphy/
git checkout develop
pip3 install -e .
Quick start
The following example illustrates how to calculate the interdaily stability with the pyActigraphy package:
>>> import pyActigraphy
>>> rawAWD = pyActigraphy.io.read_raw_awd('/path/to/your/favourite/file.AWD')
>>> rawAWD.IS()
0.6900175913031027
>>> rawAWD.IS(freq='30min', binarize=True, threshold=4)
0.6245582891144925
>>> rawAWD.IS(freq='1H', binarize=False)
0.5257020914453097
Contributing
There are plenty of ways to contribute to this package, including (but not limiting to):
- report bugs (and, ideally, how to reproduce the bug)
- suggest improvements
- improve the documentation
Authors
- Grégory Hammad @ghammad - Initial and main developer
- Mathilde Reyt @ReytMathilde
See also the list of contributors who participated in this project.
License
This project is licensed under the GNU GPL-3.0 License - see the LICENSE file for details
Acknowledgments
- Aubin Ardois @aardoi developed the first version of the MTN class during his internship at the CRC, in May-August 2018.
- The CRC colleagues for their support, ideas, etc.