Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Feasibility of large-scale deployment of multiple wearable sensors in Parkinson's disease

  • Ana Lígia Silva de Lima ,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Visualization, Writing – original draft, Writing – review & editing

    ana.silvadelima@radboudumc.nl

    Affiliations Department of Neurology, Donders Institute for Brain, Cognition and Behaviour, Radboud University Medical Center, Nijmegen, The Netherlands, CAPES Foundation, Ministry of Education of Brazil, Brasília/DF, Brazil

  • Tim Hahn,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Visualization, Writing – original draft, Writing – review & editing

    Affiliation Department of Neurology, Donders Institute for Brain, Cognition and Behaviour, Radboud University Medical Center, Nijmegen, The Netherlands

  • Luc J. W. Evers,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Writing – original draft, Writing – review & editing

    Affiliation Department of Neurology, Donders Institute for Brain, Cognition and Behaviour, Radboud University Medical Center, Nijmegen, The Netherlands

  • Nienke M. de Vries,

    Roles Data curation, Investigation, Writing – review & editing

    Affiliation Department of Neurology, Donders Institute for Brain, Cognition and Behaviour, Radboud University Medical Center, Nijmegen, The Netherlands

  • Eli Cohen,

    Roles Conceptualization, Data curation, Software, Writing – review & editing

    Affiliation Intel, Advanced Analytics, Tel Aviv, Israel

  • Michal Afek,

    Roles Conceptualization, Data curation, Software, Writing – review & editing

    Affiliation Intel, Advanced Analytics, Tel Aviv, Israel

  • Lauren Bataille,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Resources, Writing – review & editing

    Affiliation The Michael J Fox Foundation for Parkinson’s Research, New York, United States of America

  • Margaret Daeschler,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Resources, Writing – review & editing

    Affiliation The Michael J Fox Foundation for Parkinson’s Research, New York, United States of America

  • Kasper Claes,

    Roles Writing – review & editing

    Affiliation UCB Biopharma, Brussels, Belgium

  • Babak Boroojerdi,

    Roles Writing – review & editing

    Affiliation UCB Biopharma, Brussels, Belgium

  • Dolors Terricabras,

    Roles Writing – review & editing

    Affiliation UCB Biopharma, Brussels, Belgium

  • Max A. Little,

    Roles Conceptualization, Writing – review & editing

    Affiliations Aston University, Birmingham, United Kingdom, Media Lab, Massachusetts Institute of Technology, Cambridge, United States of America

  • Heribert Baldus,

    Roles Writing – review & editing

    Affiliation Philips Research, Department Personal Health, Eindhoven, the Netherlands

  • Bastiaan R. Bloem,

    Roles Conceptualization, Funding acquisition, Investigation, Methodology, Project administration, Supervision, Writing – review & editing

    Affiliation Department of Neurology, Donders Institute for Brain, Cognition and Behaviour, Radboud University Medical Center, Nijmegen, The Netherlands

  • Marjan J. Faber

    Roles Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Supervision, Writing – review & editing

    Affiliations Department of Neurology, Donders Institute for Brain, Cognition and Behaviour, Radboud University Medical Center, Nijmegen, The Netherlands, Radboud University Medical Center, Radboud Institute for Health Sciences, Scientific Center for Quality of Healthcare, Nijmegen, the Netherlands

Abstract

Wearable devices can capture objective day-to-day data about Parkinson’s Disease (PD). This study aims to assess the feasibility of implementing wearable technology to collect data from multiple sensors during the daily lives of PD patients. The Parkinson@home study is an observational, two-cohort (North America, NAM; The Netherlands, NL) study. To recruit participants, different strategies were used between sites. Main enrolment criteria were self-reported diagnosis of PD, possession of a smartphone and age≥18 years. Participants used the Fox Wearable Companion app on a smartwatch and smartphone for a minimum of 6 weeks (NAM) or 13 weeks (NL). Sensor-derived measures estimated information about movement. Additionally, medication intake and symptoms were collected via self-reports in the app. A total of 953 participants were included (NL: 304, NAM: 649). Enrolment rate was 88% in the NL (n = 304) and 51% (n = 649) in NAM. Overall, 84% (n = 805) of participants contributed sensor data. Participants were compliant for 68% (16.3 hours/participant/day) of the study period in NL and for 62% (14.8 hours/participant/day) in NAM. Daily accelerometer data collection decreased 23% in the NL after 13 weeks, and 27% in NAM after 6 weeks. Data contribution was not affected by demographics, clinical characteristics or attitude towards technology, but was by the platform usability score in the NL (χ2 (2) = 32.014, p<0.001), and self-reported depression in NAM (χ2(2) = 6.397, p = .04). The Parkinson@home study shows that it is feasible to collect objective data using multiple wearable sensors in PD during daily life in a large cohort.

Introduction

Parkinson’s Disease (PD) is a common neurodegenerative disease in which patients experience both motor and non-motor symptoms[1]. Treatment is primarily based on the management of symptoms by increasing dopamine levels through pharmacological therapy or surgery[2, 3]. Additionally, non-pharmacological therapies, such as physiotherapy, occupational therapy or speech therapy, are available to support patients[4].

Although good results in the management of motor symptoms have been achieved, particularly in the early stages of the disease[5, 6], two major problems hamper long-term treatment. First, current pharmacological therapy is successful for a limited period. In the long term, most patients develop unmanageable motor complications that can lead to worsening of quality of life[7]. Second, evaluation of day-to-day variations in PD symptoms is difficult when relying solely upon periodic consultations by clinicians[8]. Therefore, more detailed, objective and reliable measures during daily living could potentially improve the management of PD.

Wearable sensors have been used to assess PD-related symptoms continuously and longitudinally during daily living[912]. Wearables may provide greater insight into a patient’s disease status, allowing patients to self-manage their symptoms and monitor medication responses[1318]. Furthermore, wearable sensor data may improve our scientific understanding of disease progression by showing changes in motor and non-motor symptoms over time, furthering the development of digital biomarkers for disease progression[19].

While the potential value of wearable sensors for disease management and research are increasingly becoming clear, various critical aspects of feasibility remain to be determined. Only a few studies have rigorously investigated the feasibility and acceptability of using a wearable platform comprising a smartphone in combination to a smartwatch. Moreover, these prior findings remained limited by the small sample sizes (biggest sample thus far: 40 PD patients) [9, 13, 17, 18]. Therefore, we aimed to investigate the feasibility of using a wearable platform in a much larger sample of PD patients, with a focus on recruitment success, attrition rates, user compliance and system usability.

Methods

Between August/2015 and November/2016, a total of 953 PD patients from two cohorts (n = 304 in The Netherlands (NL) and n = 649 PD in North America (Unites States and Canada—NAM) participated in the Parkinson@Home feasibility study. To investigate the feasibility of the technology in different contexts, both cohorts used the same wearable platform, but had distinct strategies for recruitment, retention and study period. These topics are described separately (overview in Table 1).

thumbnail
Table 1. Study design and procedure overview at the two cohorts.

https://doi.org/10.1371/journal.pone.0189161.t001

Study design and population

The NL cohort.

The population and study design applied in the NL are described in detail elsewhere.[20] In short, participants were recruited from support groups, internet communities and through physiotherapists specialized in treating PD patients. Enrolment criteria were: (1) ≥30 years of age, (2) possession of a smartphone using an Android OS version ≥ 4.2 and (3) self-reported diagnosis of PD. No exclusion criteria were applied beyond enrolment criteria.

All enrolled participants received a single medical examination, based on the “Parkinson's Progression Markers Initiative” (PPMI)[21]. This included the full MDS-UPDRS[22], the Montreal Cognitive Assessment (MoCA)[23], and the Modified Schwab and England Activities of Daily Living Scale.[24] The medical examination was performed by specially trained physiotherapists who are members of ParkinsonNet[25], a Dutch network of health professionals specialised in PD management. At the end of the 13-weeks study period, all enrolled participants evaluated the usability of the system through the System Usability Scale (SUS)[26, 27], and were enquired about ability to use a smartphone (see APPENDIX). Finally, participants had the option to continue using the platform or return the Pebble smartwatch.

The NAM cohort.

Study recruitment for the NAM cohort was entirely virtual through direct emails to subjects participating in the “Fox Insight online study”, Facebook advertisements to targeted populations, and advertisements on Fox Trial Finder, a clinical trial matching tool for people with PD.[28] Additional to the NL, the following enrolment criteria were applied: (1) ≥18 years of age and (2) participation in the Fox Insight Online Study[29] (Table 1).

In order to enrol, interested participants had to first register in the Fox Insight study (if they had not done so already). Through Fox Insight, each participant completed online surveys about demographics, medical history, cognition, physical activity, symptoms and PD related medications and surgeries. Once enrolled in the Fox Insight, participants were eligible to register for the NAM cohort of the Parkinson@Home study on a separate webpage. These users completed an online enrolment form which was reviewed by the study team to determine eligibility. All study registrants received an email confirming their eligibility or non-eligibility. After finishing the 6-weeks study period, participants had the option to continue using the platform.

Wearable platform

The Intel® Pharma Analytics Platform used has been described in detail elsewhere.[20, 30] Briefly, it consists of the Fox Wearable Companion app, used on both a smartwatch and smartphone, and a cloud environment. In this study, a Pebble smartwatch, was used together with the patients’ Android phones. 50 Hz accelerometer data were collected continuously from the smartwatch and streamed to the smartphone.

Sensor analysis algorithms are applied to the aggregated (30 second interval) smartwatch accelerometer data in the app to estimate outcomes (i.e. levels of activity, tremor and movement during sleep). These estimated quantities are transmitted, via Wi-Fi or mobile data, to a cloud environment. They are also presented to the user by graphs and summary reports within the app. Additionally, users are able to set medication reminders, report actual medication intake and rate their symptoms (e.g. tremor, dyskinesia, rigidity, bradykinesia) within the mobile app (Fig 1). Both estimated outcomes and patients reported outcomes (PROs) are stored in the cloud environment.

thumbnail
Fig 1. (a) Fox Wearable Companion app main screen; (b) Fox Wearable Companion app activity graph; (c) Fox Wearable Companion app movement during sleep graph; (d) Fox Wearable Companion app symptom self-reports.

“Reprinted from [Intel and Michael J Fox Foundation] under a CC BY license, with permission from [INTEL®], original copyright [2017].

https://doi.org/10.1371/journal.pone.0189161.g001

Study procedures at both cohorts

Participants from both cohorts provided electronic consent and received a research kit containing a Pebble smartwatch, an installation guide and user manuals. Next, participants installed the Fox Wearable Companion App on their devices and were asked to wear the smartwatch and keep their smartphone with them as much as possible on either a 24/7 basis for 13-weeks study period (NL) or for a minimum of 5 hours a day, 7 days a week, for a 6-weeks study period (NAM). Additionally, participants reported their medication intake (i.e medication name and doses) and PD symptom severity using the app. A helpline was available during the study period for technical support. Support calls or emails were sent to participants from whom data were not collected for more than seven consecutive days.

Outcome definitions and statistical analysis

Feasibility assessment included recruitment, attrition, compliance and system usability. Recruitment success was analysed by (1) the total number of enrolled participants and (2) the number of eligible registrants that did not complete the informed consent. Compliance, similar to previous studies[31, 32], was calculated as the median percentage of the study period where accelerometer data were collected. Attrition rate, based upon Eysenbach et al. [33], were measured by (1) decrease in the daily percentage of collected accelerometer data during each study period and (2) decrease in the number of participants contributing accelerometer data. Finally, system usability was measured by the median total score on the System Usability Scale.

We investigated the relationship between self-reported demographics, clinical data, ability to use a smartphone (S1 File), System Usability score and the percentage of accelerometer data collected to identify factors that influence compliance levels. Participant demographic and clinical characteristics were grouped into categories either following previously described literature (presence of depression[34]; presence of cognitive impairment[23]) or by convenience (age; educational level: a measure of the last completed level of education where low education was equal to high school or lower levels, middle education was equal to bachelor, and high education was equal to master or higher levels; Hoehn & Yahr stage and Modified Schwab and England scale). Because compliance was not normally distributed, the median and quartiles were used to divide participants into three compliance groups (low, middle and high). The first quartile was the cut-off for the low compliant group and third quartile for the high compliant group. Depending on the distribution of other variables in the analysis, either Chi-square, Fisher’s Exact Test or Kruskal-Wallis were used to investigate significant differences between compliance groups considering demographics, clinical characteristics, ability to use a smartphone and System Usability score.

Ethics standards

This study was conducted in compliance with the Ethical Principles for Medical Research Involving Human Subjects, as defined in the Declaration of Helsinki. The study protocol and communication materials were approved by the local ethics committee (NL: CMO Arnhem-Nijmegen; NL53034.091.15; NAM: New England IRB: 15–046).

Results

Recruitment and sample characteristics

In the NL cohort, 347 eligible PD patients were invited to participate. Among those invited, 43 refused to participate. The main refusal reasons were “Study protocol seems too burdensome” (44%, n = 19), followed by “Personal circumstances” (33%, n = 14). A total of 304 patients (enrolment rate = 88%) were enrolled.

In the NAM cohort, from the 866 participants of the Fox Insight study who received a direct invitation to participate, 306 were enrolled (6% were ineligible). 344 additional participants were included from the remaining recruitment channels, with varied ineligibility rates. A total of 649 registrants (enrolment rate = 51%) were enrolled.

In both cohorts, 953 participants were enrolled. From them, 805 were data contributors (participants that contributed at least one accelerometer data point during study period). Analysis of the demographic characteristics of both cohorts showed that, in comparison to NA, the NL cohort presented more men (χ2 (1) = 9.5146, p<0.01); older (χ2 (2) = 16.435, p = 0.001) and higher educated (χ2 (2) = 25.270, p<0.001) PD included participants. The characteristics of all participants are presented in Table 2.

thumbnail
Table 2. Demographic and disease related characteristics of participants.

https://doi.org/10.1371/journal.pone.0189161.t002

Technical support to participants

In both cohorts, the helpdesk consisted of two research assistants, available for 20 hours (NL) and 40 hours (NAM) per week. The actual workload was dependent on: (1) the number of participants simultaneously enrolled in the trial; and (2) the occurrence of bugs in the app or server downtime. The most frequent and time-consuming problems were: (1) Bluetooth disconnection between the smartwatch and the smartphone and (2) questions regarding the medication report, especially in the first weeks of participation.

Compliance

Among both cohorts, 85% (n = 805 of 953 enrolled) of participants were data contributors. In the NL, 291 data-contributors collected data for a median of 1,478 hours each in the 13-weeks, with quartile ranges (1st and 3rd QR) of 888 to 1,827 hours. In NAM, 514 data contributors collected a median of 621 hours (1st QR: 286 and 3rd QR: 828 hours) each during the 6-weeks. Compliance rates for each cohort were 68% (1st and 3rd QR: 41%-83%) equal to 16.3 hours/participant/day in the NL and 62% (1st and 3rd QR: 28%-82%) equal to 14.8 hours/participant/day in NAM (Fig 2).

thumbnail
Fig 2. Distribution of compliance among all enrolled participants in the NL (n = 304-black) and NAM (n = 649-white) study cohorts.

https://doi.org/10.1371/journal.pone.0189161.g002

Attrition

In the NL, 13 participants (4% of all NL enrolled participants) did not contribute any data during the study period and were thus non-compliant. In NAM, this number was 135 (21% of all NAM enrolled participants). Additionally, 82 (27% of all enrolled) data-contributors in the NL became non-compliant during the study period. The primary known reasons (n = 47) were “Personal circumstances” (38%, n = 18) and “System too complex/System related issues” (34%, n = 16). For the NAM cohort, although reasons were unknown, this number was 89 (17% of all enrolled).

The attrition in the median percentage of sensor data collected daily varied between cohorts. In the NL, the attrition rate was 23% after 13-weeks’ study period. In the NAM, attrition was 27% after 6-weeks’ study period (Fig 3).

thumbnail
Fig 3. Attrition in compliance per day for NL (n = 291, black) and NAM participants (n = 514, gray) during the follow up period.

https://doi.org/10.1371/journal.pone.0189161.g003

Attrition in participation was tracked during and beyond the compulsory study period for each cohort. he number of participants decreased rapidly after the end of the study period in the NL cohort. A more gradual attrition in participation occurred in the NAM cohort (Fig 4).

thumbnail
Fig 4. Number of participants actively collecting sensor data at the NL (gray) and NAM (black) cohorts during and after the follow-up period (total initial n = 805).

https://doi.org/10.1371/journal.pone.0189161.g004

Ninety-six percent (n = 280) of data-contributors in the NL reported their medication through the app, while 78% (n = 404) did so in NAM. On average, data-contributors who used medication reports reported 351±217 medication intakes during the 13-week study period in the NL and 127±113 over 6-week study period in NAM. Both cohorts showed a low and non-exponential attrition in medication report, similar to the attrition showed in compliance with the accelerometer data (data not shown).

System usability

In the NL cohort, 256 participants completed the System Usability Scale (response rate = 71.4%). The median score was 62.5 (1st and 3rd QT 47.5–72.5), which classifies the wearable platform in a category between “Ok” and “Good” (Fig 5).

thumbnail
Fig 5. SUS scoring of the Fox Wearable Companion platform (smartwatch with smartphone app) as rated by participants.

https://doi.org/10.1371/journal.pone.0189161.g005

Factors related to compliance

After grouping all NL data-contributors into compliance groups, analysis reveals no significant differences in the distribution of demographics, clinical characteristics and ability to use a smartphone between these groups. However, Kruskal-Wallis analysis demonstrates that the System Usability score reported is significantly different between the groups (χ2 (2) = 32.014, p<0.001). The mean rank score is 84.8 for the low compliant group, 130.8 for the middle compliant group and 160.0 for the high compliant group, which indicates that participants in the high compliant group provided a higher usability score to the system.

For the NAM cohort, analysis shows that demographics and clinical characteristics between the three compliance groups was comparable, except for a trend regarding self-reported depression (χ2(2) = 6.397, p = .04). This result indicates that a slightly higher number of self-reported depressed patients are in the low compliant group (Table 3).

thumbnail
Table 3. Distribution of data-contributors’ characteristics and influence on compliance for the NL and NAM cohorts.

https://doi.org/10.1371/journal.pone.0189161.t003

Discussion

This study assessed the feasibility of using a wearable platform for long-term data collection in a large sample of PD patients. We focused on: recruitment success, attrition rates, compliance and system usability. Enrolment rate was 88% (n = 304) in the NL and 51% (n = 649) in NAM. Nearly 85% of all enrolled participants contributed sensor data during the study period. Median compliance rate was 68% (16.3 hours/participant/day) in the NL, and 62% (14.8 hours/participant/day) in NAM. The rate of accelerometer data collected each day declined 23% in the NL after 13-weeks of study period, and 27% in NAM after 6-weeks of study period. The distribution of demographics, clinical characteristics and ability to use a smartphone did not differ across compliance groups in the NL, but System Usability score did differ. For the NAM, the distribution of demographics and clinical characteristics between the compliance groups was comparable, except for self-reported depression status.

The high compliance in this study shows that it is feasible for people with PD to use this wearable platform in a real-world environment for many months. Although the feasibility of using consumer wearable sensors to monitor PD symptoms has been previous reported[9, 13, 17, 18, 31, 35], this is the first rigorous observational study to investigate the feasibility of a wearable platform comprising a smartwatch combined with a smartphone in such a large patient group (the largest prior study included only 40 patients). Additionally, the small differences in study protocols across cohorts allowed us to observe the impact of varying usage instructions on compliance. Comparing the feasibility results obtained in this study to other studies, where either mobile apps were used in large cohorts[36] or e-health technologies were use[37, 38] d, we achieved a high compliance together with small and non-exponentially decreasing attrition rate, even though exponential decrease in compliance is the norm in these sort of studies[33].

This unusually high compliance rate may be attributed to the “passive” data collection. In this case, little or no interaction with the technology is required in order to collect sensor data. Participants using the Parkinson@Home wearable platform, other than reporting their medication intake when reminded by the alarm (which was widely perceived as a service, instead of a burden), did not need to interact actively with the smartphone or smartwatch. In another similar smartphone-based study where “active”, “task-based” monitoring was used (that is, where participants needed to perform certain specific tasks, at regular intervals prompted by the platform)[36], a more typical high and exponential attrition rate was observed. While it is difficult to draw firm conclusions from this comparison (because the two platforms are somewhat different), we suspect that periodic and long interaction by users may increase attrition, leading to attrition rates seen in paper-based diaries[39]. The low and non-exponential attrition seen in the medication reports, a quick and less burdensome task, strengthened this conclusion. Thus, passive monitoring, where little to no interaction with the technology is required, may lead to better overall compliance rates.

Despite the potential influence of age, gender and PD-related impairment (i.e. physical or cognitive) on compliance, our results showed that overall disease severity, MDS-UPDRS scores, independency level or cognitive impairment, did not influence compliance, which suggests that this platform could be used by most PD patients. The unique design of the Parkinson@home study can partially explain this result. The presence of a personalized support centre, which was previously described as an effective strategy to improve retention of participants[33], may have increased patients’ confidence in using the system and have compensated for any disease-related difficulties.

Moreover, the “pro-active” support model, with scheduled calls to participants who showed signs of low compliance, may have boosted compliance by providing a quick resolution of technical interruptions, and addressed any apathy towards participation caused by technology difficulties. This support is even more important because compliance is compromised in participants that reported low System Usability scores. Therefore, in order to achieve high compliance while using smartphone/smartwatch wearable platforms to measure PD related symptoms at home, it is beneficial to: (1) improve the platform’s usability, (2) reduce the number of technical issues, and (3) run a personalized support centre that can provide guidance to deal with possible technology related issues that participants may encounter.

Limitations

The Parkinson@Home study did have a few limitations. First, this is one of the first large-scale cohort studies using consumer wearable sensors in PD, with a long study period duration (i.e. up to 13 weeks). However, the study sample consisted only of PD patients that possessed a smartphone, thus introducing a possible selection bias, e.g. towards more highly educated subjects. Although smartphone penetration in the NL and NAM is high[40, 41], participants may not reflect the majority of PD patients living in the Netherlands, North America, or elsewhere in the world. Furthermore, when compared to the general PD population[42], participants were mainly young with a mild disease impairment and with some degree of cognitive impaired. Even though these variables showed no obvious influence on compliance, a more impaired population may need more personal support in order to maintain compliance. Future studies should aim for a more stratified population in order to further confirm the lack of influence across the full range of disease severity in the compliance with wearable sensors among PD patients. Second, the present results only apply to the use of two specific consumer grade devices (i.e. smartphone and smartwatch). Although consumer grade devices bring potential advantage over the use of dedicated medical devices, it is unknown whether our promising feasibility results would generalize to dedicated medical devices which are often more expensive and less user-friendly.

Conclusion

In conclusion, the Parkinson@home trial showed that it is feasible to deploy a technology platform consisting of consumer-grade wearable and mobile devices for long-term data collection in a large and geographically diverse PD population. Importantly, compliance was comparable for patients with a range of backgrounds, including men and women, different ages, and some variations in disease severity. These findings suggest that wearables may offer a promising approach to overcome the limitations in monitoring disease status and progression of mildly impaired PD patients in a real-life environment. The platform here used is a promising and practical approach to capturing large amounts of sensor data from many participants by passive means, without much need for interaction with the technology. In the future, these properties may position sensor technologies as effective tools for monitoring PD and the “lived experience” of PD patients.

Supporting information

S1 File. Scale created by the researchers to measure ability to use a smartphone.

https://doi.org/10.1371/journal.pone.0189161.s001

(TIF)

Acknowledgments

We thank the many people with Parkinson's disease who participated in these studies and the Stichting Parkinson Nederland. Without them this work would not have been possible.

References

  1. 1. Gelb DJ, Oliver E, Gilman S. Diagnostic criteria for Parkinson disease. Archives of neurology. 1999;56(1):33–9. pmid:9923759
  2. 2. Goetz CG, Poewe W, Rascol O, Sampaio C. Evidence-based medical review update: Pharmacological and surgical treatments of Parkinson's disease: 2001 to 2004. Movement Disorders. 2005;20(5):523–39. pmid:15818599
  3. 3. Mehanna R, Lai EC. Deep brain stimulation in Parkinson’s disease. Translational neurodegeneration. 2013;2(1):22. pmid:24245947
  4. 4. Cutson TM, Laub KC, Schenkman M. Pharmacological and Nonpharmacological Interventions in the Treatment of Parkinson's Disease. Physical Therapy. 1995;75(5):363–73. pmid:7732080
  5. 5. Lewis S, Foltynie T, Blackwell A, Robbins T, Owen A, Barker R. Heterogeneity of Parkinson’s disease in the early clinical stages using a data driven approach. Journal of Neurology, Neurosurgery & Psychiatry. 2005;76(3):343–8.
  6. 6. Giugni JC, Okun MS. Treatment of advanced Parkinson's disease. Current opinion in neurology. 2014;27(4):450–60. Epub 2014/07/01. pmid:24978634.
  7. 7. Keus SHJ, Munneke M, Graziano M, Jaana P. European Physiotherapy Guideline for Parkinson's disease. KNGF/ParkinsonNet, the Netherlands. 2014.
  8. 8. Mera TO, Heldman DA, Espay AJ, Payne M, Giuffrida JP. Feasibility of home-based automated Parkinson's disease motor assessment. Journal of neuroscience methods. 2012;203(1):152–6. pmid:21978487
  9. 9. Arora S, Venkataraman V, Zhan A, Donohue S, Biglan K, Dorsey E, et al. Detecting and monitoring the symptoms of Parkinson's disease using smartphones: a pilot study. Parkinsonism & related disorders. 2015;21(6):650–3.
  10. 10. Patel S, Chen BR, Mancinelli C, Paganoni S, Shih L, Welsh M, et al. Longitudinal monitoring of patients with Parkinson's disease via wearable sensor technology in the home setting. Conf Proc IEEE Eng Med Biol Soc. 2011;2011:1552–5. Epub 2012/01/19. pmid:22254617.
  11. 11. Maetzler W, Domingos J, Srulijes K, Ferreira JJ, Bloem BR. Quantitative wearable sensors for objective assessment of Parkinson's disease. Mov Disord. 2013;28(12):1628–37. pmid:24030855.
  12. 12. Hobert MA, Maetzler W, Aminian K, Chiari L. Technical and clinical view on ambulatory assessment in Parkinson's disease. Acta Neurol Scand. 2014;130(3):139–47. pmid:24689772.
  13. 13. Tsanas A, Little MA, McSharry PE, Ramig L. Using the cellular mobile telephone network to remotely monitor parkinsons disease symptom severity. IEEE Transactions on Biomedical Engineering. 2012.
  14. 14. Patel S, Lorincz K, Hughes R, Huggins N, Growdon J, Standaert D, et al. Monitoring motor fluctuations in patients with Parkinson's disease using wearable sensors. IEEE transactions on information technology in biomedicine: a publication of the IEEE Engineering in Medicine and Biology Society. 2009;13(6):864–73. Epub 2009/10/23.
  15. 15. Griffiths RI, Kotschet K, Arfon S, Xu ZM, Johnson W, Drago J, et al. Automated assessment of bradykinesia and dyskinesia in Parkinson’s disease. Journal of Parkinson’s disease. 2012;2(1):47–55. pmid:23939408
  16. 16. Tzallas AT, Tsipouras MG, Rigas G, Tsalikakis DG, Karvounis EC, Chondrogiorgi M, et al. PERFORM: A System for Monitoring, Assessment and Management of Patients with Parkinson’s Disease. Sensors. 2014;14(11):21329–57. pmid:25393786
  17. 17. Lakshminarayana R, Wang D, Burn D, Chaudhuri KR, Cummins G, Galtrey C, et al. Smartphone-and internet-assisted self-management and adherence tools to manage Parkinson’s disease (SMART-PD): study protocol for a randomised controlled trial (v7; 15 August 2014). Trials. 2014;15(1):374.
  18. 18. Sharma V, Mankodiya K, De La Torre F, Zhang A, Ryan N, Ton TG, et al., editors. SPARK: personalized parkinson disease interventions through synergy between a smartphone and a smartwatch. International Conference of Design, User Experience, and Usability; 2014: Springer.
  19. 19. Espay AJ, Bonato P, Nahab FB, Maetzler W, Dean JM, Klucken J, et al. Technology in Parkinson's disease: Challenges and opportunities. Movement Disorders. 2016;31(9):1272–82. pmid:27125836
  20. 20. de Lima ALS, Hahn T, de Vries NM, Cohen E, Bataille L, Little MA, et al. Large-Scale Wearable Sensor Deployment in Parkinson’s Patients: The Parkinson@ Home Study Protocol. JMIR Research Protocols. 2016;5(3).
  21. 21. Marek K, Jennings D, Lasch S, Siderowf A, Tanner C, Simuni T, et al. The parkinson progression marker initiative (PPMI). Progress in neurobiology. 2011;95(4):629–35. pmid:21930184
  22. 22. Goetz CG, Tilley BC, Shaftman SR, Stebbins GT, Fahn S, Martinez-Martin P, et al. Movement Disorder Society-sponsored revision of the Unified Parkinson's Disease Rating Scale (MDS-UPDRS): Scale presentation and clinimetric testing results. Movement disorders. 2008;23(15):2129–70. pmid:19025984
  23. 23. Hoops S, Nazem S, Siderowf A, Duda J, Xie S, Stern M, et al. Validity of the MoCA and MMSE in the detection of MCI and dementia in Parkinson disease. Neurology. 2009;73(21):1738–45. pmid:19933974
  24. 24. Schwab R, England A. Projection Technique for Evaluating Surgery in Parkinson's Disease: Third Symposium on Parkinson's Disease. Edinburgh: Gillingham and Donaldson. 1969.
  25. 25. Bloem BR, Munneke M. Revolutionising management of chronic disease: the ParkinsonNet approach. BMJ. 2014;348:g1838. pmid:24647365
  26. 26. Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of usability studies. 2009;4(3):114–23.
  27. 27. Brooke J. SUS-A quick and dirty usability scale. Usability evaluation in industry. 1996;189(194):4–7.
  28. 28. Meunier CC, Chowdhury S, Sherer T, Brooks DW. Innovative Web-based Matching Service, Fox Trial Finder, as a mechanism to improve clinical trial recruitment. Movement Disorders. 2012;27(4):e2.
  29. 29. Heusinkveld L, Hacker M, Turchan M, Bollig M, Tamargo C, Fisher W, et al. Patient Perspectives on Deep Brain Stimulation Clinical Research in Early Stage Parkinson’s Disease. Journal of Parkinson's disease. 2017;7(1):89–94. pmid:27911344
  30. 30. Cohen S, Bataille LR, Martig AK. Enabling breakthroughs in Parkinson's disease with wearable technologies and big data analytics. mHealth. 2016;2:20. Epub 2017/03/16. pmid:28293596.
  31. 31. Ferreira JJ, Godinho C, Santos AT, Domingos J, Abreu D, Lobo R, et al. Quantitative home-based assessment of Parkinson’s symptoms: The SENSE-PARK feasibility and usability study. BMC neurology. 2015;15(1):89.
  32. 32. Brannon EE, Cushing CC, Crick CJ, Mitchell TB. The promise of wearable sensors and ecological momentary assessment measures for dynamical systems modeling in adolescents: a feasibility and acceptability study. Translational behavioral medicine. 2016;6(4):558–65. pmid:27678501
  33. 33. Eysenbach G. The law of attrition. Journal of medical Internet research. 2005;7(1):e11. pmid:15829473
  34. 34. Yesavage J, Brink T, Rose T. Geriatric depression scale (GDS). Handbook of psychiatric measures Washington DC: American Psychiatric Association. 2000:544–6.
  35. 35. Zhan A, Little MA, Harris DA, Abiola SO, Dorsey E, Saria S, et al. High Frequency Remote Monitoring of Parkinson's Disease via Smartphone: Platform Overview and Medication Response Detection. arXiv preprint arXiv:160100960. 2016.
  36. 36. Bot BM, Suver C, Neto EC, Kellen M, Klein A, Bare C, et al. The mPower study, Parkinson disease mobile data collected using ResearchKit. Scientific data. 2016;3.
  37. 37. Etter J-F. Comparing the efficacy of two Internet-based, computer-tailored smoking cessation programs: a randomized trial. Journal of Medical Internet Research. 2005;7(1):e2. pmid:15829474
  38. 38. Farvolden P, Denisoff E, Selby P, Bagby RM, Rudy L. Usage and longitudinal effectiveness of a Web-based self-help cognitive behavioral therapy program for panic disorder. Journal of medical Internet research. 2005;7(1):e7. pmid:15829479
  39. 39. Hauser RA, Deckers F, Lehert P. Parkinson's disease home diary: further validation and implications for clinical trials. Movement Disorders. 2004;19(12):1409–13. pmid:15390057
  40. 40. Majority of the elderly in the Netherlands has a smartphone [Internet]. Houten, The Netherlands: Telecompaper; 2015; 19 June 2015. https://www.telecompaper.com/pressrelease/majority-of-the-elderly-in-the-netherlands-has-a-smartphone—1088067
  41. 41. Sullivan W. Trends on Tuesday: 186.3 million people own smartphones in the U.S. Digital Gov [Internet]. 2015 26-05-2015 https://www.digitalgov.gov/2015/05/26/trends-on-tuesday-186-3-million-people-own-smartphones-in-the-u-s/.
  42. 42. De Lau L, Giesbergen P, De Rijk M, Hofman A, Koudstaal P, Breteler M. Incidence of parkinsonism and Parkinson disease in a general population The Rotterdam Study. Neurology. 2004;63(7):1240–4. pmid:15477545