Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Validation of a Kinect V2 based rehabilitation game

  • Mengxuan Ma ,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Software, Validation, Visualization, Writing – original draft, Writing – review & editing

    mmrnc@mail.missouri.edu

    Affiliation Department of Electrical Engineering and Computer Science, University of Missouri, Columbia, MO, United States of America

  • Rachel Proffitt,

    Roles Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliation Department of Occupational Therapy, University of Missouri, Columbia, MO, United States of America

  • Marjorie Skubic

    Roles Conceptualization, Formal analysis, Resources, Supervision, Validation, Visualization, Writing – review & editing

    Affiliation Department of Electrical Engineering and Computer Science, University of Missouri, Columbia, MO, United States of America

Abstract

Interactive technologies are beneficial to stroke recovery as rehabilitation interventions; however, they lack evidence for use as assessment tools. Mystic Isle is a multi-planar full-body rehabilitation game developed using the Microsoft Kinect® V2. It aims to help stroke patients improve their motor function and daily activity performance and to assess the motions of the players. It is important that the assessment results generated from Mystic Isle are accurate. The Kinect V2 has been validated for tracking lower limbs and calculating gait-specific parameters. However, few studies have validated the accuracy of the Kinect® V2 skeleton model in upper-body movements. In this paper, we evaluated the spatial accuracy and measurement validity of a Kinect-based game Mystic Isle in comparison to a gold-standard optical motion capture system, the Vicon system. Thirty participants completed six trials in sitting and standing. Game data from the Kinect sensor and the Vicon system were recorded simultaneously, then filtered and sample rate synchronized. The spatial accuracy was evaluated using Pearson’s r correlation coefficient, signal to noise ratio (SNR) and 3D distance difference. Each arm-joint signal had an average correlation coefficient above 0.9 and a SNR above 5. The hip joints data had less stability and a large variation in SNR. Also, the mean 3D distance difference of joints were less than 10 centimeters. For measurement validity, the accuracy was evaluated using mean and standard error of the difference, percentage error, Pearson’s r correlation coefficient and intra-class correlation (ICC). Average errors of maximum hand extent of reach were less than 5% and the average errors of mean and maximum velocities were about 10% and less than 5%, respectively. We have demonstrated that Mystic Isle provides accurate measurement and assessment of movement relative to the Vicon system.

Introduction

In the past decade and quite rapidly in the past five years, Natural User Interfaces (NUIs) and video games have grown in popularity in both consumer applications and in healthcare [13]. Specifically, physical rehabilitation (e.g., physical and occupational therapy) has embraced novel NUI applications in clinics, hospitals, nursing homes, and the community [46]. Robotic systems have long included game-based and NUI-based user interfaces and most robotic devices provide some form of physical assistance to the patient and/or haptic feedback [7, 8]. With the release of the Nintendo Wii in 2008, many NUI applications for healthcare moved away from bulky, expensive robotics and embraced the portable nature of movement and gesture recognition devices and systems. One of the biggest breakthroughs for this field came in 2010 when Microsoft released the Kinect sensor to accompany its Xbox console system. Within days and weeks of the Kinect’s release, hackers, universities, and companies began to exploit its markerless movement sensing abilities for educational and healthcare use. Since then, there has been an exponential increase in the number of studies that report the use of the Kinect as the input device for a NUI-based rehabilitation game or feedback application [9, 10].

In 2014, Jintronix was the first company to receive FDA approval for its rehabilitation game system that uses the Microsoft Kinect. There are a number of similar companies that utilize the Kinect sensor including SeeMee [11], VirtualRehab [12], Reflexion Health [13], MIRA [14], MotionCare360 [15], and 5Plus Therapy [16]. Many of these systems are marketed for delivering rehabilitation therapy in the home setting. This type of delivery is termed “tele-rehabilitation” and can involve remote monitoring by the therapist or virtual sessions over teleconferencing software [17, 18]. For telerehabilitation or remote sessions, it is imperative that the data the therapist receives from the system or movement-sensing device (such as the Microsoft Kinect) are accurate and reliable. If the therapist plans to use the data for documentation or for reimbursement from a health insurance company, the data ought to be as accurate as current clinical tools (e.g., goniometers).

Only one of the listed companies has validated the measurement capabilities of their systems and of the Microsoft Kinect. Kurillo and colleagues evaluated their system used in 5Plus Therapy against the Impulse motion-capture system (PhaseSpace Inc., San Leandro, CA) and found that it had good accuracy of joint positions and small to large percentage errors in joint angle measurements [19]. However, this study had a small sample size of only 10 subjects and used the first version of the Kinect sensor in its validation. Additionally, the movements used in the assessment were only within a single plane for each movement and all participants were seated during data collection.

Other researchers have validated the Kinect’s measurement and tracking capabilities for both general and specific applications. Hondori and Khademi [20] provide an excellent summary of the work completed prior to 2014. It should be noted that all of these studies evaluated the first version of the Kinect. Following the release of the Kinect V2 sensor, most researchers have focused their validation efforts on gait and posture applications [2124]. The Kinect V2 has good-to-excellent tracking and measurement capabilities for gait-specific parameters and clinical outcomes. However, many of these studies tracked only the lower limbs. Furthermore, gait is a relatively consistent, rhythmic motion that is consistent across participants, even in rehabilitation populations (i.e., one foot in front of the other). The full-body movements that participants are not limited to specific planes and could choose to use either hand have not been studied in current and prior comparisons of the Microsoft Kinect and optical marker-based motion capture systems.

We have developed software called Mystic Isle that utilizes the Microsoft Kinect V2 sensor as the input device [25]. Mystic Isle is designed as a rehabilitation game and has shown good results in improving motor function and daily activity performance in persons with chronic stroke [26]. The software initially used the first version (V1) of the Microsoft Kinect as the input device and we completed a study that compared it to the OptiTrack optical system [10]. Based on a visual analysis, we demonstrated that for the hand and elbow, the Kinect V1 has good accuracy in calculating trajectory of movement. For the shoulder, the Kinect V1 tracking abilities limit its validity. Although these findings are promising, the types and number of movements used in the study were limited to those in a seated position and mostly in one plane of movement (e.g., sagittal). Furthermore, the tracking capabilities of the Kinect V2 have substantially improved in the past 7 years and include more data points (joints) for comparison.

The current Mystic Isle game involves multi-planar, full body movements. Designed for individuals with diverse abilities, games can be played in a sitting or standing position, depending on the therapy treatment plan. In standing, the player is able to move around in the 3-dimensional space, akin to real-world rehabilitation. Few studies have evaluated the tracking and measurement capabilities of the Microsoft Kinect V2 for full-body, multi-planar movements in both sitting and standing. The purpose of this study was to determine the spatial accuracy and measurement validity of the Microsoft Kinect V2 sensor in a NUI rehabilitation game in comparison to a gold-standard marker-based motion capture system (Vicon).

Materials and methods

Participants

Participants were recruited via convenience sample at the University of Missouri- Columbia campus. Participants were included if they: 1) were over the age of 18, 2) could understand conversational English, and 3) had no medical conditions which prevented them from playing video games. The study has been approved by the Health Sciences Institutional Review Board at the University of Missouri with the approval number IRB 2005896 HS. All potential participants were screened and all subjects provided written informed consent before beginning the study.

Mystic Isle

Mystic Isle is a platform for rehabilitation that allows a user to interact with a virtual environment by using their body (Fig 1). The Mystic Isle software was created in Unity 3D and Mystic Isle allows the tracked user to interact with virtual environments and objects in a 3-D world. Using Mystic Isle, specific movements, distances, and locations of objects can be tailored to the abilities and requirements of the user. The system uses the Microsoft Kinect V2 camera to track participant movements. The Kinect V2 tracks 20 discrete points/joints on the body of the user. Both gross motor (stepping, jumping, squatting) and fine motor (waving the hand, turning the palm facing up, open/close hand) movements can be tracked. The Kinect V2 tracks the user in 3-dimensional space and then inputs the data in real time to the associated software, Mystic Isle. The Kinect V2 tracks and records the x, y, and z coordinates (and confidence) of each discrete joint at either 15 or 30 frames per second.

thumbnail
Fig 1. Mystic Isle game environment.

(a) A virtual avatar collecting targets in a Kinect-based rehabilitation game, Mystic Isle. (b) A participant playing the game with Vicon markers on the body. Joint data of game trials were recorded by a Kinect and the Vicon system for validation.

https://doi.org/10.1371/journal.pone.0202338.g001

Vicon

The Vicon system is a marker-based motion capture system that uses infrared cameras to track the 3-dimensional locations of reflective markers placed on the body. It can be used to measure or give real-time feedback on the movements of the whole body. The Vicon system has been used as an assessment tool for posture analysis, and in balance and reaching studies [27]. It is a gold standard tool for biomechanical kinematic assessment [27]. The sample rate of the Vicon system is 100Hz. For this study, the system included 7 individual cameras placed in a space with a ceiling height of 13 feet.

Mapping of the joints

The Kinect V2 provides a skeleton model [28] (Fig 2(a)) of a game player by recording the x, y, and z coordinates of each discrete joint. The full-body Plug-in Gait model template [2931] (Fig 2(b) and 2(c)) is commonly used in a Vicon system to build the skeleton model. The joint locations in these two models are not the same. In order to validate the results using joint data from these two skeleton models for this study, we mapped the joints between the two systems (Table 1). For hand, elbow, shoulder and chest joints, the mapping was direct; for the hip and spine base joints, we took the average of several joint locations in the Plug-in gait model to optimize the matching.

thumbnail
Fig 2. The joint locations of the Kinect V2 skeleton model and a Vicon plug-in gait model.

(a) The joint labels and positions of Kinect V2 skeleton model. (b)(c) the marker placement of a Vicon Plug-in gait model. Reprinted from [29] under a CC BY licence, with permission from Vicon Motion Systems.

https://doi.org/10.1371/journal.pone.0202338.g002

thumbnail
Table 1. The mapping of joints from the Kinect V2 and the joints from Vicon Plug-in gait model.

https://doi.org/10.1371/journal.pone.0202338.t001

Data collection

The sampling rate of the Kinect V2 is either 15 or 30 frames per second (f/s), depending on computer performance. In this study, 15 participants’ Kinect V2 data were collected at a rate of 15 f/s on a lower performance laptop computer. The remaining 15 participants’ Kinect V2 data were collected at a rate of 30 f/s on a higher performance desktop computer. In order to investigate how the sample rate influences the accuracy of the measurement outcomes in Mystic Isle, we analyzed the errors of extent metrics and speed metrics using the data collected under different frame rates separately. The average difference between the two frame rates in hand extent of reach metrics were 0.70 ± 0.55 centimeters. The average difference between the two frame rates of hand speed metrics were 1.08 ± 1.09 centimeters/second. This variation of errors is tolerable and nearly negligible. Therefore, we will combine samples together for all analyses. The detailed comparison of the accuracies under different sample rates is provided in S1 Fig.

The layout of the data collection room and the coordinate systems of the two systems are displayed in Fig 3. The origin of the Vicon camera system was set to be the center of the room. To make the coordinate space of the Kinect V2 overlay with the Vicon coordinate space, the z dimension of the Kinect® V2 was aligned with the x dimension of the Vicon system (Fig 3). The distance between the origin points of the two systems was 2 meters. The display screen of the game was placed right behind the Kinect V2 and was not occluded by the Kinect V2. Participants stood 1.8-2.4 meters (6-8 feet) from the Kinect V2 and close to the origin point of the Vicon system.

thumbnail
Fig 3. The layout of the data collection room and the coordinate systems of the two systems.

(a) The settings of the Vicon system and the Kinect® V2. The origin of the Vicon system is set in the center of the room. The z dimension of the Kinect® V2 coordinate is lined with x dimension of the Vicon’s. (b) The transformed coordinates of the Kinect® V2.

https://doi.org/10.1371/journal.pone.0202338.g003

Each participant completed six trials, described below. For each trial, the locations of the virtual objects were determined through a calibration step. We did not instruct participants to use a specific hand for the reaches or foot for stepping (as appropriate).

  1. Sitting close: Two rings of eight objects were presented to each participant. The locations of the objects were within arm’s length and no torso movement was required. The subject was seated.
  2. Sitting far: Two rings of eight objects were presented to each participant. The locations of the objects required the participant to lean with their torso to be successful. The subject was seated.
  3. Standing close: Two rings of eight objects were presented to each participant. The locations of the objects were within arm’s length and no torso movement was required. The subject was standing and did not take a step.
  4. Standing far: Two rings of eight objects were presented to each participant. The locations of the objects required the participant to lean with their torso to be successful. The subject was standing and did not take a step.
  5. Standing step: Two rings of eight objects were presented to each participant. The locations of the objects required the participant to take a step in order to reach the virtual object.
  6. Sorting game: Two rings of eight brightly colored objects were presented to each participant. Four color areas appeared in the virtual environment. The participant was then instructed to select an object and “drag” it into the matching colored area. This game used the same calibration for the “standing close” game.

Data analysis

Data pre-processing and statistical analysis were performed using R2017a MATLAB. The Kinect V2 coordinates were transformed, data from both systems were filtered and synchronized, and the Vicon data were down sampled. These steps are described in detail below.

Coordinate transformation.

As shown in Fig 3, the coordinates of the Kinect V2 and the Vicon system are different. In order to visualize the similarities in different dimensions and compute the correlation of the data from the two systems, it was necessary to perform coordinate transformation. We transformed the Kinect V2 coordinates to be the same as the Vicon’s, which means x, y and z dimension of the Kinect V2 Data have been transformed to y, z and x dimension, respectively.

Filtering.

Noise, such as spike noise, quantization noise and white noise, can be introduced by digital devices when collecting data [32]. In addition, for the Vicon system, marker occlusion is possible and gaps are filled in, introducing noise. To reduce noise, Butterworth filters were applied to the both Kinect V2 and the Vicon data. A sixth-order Butterworth filter with 4Hz cut-off frequency was selected for Vicon data, while a sixth-order Butterworth filter with 3Hz cut-off frequency and was chosen for filtering the Kinect V2 data. The parameters of filters were selected with the largest average Pearson’s r correlation coefficient of the joints, which is also applied in our previous study [33].

Synchronization.

Mystic Isle and the Vicon system started recording data at different times and through different input streams. In order to synchronize the data, the participants clapped three times at the beginning of each trial. The end of the clapping motion was considered to be the start point of a trial and the time stamp of the last game event of Mystic Isle was the end of the data trial. The data from two systems were cut based on the start and stop points. Details of synchronizing the data are provided in S2 Fig.

Down sampling.

The sampling rate of the Vicon system (100 Hz) is different from the sampling rate of the Kinect V2 (15Hz or 30Hz). The velocity metric is affected by different sample rates. Thus, the Vicon data was down sampled close to either 15Hz or 30Hz to match with Kinect V2 data’s. The pseudo code is shown in S1 Table.

Outcomes

Spatiotemporal accuracy.

The signals representing the location of joints captured by the Kinect and the Vicon systems are spatial temporal signals. When analyzing the similarity of the spatiotemporal signals from the two systems, the mean of each signal was subtracted from the signal to minimize the bias.

Signal to noise ratio (SNR) compares the level of the ground truth signal with the level of noise. We applied SNR to compare the level of the signals from the Vicon system with the level of the signal difference between the two systems. The formula of SNR is (1)

We averaged the SNR results for each joint in different types of games. SNR is typically computed in decibels (dB). A SNR with 0 dB means the signal and the noise have the same level. A SNR below 0 dB indicates that the noise is larger than the desired signal; a 10 dB SNR indicates that the signal is 10 times larger than the noise [34].

Measurement validity.

Extent of reach was calculated for each trial. Extent of reach was defined as the distance from the hand joint to the shoulder center, where shoulder center is the middle of the left and right shoulder joints. Suppose the hand joint and the shoulder center are represented by jhand = {hx, hy, hz} and jshoulderC = {sx, sy, sz}, then the extent of reach for each frame is calculated by (2)

We also calculated maximum and mean velocities for each trial. Suppose the hand joint of the ith frame is represented by jhand = {xi, yi, zi}, the velocity of this frame is calculated by (3) where t, time, is measured and stored automatically with each frame by the Kinect® V2 SDK.

Statistical analysis

For spatiotemporal accuracy, we calculated the mean Euclidean 3-D distance, Pearson’s r correlation coefficient and SNR of each joint to determine the strength of association. For measurement validity, we calculated the difference and percentage error between the two systems for each participant per each game trial. These values were then averaged for different types of games. We also calculated the standard error of the difference, Pearson’s r correlation coefficient and intra-class correlation (ICC) with 95 percentage confidence internal.

Results

Participants

Thirty subjects participated in this study, including 24 females and 6 males, with an average age of 24.2 years ± 6.6. Only two participants were left handed.

Spatiotemporal accuracy

Upper body.

The average correlation coefficient of the arm joints was high; most of the correlation values were above 0.9 (Table 2). In addition, the SNR values of the arm joints (Table 3) were above 5, indicating a signal at least 5 times greater than noise. The hand joints had the greatest correlation between the two systems and very high SNR values. The chest (Spine Mid) “joint” had lower correlation between the two systems along with lower SNR values, ranging from 3 to 10. The mean 3D distance differences of joints were less than 10 centimeters (Table 4). The distance differences of chest (Spine Mid) “joint” were smaller than the joints on the arms. In addition, the distance differences were larger in the “standing step” game where the participants were required to take a step to reach an object.

thumbnail
Table 2. Correlation coefficients of spatiotemporal signals from the Vicon and the Kinect V2 for each of the six trials.

https://doi.org/10.1371/journal.pone.0202338.t002

thumbnail
Table 3. Signal-to-noise ratios of spatiotemporal signals from the Vicon and the Kinect® V2 for each of the six trials.

https://doi.org/10.1371/journal.pone.0202338.t003

thumbnail
Table 4. Spatiotemporal accuracy of joint signals from the Kinect V2 against the Vicon markers for each of the six trials.

The accuracy is evaluated by the mean 3D Euclidean distance in centimeter and corresponding standard deviation.

https://doi.org/10.1371/journal.pone.0202338.t004

Lower body.

When comparing the two systems, the lower body joints (Tables 2 and 3) demonstrated less stability overall showing lower correlation values (0.5 to 0.9) than upper body and large variation in SNR values. However, lower body joints had smaller 3D distance differences than the values of upper body joints (Table 4). The differences were larger when the players performed a step motion in the game trial “standing step”.

Measurement validity

Extent of reach.

Overall, the average difference values of maximum extent of reach were less than 3 cm across all six trials and the percentage error was less than five percent (Table 5). More errors were introduced in measurements of the right hand as compared to the left hand. The Pearson’s r correlation coefficient of extent of hand in x, y and z dimension are high. Most were greater than 0.8. Only one trial had the lowest value 0.7. Extent of reach in 3D had lower Pearson’s r correlation coefficient correlation values compared to extent of reach in each dimension. But the values were not less than 0.6. The intra-class correlation values of extent of reach around the sagittal and frontal axes were very high (>0.96) and larger than movements around the vertical axis for most of the trials. The intra-class correlation values of extent of reach in 3D were relatively low in standing-type trials.

thumbnail
Table 5. Accuracy of clinical measures from the Kinect V2 against the Vicon for six types of game trials.

Accuracies were validated using mean difference, standard error, mean percentage error, Pearson’s r correlation coefficient and intra-class correlation with corresponding 95 percentage confidence internal.

https://doi.org/10.1371/journal.pone.0202338.t005

Maximum and mean velocity.

Maximum velocity had larger errors than mean velocity over all the trials (Table 5). The largest average error of maximum velocity was about 10 cm/s from the “game” trial. For mean velocity, the largest amount of error was less than 4 cm/s. When considering percentage error, the average percentage error of mean velocity was about 10% and the average percentage error of maximum velocity was less than 5%. The errors from the “game” trial were greater than other trials and mean velocity errors were larger in sitting versus standing trials. The Pearson’s r correlation coefficient values of maximum and mean velocities were not less than 0.9 and the intra-class correlation values were not less than 0.97.

Discussion

Mystic Isle, similar to other rehabilitation-focused games and software, has been shown feasible as an intervention for people with stroke with the Microsoft Kinect V2 camera being used as an input device. Before using the Kinect V2 and the Mystic Isle software as an assessment tool in a clinical setting, it is necessary to validate the accuracy of the Kinect V2’s tracking capabilities. Therefore, the purpose of this study was to determine the spatial accuracy and measurement validity of the Microsoft Kinect V2 sensor in a rehabilitation game in comparison to a gold-standard marker-based motion capture system (Vicon). We have demonstrated that Mystic Isle provides an accurate measurement of movement relative to the Vicon system; however there are some movements and planes of measurement in which the accuracy is considerably lower. The findings from this study are similar to findings from other comparison studies between the Kinect V1 and the Vicon. This study is different from prior work in that we tracked movements of the upper limbs during unrestrained full-body movements (versus just the lower limbs during walking) and the participants were not limited to specific planes of movements and could choose to use either hand during a reach [32, 34, 35]. The movements in this study more closely mimic real-world performance; this has significant implications for clinical rehabilitation practice. Each of these points is discussed below. We conclude with limitations and next steps for research and clinical practice.

With regards to exploring measurement validity, we found that the errors of hand extension and speed metrics from the right hand were larger than the errors of the left hand, but the higher error rate is still close enough for relevant clinical assessments. Also, we observed that the percentage errors of mean velocity of sitting trials were larger than the error from standing trials. In sitting, participants tended to move slower than in standing; thus, overall velocity was lower in sitting. However, the absolute errors were similar across all trials. The sixth trial, the sorting game, had the largest percentage errors of all trials. There are two reasons for this. First, the sorting game trial was the longest trial. The longer a person is engaged with the task, the greater the potential for noise to be introduced. Second, the required movements for game success were different than the other trials. Participants “dragged” a virtual target from one side of the screen to another in order to “sort” the virtual objects. Further, some participants had to bend at the knee in order to “place” the virtual object in the correct spot. The bending position likely introduced some noise and limited the tracking capability of the Kinect V2, particularly at the hip joints.

When considering spatial accuracy of the tracked joints, the joints of the arm were highly correlated between both systems and had high SNR values. The joints of the hip had much lower SNR values and fewer correlations over 0.90. Other researchers have reported similar findings with regards to the lower body [34]. Mentiplay et al. found poor agreement between Kinect V2 and a Vicon system in peak hip flexion [36]. These lower correlation values have often been interpreted as a consequence of the optimization of Kinect SDK for gesture-based games [34]. Thus, the Kinect SDK appears to provide higher tracking abilities on upper body joints.

Despite the decreased spatiotemporal accuracy of the Kinect for tracking lower body joints, researchers have shown that the Kinect is able to track walking paths and provide data for calculating gait-related variables with relatively high accuracy (e.g., stride length, walking speed) [24, 37]. Guess et al. showed the Kinect can accurately measure hip and knee flexion angles for a vertical drop jump [35]. One of the first evaluations of the Kinect for upper body tracking demonstrated similar percentage errors [32]. In this study, we explored full-body movements that involved reaching, sitting, stepping, and cross-body movements. These results add a richness to the primarily gait-related literature validating the use of the Kinect for tracking upper body kinematics during full body movement. Allowing participants more freedom in a reaching movement (e.g., choice of hand, allowing cross-body reaches) mimics daily activity much more closely than other studies [32, 34, 35]. This may limit the internal validity of the study; however, it greatly increases the external validity of the findings. We are the first to validate the Kinect V2 in this scenario.

Additionally, these findings support the use of the Kinect V2 in a clinical rehabilitation setting. We have shown that the Kinect V2 is an accurate tool for tracking movement; the clinical measurements we can obtain (e.g., extent of reach) are repeatable and valid. Reliability of standard clinical assessment tools for range of motion (goniometers) vary across clinical populations and joints measured [38, 39]. The ICCs between raters and between tools in prior studies range from 0.50 to 0.98 [40, 41]. Therefore, the Kinect V2 has the potential to be utilized in clinical practice and home-based rehabilitation to complement existing outcome assessments. Furthermore, these data can be collected by the Kinect V2 in remote settings, such as a patient’s home, and provide clinicians with a look at performance over time. Health insurance companies are demanding more data and metrics to support clinical decision making. With a validated sensor, this system has the potential to provide rehabilitation clinicians and insurers with high quality, performance-based data and outcomes.

This study has a few limitations. First, the sample is relatively homogenous, young, and a majority of females, which limits a generalization of the findings to an older population, which is common in stroke rehabilitation. Our previous studies with the Mystic Isle game have involved stroke patients [25, 42]; however, we have not yet validated the assessments in this population. Our ongoing research is investigating this further in an older population. Second, there were some error differences in tracking the left and right hands, although these were not statistically significant and the errors are within acceptable rates for clinical use for both left and right sides [37, 43]. Lastly, people with stroke have different movement patterns and postures as compared to healthy individuals. Flexor synergy patterns and spasticity might make it more difficult for the Kinect V2 to reliably track the more affected extremity; however, we have had much success in our prior work [25, 42].

Our preliminary research has shown that motor function and daily activity performance of stroke patients can improve through the use of Mystic Isle as an in-home intervention [26]. The next step for our research is to use the Kinect V2 to complete assessments of movements in people with stroke. Additionally, we are building an in-home monitoring system that utilizes the Kinect V2 for ambient tracking of movement and as an assessment of upper extremity movement performance. These studies will further test the use of the Kinect V2 as a valid tool for tracking movements in rehabilitation populations.

Supporting information

S1 Fig. Comparison of measurement validity under difference sample rates.

The influence of the sample rates of the Kinect to the accuracy of clinical measurement were investigated.

https://doi.org/10.1371/journal.pone.0202338.s001

(PDF)

S2 Fig. Start and stop points for synchronization.

The figure illustrates the starting point and stop point on one subject’s data.

https://doi.org/10.1371/journal.pone.0202338.s002

(PDF)

S1 Table. Pseudocode for preprocessing the joint data from the Kinect and Vicon system.

The data was preprocessed by R2017a MATLAB. The Kinect data were transformed, filtered and synchronized. The Vicon data were synchronized and down-sampled.

https://doi.org/10.1371/journal.pone.0202338.s003

(PDF)

S2 Dataset. Validation results of extent of reach metrics.

https://doi.org/10.1371/journal.pone.0202338.s005

(XLSX)

S3 Dataset. Validation results of max and mean speed metrics.

https://doi.org/10.1371/journal.pone.0202338.s006

(XLSX)

References

  1. 1. Chow Y, Susilo W, Phillips J, Baek J, Vlahu-Gjorgievska E. Video games and virtual reality as persuasive technologies for health care: an overview. Journal of Wireless Mobile Networks, Ubiquitous Computing, and Dependable Applications. 2017;8(3):18–35.
  2. 2. Pourmand A, Davis S, Lee D, Barber S, Sikka N. Emerging utility of virtual reality as a multidisciplinary tool in clinical medicine. Games Health J. 2017;6(5):263–70. pmid:28759254
  3. 3. Alexandrova IV, Rall M, Breidt M, Tullius G, Kloos U, Bulthoff HH, et al. Enhancing medical communication training using motion capture, perspective taking and virtual reality. Stud Health Technol Inform. 2012;173:16–22. pmid:22356950
  4. 4. Anderson KR, Woodbury ML, Phillips K, Gauthier LV. Virtual reality video games to promote movement recovery in stroke rehabilitation: a guide for clinicians. Archives of Physical Medicine and Rehabilitation. 96(5):973–6. pmid:25910856
  5. 5. Broeren J, Bjorkdahl A, Pascher R, Rydmark M. Virtual reality and haptics as an assessment device in the postacute phase after stroke. Cyberpsychol Behav. 2002;5(3):207–11. pmid:12123242
  6. 6. Baheux K, Yoshizawa M, Tanaka A, Seki K, Handa Y. Diagnosis and rehabilitation of hemispatial neglect patients with virtual reality technology. Technol Health Care. 2005;13(4):245–60. pmid:16055973
  7. 7. Acar G, Altun GP, Yurdalan S, Polat MG. Efficacy of neurodevelopmental treatment combined with the Nintendo® Wii in patients with cerebral palsy. J Phys Ther Sci. 2016;28(3):774–80. pmid:27134357
  8. 8. Abdulsatar F, Walker RG, Timmons BW, Choong K. “Wii-Hab” in critically ill children: a pilot trial. J Pediatr Rehabil Med. 2013;6(4):193–204. pmid:24705654
  9. 9. Chanpimol S, Seamon B, Hernandez H, Harris-Love M, Blackman MR. Using Xbox kinect motion capture technology to improve clinical rehabilitation outcomes for balance and cardiovascular health in an individual with chronic TBI. Arch Physiother. 2017;7:6. pmid:28824816
  10. 10. Chang CY, Lange B, Zhang M, Koenig S, Requejo P, Somboon N, et al. Towards pervasive physical rehabilitation using Microsoft Kinect. In: 2012 6th International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth) and Workshops; 2012 May 21-24; San Diego, USA. New York: IEEE; 2012. p. 159–62.
  11. 11. Sugarman H, Weisel-Eichler A, Burstin A, Brown R. Use of novel virtual reality system for the assessment and treatment of unilateral spatial neglect: A feasibility study. In: 2011 International Conference on Virtual Rehabilitation; 2011 June 27-29; Zurich, Switzerland. New York: IEEE; 2011. p. 1–2.
  12. 12. VirtualRehab. VirtualRehab; [cited 2017 November 19]. Available from: http://www.virtualrehab.co/
  13. 13. Health R. Reflexion Health; [cited 2017 November 19]. Available from: http://reflexionhealth.com/
  14. 14. Wilson JD, Khan-Perez J, Marley D, Buttress S, Walton M, Li B, et al. Can shoulder range of movement be measured accurately using the Microsoft Kinect sensor plus Medical Interactive Recovery Assistant (MIRA) software?. J Shoulder Elbow Surg. 2017;26(12):e382–e9. pmid:28865963
  15. 15. VirtualRehab. MotionCare360; [cited 2017 November 19]. Available from: http://www.motioncare360.com/
  16. 16. Therapy P. Making physical therapy more efficient: 5Plus Therapy; [cited 2017 November 2017]. Available from: http://www.5plustherapy.com/#solution
  17. 17. Pathirana PN, Li S, Trinh HM, Seneviratne A. Robust real-time bio-kinematic movement tracking using multiple kinects for tele-rehabilitation. IEEE Transactions on Industrial Electronics. 2016;63(3):1822–33
  18. 18. Brennan DM, Mawson S, Brownsell S. Telerehabilitation: enabling the remote delivery of healthcare, rehabilitation, and self management. Stud Health Technol Inform. 2009;145:231–48 pmid:19592797
  19. 19. Kurillo G, Chen A, Bajcsy R, Han JJ. Evaluation of upper extremity reachable workspace using Kinect camera. Technol Health Care. 2013;21(6):641–56 pmid:24284552
  20. 20. Mousavi Hondori H, Khademi M. A review on technical and clinical impact of microsoft kinect on physical therapy and rehabilitation. Journal of Medical Engineering. 2014;2014:16.
  21. 21. Geerse DJ, Coolen BH, Roerdink M. Kinematic validation of a multi-kinect v2 Instrumented 10-meter walkway for quantitative gait assessments. PLoS One. 2015;10(10):e0139913. pmid:26461498
  22. 22. Muller B, Ilg W, Giese MA, Ludolph N. Validation of enhanced kinect sensor based motion capturing for gait assessment. PLoS One. 2017;12(4):e0175813. pmid:28410413
  23. 23. Motiian S, Pergami P, Guffey K, Mancinelli CA, Doretto G. Automated extraction and validation of children’s gait parameters with the Kinect. BioMedical Engineering OnLine. 2015;14:112. pmid:26626555
  24. 24. Springer S, Yogev Seligmann G. Validity of the kinect for gait assessment: a focused review. Sensors (Basel). 2016;16(2):194.
  25. 25. Lange B, Koenig S, Chang CY, McConnell E, Suma E, Bolas M, et al. Designing informed game-based rehabilitation tasks leveraging advances in virtual reality. Disabil Rehabil. 2012;34(22):1863–70 pmid:22494437
  26. 26. Proffitt R, Lange B. Considerations in the efficacy and effectiveness of virtual reality interventions for stroke rehabilitation: moving the field forward. Phys Ther. 2015;95(3):441–8 pmid:25343960
  27. 27. Vicon. Vicon Clinical Science: Vicon; [cited 2017 November 19]. Available from: https://www.vicon.com/motion-capture/life-sciences.
  28. 28. Microsoft. Reflexion Health; [cited 2017 November 19]. Available from: https://msdn.microsoft.com/en-us/library/microsoft.kinect.jointtype.aspx
  29. 29. Vicon Motion Systems. Full body modeling with Plug in Gait: Vicon; [cited 2017 November 19]. Available from: https://docs.vicon.com/display/Nexus26/Full+body+modeling+with+Plug-in+Gait.
  30. 30. LifeModeler. Marker Placement Protocols: LifeModeler; [cited 2017 November 19]. Available from: http://www.lifemodeler.com/LM_Manual_2010/A_motion.shtml.
  31. 31. Hartmann M, Kreuzpointner F, Schwirtz A, Haas J.-P. Improved accuracy with an optimized plug-in-gait protocol. Gait & Posture. 2014;39(1):S109.
  32. 32. Nixon ME, Howard AM, Chen YP. Quantitative evaluation of the Microsoft Kinect for use in an upper extremity virtual rehabilitation environment. In: 2013 International Conference on Virtual Rehabilitation (ICVR); 2013 August 26-29; Philadelphia, USA. New York: IEEE; 2011. p. 222–8.
  33. 33. Ma M, Proffitt R, Skubic M. Quantitative Assessment and Validation of a Stroke Rehabilitation Game. In: 2017 IEEE/ACM International Conference on Connected Health: Applications, Systems and Engineering Technologies (CHASE); 2017 July 17-19; Philadelphia, USA. New York: IEEE; 2017. p. 255–257.
  34. 34. Otte K, Kayser B, Mansow-Model S, Verrel J, Paul F, Brandt AU, et al. Accuracy and Reliability of the Kinect Version 2 for Clinical Measurement of Motor Function. PLoS One. 2016;11(11):e0166532. pmid:27861541
  35. 35. Guess TM, Razu S, Jahandar A, Skubic M, Huo Z. Comparison of 3D joint angles measured with the kinect 2.0 skeletal tracker versus a marker-based motion capture system. Journal of Applied Biomechanics. 2017;33(2):176–81. pmid:27918704
  36. 36. Mentiplay BF, Perraton LG, Bower KJ, Pua YH, McGaw R, Heywood S, et al. Gait assessment using the Microsoft Xbox One Kinect: Concurrent validity and inter-day reliability of spatiotemporal and kinematic variables. J Biomech. 2015;48(10):2166–70. pmid:26065332
  37. 37. Stone E, Skubic M. Evaluation of an inexpensive depth camera for in-home gait assessment. Journal of Ambient Intelligence and Smart Environments. 2011;3(4):349–361.
  38. 38. Boone DC, Azen SP, Lin CM, Spence C, Baron C, Lee L. Reliability of Goniometric Measurements. Physical Therapy. 1978;58(11):1355–60. pmid:704684
  39. 39. Horger MM. The Reliability of Goniometric Measurements of Active and Passive Wrist Motions. Am J Occup Ther. 1990;44(4):342–348. pmid:2330964
  40. 40. van de Pol R, van Trijffel E, Lucas C. Inter-rater reliability for measurement of passive physiological range of motion of upper extremity joints is better if instruments are used: a systematic review. Journal of Physiotherapy. 2010;56(1):7–17. pmid:20500132
  41. 41. Mitchell K, Gutierrez SB, Sutton S, Morton S, Morgenthaler A. Reliability and validity of goniometric iPhone applications for the assessment of active shoulder external rotation. Physiotherapy Theory and Practice. 2014;30(7):521–525. pmid:24654927
  42. 42. Proffitt R, Lange B. Feasibility of a Customized, In-Home, Game-Based Stroke Exercise Program Using the Microsoft Kinect® Sensor. Int J Telerehabil. 2015;7(2):23–34. pmid:27563384
  43. 43. Hotrabhavananda B, Mishra AK, Skubic M, Hotrabhavananda N, Abbott C. Evaluation of the microsoft kinect skeletal versus depth data analysis for timed-up and go and figure of 8 walk tests. In: 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC); 2016 August 16-20; Orlando, USA. New York: IEEE; 2016. p. 2274–77.