We designed a virtual reality test to assess instrumental activities of daily living (IADL) with a motion capture system. We propose a detailed kinematic analysis to interpret the participant's various movements, including trajectory, moving distance, and time to completion to evaluate IADL capabilities.
The inability to complete instrumental activities of daily living (IADL) is a precursor to various neuropsychological diseases. Questionnaire-based assessments of IADL are easy to use but prone to subjective bias. Here, we describe a novel virtual reality (VR) test to assess two complex IADL tasks: handling financial transactions and using public transportation. While a participant performs the tasks in a VR setting, a motion capture system traces the position and orientation of the dominant hand and head in a three-dimensional Cartesian coordinate system. Kinematic raw data are collected and converted into 'kinematic performance measures,' i.e., motion trajectory, moving distance, and time to completion. Motion trajectory is the path of a particular body part (e.g., dominant hand or head) in space. Moving distance refers to the total distance of the trajectory, and time to completion is how long it took to complete an IADL task. These kinematic measures could discriminate patients with cognitive impairment from healthy controls. The development of this kinematic measuring protocol allows detection of early IADL-related cognitive impairments.
Instrumental activities of daily living (IADL), such as handling financial transactions, using public transportation, and cooking, are medical markers since they require multiple neuropsychological functions1. Impaired IADL capabilities are thus considered precursors to neurological diseases, such as mild cognitive impairment (MCI) and dementia2. Gold's comprehensive review of IADL tasks3 indicated that more cognitively demanding tasks, such as managing finances and using public transportation, were the earliest predictor of MCI and dementia.
To date, the most commonly used assessments of IADL are self-reported questionnaires, informant-based questionnaires, and performance-based assessments4. Questionnaire-based assessments of IADL are cost-effective and easy to use, but are prone to subjective bias. For instance, when self-reporting, patients tend to over- or under-estimate their IADL capabilities5. Similarly, informants misjudge IADL capabilities due to the observer's misperceptions or knowledge gaps4. Thus, performance-based assessments that ask patients to carry out specific IADL tasks have been preferred, although many of the tasks are inappropriate for a general clinical setting6.
Recently, virtual reality (VR) studies have shown that this technology could have significant applications in medicine and healthcare, which includes everything from training to rehabilitation to medical assessment7. All participants can be tested under the same VR conditions, which mimic the real world. For instance, Allain et al.8 developed a virtual coffee-making task and showed that patients with cognitive impairment performed the task poorly. Klinger et al.9 developed another VR environment for mailing and shopping tasks and found a meaningful relationship between task completion time in VR and neuropsychological test results. Previous VR studies of IADL assessment have mostly focused on simple performance measures such as reaction time or accuracy when using conventional input devices such as a mouse and keyboard8,9. More detailed performance data about IADL is thus needed to efficiently screen for patients with MCI4.
Kinematic analysis of real-time motion capture data is a powerful approach to quantitatively document detailed performance data associated with IADL tasks. For example, White et al.10 developed a virtual kitchen that captures the participant's joint angle data during daily living tasks and used captured data to quantitatively assess the effectiveness of physical therapy. Dimbwadyo-Terrer et al.11 developed an immersive VR environment to assess upper limb performance when conducting basic daily living tasks and showed that kinematic data recorded in a VR environment highly correlated with functional scales of the upper limb. These kinematic analyses with motion capture systems could provide further opportunity to quickly assess a patient's cognitive impairment12. Inclusion of the detailed kinematic data in screening for patients with MCI significantly improved the classification of patients compared to healthy controls13.
Here, we describe a protocol to assess the kinematics of daily living movements with motion capture systems in an immersive VR environment. The protocol comprised two complex IADL tasks: "Task 1: Withdraw money" (handling financial transactions) and "Task 2: Take a bus" (using public transportation). While the tasks were performed, a motion capture system traced the position and orientation of the dominant hand and head. After completing Task 1, dominant hand trajectory, moving distance, and time to completion were collected. In Task 2, head trajectory, moving distance, and time to completion were collected. The Representative Results section in this article details the preliminary test of patients with MCI (i.e., IADL capabilities are impaired) compared to healthy controls (i.e., IADL capabilities are intact).
All experimental procedures described here were approved by the Institutional Review Board of Hanyang University, according to the Declaration of Helsinki (HYI-15-029-2). 6 healthy controls (4 males and 2 females) and 6 MCI patients (3 males and 3 females) were recruited from a tertiary medical center, Hanyang University Hospital.
1. Recruit Participants
2. Install VR Software and Connect Computers
3. Set Up Motion Capture Systems in a Virtual Environment
4. Prepare a Virtual Environment for Use
5. Familiarize the Participant with the Virtual Environment
6. Perform "Task 1: Withdraw money"
CAUTION: Counterbalance the sequences of Task 1 and Task 2 to remove the carry-over effect.
7. Perform "Task 2: Take a bus"
CSV files from "Task 1: Withdraw money" were analyzed using the statistical software R to calculate the dominant hand trajectory, moving distance, and time to completion. The trajectory of the dominant hand movement is visualized (Figure 6). The moving distance of the dominant hand is calculated by summing the total distances between sequential hand positions while performing Task 1. The distance between positions is the Euclidian distance. Time to completion means the time taken to finish the whole task (i.e., from step 1 "insert the card into the ATM" to step 8 "take money from the ATM"). For the R code for statistical analysis, see the attached "Task 1 R Code.docx" file in Supplemental File 3.
CSV files from "Task 2: Take a bus" are analyzed to calculate the head trajectory, moving distance, and time to completion using the R statistical software. The trajectory of the head movement is visualized (Figure 7). The moving distance of the head is calculated by summing the total distances between sequential head positions when performing Task 2. The distance between two positions is the Euclidian distance. The time to completion means the time taken from the start to the end of the whole task with eight target buses. For the R code for statistical analysis, see the attached "Task 2 R Code.docx" file in Supplemental File 4.
Anthropometric characteristics and the kinematic measures from patients with MCI and healthy controls are shown in Table 1. This VR test with motion capture systems presents new opportunities for measuring the kinematics of complex IADL tasks. By following the protocol presented here, researchers can obtain kinematic performance data for "Task 1: Withdraw money" (handling financial transactions) and "Task 2: Take a bus" (using public transportation).
Indeed, a case-control study with this protocol was performed with several statistical analyses (i.e., multivariate analysis of variance, a Pearson correlation analysis, and a forward stepwise linear discriminant analysis), which can be found in our empirical study13.
Figure 1: A room-sized immersive virtual environment Please click here to view a larger version of this figure.
Figure 2: Preparation before the assessment. (A) The subject wears stereoscopic glasses. (B) Reflective markers are attached to the dominant hand and head. Please click here to view a larger version of this figure.
Figure 3: Virtual hand representation in the virtual environment. (A) A white sphere represents the position of the index finger. The participant clicks a virtual number "2" button. (B)The participant clicks a virtual number "4" button. Please click here to view a larger version of this figure.
Figure 4: Task 1: Withdraw money from ATM. (A) Participant enters a PIN code into the ATM. (B) Participant withdraws money from the ATM. Please click here to view a larger version of this figure.
Figure 5: Task 2: Take a bus. (A) Participant waits at the bus stop. (B) Participant walks out of the bus stop and into the target bus. Please click here to view a larger version of this figure.
Figure 6: Task 1: Hand movement trajectory in 3D Cartesian space. (A) Healthy controls. (B) MCI patients. Please click here to view a larger version of this figure.
Figure 7: Task 2: Head movement trajectory in 3D Cartesian space. (A) Healthy controls. (B) MCI patients. Please click here to view a larger version of this figure.
MCI patients | Healthy controls | |
Number (male) | 6 (3) | 6 (4) |
Age (year) | 72.4 ± 1.9 | 72.6 ± 1.7 |
Task 1: Withdraw money | ||
Moving distance (m) | 34.7 ± 9.1 | 52.5 ± 10.5 |
Time to completion (min) | 1.8 ± 0.3 | 1.3 ± 0.2 |
Task 2: Take a bus | ||
Moving distance (m) | 100.3 ± 11.4 | 128.5 ± 14.2 |
Time to completion (min) | 13.5 ± 0.2 | 13.5 ± 0.2 |
Table 1: Anthropometric characteristics and kinematic measures. Values are means ± SD.
Supplemental File 1: Task 1 Withdraw Money.zip. Please click here to download this file.
Supplemental File 2: Task 2 Take a Bus.zip. Please click here to download this file.
Supplemental File 3: Task 1 R Code.docx. Please click here to download this file.
Supplemental File 4: Task 2 R Code.docx. Please click here to download this file.
We detailed a kinematic measuring protocol of daily living movements with motion capture systems in an immersive VR environment. First, the experimental setting guided to how to set up, prepare, and familiarize participants with the immersive VR environment. Second, we developed two standardized IADL tasks in VR. Third, Step 3 and Step 5 in the Protocol section are the most critical steps to minimize VR sickness. When setting up the motion capture systems in the virtual environment (Step 3), it is important to mount the tracking camera high enough to fully cover the capture volume, fix the cameras stably to prevent movement during capture, ensure that at least two cameras can simultaneously capture an object, and remove any extraneous reflections or unnecessary markers from the virtual environment. While familiarizing the participants with VR (Step 5), it is crucial to provide enough training for them to become accustomed to the virtual experience. If the participants experience any VR sickness symptoms (e.g., discomfort, headache, nausea, vomiting, pallor, sweating, fatigue, drowsiness, disorientation, and apathy), the experiment should be stopped. Finally, the kinematic raw data were translated by R statistical software.
A limitation and challenge of our protocol is that the virtual IADL tasks should be validated by comparison with real IADL tasks. Though previous studies demonstrated that both virtual and real tasks were highly correlated in terms of reaction time, accuracy8, clinical, and functional measures11, the current kinematic measuring protocol should be compatible with many conventional neuropsychological assessments. Building upon this validation, we need to scale up this protocol with different IADL tasks. Another limitation is that this protocol analyzes only typical kinematic measures, so more sophisticated kinematic performance measures in a virtual environment, such as acceleration, movement accuracy, and efficiency, should be included.
The significance of the current kinematic measurement protocol is that it is fast, safe, easy to perform, and non-invasive for detection of early IADL deficits. A former study using this protocol confirmed that kinematic measures in conjunction with a neuropsychological test result best discriminated MCI patients from healthy controls13. Quantification of specific functional deficits could well provide a basis for locating the source and extent of neurological damage and therefore aid in clinical decision-making for individualizing therapies18. In this context, the protocol proposed in this article could be used for evidence-based clinical decision-making.
Considering future applications, this protocol could be used for other neuropsychological diseases such as traumatic brain injury19. Also, it might be interesting to analyze specific subtasks in the current protocol to identify which types are more challenging. Moreover, recent VR studies to train stroke patients showed improvements in memory and attention functions following a VR-based game intervention20. It would be of great interest to apply this protocol to additional neuropsychological rehabilitation contexts.
The authors have nothing to disclose.
K.S. and A.L. contribute equally. This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (NRF-2016R1D1A1B03931389).
Computer | N/A | N/A | Computer requirements: – Single socket H3 (LGA 1150) supports – Intel® Xeon® E3-1200 v3, 4th gen. Core i7/i5/i3 processors – Intel® C226 Express PCH – Up to 32GB DDR3 ECC/non-ECC 1600MHz UDIMM in 4 sockets – Dual Gigabit Ethernet LAN ports – 8x SATA3 (6Gbps) – 2x PCI-E 3.0 x16, 3x PCI-E 2.0 x1, and 2x PCI 5V 32-bit slots – 6x USB 3.0 (2 rear + 4 via headers) – 10x USB 2.0 (4 rear + 6 via headers) – HD Audio 7.1 channel connector by Realtek ALC1150 – 1x DOM power connector and 1x SPDIF Out Header – 800W High Efficiency Power Supply – Intel Xeon E3-1230v3 – DDR3 PC12800 8GB ECC – WD 1TB BLUE WD 10EZEX 3.5" – NVIDIA QUADRO K5000 & SYNC |
Stereoscopic 3D Projector | Barco | F35 AS3D WUXGA | Resolution: – WQXGA (2,560 x 1,600) – Panorama (2,560 x 1,080) – WUXGA (1,920 x 1,200), 1080p (1,920 x 1,080) |
Stereoscopic Glasses | Volfoni | Edge 1.2 | For further information, visit http://volfoni.com/en/edge-1-2/ |
Motion Capture Systems | NaturalPoint OptiTrack | 17W | For further information, visit http://optitrack.com/products/prime-17w/ |
OptiTrack (Motion capture software) | NaturalPoint OptiTrack | Motive 2.0 | For further information, visit https://optitrack.com/downloads/motive.html |
MiddleVR (Middleware software) | MiddleVR | MiddleVR For Unity | For further information, visit http://www.middlevr.com/middlevr-for-unity/ |
VRDaemon (Middleware software) | MiddleVR | MiddleVR For Unity | For further information, visit http://www.middlevr.com/middlevr-for-unity/ |
Unity3D (Game engine) | Unity Technologies | Personal | For further information, visit https://unity3d.com/unity |