概要

A Human-machine-interface Integrating Low-cost Sensors with a Neuromuscular Electrical Stimulation System for Post-stroke Balance Rehabilitation

Published: April 12, 2016
doi:

概要

A novel low-cost human-machine interface for interactive post-stroke balance rehabilitation system is presented in this article. The system integrates off-the-shelf low-cost sensors towards volitionally driven electrotherapy paradigm. The proof-of-concept software interface is demonstrated on healthy volunteers.

Abstract

A stroke is caused when an artery carrying blood from heart to an area in the brain bursts or a clot obstructs the blood flow to brain thereby preventing delivery of oxygen and nutrients. About half of the stroke survivors are left with some degree of disability. Innovative methodologies for restorative neurorehabilitation are urgently required to reduce long-term disability. The ability of the nervous system to reorganize its structure, function and connections as a response to intrinsic or extrinsic stimuli is called neuroplasticity. Neuroplasticity is involved in post-stroke functional disturbances, but also in rehabilitation. Beneficial neuroplastic changes may be facilitated with non-invasive electrotherapy, such as neuromuscular electrical stimulation (NMES) and sensory electrical stimulation (SES). NMES involves coordinated electrical stimulation of motor nerves and muscles to activate them with continuous short pulses of electrical current while SES involves stimulation of sensory nerves with electrical current resulting in sensations that vary from barely perceivable to highly unpleasant. Here, active cortical participation in rehabilitation procedures may be facilitated by driving the non-invasive electrotherapy with biosignals (electromyogram (EMG), electroencephalogram (EEG), electrooculogram (EOG)) that represent simultaneous active perception and volitional effort. To achieve this in a resource-poor setting, e.g., in low- and middle-income countries, we present a low-cost human-machine-interface (HMI) by leveraging recent advances in off-the-shelf video game sensor technology. In this paper, we discuss the open-source software interface that integrates low-cost off-the-shelf sensors for visual-auditory biofeedback with non-invasive electrotherapy to assist postural control during balance rehabilitation. We demonstrate the proof-of-concept on healthy volunteers.

Introduction

An episode of neurological dysfunction caused by focal cerebral, spinal, or retinal infarction is called stroke1. Stroke is a global health problem and fourth leading cause of disability worldwide1. In countries like India and China, the two most populous nations of the world, neurologic disability due to stroke is being labeled as hidden epidemic2. One of the most common medical complications after a stroke are falls with a reported incidence of up to 73% in the first year post-stroke3. The post-stroke fall is multifactorial and includes both spinal and supraspinal factors like balance and visuospatial neglect4. A review by Geurts and colleagues5 identified 1) multi-directionally impaired maximal weight shifting during bipedal standing, 2) slow speed, 3) directional imprecision, and 4) small amplitudes of single and cyclic sub-maximal frontal plane weight shifts as the balance factors for fall risk. The consequent impact on activities of daily living can be significant since prior works have shown that balance is associated with ambulatory ability and independence in gross motor function5,6. Moreover, Geurts and colleagues5 suggested that supraspinal multisensory integration (and muscle coordination7) in addition to muscle strength is critical for balance recovery which is lacking in current protocols. Towards multisensory integration, our hypothesis8 on volitionally driven non-invasive electrotherapy (NMES/SES) is that this adaptive behavior can be shaped and facilitated by modulating active perception of sensory inputs during NMES/SES-assisted movement of the affected limb such that the brain can incorporate this feedback into subsequent movement output by recruiting alternate motor pathways9, if needed.

To achieve volitionally driven NMES/SES-assisted balance training in a resource-poor setting, a low-cost human-machine-interface (HMI) was developed by leveraging available open-source software and recent advances in off-the-shelf video game sensor technology for visual-auditory biofeedback. NMES involves coordinated electrical stimulation of nerves and muscles that has been shown to improve muscle strength and reduce spasticity10. Also, SES involves stimulation of sensory nerves with electrical current to evoke sensations where preliminary published work11 showed that subsensory stimulation applied over the tibialis anterior muscles alone is effective in attenuating postural sway. Here, the HMI will make possible sensory-motor integration during interactive post-stroke balance therapy where volitionally-driven NMES/SES for the ankle muscles will act as a muscle amplifier (with NMES) as well as enhance afferent feedback (with SES) to assist healthy ankle strategies12,13,14 to maintain upright stance during postural sways. This is based on the hypothesis presented in Dutta et al.8 that an increased corticospinal excitability of relevant ankle muscles effected through non-invasive electrotherapy may lend to an improved supraspinal modulation of ankle stiffness. Indeed, prior work has shown that NMES/SES elicits lasting changes in corticospinal excitability, possibly as a result of co-activating motor and sensory fibers15,16. Moreover, Khaslavskaia and Sinkjaer17 showed in humans that concurrent motor cortical drive present at the time of NMES/SES enhanced motor cortical excitability. Therefore, volitionally-driven NMES/SES may induce short-term neuroplasticity in spinal reflexes (e.g., reciprocal Ia inhibition17) where corticospinal neurons that project via descending pathways to a given motoneuron pool can inhibit the antagonistic motoneuron pool via Ia-inhibitory interneurons in humans18, as shown in Figure 1, towards an operant conditioning paradigm (see Dutta et al.8).

Figure 1
Figure 1: The concept (details at Dutta et al.21) underlying interactive human machine interface (HMI) to drive the center of pressure (CoP) cursor to the cued target to improve ankle muscle coordination under volitionally driven neuromuscular electrical stimulation (NMES)-assisted visuomotor balance therapy. EEG: electroencephalography, MN: α-motoneuron, IN: Ia-inhibitory interneuron, EMG: electromyogram, DRG: dorsal root ganglion. Reproduced from8 and37​. Please click here to view a larger version of this figure.

The antero-posterior (A-P) displacements in center of mass (CoM) are performed by ankle plantarflexors (such as medial gastrocnemius and soleus muscles) and dorsiflexors (such as the anterior tibial muscle) while medio-lateral (M-L) displacements are performed by ankle invertors (such as the anterior tibial muscle) and evertors (such as peroneus longus and brevis muscles). Consequently, stroke-related ankle impairments including weakness of the ankle dorsiflexor muscles and increased spasticity of the ankle plantarflexor muscles lead to impaired postural control. Here, agility training programs6 can be leveraged in a virtual reality (VR) based gaming platform that challenge dynamic balance where tasks are progressively increased in difficulty which may be more effective than static stretching/weight-shifting exercise program in preventing falls6. For example, subjects can perform volitionally driven NMES/SES assisted A-P and M-L displacements during a dynamic visuomotor balance task where the difficulty can be progressively increased to ameliorate post-stroke ankle-specific control problems in weight shifting during bipedal standing. Towards volitionally driven NMES/SES assisted balance therapy in a resource-poor setting, we present a low-cost HMI for Mobile Brain/Body Imaging (MoBI)19, towards visual-auditory biofeedback which can also be used for data collection from low-cost sensors for offline data exploration in MoBILAB (see Ojeda et al.20).

Protocol

Note: The HMI software pipeline was developed based on freely available open-source software and off-the-shelf low-cost video game sensors (details available at: https://team.inria.fr/nphys4nrehab/software/ and https://github.com/NeuroPhys4NeuroRehab/JoVE). The HMI software pipeline is provided for data collection during a modified functional reach task (mFRT)21 in a VR based gaming platform for visuomotor balance therapy (VBT)8.

Figure 2a shows the diagnostic eye tracker setup where the gaze features are extracted offline for the quantification of post-stroke residual function so that the visual feedback in VR can be customized accordingly.

Figure 2b shows the experimental setup for VBT.

Figure 2
Figure 2: (a) Schematic of the human-machine-interface for the evaluation of post-stroke pursuit eye movements. (b) Schematic of the human-machine-interface where the software interface integrates biosignal sensors and motion capture to record mobile brain/body imaging data with neuromuscular electrical stimulation system (NMES) and sensory electrical stimulation (SES) for post-stroke NMES/SES-assisted visuomotor balance therapy. NMES: Neuromuscular Electrical Stimulation, SES: Sensory Electrical Stimulation, EMG: Electromyogram, EEG: Electroencephalogram, EOG: Electrooculogram, CoP: Center of Pressure, PC: Personal Computer. Reproduced from 8 and 37​. Please click here to view a larger version of this figure.

1. Software Installation for Mobile Brain/Body Imaging During VBT

  1. Install drivers for the Motion Capture (installation procedures provided at https://code.google.com/p/labstreaminglayer/wiki/KinectMocap)
    1. Download and install Kinect Runtime from http://go.microsoft.com/fwlink/?LinkId=253187 (Motion Capture sensor should not be plugged into any of the USB ports on the computer).
    2. Plug in the powered Motion Capture Sensor into a USB port via the interface cable. The drivers will load automatically.
  2. Install drivers for the Eye Tracker Sensor (installation procedures provided at http://github.com/esdalmaijer/EyeTribe-Toolbox-for-Matlab)
    1. Download the software from http://theeyetribe.com, launch the application and launch the application to install the software (Eye Tracker sensor should not be plugged into any of the USB ports on the computer).
    2. Plug in the powered Eye Tracker Sensor and the drivers will load automatically.
  3. Install drivers for the Balance Board (installation procedures provided at (installation procedures provided at http://www.colorado.edu/intphys/neuromechanics/cu_wii.html)
    1. Download and extract CU_WiiBB.zip from http://www.colorado.edu/intphys/neuromechanics/CU_WiiBB.zip
    2. Copy the WiiLab folder to Microsoft Window operating system's standard Program Files directory.
    3. Open the WiiLab folder in the Program Files directory and run as an administrator the InstallWiiLab.bat file to install the Balance Board.
  4. Install drivers for EEG/EOG (installation procedures provided at http://openvibe.inria.fr/how-to-connect-emotiv-epoc-with-openvibe/ )
    1. Download and install Emotiv SDK from http://www.emotiv.com/apps/sdk/209/
    2. Download and install OpenViBE Acquisition Server with labstreaminglayer (LSL) from https://code.google.com/p/labstreaminglayer/downloads/detail?name=OVAS-withLSL-0.14.3-3350-svn.zip for distributed multi-sensor signal transport, time synchronization and data collection system (installation procedures provided at https://code.google.com/p/labstreaminglayer/).
  5. Install the drivers for the commercial NMES stimulator (details at http://www.vivaltis.com/gammes/phenix/phenix-usb-neo-50-554-1.html#content).

2. Low-cost Sensor Placement for Mobile Brain/Body Imaging (MoBI): The Open-source HMI Software Pipeline Provides Mobile Brain/Body Imaging (MoBI) 19 with Low-Cost Off-the-Shelf Sensors (Figure 2b) Which Can be Adapted for Other Agility Training Programs.

  1. Visual Feedback for MoBI:
    1. Begin by obtaining a projection screen to display the visual biofeedback at the one end of the room (recommended distance from subject 0.6 m).
    2. Adjust the height so that the center of the screen will be at the subjects' eye-level.
  2. Motion Capture for MoBI:
    1. Place the motion capture sensor in front of the projection screen, and aim it at the volume of motion capture.
    2. Confirm that the volume of motion capture is 1.5 m to 2.5 m in front of the motion capture sensor.
  3. Balance Board Placement for MoBI:
    1. Place the Balance Board on the floor, about 2.0 m away from the motion capture sensor.
    2. Leave enough room around the Balance Board to ensure full-body movement (i.e., during modified functional reach task21).
  4. EEG/EMG/EOG Sensor Placement for MoBI
    1. Ask the subject to sit on a chair facing the Motion Capture and with their feet on the Balance Board.
    2. Place the recording (EMG) cum stimulation (NMES/SES) electrodes bilaterally on the Medial Gastrocnemius (MG) and Tibialis Anterior (TA) muscles of the subject. Then, connect them to the wireless electrical stimulator (NMES/SES) system.
    3. Place the electroencephalogram (EEG) cap on the subjects head following the International 10 – 20 system. Then, place the EEG electrodes with conductive paste at -Fz, C3, Cz, C4, P3, Pz, P4, PO7, Oz, PO8 – before connecting them to the wireless EEG headset.
    4. Place two EEG electrodes with conductive paste above and below one of eyes for vertical EOG and put two electrodes with conductive paste at the outer canthus of each eye for horizontal EOG. (Note: In case Eye Tracker sensor is not used in the post-stroke subject then bilateral EOG is preferable).
    5. Place two EEG electrodes on earlobes as reference electrodes.

3. Eye Tracker Based Evaluation of Post-stroke Pursuit Eye Movements

  1. Ask the subject to sit with the chin resting comfortably on the height-adjustable Chin-Rest. Then, raise the computer monitor to a convenient height such that the eyes are roughly facing the center of the computer monitor (Figure 2a).
  2. Place the Eye Tracker roughly 50 cm from the Chin-Rest and ask the subject to look straight at the computer monitor for visual cues.
  3. Run EyeTribeWinUI.exe in the 'SmartEye' folder to calibrate the Eye Tracker sensor. The subject will be asked to look at various targets on the PC monitor for roughly 2 sec each. A typical user calibration process takes approximately 20 sec to complete. The (x, y) coordinates of the subject's gaze point are recorded for different cued targets for calibration.
  4. Run 'Visual_Stimulus.exe' in the SmartEye folder to execute the virtual reality based interface. Subsequently run the 'SmartEye.exe' program present in the 'SmartEye' folder to acquire the subjects' eye gaze data that is synchronized with the virtual reality based task. This data will be used for the evaluation of post-stroke pursuit eye movement.

Figure 3
Figure 3: (a) Cursor representing the center of pressure (CoP) which needs to be volitionally driven to the cued target during visuomotor balance therapy , (b) Visuomotor balance therapy protocol where the subject steers the computer cursor to a peripheral target driven by volitionally generated CoP excursions. The Reset can be assisted with Neuromuscular Electrical Stimulation (NMES) and sensory electrical stimulation (SES), (c) Experimental setup for visually-cued visuomotor balance therapy. Reproduced from 8 and 37​. Please click here to view a larger version of this figure.

4. NMES/SES-Assisted Visuomotor Balance Therapy (VBT) under MoBI

  1. Connect the eye-tracker and balance board sensors to the visual feedback computer (Figure 2).
    1. Make sure that the Eye Tracker sensor is powered on, connected to computer, and that it has fully booted. Start the 'EyeTribe server.exe' and 'EyeTribeWinUI.exe' available in the 'VBT' folder (see steps 1.3).
    2. Make sure that the Balance Board sensor is powered on. Then, press the button on the Balance Board sensor to make the remote discoverable in the menu. Then, click on the show or hide icon in the system's taskbar and click on Bluetooth device icon. Then, click on the 'Add a Device' option and pair the Balance Board sensor as a Bluetooth device without using the code to the visual feedback computer. Once the Balance Board sensor is connected to the visual feedback computer, open the 'VBT' folder and run the WiiBBinterface.m file to establish Matlab- Balance Board sensor interface (see steps 1.6).
    3. Make sure that the Motion Capture sensor is powered on, connected to the computer and that it has fully booted (there is a green LED on the front). Open the LSL folder and start 'Mocap' software to begin streaming of the motion capture sensor data (see steps 1.6).
    4. Make sure that the EEG/EOG data acquisition systems are powered on. Then, double-click on the openvibe-acquisition-server-withlsl.cmd available in the LSL folder (see steps 1.6). From the menu, select the respective sensor hardware (i.e., 'Emotiv EPOC') and configure the module, if necessary, by clicking on the 'Driver Properties'. Then, click on 'Connect', and then click on 'Play' to start the acquisition server.
  2. Calibrate the Sensors for VBT
    1. Ask the post-stroke subject to stand on the Balance Board with safety harness (and partial body weight support, if necessary).
    2. Set a minimum baseline NMES level (pulse-width and current level) necessary for upright standing according to clinical observation (i.e., zero body weight support)22. For setting the minimum baseline NMES level, one can set the stimulation frequency at 20 Hz and then increase the pulse-width and/or current level until upright standing is achieved. Here, NMES of knee extensors is required to generate enough torque to prevent knee buckling.
    3. Ask the subject to perform various reach movements that affects CoM and CoP location.
    4. Run the 'CalibSensors.m' program available in the 'DataCollect' folder in order to collect multi-sensor calibration data while the subject performs various self-initiated maximal reach movements in different directions that affect center of mass (CoM) and center of pressure (CoP) location on the visual feedback.

5. Multi-sensor Data Collection from Low-cost Sensors During VBT (Figure 2b)

  1. Run the 'CollectBaseline.m' program in the 'DataCollect' folder to collect baseline resting-state, eyes-open, multi-sensor data by asking the subject to stand still for 2 min while looking straight at the CoP target on the PC monitor (Figure 3a).
  2. Connect the visual feedback computer's video output to the projection screen and run the SmartEyeVRTasks.exe file in the 'VBT' folder in the visual feedback computer to launch the SmartEyeVRTasks GUI. Also, run 'CollectVBT.m' program in the 'DataCollect' folder to collect sensor data during VBT.
    1. From upright standing, called the 'Central hold' phase, ask the subject to steer the cursor, driven by the CoP, as fast as possible towards randomly presented peripheral target as cued by visual feedback (Figure 3b).
    2. Following this 'Move' phase, ask the subject to hold the cursor at the target location for 1 sec during the 'Peripheral hold' phase.
    3. Following the 'Peripheral hold' phase, the cursor will 'Reset' back to the center when the subject needs to return back to upright standing – the 'Central hold' position. NMES/SES is triggered for the muscle when its EMG level goes above a set threshold to assist the volitional effort required to return the CoP to the 'Central hold' position.
      Note: The difficulty of the mFRT can be increased by decreasing the gain, Equation 1, or increasing the noise variance,Equation 2, within subject-specific feasible range:
      Equation 3
      where the CoP excursions, Equation 4, drive the computer cursor, Equation 5, in discretized time, Equation 6, with time-step, Equation 7.

Representative Results

Figure 4 shows the eye gaze features that were extracted offline for the quantification of an able-bodied performance during a smooth pursuit task. The following features were extracted as shown in Table 1:

Feature 1 = percentage deviation between target stimulus position and the centroid of participant's fixation points when the stimulus is changing position in the horizontal direction.

Feature 2 = percentage deviation between target stimulus position and centroid of participant's fixation points when the stimulus is changing position in the vertical direction.

Feature 3 = blink per min

Feature 4 = percentage of time the participant is looking (eye was detected by eye tracker) at the stimulus.

Feature 5 = percentage of time the participant is not looking (eye was detected by eye tracker) at the stimulus. (Note: Feature 5 = 100-Feature 4)

Feature 6 = percentage Smooth Pursuit Length (SPL) overshoot made by the participant, i.e.,
Equation 8

where SPL = Smooth Pursuit Length is the length (in pixels) covered by participant to track the moving stimulus, SML = Stimulus Movement Length (in pixel), i.e., actual length of the path in which the stimulus moves.

Figure 4
Figure 4: Top panel shows an illustrative figure of the smooth pursuit during horizontal movement. Bottom panel shows an illustrative figure of the smooth pursuit during vertical movement. Reproduced from 8 and 37​. Please click here to view a larger version of this figure.

Feature 1 (%) Feature 2 (%) Feature 3 (per minute) Feature 4 (%) Feature 5 (%) Feature 6 (%)
Left Eye 1.00 3.66 6.83 95.52 4.49 46.78
Right Eye 0.67 6.00 6.34 94.40 5.60 24.99

Table 1: Eye Gaze Feature.

A proof-of-concept VBT study (without NMES/SES) was conducted on 10 able-bodied subjects (5 right-leg dominant males and 5 right-leg dominant females aged between 22 to 46 years) under a modified functional reach task (mFRT) paradigm (Figure 3c). The mFRT is proposed to quantify the subjects' ability to volitionally shift their CoP position as quickly as possible without losing balance while cued with CoP visual biofeedback. During mFRT, multi-sensor data was collected for mobile brain/body imaging (MoBI)19. MOBI data was processed offline to determine the overall postural sway from CoP (from Balance Board) and CoM (from Motion Capture Sensor) trajectories. Also, the features were extracted from biosignals that were recorded simultaneously along with the gaze behavior (e.g., blink rate, saccadic direction from electrooculogram). The results from this proof-of-concept study was presented in Dutta et al.8 where alpha event-related desynchronization (aERD%) was found primarily in the parietal and occipital EEG electrodes.Moreover, the mean squared error (MSE) normalized by the baseline value trended towards a decrease, the blink rate trended towards an increase, and the saccadic direction relative to the cursor acceleration trended towards zero during successive trials of the visuomotor task. Based on the data from Dutta et al.8, the EOG data showed that the ratio of fixation duration on the target and the fixation duration on the cursor before the initiation of the motor response (i.e., EMG onset) – FDratio – increased (Figure 5a) while the baseline normalized mean squared error (MSEnorm) decreased (Figure 5b) during VBT trials.

Figure 5
Figure 5: (a) Changes in the ratio of fixation duration on the target and the fixation duration on the cursor – FDratio – extracted from electrooculogram during visuomotor balance task (VBT) trials. (b) Changes in the baseline normalized mean squared error (MSEnorm) during VBT trials. Reproduced from 8 and 37Please click here to view a larger version of this figure.

Discussion

A simple-to-use, clinically valid low-cost tool for movement and balance therapy will be a paradigm shift for neurorehabilitation in a low-resource setting. It is likely to have a very high societal impact since neurological disorders like stroke will dramatically increase in future due to aging world population2. There is, therefore, a pressing need to leverage cyber physical systems where the ability to customize, monitor, and support neuro-rehabilitation at remote sites has recently become possible with the integrations of computation, networking, and physical processes via telecommunications. Towards that overarching goal, the low-cost Eye Tracker based evaluation of post-stroke pursuit eye movements can not only provide home-based diagnosis but also therapy where smooth pursuit eye movement training promoted recovery from auditory and visual neglect25. Here, the latency of the smooth-pursuit in healthy subjects has been found to be very consistent for targets moving 5 degrees/s or faster with a mean latency of 100 ±5 msec26.

Moreover, the proposed human-machine-interface (HMI) for volitionally driven neuromuscular electrical stimulation (NMES) and sensory electrical stimulation (SES) for post-stroke balance therapy integrated biosignal sensors and motion capture with NMES/SES for post-stroke balance rehabilitation, which has the potential27,28 as a home-based intervention to improve post-stroke standing balance. The novel part of the HMI is the software interface that integrates multiple off-the-shelf low-cost sensors to record mobile brain/body imaging data and for visual-auditory biofeedback during NMES/SES assisted visuomotor balance therapy (VBT). Based on healthy subject results from the proof-of-concept study (without NMES/SES), we propose that the multi-sensor information can be fused to estimate the state of motor learning during post-stroke VBT, and therefore the difficulty can be adapted online for mFRT. For example, smooth pursuit eye movement training25 can be integrated with myoelectrically driven NMES/SES-assisted visuomotor task, as presented in Dutta et al.8, where alpha event-related desynchronization at the parietal and occipital EEG electrodes may predict the normalized mean square error (MSE) in reaching the peripheral targets. Therefore, based on the evaluation of post-stroke pursuit eye movements as well as the gaze behavior during VBT task, we can objectively analyze and monitor eye-related problems contributing to balance disability thereby leveraging the residual function during rehabilitation29. Moreover, gaze behavior (e.g. blink rate, saccades) can be used to monitor user engagement during motor learning30.

The motor learning during VBT can be analyzed using a reduced dimension reaction mass pendulum (RMP) biped model that is presented in Dutta et al.24. The reduced dimension RMP model24 can be constructed offline from skeleton tracking data (which is the joint data that is streamed out of the Motion Capture sensor in the skeleton stream, Figure 6). Significance of RMP model over traditional point-mass pendulum model was during occasional arm swinging in healthy to regain balance at the limits of stability during mFRT where the RMP model augmented the traditional point-mass pendulum model by capturing the shape, size and orientation of the aggregate rotational centroidal inertia. In our prior work21, CoM-CoP lean-line was found to be a suitable visual feedback of the upright posture. Also, we have shown the relevance of whole body normalized centroidal angular momentum (CAM) during stand-to-walk transition in post-stroke gait24. Indeed, angular momentum is tightly regulated with segment-to-segment cancellations of angular momentum during human walking31 and possibly in all coordinated human movement including mFRT to prevent falls. Based on these prior works, it can be postulated that stroke survivors with muscle weakness and coordination deficits will take it longer to regulate CAM when compared to age-matched able-bodied subjects. This is currently under investigation using the reduced dimension RMP model24.

Figure 6
Figure 6: Left panel shows the joint labels for the skeleton model data from the Motion Capture Sensor which can be analyzed offline using a reduced dimension biped model (right panel) for capturing the posture (see Banerjee et al.24). RMP: Reaction Mass Pendulum, CoP: Center of Pressure, CoM: Center of Mass, GRF: Ground reaction force vector. Please click here to view a larger version of this figure.

The grand challenge is to develop and clinically validate advanced cyber physical systems for teleneurorehabilitation that is based on the manipulation of environmental, behavioral, and pharmacologic contexts. The future applications of the HMI include a teleneurorehabilitation paradigm in a home-based setup where identification and monitoring of visuomotor deficits/learning from gaze-behavior may lend to an operant conditioning paradigm which will enforce volitional use of relevant residual function. For example, the HMI can be augmented with two Wii BB (one for the paretic and one for the non-paretic limb) which can be positioned side by side without touching (i.e., <1 mm apart). Following the experimental protocol of Mansfield and colleagues7, the subjects could stand with one foot on each Wii BB in a standard position (feet oriented at 14° with 7° rotation of each foot with an inter-malleoli distance equal to 8% of the height), with each foot equidistant from the midline between both Wii BBs. During mFRT, both the paretic and non-paretic limbs will contribute to the CoP position where the operant conditioning can be implemented by providing positive reinforcement to the residual function of the paretic limb and negative reinforcement for the compensatory mechanisms of the non-paretic limb (based on the principle of constraint-induced movement therapy32) by making the cursor easier to control with the CoP excursions of the paretic side. Moreover, visual field defects, both homonymous defects and those defects related to optic nerve lesion, may be improved-at least to some extent-in patients33 towards better visuomotor integration34 contributing to improved balance. The clinical stroke study is being conducted under the hypothesis that our low-cost HMI towards volitionally driven NMES/SES assisted dynamic visuomotor balance therapy can ameliorate post-stroke ankle-specific control problems in visually cued weight shifting during bipedal standing. It is expected to reduce the fall incidence rates in chronic stroke survivors, which can be high as 2.2 to 4.9 falls each person-year35. Indeed, for showing the efficacy of this HMI for post-stroke balance therapy towards restorative neurorehabilitation, the critical step is adequate subject using gaze based visuomotor performance evaluation, i.e., stroke survivors who have sufficient residual sensorimotor function necessary for recovery36.

開示

The authors have nothing to disclose.

Acknowledgements

Research conducted within the context of the Joint targeted Program in Information and Communication Science and Technology – ICST, supported by CNRS, Inria, and DST, under CEFIPRA's umbrella. The authors would like to acknowledge the support of students, specifically Rahima Sidiboulenouar, Rishabh Sehgal, and Gorish Aggarwal, towards development of the experimental setup.

Materials

NMES stimulator Vivaltis, France PhenixUSBNeo NMES stimulator cum EMG sensor (Figure 2b)
Balance Board Nintendo, USA Wii Balance Board Balance Board (Figure 2b)
Motion Capture Microsoft, USA XBOX-360 Kinect Motion Capture (Figure 2b)
Eye Tracker  Eye Tribe The Eye Tribe SmartEye Tracker (Figure 2a)
EEG Data Acquisition System Emotiv, Australia Emotiv Neuroheadset Wireless EEG headset (Figure 2b)
EEG passive electrode Olimex EEG-PE EEG passive electrode for EOG and references (6 in number)(Figure 2b)
EEG active electrode Olimex EEG-AE EEG active electrode (10 in number)(Figure 2b)
Computer with PC monitor Dell Data processing and visual feedback (Figure 2)
Softwares, EMG electrodes, NMES electrodes, and cables

参考文献

  1. Sacco, R. L., Kasner, S. E. An updated definition of stroke for the 21st century: a statement for healthcare professionals from the American Heart Association/American Stroke Association. Stroke; a journal of cerebral circulation. 44 (7), 2064-2089 (2013).
  2. Das, A., Botticello, A. L., Wylie, G. R., Radhakrishnan, K. Neurologic Disability: A Hidden Epidemic for India. Neurology. 79 (21), 2146-2147 (2012).
  3. Verheyden, G. S. A. F., Weerdesteyn, V. Interventions for preventing falls in people after stroke. The Cochrane database of systematic reviews. 5, 008728 (2013).
  4. Campbell, G. B., Matthews, J. T. An integrative review of factors associated with falls during post-stroke rehabilitation. Journal of Nursing Scholarship: An Official Publication of Sigma Theta Tau International Honor Society of Nursing / Sigma Theta Tau. 42, 395-404 (2010).
  5. Geurts, A. C. H., de Haart, M., van Nes, I. J. W., Duysens, J. A review of standing balance recovery from stroke. Gait & posture. 22, 267-281 (2005).
  6. Marigold, D. S., Eng, J. J., Dawson, A. S., Inglis, J. T., Harris, J. E., Gylfadóttir, S. Exercise leads to faster postural reflexes, improved balance and mobility, and fewer falls in older persons with chronic stroke. Journal of the American Geriatrics Society. 53, 416-423 (2005).
  7. Mansfield, A., Mochizuki, G., Inness, E. L., McIlroy, W. E. Clinical correlates of between-limb synchronization of standing balance control and falls during inpatient stroke rehabilitation. Neurorehabilitation and neural repair. 26, 627-635 (2012).
  8. Dutta, A., Lahiri, U., Das, A., Nitsche, M. A., Guiraud, D. Post-stroke balance rehabilitation under multi-level electrotherapy: a conceptual review. Neuroprosthetics. 8, 403 (2014).
  9. Agnes Roby-Brami, S. F. Reaching and Grasping Strategies in Hemiparetic Patients. Human Kinetics Journals. , (2010).
  10. Sabut, S. K., Sikdar, C., Kumar, R., Mahadevappa, M. Functional electrical stimulation of dorsiflexor muscle: effects on dorsiflexor strength, plantarflexor spasticity, and motor recovery in stroke patients. NeuroRehabilitation. 29, 393-400 (2011).
  11. Magalhães, F. H., Kohn, A. F. Effectiveness of electrical noise in reducing postural sway: a comparison between imperceptible stimulation applied to the anterior and to the posterior leg muscles. European Journal of Applied Physiology. 114, 1129-1141 (2014).
  12. Hwang, S., Tae, K., Sohn, R., Kim, J., Son, J., Kim, Y. The balance recovery mechanisms against unexpected forward perturbation. Annals of biomedical engineering. 37, 1629-1637 (2009).
  13. Gatev, P., Thomas, S., Kepple, T., Hallett, M. Feedforward ankle strategy of balance during quiet stance in adults. The Journal of physiology. 514, 915-928 (1999).
  14. Cofre Lizama, E. L., Pijnappels, M., Reeves, N. P., Verschueren, S. M. P., van Dieën, J. H. Can explicit visual feedback of postural sway efface the effects of sensory manipulations on mediolateral balance performance. Journal of Neurophysiology. , (2015).
  15. Knash, M. E., Kido, A., Gorassini, M., Chan, K. M., Stein, R. B. Electrical stimulation of the human common peroneal nerve elicits lasting facilitation of cortical motor-evoked potentials. Experimental brain research. 153, 366-377 (2003).
  16. Dinse, H. R., Tegenthoff, M. Evoking plasticity through sensory stimulation: Implications for learning and rehabilitation. The Journal of neuroscience: the official journal of the Society for Neuroscience. 6, 11-20 (2015).
  17. Khaslavskaia, S., Sinkjaer, T. Motor cortex excitability following repetitive electrical stimulation of the common peroneal nerve depends on the voluntary drive. Experimental brain research. 162, 497-502 (2005).
  18. Perez, M. A., Field-Fote, E. C., Floeter, M. K. Patterned sensory stimulation induces plasticity in reciprocal ia inhibition in humans. The Journal of neuroscience: the official journal of the Society for Neuroscience. 23, 2014-2018 (2003).
  19. Makeig, S. Mind Monitoring via Mobile Brain-Body Imaging. Foundations of Augmented Cognition. Neuroergonomics and Operational. , 749-758 (2009).
  20. Ojeda, A., Bigdely-Shamlo, N., Makeig, S. MoBILAB: an open source toolbox for analysis and visualization of mobile brain/body imaging data. Frontiers in Human Neuroscience. 8, 121 (2014).
  21. Dutta, A., Chugh, S., Banerjee, A., Dutta, A. Point-of-care-testing of standing posture with Wii balance board and microsoft kinect during transcranial direct current stimulation: A feasibility study. NeuroRehabilitation. 34, 789-798 (2014).
  22. Nataraj, R. . Feedback Control Of Standing Balance Using Functional Neuromuscular Stimulation Following Spinal Cord Injury. , (2011).
  23. Dutta, A., Paulus, W., Nitsche, A., M, Translational Methods for Non-Invasive Electrical Stimulation to Facilitate Gait Rehabilitation Following Stroke – The Future Directions. Neuroscience and Biomedical Engineering. 1, 22-33 (2013).
  24. Banerjee, A., Khattar, B., Dutta, A. A Low-Cost Biofeedback System for Electromyogram-Triggered Functional Electrical Stimulation Therapy: An Indo-German Feasibility Study. ISRN Stroke. 2014, e827453 (2014).
  25. Kerkhoff, G., Reinhart, S., Ziegler, W., Artinger, F., Marquardt, C., Keller, I. Smooth pursuit eye movement training promotes recovery from auditory and visual neglect: a randomized controlled study. Neurorehabilitation and Neural Repair. 27, 789-798 (2013).
  26. Carl, J. R., Gellman, R. S. Human smooth pursuit: stimulus-dependent responses. Journal of Neurophysiology. 57, 1446-1463 (1987).
  27. Clark, R. A., Bryant, A. L., Pua, Y., McCrory, P., Bennell, K., Hunt, M. Validity and reliability of the Nintendo Wii Balance Board for assessment of standing balance. Gait & posture. 31, 307-310 (2010).
  28. Clark, R. A., Pua, Y. -. H. Validity of the Microsoft Kinect for assessment of postural control. Gait & posture. 36, 372-377 (2012).
  29. Khattar, B., Banerjee, A., Reddi, R., Dutta, A. Feasibility of Functional Electrical Stimulation-Assisted Neurorehabilitation following Stroke in India: A Case Series. Case Reports in Neurological Medicine. 2012, e830873 (2012).
  30. Sailer, U., Flanagan, J. R., Johansson, R. S. Eye-hand coordination during learning of a novel visuomotor task. The Journal of neuroscience: the official journal of the Society for Neuroscience. 25, 8833-8842 (2005).
  31. Herr, H., Popovic, M. Angular momentum in human walking. The Journal of Experimental Biology. 211, 467-481 (2008).
  32. Taub, E., Morris, D. M. Constraint-induced movement therapy to enhance recovery after stroke. Current atherosclerosis reports. 3, 279-286 (2001).
  33. Kasten, E., Wuest, S., Sabel, B. A. Residual vision in transition zones in patients with cerebral blindness. Journal of Clinical and Experimental Neuropsychology. 20, 581-598 (1998).
  34. Marshall, S. P. Identifying Cognitive State from Eye Metrics. Aviation, Space, and Environmental Medicine. 78, 165-175 (2007).
  35. Weerdesteyn, V., de Niet, M., van Duijnhoven, H. J. R., Geurts, A. C. H. Falls in individuals with stroke. Journal of Rehabilitation Research and Development. 45, 1195-1213 (2008).
  36. Stinear, C. M., Barber, P. A., Petoe, M., Anwar, S., Byblow, W. D. The PREP algorithm predicts potential for upper limb recovery after stroke. Brain: A Journal of Neurology. 135 ((Pt 8)), 2527-2535 (2012).
  37. Dutta, A., Lahiri, D., Kumar, U., Das, A., Padma, M. V. Post-stroke engagement-sensitive balance rehabilitation under an adaptive multi-level electrotherapy: clinical hypothesis and computational framework. Neuroscience and Biomedical Engineering. 2 (2), 68-80 (2015).

Play Video

記事を引用
Kumar, D., Das, A., Lahiri, U., Dutta, A. A Human-machine-interface Integrating Low-cost Sensors with a Neuromuscular Electrical Stimulation System for Post-stroke Balance Rehabilitation. J. Vis. Exp. (110), e52394, doi:10.3791/52394 (2016).

View Video