Solaiman Shokur
EPFL STI IBI-STI TNE
B3 4 225.134 (Campus Biotech bâtiment B3)
Ch. des Mines 9
1202 Genève
+41 21 693 80 44
+41 21 693 50 22
Office:
B3 4 225.134
EPFL
>
STI
>
INX-STI
>
TNE
Web site: Web site: https://tne.epfl.ch
EPFL STI INX-STI TNE
B3 4 243.134 (Campus Biotech bâtiment B3)
Ch. des Mines 9
1202 Genève
+41 21 693 80 44
+41 21 693 50 22
Office:
B3 4 225.134
EPFL
>
SV
>
SV-SSV
>
SSV-ENS
Web site: Web site: https://sv.epfl.ch/education
Fields of expertise
His work involves the development, and validation of new technologies to restore sensory-motor functions for patients with complete paraplegia and upper arm amputation.
His area of expertise includes virtual reality for rehabilitation, invasive and non-invasive brain-machine interfaces, and haptic devices.
Biography
2019 - now : Senior Scientist, Translational neuroengineering Laboratory, EPFL(Geneva)Team leader: CHRONOS project, a multi-center European project that aims at developing the first chronically implanted prosthetic hand for transradial amputee patients with bidirectional communication capabilities.NCCR Robotics: Bi-directional control of supernumerary limbs.
2014-2019: Research Coordinator, AASDAP (São Paulo, Brazil)
Responsible for both the scientific production and the clinical protocol at the AASDAP neurorehabilitation laboratory. 2013-2014: Postdoctoral associate, Walk again project, Insituto Santos Dumond. (Natal, RN, Brazil)
Head of the engineering team in charge of the system integration.
2010-2012: Visiting scientist, Nicolelis Lab (Duke University). Development and validation of a virtual-reality based brain-machine interface for rhesus monkeys.
2007 – 2010: Teaching assistant, Laboratory of Robotics Systems (EPFL)
Publications
Infoscience publications
Infoscience
2024
Report A sensory-motor hand prosthesis with integrated thermal feedback
Background: Recently, we reported the presence of phantom thermal sensations in amputees: thermal stimulation of specific spots on the residual arm elicited thermal sensations in their missing hands. Here, we exploit phantom thermal sensations via a standalone system integrated into a robotic prosthetic hand to provide real-time and natural temperature feedback. Methods: The subject (a male adult with unilateral transradial amputation) used the sensorized prosthesis to manipulate objects and distinguish their thermal properties. We tested his ability to discriminate between (1) hot, cold, and ambient temperature objects, (2) different materials (copper, glass, and plastic), and (3) artificial versus human hands. We also introduced the thermal box and block test (thermal BBT), a test to evaluate real-time temperature discrimination during standardized pick -and -place tasks. Findings: The subject performed all three discrimination tasks above chance level with similar accuracies as with his intact hand. Additionally, in all 15 sessions of the thermal BBT, he correctly placed more than half of the samples. Finally, the phantom thermal sensation was stable during the 13 recording sessions spread over 400 days. Conclusion: Our study paves the way for more natural hand prostheses that restore the full palette of sensations.
Med. 2024-02-09. DOI : 10.1016/j.medj.2023.12.006.Remapping Wetness Perception in Upper Limb Amputees
Recent research has made remarkable strides in restoring sensory feedback for prosthetic users, including tactile, proprioceptive, and thermal feedback. Herein, a sensory modality that has been largely neglected is explored: the ability to perceive wetness. Providing moisture-related information to prosthesis users can increase their overall sensory palette toward a more natural sensory experience. A rapid decrease in skin temperature is found to trigger the illusion of contact with something wet. Two body parts were tested, the upper arm and the lateral abdomen, in a group of non amputated participants, and it was found that a wetness sensation can be elicited and maintained for at least 10 s in 86% and 93% of participants, respectively. It is then demonstrated how to mediate the wetness sensation in real-time using a thermal wearable device that mimics the thermal properties of the skin. Finally, two upper limb amputee individuals used their prosthetic arm, sensorized with the device, to discriminate between three levels of moisture; their detection accuracy was similar to one they had with their intact hands. The current study is a stepping stone for future prostheses aimed at restoring the richness of sensory experience in upper limb amputees.|A new generation of prostheses aims to restore the rich sensory feedback of amputated people, but one modality is often neglected: wetness perception. Ploumitsakou and colleagues present an approach to detect and mediate moisture information: a cold, dry skin stimulation created the vivid sensation of touching something wet. Blindfolded amputees could scan objects and discriminate three levels of moisture.image (c) 2024 WILEY-VCH GmbH
Advanced Intelligent Systems. 2024-01-18. DOI : 10.1002/aisy.202300512.Thermal sensing device and sensory feedback system and method using such thermal sensing device
The present invention concerns a thermal sensing device (1) and a sensory feedback system and method using such thermal sensing device, comprising at least one film (19) of electrically insulating polymer defining a global surface of the thermal sensing device (1) and including at least one sensing track (10) of a conducting material exhibiting a change in resistivity with temperature, said sensing track (10) being arranged with a connection (12) to at least one control module (2) receiving the signal of the thermal sensing device (1) as a measure of temperature, said thermal sensing device (1) being characterized in that said film (19) further comprises at least one heating track (11), powered by said control module (2) to dissipate electrical power into heat within the thermal sensing device (1) and maintain the sensing track (10) at a determined baseline temperature, preferably close to the baseline temperature of the human skin
WO2024100612 ; EP4368150 . 2024.Investigating neural resource allocation in the sensorimotor control of extra limbs
The rise of robotic body augmentation brings forth new developments that will transform robotics, human-machine interaction, and wearable electronics. Extra robotic limbs, although building upon restorative technologies, bring their own set of challenges in achieving effective bidirectional human-machine collaboration. The questions are whether people can adjust and learn to use a new robotic limb and whether this is achievable without limiting their other physical capabilities. In realizing successful robotic body augmentation, it's crucial to make sure that introducing an extra (artificial) limb doesn't compromise the functions of a natural (biological) limb. This thesis presents research on robotic body augmentation via extra robotic limbs, merging the definition of theoretical foundations with empirical investigations on the adaptability of the human body and brain to advanced technological integrations. Central to this work is the concept of the 'Neural Resource Allocation Problem', defined and discussed in the introduction of this thesis. It addresses the challenges of integrating augmentative devices with the human body without compromising natural functionalities. Such conceptualization is crucial to ensure that augmentation technologies effectively expand user's capacities rather than simply rerouting resources and replacing an existing function with a different, new one. Based on this theoretical groundwork, I then proposed operational guidelines and detailed the development and characterization of an ad-hoc human-machine interface based on gaze and diaphragmatic respiration for extra robotic arms. The validation carried out on a virtual extra arm thanks to the neuro-robotic platform engineered for this work and the subsequent testing with an extra robotic arm proved the proposed human-machine interface to be effective and non-intrusive, substantiating the proposed methodology. The in-depth analysis of how users adapt to a toe-controlled robotic thumb that concludes the empirical work reported in this thesis is once again rooted in the conceptual framework detailed at the beginning of the thesis. It offered a window into necessary trade-offs, long term effects and the neural adaptations involved with significant and generalisable augmented-hand motor learning. This thesis contributes to the improvement of targeted human machine interfaces design for extra robotic limbs. The non-intrusive biosignals identified have the potential to be further explored and be applied for the control of degrees of freedom of more sophisticated robotic arms to enable more advanced augmentation. This thesis also contributes to a deeper understanding of the consequences of semi-intensive use of robotic body augmentation at behavioural and neural level.
Lausanne, EPFL, 2024. DOI : 10.5075/epfl-thesis-10295.2023
Human motor augmentation with an extra robotic arm without functional interference
Extra robotic arms (XRAs) are gaining interest in neuroscience and robotics, offering potential tools for daily activities. However, this compelling opportunity poses new challenges for sensorimotor control strategies and human-machine interfaces (HMIs). A key unsolved challenge is allowing users to proficiently control XRAs without hindering their existing functions. To address this, we propose a pipeline to identify suitable HMIs given a defined task to accomplish with the XRA. Following such a scheme, we assessed a multimodal motor HMI based on gaze detection and diaphragmatic respiration in a purposely designed modular neurorobotic platform integrating virtual reality and a bilateral upper limb exoskeleton. Our results show that the proposed HMI does not interfere with speaking or visual exploration and that it can be used to control an extra virtual arm independently from the biological ones or in coordination with them. Participants showed significant improvements in performance with daily training and retention of learning, with no further improvements when artificial haptic feedback was provided. As a final proof of concept, naive and experienced participants used a simplified version of the HMI to control a wearable XRA. Our analysis indicates how the presented HMI can be effectively used to control XRAs. The observation that experienced users achieved a success rate 22.2% higher than that of naive users, combined with the result that naive users showed average success rates of 74% when they first engaged with the system, endorses the viability of both the virtual reality-based testing and training and the proposed pipeline.
Science Robotics. 2023-12-13. DOI : 10.1126/scirobotics.adh1438.Editorial: Hybrid brain-robot interfaces for enhancing mobility
Frontiers In Neurorobotics. 2023-08-15. DOI : 10.3389/fnbot.2023.1264045.Human-machine interface for two-dimensional steering control with the auricular muscles
Human-machine interfaces (HMIs) can be used to decode a user's motor intention to control an external device. People that suffer from motor disabilities, such as spinal cord injury, can benefit from the uses of these interfaces. While many solutions can be found in this direction, there is still room for improvement both from a decoding, hardware, and subject-motor learning perspective. Here we show, in a series of experiments with non-disabled participants, a novel decoding and training paradigm allowing naive participants to use their auricular muscles (AM) to control two degrees of freedom with a virtual cursor. AMs are particularly interesting because they are vestigial muscles and are often preserved after neurological diseases. Our method relies on the use of surface electromyographic records and the use of contraction levels of both AMs to modulate the velocity and direction of a cursor in a two-dimensional paradigm. We used a locking mechanism to fix the current position of each axis separately to enable the user to stop the cursor at a certain location. A five-session training procedure (20-30 min per session) with a 2D center-out task was performed by five volunteers. All participants increased their success rate (Initial: 52.78 +/- 5.56%; Final: 72.22 +/- 6.67%; median +/- median absolute deviation) and their trajectory performances throughout the training. We implemented a dual task with visual distractors to assess the mental challenge of controlling while executing another task; our results suggest that the participants could perform the task in cognitively demanding conditions (success rate of 66.67 +/- 5.56%). Finally, using the Nasa Task Load Index questionnaire, we found that participants reported lower mental demand and effort in the last two sessions. To summarize, all subjects could learn to control the movement of a cursor with two degrees of freedom using their AM, with a low impact on the cognitive load. Our study is a first step in developing AM-based decoders for HMIs for people with motor disabilities, such as spinal cord injury.
Frontiers In Neurorobotics. 2023-06-05. DOI : 10.3389/fnbot.2023.1154427.Restoration of natural thermal sensation in upper-limb amputees
The use of hands for gathering rich sensory information is essential for proper interaction with the environment; therefore, the restoration of sensation is critical for reestablishing the sense of embodiment in hand amputees. Here, we show that a noninvasive wearable device can be used to provide thermal sensations on amputees' phantom hands. The device delivers thermal stimuli to specific regions of skin on their residual limb. These sensations were phenomenologically similar to those on the intact limbs and were stable over time. Using the device, the subjects could successfully exploit the thermal phantom hand maps to detect and discriminate different thermal stimuli. The use of a wearable device that provides thermal sensation can increase the sense of embodiment and improve life quality in hand amputees.
Science. 2023-05-19. DOI : 10.1126/science.adf6121.Immediate effect of ankle exoskeleton on spatiotemporal parameters and center of pressure trajectory after stroke
Gait impairments is a common condition in post-stroke subjects. We recently presented a wearable ankle exoskeleton called G-Exos, which showed that the device assisted in the ankle's dorsiflexion and inversion/reversion movements. The aim of the current pilot study was to explore spatiotemporal gait parameters and center of pressure trajectories associated with the use of the G-Exos in stroke participants. Three post-stroke subjects (52-63 years, 2 female/1 male) walked 160-meter using the G-Exos on the affected limb, on a protocol divided into 4 blocks of 40-meters: (I) without the exoskeleton, (II) with systems hybrid system, (III) active only and (IV) passive only. The results showed that the use of the exoskeleton improved swing and stance phases on both limbs, reduced stride width on the paretic limb, increased stance COP distances, and made single support COP distances more similar between the paretic and non-paretic limb. This suggests that all G-Exos systems contributed to improving body weight bearing on the paretic limb and symmetry in the gait cycle.
2023-01-01. International Conference on Rehabilitation Robotics (ICORR), Singapore, SINGAPORE, SEP 24-28, 2023. DOI : 10.1109/ICORR58425.2023.10304816.Effect of freezing of gait and dopaminergic medication in the biomechanics of lower limbs in the gait of patients with Parkinson's disease compared to neurologically healthy
Introduction: This study aims to evaluate the effects of medication, and the freezing of gait (FoG) on the kinematic and kinetic parameters of gait in people with Parkinson's disease (pwPD) compared to neurologically healthy. Methods: Twenty-two people with a clinical diagnosis of idiopathic PD in ON and OFF medication (11 FoG), and 18 healthy participants (control) were selected from two open data sets. All participants walked on the floor on a 10-meter-long walkway. The joint kinematic and ground reaction forces (GRF) variables of gait and the clinical characteristics were compared: (1) PD with FoG (pwFoG) and PD without FoG (pwoFoG) in the ON condition and control; (2) PD with FoG and PD without FoG in the OFF condition and control; (3) Group (PD with FoG and PD without FoG) and Medication. Results: (1) FoG mainly affects distal joints, such as the ankle and knee; (2) PD ON showed changes in the range of motion of both distal and proximal joints, which may explain the increase in step length and gait speed expected with the use of L-Dopa; and (3) the medication showed improvements in the kinematic and kinetic parameters of the gait of people with pwFoG and pwoFoG equally; (4) pwPD showed a smaller second peak of the vertical component of the GRF than the control. Conclusion: The presence of FoG mainly affects distal joints, such as the ankle and knee. PD presents a lower application of GRF during the impulse period than healthy people, causing lower gait performances.
Neuroscience Letters. 2023-04-15. DOI : 10.1016/j.neulet.2023.137250.2022
Kinematics predictors of spatiotemporal parameters during gait differ by age in healthy individuals
Joint biomechanics and spatiotemporal gait parameters change with age or disease and are used in treatment decision-making. Research question: To investigate whether kinematic predictors of spatiotemporal parameters during gait differ by age in healthy individuals. Methods: We used an open dataset with the gait data of 114 young adults (M = 28.0 years, SD = 7.5) and 128 older adults (M = 67.5 years, SD = 3.8) walking at a comfortable self-selected speed. Linear regression models were developed to predict spatiotemporal parameters separately for each group using joint kinematics as independent variables. Results: In young adults, knee flexion loading response and hip flexion/extension were the common predictors of gait speed; hip flexion and hip extension contributed to explaining the stride length; hip flexion contributed to explaining the cadence and stride time. In older adults, ankle plantarflexion, knee flexion loading response, and pelvic rotation were the common predictors of the gait speed; ankle plantarflexion and knee flexion loading response contributed to explaining the stride length; ankle plantarflexion loading response and ankle plantarflexion contributed to explain the cadence, stride width and stride time. Significance: Our results suggest that the ability of joint kinematic variables to estimate spatiotemporal parameters during gait differs by age in healthy individuals. Particularly in older adults, ankle plantarflexion was the common predictor of the spatiotemporal parameters, suggesting the importance of the ankle for gait parameters in this age group. This provides insight for clinicians into the most effective evaluation and has been used by physical professionals in prescribing the most appropriate exercises to attenuate the effects produced by age-related neuromuscular changes.
Gait & Posture. 2022-07-01. DOI : 10.1016/j.gaitpost.2022.05.034.A Public Data Set of Videos, Inertial Measurement Unit, and Clinical Scales of Freezing of Gait in Individuals With Parkinson's Disease During a Turning-In-Place Task
Frontiers In Neuroscience. 2022-02-23. DOI : 10.3389/fnins.2022.832463.Training with noninvasive brain-machine interface, tactile feedback, and locomotion to enhance neurological recovery in individuals with complete paraplegia: a randomized pilot study
In recent years, our group and others have reported multiple cases of consistent neurological recovery in people with spinal cord injury (SCI) following a protocol that integrates locomotion training with brain machine interfaces (BMI). The primary objective of this pilot study was to compare the neurological outcomes (motor, tactile, nociception, proprioception, and vibration) in both an intensive assisted locomotion training (LOC) and a neurorehabilitation protocol integrating assisted locomotion with a noninvasive brain-machine interface (L + BMI), virtual reality, and tactile feedback. We also investigated whether individuals with chronic-complete SCI could learn to perform leg motor imagery. We ran a parallel two-arm randomized pilot study; the experiments took place in Sao Paulo, Brazil. Eight adults sensorimotor-complete (AIS A) (all male) with chronic (> 6 months) traumatic spinal SCI participated in the protocol that was organized in two blocks of 14 weeks of training and an 8-week follow-up. The participants were allocated to either the LOC group (n = 4) or L + BMI group (n = 4) using block randomization (blinded outcome assessment). We show three important results: (i) locomotion training alone can induce some level of neurological recovery in sensorimotor-complete SCI, and (ii) the recovery rate is enhanced when such locomotion training is associated with BMI and tactile feedback ( increment Mean Lower Extremity Motor score improvement for LOC = + 2.5, L + B = + 3.5; increment Pinprick score: LOC = + 3.75, L + B = + 4.75 and increment Tactile score LOC = + 4.75, L + B = + 9.5). (iii) Furthermore, we report that the BMI classifier accuracy was significantly above the chance level for all participants in L + B group. Our study shows potential for sensory and motor improvement in individuals with chronic complete SCI following a protocol with BMIs and locomotion therapy. We report no dropouts nor adverse events in both subgroups participating in the study, opening the possibility for a more definitive clinical trial with a larger cohort of people with SCI.
Scientific Reports. 2022-11-29. DOI : 10.1038/s41598-022-24864-5.Editorial: Getting Neuroprosthetics Out of the Lab: Improving the Human-Machine Interactions to Restore Sensory-Motor Functions
Frontiers In Robotics And Ai. 2022-05-25. DOI : 10.3389/frobt.2022.928383.Biomechanical aspects that precede freezing episode during gait in individuals with Parkinson's disease: A systematic review
Background: The freezing episode (FE) management during gait in Parkinson's disease is inefficient with current medications, neurosurgery, and physical interventions. Knowing the biomechanical change patients suffer preceding FE would be the ultimate goal to measure, predict, and prevent these events. Objective: We performed a systematic review to summarize the kinematic, kinetic, electromyographic, and spatiotemporal characteristics of the events that precede the FE during gait in Parkinson's disease. Literature Survey: Databases searched included PubMed, Embase, and Cochrane and between 2001 to August 2021. Methodology: The present study was a systematic review registered in the PROSPERO database (CRD42021255082). Three reviewers searched and selected studies with methodologies involving biomechanical changes and kinetic, kinematic, electromyography, and spatiotemporal changes before FE in a patient with Parkinson's disease. The relevant articles that show the events preceding FE in patients with PD were identified. We excluded studies that describe or compare methods or algorithms to detect FE. Studies may include participants with all PD severity, time of disease, and age. Synthesis: We selected ten articles for final evaluation. The most consistent results indicate a dramatic reduction of movement excursions with (1) decrease in stride length; (2) decreased gait speed; (3) postural instability with the increased double support phase; (4) incoordination of anterior tibial and gastrocnemius; (5) larger amplitude in the EMG of biceps femoris; (6) decreased range of motion in the sagittal plane at the ankle and hip joints; and (7) anterior pelvic tilt. Conclusion: FE is characterized by complex motor patterns than normal gait and mismatched gains in the perception and execution of the ongoing movement.
Gait & Posture. 2022-01-01. DOI : 10.1016/j.gaitpost.2021.10.021.2021
The neural resource allocation problem when enhancing human bodies with extra robotic limbs
The emergence of robotic body augmentation provides exciting innovations that will revolutionize the fields of robotics, human-machine interaction and wearable electronics. Although augmentative devices such as extra robotic arms and fingers are informed by restorative technologies in many ways, they also introduce unique challenges for bidirectional human-machine collaboration. Can humans adapt and learn to operate a new robotic limb collaboratively with their biological limbs, without restricting other physical abilities? To successfully achieve robotic body augmentation, we need to ensure that, by giving a user an additional (artificial) limb, we are not trading off the functionalities of an existing (biological) one. Here, we introduce the 'neural resource allocation problem' and discuss how to allow the effective voluntary control of augmentative devices without compromising control of the biological body. In reviewing the relevant literature on extra robotic fingers and arms, we critically assess the range of potential solutions available for this neural resource allocation problem. For this purpose, we combine multiple perspectives from engineering and neuroscience with considerations including human-machine interaction, sensory-motor integration, ethics and law. In summary, we aim to define common foundations and operating principles for the successful implementation of robotic body augmentation.
Nature Machine Intelligence. 2021-10-01. DOI : 10.1038/s42256-021-00398-9.A modular strategy for next-generation upper-limb sensory-motor neuroprostheses
Neuroprosthetics is a discipline that aims at restoring lost functions to people affected by a variety of neurological disorders or neuro-traumatic lesions. It combines the expertise of computer science and electrical, mechanical, and micro/nanotechnology with cellular, molecular, and systems neuroscience. Rapid breakthroughs in the field during the past decade have brought the hope that neuroprostheses can soon become a clinical reality, in particular-as we will detail in this review-for the restoration of hand functions. We argue that any neuroprosthesis relies on a set of hardware and algorithmic building elements that we call the neurotechnological modules (NTs) used for motor decoding, movement restoration, or sensory feedback. We will show how the modular approach is already present in current neuroprosthetic solutions and how we can further exploit it to imagine the next generation of neuroprosthetics for sensory-motor restoration.
Med. 2021-08-13. DOI : 10.1016/j.medj.2021.05.002.Detecting Freezing of Gait in Parkinson's Disease Patient via Deep Residual Network
Freezing of Gait (FoG) is a common condition in patients with Parkinson's disease (PD). It often leads to falls, and it severely affects the patient's quality of life. Although the neural mechanism of FoG is not well-known, wearable sensorbased assistive systems have been shown to effectively monitor FoG and help patients resume walking through rhythmic auditory cues when FoG is detected in real-time. With the development of technologies based on wearable sensors, accurate detection of FoG events is important for resume walking, clinical diagnosis, and treatment. Here, we propose a deep residual network to detect FoG. Offline analysis performed on a publicly available dataset with 10 patients shows the superiority of the proposed approach compared to traditional method (Moore's algorithm) and several deep learning techniques. Under is window size, the proposed method can achieve 85.7% sensitivity and 94.0% specificity. The geometric mean of the proposed method is 37.4% ahead of Moore's algorithm. Our approach can help improve the patients with PD quality of life and evaluate symptoms of FoG.
2021-01-01. 20th IEEE International Conference on Machine Learning and Applications (ICMLA), ELECTR NETWORK, Dec 13-16, 2021. p. 320-325. DOI : 10.1109/ICMLA52953.2021.00056.Electrical spinal cord stimulation protocols to regulate motor and autonomic functions in people with neurological disorders
Neurological disorders such as spinal cord injury (SCI) massively reduce independence and quality of life. Most often, the majority of the nervous system is still fully or partially spared, but dysfunctional due to aberrant or absent descending input from the brain. Neuromodulation techniques restore function by artificially engaging these neuronal circuits. Given its central position and multi- functional role, the spinal cord provides an excellent gateway for such therapies. Epidural Electrical Stimulation (EES) of the spinal cord has been put forward as a potential therapy to restore motor and autonomic function after SCI. Throughout the last decade, our laboratory developed the concept of biomimetic EES. Spatiotemporal stimulation parameters are optimized to re-establish the natural dynamics of the underlying spinal circuitry. Strong evidence in rodent and non- human primate models suggests that biomimetic EES could promote locomotion and haemodynamic stability after severe SCI, leading to a push for rapid clinical translation of this potentially game-changing therapy. This clinical translation is the pivot of my thesis. The first part focuses on the recovery of leg motor control after SCI. The results of a first-in-human clinical trial in 9 participants with chronic SCI are presented. We developed a set of targeted neurotechnologies that include a new epidurally implanted electrode array, real-time communication and tailormade software systems that enable the precise delivery of biomimetic EES. We demon- strate that biomimetic EES can be optimized to immediately restore a vast number of leg motor functions such as walking and cycling, as well as trunk posture. Furthermore, after 5 months of intensive EES-enabled volitional training, participants did not only all improve their motor performance while using EES, they moreover regained volitional control over previously paralyzed muscles. This neuro- logical recovery correlated with a reduced metabolic activity in the spinal cord, suggesting spinal remodeling. A systematic pre-clinical approach identified a specific neuronal population mediating these effects. The second part of my thesis considers another critically important body function: haemodynamic management. Orthostatic hypo- tension is a major health issue after severe SCI and stems from a disconnection between the vasomotor regulatory centres in the brainstem and the sympathetic circuitry that adjusts peripheral vascular resistance. Preclinical work in rodents showed that EES ap- plied to the low-thoracic spinal cord could restore haemodynamic stability after SCI. On the path towards clinical translation, we implemented closed-loop control of EES in a non-human primate model of SCI to maintain haemodynamic stability during orthostatic challenge. I moreover present a case-study that demonstrates the transferability of thoracic EES to improve haemodynamic manage- ment in a patient with Multiple System Atrophy (MSA). A marked reduction of orthostatic hypotension led to a clinically relevant decrease in fainting episodes and considerably improved quality of life. My thesis demonstrates the prosthetic and therapeutic effects of biomimetic EES and highlights its ability to immediately restore body functions and boost neurological recovery after SCI and in MSA. These results strengthen the belief that EES holds the potential to evolve into a relevant and efficient therapeutic intervention in a plentitude of neurological disorders.
Lausanne, EPFL, 2021. DOI : 10.5075/epfl-thesis-9205.Material surface detection on various body parts: a preliminary study for temperature substitution for upper arm amputees
Experiments have shown that healthy subjects can detect materials solely relying on the material's thermal properties at their fingertip. We are interested in developing a non-invasive temperature display for sensory substitution for patients with arm amputation, that could reproduce the signature temperature drop of different materials. In a group of healthy subjects, we investigate the best placement of such a device and the ability to discriminate objects' materials on the upper arm and the mid-abdomen in comparison to the fingertip. Our experiments show that the discrimination rates for all three locations are above chance level, with scores in the abdomen even higher than for the fingertip. We discuss how these findings can help to develop a sensory substitution device for temperature feedback for amputees.
2021-01-01. 10th International IEEE-EMBS Conference on Neural Engineering (NER), Prague, ELECTR NETWORK, May 04-06, 2021. p. 195-198. DOI : 10.1109/NER49283.2021.9441262.Current Solutions and Future Trends for Robotic Prosthetic Hands
The desire for functional replacement of a missing hand is an ancient one. Historically, humans have replaced a missing limb with a prosthesis for cosmetic, vocational, or personal autonomy reasons. The hand is a powerful tool, and its loss causes severe physical and often mental debilitation. Technological advancements have allowed the development of increasingly effective artificial hands, which can improve the quality of life of people who suffered a hand amputation. Here, we review the state of the art of robotic prosthetic hands (RPHs), with particular attention to the potential and current limits of their main building blocks: the hand itself, approaches to decoding voluntary commands and controlling the hand, and systems and methods for providing sensory feedback to the user. We also briefly describe existing approaches to characterizing the performance of subjects using RPHs for grasping tasks and provide perspectives on the future of different components and the overall field of RPH development.
Annual Review Of Control, Robotics, And Autonomous Systems. 2021-01-01. DOI : 10.1146/annurev-control-071020-104336.Preventing Breaks in Embodiment in Immersive Virtual Reality
Virtual reality (VR) is immersive not only because of visual integration, but because we can act and perform in a virtual environment (VE). Beyond the mere fact that a VR system provides images computed directly from the users' viewpoint in stereoscopy, it can also provide a natural way to interact within the VE using the entire body. Full-body motion capture allows the user's movement to be mapped to a virtual body. Such a mapping may let users feel in control of (sense of agency), own (sense of body ownership) and locate themselves inside this virtual body (sense of self-location), thus leading them to feel embodied in this virtual body whenever these conditions are met (sense of embodiment (SoE)). Hence the integration of a full body avatar in VR is essential, as it brings even more; experiencing being someone else, train to recover mobility or having superhuman abilities. Most importantly, the avatar mediates the virtual nature of the VE to make it seamless for the user. Thus, this thesis focuses on helping the users execute complex movements in VR by applying a distortion to their movements. However, such a distortion is efficient only if users, when assisted, do not experience a break in embodiment (BiE), i.e., do not abruptly lose their sense of embodiment. Moreover, the way the virtual body is perceived is not consistent between users and some users might accept or reject a virtual body more easily than others. Therefore, this thesis explores how to prevent a BiE while distorting users' movement and adapting the VR experience to each individual. To this end, we designed a system combining brain-computer interface and machine learning to detect when users experience a BiE, compute users' preferences, and adapt the VR application's parameters. We first identified the user`s threshold for perceiving a movement distortion in some critical contexts, such as articular limits. We designed a new distortion to help users execute a complex movement. We introduced a method based on machine learning to find users' preferences when using our new distortion. We also established a link between the brain's error monitoring mechanism and the cognitive process of embodiment. This link exposed a new implicit EEG marker indicating when users experience a BiE. Finally, we designed a system to calibrate the distortion implicitly based on users' preferences with all these elements. This system first consists of detecting a BiE during a continuous movement thanks to state-of-the-art brain-computer interface algorithms. Our machine learning method then computes users' preferences, and the distortion is adapted dynamically. This original approach demonstrates the use of an implicit feedback loop to help users while preventing any BiE. Our framework opens an exciting perspective for personalized and self-adapting embodiment systems.
Lausanne, EPFL, 2021. DOI : 10.5075/epfl-thesis-8053.2019
Non-invasive, Brain-controlled Functional Electrical Stimulation for Locomotion Rehabilitation in Individuals with Paraplegia
Spinal cord injury (SCI) impairs the flow of sensory and motor signals between the brain and the areas of the body located below the lesion level. Here, we describe a neurorehabilitation setup combining several approaches that were shown to have a positive effect in patients with SCI: gait training by means of non-invasive, surface functional electrical stimulation (sFES) of the lower-limbs, proprioceptive and tactile feedback, balance control through overground walking and cue-based decoding of cortical motor commands using a brain-machine interface (BMI). The central component of this new approach was the development of a novel muscle stimulation paradigm for step generation using 16 sFES channels taking all sub-phases of physiological gait into account. We also developed a new BMI protocol to identify left and right leg motor imagery that was used to trigger an sFES-generated step movement. Our system was tested and validated with two patients with chronic paraplegia. These patients were able to walk safely with 65-70% body weight support, accumulating a total of 4,580 steps with this setup. We observed cardiovascular improvements and less dependency on walking assistance, but also partial neurological recovery in both patients, with substantial rates of motor improvement for one of them.
Scientific Reports. 2019-05-01. DOI : 10.1038/s41598-019-43041-9.Creating a neuroprosthesis for active tactile exploration of textures
Intracortical microstimulation (ICMS) of the primary somatosensory cortex (S1) can produce percepts that mimic somatic sensation and, thus, has potential as an approach to sensorize prosthetic limbs. However, it is not known whether ICMS could recreate active texture exploration-the ability to infer information about object texture by using one's fingertips to scan a surface. Here, we show that ICMS of S1 can convey information about the spatial frequencies of invisible virtual gratings through a process of active tactile exploration. Two rhesus monkeys scanned pairs of visually identical screen objects with the fingertip of a hand avatar-controlled first via a joystick and later via a brain-machine interface-to find the object with denser virtual gratings. The gratings consisted of evenly spaced ridges that were signaled through individual ICMS pulses generated whenever the avatar's fingertip crossed a ridge. The monkeys learned to interpret these ICMS patterns, evoked by the interplay of their voluntary movements and the virtual textures of each object, to perform a sensory discrimination task. Discrimination accuracy followed Weber's law of just-noticeable differences (JND) across a range of grating densities; a finding that matches normal cutaneous sensation. Moreover, 1 monkey developed an active scanning strategy where avatar velocity was integrated with the ICMS pulses to interpret the texture information. We propose that this approach could equip upper-limb neuroprostheses with direct access to texture features acquired during active exploration of natural objects.
Proceedings of the National Academy of Sciences of the United States Of America. 2019-10-22. DOI : 10.1073/pnas.1908008116.Exoskeletons as Mechatronic Design Example
Exoskeletons are a perfect example of a mechatronics product. They illustrate the close integration and interdependence of mechanical design, drive train, sensors, control strategy and user interface. Recent developments of our lab will be discussed in detail. Application examples include paraplegics, amputees, muscular dystrophy patients. The motivations of the users exoskeletons are as diverse as sporting challenge, life quality improvement for daily living, rehabilitation and social integration. Links to Cognitive Neurosciences will also be briefly discussed.
2019-01-01. 6th International Workshop on Medical and Service Robots (MESROB), Cassino, ITALY, 2018. p. 109-117. DOI : 10.1007/978-3-030-00329-6_13.2018
Closed-Loop Functional Electrical Stimulation for Gait Training for Patients with Paraplegia
This paper presents a novel functional electrical stimulation paradigm to generate locomotion in paraplegic patients. We propose a closed-loop surface functional electrical stimulation (sFES) of 16 lower-limb muscles to produce all subphases of the gait. The reproduction of cyclic and coordinated joint-movements is a challenge, especially if synchronic and synergic lower-limbs muscular contraction is considered. This paper presents the implementation and validation of a sFES closed-loop position control and the pilot clinical tests with a chronic, motor complete (ASIA B) SCI patient. We demonstrate that the patient can safely use our setup to produce a physiologically correct gait with minimal external help. The implemented sFES-gait training was beneficial to see an improvement in gait kinematics. The integration of the proposed setup with a neurorehabilitation protocol has the potential to become a valuable locomotion therapy for SCI patients.
2018-01-01. IEEE International Conference on Robotics and Biomimetics (ROBIO), Kuala Lumpur, MALAYSIA, Dec 12-15, 2018. p. 1489-1495. DOI : 10.1109/ROBIO.2018.8665270.2016
Long-Term Training with a Brain-Machine Interface-Based Gait Protocol Induces Partial Neurological Recovery in Paraplegic Patients
Brain-machine interfaces (BMIs) provide a new assistive strategy aimed at restoring mobility in severely paralyzed patients. Yet, no study in animals or in human subjects has indicated that long-term BMI training could induce any type of clinical recovery. Eight chronic (3-13 years) spinal cord injury (SCI) paraplegics were subjected to long-term training (12 months) with a multi-stage BMI-based gait neurorehabilitation paradigm aimed at restoring locomotion. This paradigm combined intense immersive virtual reality training, enriched visual-tactile feedback, and walking with two EEG-controlled robotic actuators, including a custom-designed lower limb exoskeleton capable of delivering tactile feedback to subjects. Following 12 months of training with this paradigm, all eight patients experienced neurological improvements in somatic sensation (pain localization, fine/crude touch, and proprioceptive sensing) in multiple dermatomes. Patients also regained voluntary motor control in key muscles below the SCI level, as measured by EMGs, resulting in marked improvement in their walking index. As a result, 50% of these patients were upgraded to an incomplete paraplegia classification. Neurological recovery was paralleled by the reemergence of lower limb motor imagery at cortical level. We hypothesize that this unprecedented neurological recovery results from both cortical and spinal cord plasticity triggered by long-term BMI usage.
Scientific Reports. 2016-08-11. DOI : 10.1038/srep30383.Assimilation of virtual legs and perception of floor texture by complete paraplegic patients receiving artificial tactile feedback
Spinal cord injuries disrupt bidirectional communication between the patient's brain and body. Here, we demonstrate a new approach for reproducing lower limb somatosensory feedback in paraplegics by remapping missing leg/foot tactile sensations onto the skin of patients' forearms. A portable haptic display was tested in eight patients in a setup where the lower limbs were simulated using immersive virtual reality (VR). For six out of eight patients, the haptic display induced the realistic illusion of walking on three different types of floor surfaces: beach sand, a paved street or grass. Additionally, patients experienced the movements of the virtual legs during the swing phase or the sensation of the foot rolling on the floor while walking. Relying solely on this tactile feedback, patients reported the position of the avatar leg during virtual walking. Crossmodal interference between vision of the virtual legs and tactile feedback revealed that patients assimilated the virtual lower limbs as if they were their own legs. We propose that the addition of tactile feedback to neuroprosthetic devices is essential to restore a full lower limb perceptual experience in spinal cord injury (SCI) patients, and will ultimately, lead to a higher rate of prosthetic acceptance/use and a better level of motor proficiency.
Scientific Reports. 2016-09-01. DOI : 10.1038/srep32293.2013
A Brain-Machine Interface Enables Bimanual Arm Movements in Monkeys
Brain-machine interfaces (BMIs) are artificial systems that aim to restore sensation and movement to paralyzed patients. So far, BMIs have enabled only one arm to be moved at a time. Control of bimanual arm movements remains a major challenge. We have developed and tested a bimanual BMI that enables rhesus monkeys to control two avatar arms simultaneously. The bimanual BMI was based on the extracellular activity of 374 to 497 neurons recorded from several frontal and parietal cortical areas of both cerebral hemispheres. Cortical activity was transformed into movements of the two arms with a decoding algorithm called a fifth-order unscented Kalman filter (UKF). The UKF was trained either during a manual task performed with two joysticks or by having the monkeys passively observe the movements of avatar arms. Most cortical neurons changed their modulation patterns when both arms were engaged simultaneously. Representing the two arms jointly in a single UKF decoder resulted in improved decoding performance compared with using separate decoders for each arm. As the animals' performance in bimanual BMI control improved over time, we observed widespread plasticity in frontal and parietal cortical areas. Neuronal representation of the avatar and reach targets was enhanced with learning, whereas pairwise correlations between neurons initially increased and then decreased. These results suggest that cortical networks may assimilate the two avatar arms through BMI control. These findings should help in the design of more sophisticated BMIs capable of enabling bimanual motor control in human patients.
Science Translational Medicine. 2013. DOI : 10.1126/scitranslmed.3006159.Expanding the primate body schema in sensorimotor cortex by virtual touches of an avatar
The brain representation of the body, called the body schema, is susceptible to plasticity. For instance, subjects experiencing a rubber hand illusion develop a sense of ownership of a mannequin hand when they view it being touched while tactile stimuli are simultaneously applied to their own hand. Here, the cortical basis of such an embodiment was investigated through concurrent recordings from primary somatosensory (i.e., S1) and motor (i.e., M1) cortical neuronal ensembles while two monkeys observed an avatar arm being touched by a virtual ball. Following a period when virtual touches occurred synchronously with physical brushes of the monkeys' arms, neurons in S1 and M1 started to respond to virtual touches applied alone. Responses to virtual touch occurred 50 to 70 ms later than to physical touch, consistent with the involvement of polysynaptic pathways linking the visual cortex to S1 and M1. We propose that S1 and M1 contribute to the rubber hand illusion and that, by taking advantage of plasticity in these areas, patients may assimilate neuroprosthetic limbs as parts of their body schema.
Proceedings Of The National Academy Of Sciences Of The United States Of America. 2013. DOI : 10.1073/pnas.1308459110.Virtual reality based Brain-Machine-Interface for Sensori-Motor and Social experiments with Primates
As a result of improved understanding of brain mechanisms as well as unprecedented technical advancement in neural recording methods and computer technology, it is now possible to translate large-scale brain signals into movement intentions in real time. Such decoding of both actual and imagined movements of a subject allows for new paradigms of treatment for severely impaired patients, such as neural control of a prosthesis. The field of Brain Machine Interfaces (BMI) explores the tremendous potential of hybrid systems linking neural tissue to artificial devices. BMI operations involve a bidirectional learning process: the BMI system learns to decode brain signals by uncovering their relationship to voluntary movements, while the brain itself plastically adapts to the task. Proper BMI training is critical for its successful adoption by the patient. We believe that training a subject in a realistic virtual environment prior to the use of the physical prosthetic device is an efficient and safe method that can significantly facilitate design of practical neural prostheses for patients in need. In this dissertation we describe the control of a 3D virtual monkey (the avatar) as visual feedback for BMI with rhesus monkeys and address a number of key questions: • Monkeys’ interaction with the avatar. • Modulation of neurons in primary somatosensory (S1) and motor (M1) cortical areas during passive observation of the avatar being touched. • Modulation of neural responses by the observation of the avatar’s movements. • Changes in neural responses during long-term brain control of the avatar. We describe the plasticity of the body representation by the brain resulting from visual stimuli delivered via the avatar and tactile stimuli applied to the subject’s physical arm. We show how the avatar can be used for training rhesus monkeys to perform complex tasks. Behavioral evidence that rhesus monkeys respond to the avatar shape and motions and can even relate it with a representation of another monkey is presented. Finally two instances of novel brain controlled avatar are shown: a complete closed loop brain-machine-brain-interface with sensory feedback through direct cortical stimulation and the first successful attempt of a multi-limb BMI. We also study a simplified learning process of the BMI through the passive observation of the movements of the avatar arms.
Lausanne, EPFL, 2013. DOI : 10.5075/epfl-thesis-5671.2012
Bimanual brain-machine interface
Brain-machine interfaces (BMIs) - devices that connect brain areas to external actuators - strive to restore limb mobility and sensation to patients suffering from paralysis or limb loss. Here we report a novel BMI that controls two virtual arms simultaneously. The development of BMIs for bimanual control is important because even the most basic daily movements such as opening a jar or buttoning a shirt require two arms. We for the first time have designed and implemented a bimanual BMI where activity of multiple cortical areas is translated in real-time into center-out reaching movements performed by two virtual arms. Eight multielectrode arrays, a total of 768 electrode channels, were implanted in the primary motor (M1), sensory (S1), supplementary motor (SMA), dorsal premotor (PMd), and posterior parietal (PP) cortices of both hemispheres of a rhesus monkey. Movement kinematics of each arm were extracted from the same ensemble of 400 neurons using a Wiener filter and an unscented Kalman filter (UKF). Typically, a single neuron contributed to the movements of both left and right arms. Movements were enacted by arms of a virtual rhesus monkey avatar on a computer screen presented in first-person to the monkey. On each trial, the virtual arms moved their central locations to peripheral targets presented simultaneously on the right and left sides of the computer screen. Peri-event time histograms and linear discriminant analysis revealed a highly distributed encoding scheme, with movement directions of both limbs represented by both ipsilateral and contralateral areas. Furthermore, movements were represented by multiple cortical regions, including both primary and non-primary motor areas which have been previously identified areas important for bimanual coordination. Over the course of several weeks of real-time BMI control, the monkey’s performance clearly improved both when the monkey continued to move the joystick and when the joystick was removed. These results support the feasibility of cortically-driven clinical neural prosthetics for bimanual operations.
Society for Neuroscience, New Orleans, LA., October 13-17, 2012.Beyond the homunculus: Visual responses of primary somatosensory cortex (S1) neurons to virtual touch of a virtual
Following a brief period of brushing a monkey arm with a real brush, synchronized with the vision of an arm avatar being brushed virtually, neurons in the primary somatosensory and motor cortices began to fire in response to the virtual brushing alone, suggesting that cortical representation of the body can be reshaped, in a matter of minutes, to incorporate even virtual limbs.
Society for Neuroscience 42nd Annual Meeting, New Orleans, LA., Octobre, 12-17, 2012.2011
Social interaction probed by reaching to face images: Rhesus monkeys consider a textured monkey avatar as a conspecific.
Realistic body images (avatars) have been long utilized in virtual reality applications, and they are becoming increasingly used in Neuroscience and Neuroprosthetics fields. To elucidate monkeys' perception of avatars, we have measured reaction of two naive rhesus monkeys when confronted to realistic 3D monkey avatars with different facial expressions and different levels of realism. We have compared it with their reaction to images of real monkeys with similar facial expressions. Monkeys were initially overtrained in reaching task in which they manipulated a joystick to reach toward circular targets with a computer cursor. We then replaced every 15th target with a randomly selected image of either a real monkey face, an avatar face or a sphere (i.e., control image), and we measured the average speed to reach each of these images. We also tested two different facial expressions: an aggressive bared teeth face and a friendly face. Showing the face images significantly altered the kinematics of reaching movements. These results indicated that monkeys interacted with the realistic avatar as if it was a conspecific. This effect was absent if the avatar was not textured. The two way Anova showed that the interaction of texture and facial expression was a significant factor for monkeys' speed of reaching (p<0.01). At the same time, we did not find any significant effect of 3D rendering versus 2D flat rendering (p>0.8), suggesting that the texture is more important than spatial realism for Monkeys to consider an avatar as a conspesific. These effects should be taken into account when using avatars in primate neuroprosthetic research.
socity for Neuroscience, Washington DC, USA, November 12-16 2011.Active tactile exploration using a brain-machine-brain interface
Brain–machine interfaces1, 2 use neuronal activity recorded from the brain to establish direct communication with external actuators, such as prosthetic arms. It is hoped that brain–machine interfaces can be used to restore the normal sensorimotor functions of the limbs, but so far they have lacked tactile sensation. Here we report the operation of a brain–machine–brain interface (BMBI) that both controls the exploratory reaching movements of an actuator and allows signalling of artificial tactile feedback through intracortical microstimulation (ICMS) of the primary somatosensory cortex. Monkeys performed an active exploration task in which an actuator (a computer cursor or a virtual-reality arm) was moved using a BMBI that derived motor commands from neuronal ensemble activity recorded in the primary motor cortex. ICMS feedback occurred whenever the actuator touched virtual objects. Temporal patterns of ICMS encoded the artificial tactile properties of each object. Neuronal recordings and ICMS epochs were temporally multiplexed to avoid interference. Two monkeys operated this BMBI to search for and distinguish one of three visually identical objects, using the virtual-reality arm to identify the unique artificial texture associated with each. These results suggest that clinical motor neuroprostheses might benefit from the addition of ICMS feedback to generate artificial somatic perceptions associated with mechanical, robotic or even virtual prostheses.
Nature. 2011. DOI : 10.1038/nature10489.2010
Integration of a virtual reality based arm in primary somatosensory cortex
Recent advances in brain-machine interfaces (BMIs) have demonstrated the possibility of motor neuroprosthetics directly controlled by brain activity. Ideally neuroprosthetic limbs should be integrated in the body schema of the subject. To explore the ways to enhance such incorporation, we recorded modulations of neuronal ensemble activity in the primary somatosensory (S1) cortex during tactile stimulation simulated in virtual reality (VR) under conditions known to evoke a rubber-hand illusion. A realistic 3D mesh represented monkey body in VR. The monkey’s arms were hidden by an opaque plate and virtual arms projected on the plate. A robotic brush, also hidden from the monkey, touched various locations on forearms of the monkey and was synchronized with a virtual brush touching the projected VR arms. Additionally, we implemented tactile stimulation with air puffs. We have tested various combination of tactile (physical touch), visual (VR arm being touched) and sound (robotic brush touching the arm) inputs: synchronous tactile and visual (T-VR), tactile without visual (T), and visual only (VR). Neuronal ensemble activity was recorded from S1 and primary motor cortex (M1). We found differences in both S1 and M1 activities across the stimulation types. In particular S1 responses to T-VR were stronger than for T. Moreover, S1 neurons were modulated during visual stimulation without touch (VR), suggesting S1 activation as a neuronal mechanism of the rubber-hand illusion. Further, we decoded stimulation parameters from the activity of large neuronal populations. These results suggest a flexible and distributed representation of somatosensory information in the cortex, which can be modified by visual feedback from the body and/or artificial actuators.
Neuroscience 2010, SfN's 40th annual meeting, San Diego, California, USA, 13-17 November, 2010.Virtual Environment to Evaluate Multimodal Feedback Strategies for Augmented Navigation of the Visually Impaired
This paper proposes a novel experimental environ- ment to evaluate multimodal feedback strategies for augmented navigation of the visually impaired. The environment consists of virtual obstacles and walls, an optical tracking system and a simple device with audio and vibrotactile feedback that interacts with the virtual environment, and presents many advantages in terms of safety, flexibility, control over exper- imental parameters and cost. The subject can freely move in an empty room, while the position of head and arm are tracked in real time. A virtual environment (walls, obstacles) is randomly generated, and audio and vibrotactile feedback are given according to the distance from the subjects arm to the virtual walls/objects. We investigate the applicability of our environment using a simple, commercially available feedback device. Experiments with unimpaired subjects show that it is possible to use the setup to ”blindly” navigate in an unpredictable virtual environment. This validates the environment as a test platform to investigate navigation and exploration strategies of the visually impaired, and to evaluate novel technologies for augmented navigation.
2010. 32nd Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC'10), Buenos Aires, Argentina, September 1- 4, 2010.. p. 975-978. DOI : 10.1109/IEMBS.2010.5627611.2005
Toward a person-follower robot
In this article we described the attempt to build a robot able to locate and follow an human target moving in a domestic environment. After a brief review of the state of the art in relative location technologies, we described our approach that aims to develop robots provided with simple and robust relative location technologies that do not require to structure the environment and on simple semi-reactive strategies that does not require the use of internal maps and the ability to self-localize. More specifically, the approach is based on a control system able to display and integrate an exploration, obstacle avoidance, and target following behavior and a relative location device based on an signal emitter (placed on the target person) and a directional sensor (placed on the mobile robot).
2005. RoboCare Workshop. p. 65-68.