Solaiman Shokur

solaiman.shokur@epfl.ch +41 21 693 80 44
EPFL STI IBI-STI TNE
B3 4 225.134 (Campus Biotech bâtiment B3)
Ch. des Mines 9
CH-1202 Genève
+41 21 693 80 44
Office: B3 4 225.134
EPFL
>
STI
>
IBI-STI
>
TNE
Web site: Web site: https://tne.epfl.ch
+41 21 693 80 44
EPFL
>
SV
>
SV-SSV
>
SSV-ENS
Web site: Web site: https://sv.epfl.ch/education
Fields of expertise
His work involves the development, and validation of new technologies to restore sensory-motor functions for patients with complete paraplegia and upper arm amputation.
His area of expertise includes virtual reality for rehabilitation, invasive and non-invasive brain-machine interfaces, and haptic devices.
Biography
NCCR Robotics: Bi-directional control of supernumerary limbs.
2014-2019: Research Coordinator, AASDAP (São Paulo, Brazil)
Responsible for both the scientific production and the clinical protocol at the AASDAP neurorehabilitation laboratory.
Head of the engineering team in charge of the system integration.
2010-2012: Visiting scientist, Nicolelis Lab (Duke University).
Publications
Infoscience publications
Infoscience
2019
Creating a neuroprosthesis for active tactile exploration of textures
Intracortical microstimulation (ICMS) of the primary somatosensory cortex (S1) can produce percepts that mimic somatic sensation and, thus, has potential as an approach to sensorize prosthetic limbs. However, it is not known whether ICMS could recreate active texture exploration-the ability to infer information about object texture by using one's fingertips to scan a surface. Here, we show that ICMS of S1 can convey information about the spatial frequencies of invisible virtual gratings through a process of active tactile exploration. Two rhesus monkeys scanned pairs of visually identical screen objects with the fingertip of a hand avatar-controlled first via a joystick and later via a brain-machine interface-to find the object with denser virtual gratings. The gratings consisted of evenly spaced ridges that were signaled through individual ICMS pulses generated whenever the avatar's fingertip crossed a ridge. The monkeys learned to interpret these ICMS patterns, evoked by the interplay of their voluntary movements and the virtual textures of each object, to perform a sensory discrimination task. Discrimination accuracy followed Weber's law of just-noticeable differences (JND) across a range of grating densities; a finding that matches normal cutaneous sensation. Moreover, 1 monkey developed an active scanning strategy where avatar velocity was integrated with the ICMS pulses to interpret the texture information. We propose that this approach could equip upper-limb neuroprostheses with direct access to texture features acquired during active exploration of natural objects.
Proceedings of the National Academy of Sciences of the United States Of America. 2019-10-22. DOI : 10.1073/pnas.1908008116.Non-invasive, Brain-controlled Functional Electrical Stimulation for Locomotion Rehabilitation in Individuals with Paraplegia
Spinal cord injury (SCI) impairs the flow of sensory and motor signals between the brain and the areas of the body located below the lesion level. Here, we describe a neurorehabilitation setup combining several approaches that were shown to have a positive effect in patients with SCI: gait training by means of non-invasive, surface functional electrical stimulation (sFES) of the lower-limbs, proprioceptive and tactile feedback, balance control through overground walking and cue-based decoding of cortical motor commands using a brain-machine interface (BMI). The central component of this new approach was the development of a novel muscle stimulation paradigm for step generation using 16 sFES channels taking all sub-phases of physiological gait into account. We also developed a new BMI protocol to identify left and right leg motor imagery that was used to trigger an sFES-generated step movement. Our system was tested and validated with two patients with chronic paraplegia. These patients were able to walk safely with 65-70% body weight support, accumulating a total of 4,580 steps with this setup. We observed cardiovascular improvements and less dependency on walking assistance, but also partial neurological recovery in both patients, with substantial rates of motor improvement for one of them.
Scientific Reports. 2019-05-01. DOI : 10.1038/s41598-019-43041-9.Exoskeletons as Mechatronic Design Example
Exoskeletons are a perfect example of a mechatronics product. They illustrate the close integration and interdependence of mechanical design, drive train, sensors, control strategy and user interface. Recent developments of our lab will be discussed in detail. Application examples include paraplegics, amputees, muscular dystrophy patients. The motivations of the users exoskeletons are as diverse as sporting challenge, life quality improvement for daily living, rehabilitation and social integration. Links to Cognitive Neurosciences will also be briefly discussed.
2019-01-01. 6th International Workshop on Medical and Service Robots (MESROB), Cassino, ITALY, 2018. p. 109-117. DOI : 10.1007/978-3-030-00329-6_13.2018
Closed-Loop Functional Electrical Stimulation for Gait Training for Patients with Paraplegia
This paper presents a novel functional electrical stimulation paradigm to generate locomotion in paraplegic patients. We propose a closed-loop surface functional electrical stimulation (sFES) of 16 lower-limb muscles to produce all subphases of the gait. The reproduction of cyclic and coordinated joint-movements is a challenge, especially if synchronic and synergic lower-limbs muscular contraction is considered. This paper presents the implementation and validation of a sFES closed-loop position control and the pilot clinical tests with a chronic, motor complete (ASIA B) SCI patient. We demonstrate that the patient can safely use our setup to produce a physiologically correct gait with minimal external help. The implemented sFES-gait training was beneficial to see an improvement in gait kinematics. The integration of the proposed setup with a neurorehabilitation protocol has the potential to become a valuable locomotion therapy for SCI patients.
2018-01-01. IEEE International Conference on Robotics and Biomimetics (ROBIO), Kuala Lumpur, MALAYSIA, Dec 12-15, 2018. p. 1489-1495.2016
Assimilation of virtual legs and perception of floor texture by complete paraplegic patients receiving artificial tactile feedback
Spinal cord injuries disrupt bidirectional communication between the patient's brain and body. Here, we demonstrate a new approach for reproducing lower limb somatosensory feedback in paraplegics by remapping missing leg/foot tactile sensations onto the skin of patients' forearms. A portable haptic display was tested in eight patients in a setup where the lower limbs were simulated using immersive virtual reality (VR). For six out of eight patients, the haptic display induced the realistic illusion of walking on three different types of floor surfaces: beach sand, a paved street or grass. Additionally, patients experienced the movements of the virtual legs during the swing phase or the sensation of the foot rolling on the floor while walking. Relying solely on this tactile feedback, patients reported the position of the avatar leg during virtual walking. Crossmodal interference between vision of the virtual legs and tactile feedback revealed that patients assimilated the virtual lower limbs as if they were their own legs. We propose that the addition of tactile feedback to neuroprosthetic devices is essential to restore a full lower limb perceptual experience in spinal cord injury (SCI) patients, and will ultimately, lead to a higher rate of prosthetic acceptance/use and a better level of motor proficiency.
Scientific Reports. 2016-09-01. DOI : 10.1038/srep32293.Long-Term Training with a Brain-Machine Interface-Based Gait Protocol Induces Partial Neurological Recovery in Paraplegic Patients
Brain-machine interfaces (BMIs) provide a new assistive strategy aimed at restoring mobility in severely paralyzed patients. Yet, no study in animals or in human subjects has indicated that long-term BMI training could induce any type of clinical recovery. Eight chronic (3-13 years) spinal cord injury (SCI) paraplegics were subjected to long-term training (12 months) with a multi-stage BMI-based gait neurorehabilitation paradigm aimed at restoring locomotion. This paradigm combined intense immersive virtual reality training, enriched visual-tactile feedback, and walking with two EEG-controlled robotic actuators, including a custom-designed lower limb exoskeleton capable of delivering tactile feedback to subjects. Following 12 months of training with this paradigm, all eight patients experienced neurological improvements in somatic sensation (pain localization, fine/crude touch, and proprioceptive sensing) in multiple dermatomes. Patients also regained voluntary motor control in key muscles below the SCI level, as measured by EMGs, resulting in marked improvement in their walking index. As a result, 50% of these patients were upgraded to an incomplete paraplegia classification. Neurological recovery was paralleled by the reemergence of lower limb motor imagery at cortical level. We hypothesize that this unprecedented neurological recovery results from both cortical and spinal cord plasticity triggered by long-term BMI usage.
Scientific Reports. 2016-08-11. DOI : 10.1038/srep30383.2013
A Brain-Machine Interface Enables Bimanual Arm Movements in Monkeys
Brain-machine interfaces (BMIs) are artificial systems that aim to restore sensation and movement to paralyzed patients. So far, BMIs have enabled only one arm to be moved at a time. Control of bimanual arm movements remains a major challenge. We have developed and tested a bimanual BMI that enables rhesus monkeys to control two avatar arms simultaneously. The bimanual BMI was based on the extracellular activity of 374 to 497 neurons recorded from several frontal and parietal cortical areas of both cerebral hemispheres. Cortical activity was transformed into movements of the two arms with a decoding algorithm called a fifth-order unscented Kalman filter (UKF). The UKF was trained either during a manual task performed with two joysticks or by having the monkeys passively observe the movements of avatar arms. Most cortical neurons changed their modulation patterns when both arms were engaged simultaneously. Representing the two arms jointly in a single UKF decoder resulted in improved decoding performance compared with using separate decoders for each arm. As the animals' performance in bimanual BMI control improved over time, we observed widespread plasticity in frontal and parietal cortical areas. Neuronal representation of the avatar and reach targets was enhanced with learning, whereas pairwise correlations between neurons initially increased and then decreased. These results suggest that cortical networks may assimilate the two avatar arms through BMI control. These findings should help in the design of more sophisticated BMIs capable of enabling bimanual motor control in human patients.
Science Translational Medicine. 2013. DOI : 10.1126/scitranslmed.3006159.Expanding the primate body schema in sensorimotor cortex by virtual touches of an avatar
The brain representation of the body, called the body schema, is susceptible to plasticity. For instance, subjects experiencing a rubber hand illusion develop a sense of ownership of a mannequin hand when they view it being touched while tactile stimuli are simultaneously applied to their own hand. Here, the cortical basis of such an embodiment was investigated through concurrent recordings from primary somatosensory (i.e., S1) and motor (i.e., M1) cortical neuronal ensembles while two monkeys observed an avatar arm being touched by a virtual ball. Following a period when virtual touches occurred synchronously with physical brushes of the monkeys' arms, neurons in S1 and M1 started to respond to virtual touches applied alone. Responses to virtual touch occurred 50 to 70 ms later than to physical touch, consistent with the involvement of polysynaptic pathways linking the visual cortex to S1 and M1. We propose that S1 and M1 contribute to the rubber hand illusion and that, by taking advantage of plasticity in these areas, patients may assimilate neuroprosthetic limbs as parts of their body schema.
Proceedings Of The National Academy Of Sciences Of The United States Of America. 2013. DOI : 10.1073/pnas.1308459110.Virtual reality based Brain-Machine-Interface for Sensori-Motor and Social experiments with Primates
As a result of improved understanding of brain mechanisms as well as unprecedented technical advancement in neural recording methods and computer technology, it is now possible to translate large-scale brain signals into movement intentions in real time. Such decoding of both actual and imagined movements of a subject allows for new paradigms of treatment for severely impaired patients, such as neural control of a prosthesis. The field of Brain Machine Interfaces (BMI) explores the tremendous potential of hybrid systems linking neural tissue to artificial devices. BMI operations involve a bidirectional learning process: the BMI system learns to decode brain signals by uncovering their relationship to voluntary movements, while the brain itself plastically adapts to the task. Proper BMI training is critical for its successful adoption by the patient. We believe that training a subject in a realistic virtual environment prior to the use of the physical prosthetic device is an efficient and safe method that can significantly facilitate design of practical neural prostheses for patients in need. In this dissertation we describe the control of a 3D virtual monkey (the avatar) as visual feedback for BMI with rhesus monkeys and address a number of key questions: • Monkeys’ interaction with the avatar. • Modulation of neurons in primary somatosensory (S1) and motor (M1) cortical areas during passive observation of the avatar being touched. • Modulation of neural responses by the observation of the avatar’s movements. • Changes in neural responses during long-term brain control of the avatar. We describe the plasticity of the body representation by the brain resulting from visual stimuli delivered via the avatar and tactile stimuli applied to the subject’s physical arm. We show how the avatar can be used for training rhesus monkeys to perform complex tasks. Behavioral evidence that rhesus monkeys respond to the avatar shape and motions and can even relate it with a representation of another monkey is presented. Finally two instances of novel brain controlled avatar are shown: a complete closed loop brain-machine-brain-interface with sensory feedback through direct cortical stimulation and the first successful attempt of a multi-limb BMI. We also study a simplified learning process of the BMI through the passive observation of the movements of the avatar arms.
Lausanne, EPFL, 2013. DOI : 10.5075/epfl-thesis-5671.2012
Bimanual brain-machine interface
Brain-machine interfaces (BMIs) - devices that connect brain areas to external actuators - strive to restore limb mobility and sensation to patients suffering from paralysis or limb loss. Here we report a novel BMI that controls two virtual arms simultaneously. The development of BMIs for bimanual control is important because even the most basic daily movements such as opening a jar or buttoning a shirt require two arms. We for the first time have designed and implemented a bimanual BMI where activity of multiple cortical areas is translated in real-time into center-out reaching movements performed by two virtual arms. Eight multielectrode arrays, a total of 768 electrode channels, were implanted in the primary motor (M1), sensory (S1), supplementary motor (SMA), dorsal premotor (PMd), and posterior parietal (PP) cortices of both hemispheres of a rhesus monkey. Movement kinematics of each arm were extracted from the same ensemble of 400 neurons using a Wiener filter and an unscented Kalman filter (UKF). Typically, a single neuron contributed to the movements of both left and right arms. Movements were enacted by arms of a virtual rhesus monkey avatar on a computer screen presented in first-person to the monkey. On each trial, the virtual arms moved their central locations to peripheral targets presented simultaneously on the right and left sides of the computer screen. Peri-event time histograms and linear discriminant analysis revealed a highly distributed encoding scheme, with movement directions of both limbs represented by both ipsilateral and contralateral areas. Furthermore, movements were represented by multiple cortical regions, including both primary and non-primary motor areas which have been previously identified areas important for bimanual coordination. Over the course of several weeks of real-time BMI control, the monkey’s performance clearly improved both when the monkey continued to move the joystick and when the joystick was removed. These results support the feasibility of cortically-driven clinical neural prosthetics for bimanual operations.
Society for Neuroscience, New Orleans, LA., October 13-17, 2012.Beyond the homunculus: Visual responses of primary somatosensory cortex (S1) neurons to virtual touch of a virtual
Following a brief period of brushing a monkey arm with a real brush, synchronized with the vision of an arm avatar being brushed virtually, neurons in the primary somatosensory and motor cortices began to fire in response to the virtual brushing alone, suggesting that cortical representation of the body can be reshaped, in a matter of minutes, to incorporate even virtual limbs.
Society for Neuroscience 42nd Annual Meeting, New Orleans, LA., Octobre, 12-17, 2012.2011
Social interaction probed by reaching to face images: Rhesus monkeys consider a textured monkey avatar as a conspecific.
Realistic body images (avatars) have been long utilized in virtual reality applications, and they are becoming increasingly used in Neuroscience and Neuroprosthetics fields. To elucidate monkeys' perception of avatars, we have measured reaction of two naive rhesus monkeys when confronted to realistic 3D monkey avatars with different facial expressions and different levels of realism. We have compared it with their reaction to images of real monkeys with similar facial expressions. Monkeys were initially overtrained in reaching task in which they manipulated a joystick to reach toward circular targets with a computer cursor. We then replaced every 15th target with a randomly selected image of either a real monkey face, an avatar face or a sphere (i.e., control image), and we measured the average speed to reach each of these images. We also tested two different facial expressions: an aggressive bared teeth face and a friendly face. Showing the face images significantly altered the kinematics of reaching movements. These results indicated that monkeys interacted with the realistic avatar as if it was a conspecific. This effect was absent if the avatar was not textured. The two way Anova showed that the interaction of texture and facial expression was a significant factor for monkeys' speed of reaching (p<0.01). At the same time, we did not find any significant effect of 3D rendering versus 2D flat rendering (p>0.8), suggesting that the texture is more important than spatial realism for Monkeys to consider an avatar as a conspesific. These effects should be taken into account when using avatars in primate neuroprosthetic research.
socity for Neuroscience, Washington DC, USA, November 12-16 2011.Active tactile exploration using a brain-machine-brain interface
Brain–machine interfaces1, 2 use neuronal activity recorded from the brain to establish direct communication with external actuators, such as prosthetic arms. It is hoped that brain–machine interfaces can be used to restore the normal sensorimotor functions of the limbs, but so far they have lacked tactile sensation. Here we report the operation of a brain–machine–brain interface (BMBI) that both controls the exploratory reaching movements of an actuator and allows signalling of artificial tactile feedback through intracortical microstimulation (ICMS) of the primary somatosensory cortex. Monkeys performed an active exploration task in which an actuator (a computer cursor or a virtual-reality arm) was moved using a BMBI that derived motor commands from neuronal ensemble activity recorded in the primary motor cortex. ICMS feedback occurred whenever the actuator touched virtual objects. Temporal patterns of ICMS encoded the artificial tactile properties of each object. Neuronal recordings and ICMS epochs were temporally multiplexed to avoid interference. Two monkeys operated this BMBI to search for and distinguish one of three visually identical objects, using the virtual-reality arm to identify the unique artificial texture associated with each. These results suggest that clinical motor neuroprostheses might benefit from the addition of ICMS feedback to generate artificial somatic perceptions associated with mechanical, robotic or even virtual prostheses.
Nature. 2011. DOI : 10.1038/nature10489.2010
Integration of a virtual reality based arm in primary somatosensory cortex
Abstract: Recent advances in brain-machine interfaces (BMIs) have demonstrated the possibility of motor neuroprosthetics directly controlled by brain activity. Ideally neuroprosthetic limbs should be integrated in the body schema of the subject. To explore the ways to enhance such incorporation, we recorded modulations of neuronal ensemble activity in the primary somatosensory (S1) cortex during tactile stimulation simulated in virtual reality (VR) under conditions known to evoke a rubber-hand illusion. A realistic 3D mesh represented monkey body in VR. The monkey’s arms were hidden by an opaque plate and virtual arms projected on the plate. A robotic brush, also hidden from the monkey, touched various locations on forearms of the monkey and was synchronized with a virtual brush touching the projected VR arms. Additionally, we implemented tactile stimulation with air puffs. We have tested various combination of tactile (physical touch), visual (VR arm being touched) and sound (robotic brush touching the arm) inputs: synchronous tactile and visual (T-VR), tactile without visual (T), and visual only (VR). Neuronal ensemble activity was recorded from S1 and primary motor cortex (M1). We found differences in both S1 and M1 activities across the stimulation types. In particular S1 responses to T-VR were stronger than for T. Moreover, S1 neurons were modulated during visual stimulation without touch (VR), suggesting S1 activation as a neuronal mechanism of the rubber-hand illusion. Further, we decoded stimulation parameters from the activity of large neuronal populations. These results suggest a flexible and distributed representation of somatosensory information in the cortex, which can be modified by visual feedback from the body and/or artificial actuators.
Neuroscience 2010, SfN's 40th annual meeting,, San Diego, California, USA, 13-17 November, 2010.Virtual Environment to Evaluate Multimodal Feedback Strategies for Augmented Navigation of the Visually Impaired
Abstract— This paper proposes a novel experimental environ- ment to evaluate multimodal feedback strategies for augmented navigation of the visually impaired. The environment consists of virtual obstacles and walls, an optical tracking system and a simple device with audio and vibrotactile feedback that interacts with the virtual environment, and presents many advantages in terms of safety, flexibility, control over exper- imental parameters and cost. The subject can freely move in an empty room, while the position of head and arm are tracked in real time. A virtual environment (walls, obstacles) is randomly generated, and audio and vibrotactile feedback are given according to the distance from the subjects arm to the virtual walls/objects. We investigate the applicability of our environment using a simple, commercially available feedback device. Experiments with unimpaired subjects show that it is possible to use the setup to ”blindly” navigate in an unpredictable virtual environment. This validates the environment as a test platform to investigate navigation and exploration strategies of the visually impaired, and to evaluate novel technologies for augmented navigation.
2010. 32nd Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC'10), Buenos Aires, Argentina, September 1- 4, 2010.. p. 975-978.2005
Toward a person-follower robot
In this article we described the attempt to build a robot able to locate and follow an human target moving in a domestic environment. After a brief review of the state of the art in relative location technologies, we described our approach that aims to develop robots provided with simple and robust relative location technologies that do not require to structure the environment and on simple semi-reactive strategies that does not require the use of internal maps and the ability to self-localize. More specifically, the approach is based on a control system able to display and integrate an exploration, obstacle avoidance, and target following behavior and a relative location device based on an signal emitter (placed on the target person) and a directional sensor (placed on the mobile robot).
2005. RoboCare Workshop. p. 65-68.