You are on page 1of 8

Running head: ROBOTIC PROSTHESIS 1

Robotic Prosthesis and the Integration of BMIs and BCIs

Finn Haney

Mrs. Bollinger

Honors Human Anatomy and Physiology I

Exeter Township Senior High School


ROBOTIC PROSTHESIS 2

Abstract

Robotic prosthetics are pushing the boundaries of medical science. Recent advances in the

technology have allowed the signals in the brain to be decoded and interpreted by a BCI, or a

brain computer interface. The BCI sends this information to a robot or robotic prosthetic and

then carries out the action. This combined with the innovative benefits motor imagery and vision

guided robotic assistance makes for extremely beneficial technology.


ROBOTIC PROSTHESIS 3

Robotic Prosthesis and the Integration of BMIs

Robotic prosthetics are on the cutting edge of biomedical engineering. Today, BMIs or

brain machine interfaces, and BCIs or brain computer interfaces are being designed to decode

and interpret the signals in the brain controlling movement. These interfaces then send processed

signals to a robotic prosthetic which can perform intricate movements and adapt to different

signals just like a real limb would. If perfected this allows for maximum useability in the real

world. Picking up and interpreting the signals in the brain that are being sent to missing limbs

and having a robotic prosthetic mimic the movement is ideal. Robotic prosthetics would greatly

improve the quality of life for the physically disabled that may have no way to interact with their

surroundings.

Recent progress in neural interfaces, or brain-computer interfaces, have shown significant

results that could potentially replace or fully rehabilitate sensor motory functions in parts of the

body (Athanasiou et al., 2017). Conditions that affect sensory motor function can negatively

affect both an individual's physical well being, but their psychological well being as well. Spinal

injuries disconnect neural paths that communicate with sensor motor processes without a change

to the brain. Brain Computer Interfaces bridge these disconnects by interpreting signals coming

and leaving the brain enabling the individual to control robotic prostheses or wheelchair

(Athanasiou et al., 2017).

Noninvasive BCIs have far greater potential in safety and cost effectiveness over their

invasive BCI counterpart, but lack in how well the computer can communicate with the Brain

( Athanasiou et al., 2017). This had been the case for many years, but with recent developments

in noninvasive BCIs, they can now perform the same tasks as an invasive BCI. This is
ROBOTIC PROSTHESIS 4

fundamental in the success of robotic prosthesis. In the past to pick up any signals from the

brain, a hole would have to be drilled into the skull. Following this, receptors would be placed

along the surface of the brain and the hole would be filled with the BCI, a long cylindrical device

that is half submerged into the midbrain and the other half protrudes from the skull. Although the

new developments in non invasive BCIs allow for the BCI to detect motor imagery, without

costly surgery. (Xu L, et al., 2014).

A team led by Alkinoos Athanasiou developed an arm platform called The Mercury

robotic arm system that can be manipulated at eight points and is controlled by a BMI worn

around the arm that contains an exoskeleton positioning sensing harness which allows for a

realistic reproduction of the movement (Moustakas N., Athanasiou A., Kartsidis P., Bamidis P.

D., Astaras A. 2013)

Two experiments were conducted. In the first, participants were sat between two robotic

arms controlled by a BCI with a large screen in front of them displaying videos of arms moving,

this trained participants to use visual and kinesthetic cues to control simple motor function tasks

with the arms (Arfaras G., Athanasiou A., Pandria N., et al.). In the second experiment after the

BCI system trained to the participants brain waves, they were asked to follow movement

prompts that included every possible movement combination. Each prompt lasted for thirty

seconds and the patient's overall success was recorded ( Athanasiou et al., 2017). This

experiment proves that real world application is possible. Subjects selected at random, were

fitted with a non invasive BCI and BMI which trained themselves to the wearers brain waves

with no outside programming. The wearers were able to gain complete control over the arm in

less than an hour.


ROBOTIC PROSTHESIS 5

In a separate study, two patients with tetraplegia, paralysis of all four limbs, controlled a

robotic arm that uses a shared control system. This shared control system combines automation

and computer assistance with manual input from the person in control (John E. Downy et al.,

2016). In doing so, the system can track objects with vision-guided robotic assistance, and

predetermines the best way of grasping an object but control still remains in the patients hands

through the BMI (John E. Downy et al., 2016).

With constant advancements in artificial intelligence BCIs are also integrating semi

autonomous decision making or “”shared control” to help guide the wearer. (Nageotte F., Zanne

P., Doignon C., De Mathelin M. Stitching 2009). The integration of the Shared control system

and vision-guided robotic assistance has made movements more precise, accurate, and less

difficult than ever before all while the patient is in near complete control (John E. Downy et al.,

2016 ). Before the BMI and vision-guided robotic assistance program are certain of the patients

intentions their automation holds nearly no control, but as the system gains more certainty as to

what the patient is trying to accomplish, the systems automation kicks in allowing for extremely

detailed movements (John E. Downy et al., 2016).

The purpose of shared control is to make tasks easier and make for a wider range of

completable tasks. This system also alleviates user frustration by correcting human error through

the vision-guided robotic assistance (John E. Downy et al., 2016). When the vision systems

algorithm has enough certainty that the patient is reaching for a certain object the system takes

control of fine adjustments such as the trajectory of the reach for the object, the strength of the

grip required ect. The shared control system although it can take control, it can be overridden by

the patient based on signals picked up from the BMI (John E. Downy et al., 2016).
ROBOTIC PROSTHESIS 6

The integration of the Shared control system and vision-guided robotic assistance has made

movements more precise, accurate, and less difficult than ever before all while the patient is in

near complete control (John E. Downy et al., 2016). Before the BMI and vision-guided robotic

assistance program are certain of the patients intentions their automation holds nearly no control,

but as the system gains more certainty as to what the patient is trying to accomplish, the systems

automation kicks in allowing for extremely detailed movements (John E. Downy et al., 2016).

The human hand is extremely intricate and the movements required to complete simple

tasks is quite complex. The combination of these small movements to complete one overall

movement synergy (Marco Santello et al. 2016). But the hand does more than compleat complex

movement tasks, its connected to the CNS allowing multiple sense to be picked up by the hand.

The final hurdle to overcome in creating a robotic prosthetic hand is mimicking the synergies of

a real human hand (Marco Santello et al. 2016).

The human hands have two very different ways of grasping objects, Power grips for

lifting heavy objects or gripping a baseball bat and precision grips for holding a pencil or turning

a key. Power grips using the palm of the hand and precision grips using the fingers (Marco

Santello et al. 2016). Recreating these grips for certain objects is easy but for robotic hand

synergy to be successful in the real world the hand needs to be able to grasp any object which

means the hand needs to able to pre position itself for the most optimal grip and be able to tell

how much squeezing pressure to apply (Marco Santello et al. 2016).

As an infant the body develops the palmer grasp reflex, when the palm is stimulated all

the fingers move together grasping the object. In later years more control over individual fingers

begins to develop. This suggests that hand synergy evolves from one basic movement as an
ROBOTIC PROSTHESIS 7

infant to the thousands or complex movements adults can compleat (Marco Santello et al. 2016).

This evolution is the key to understanding human hand synergy. Once human hand synergy is

completely understood the robotic reproduction would follow shortly after.

Robotic prosthesis is an extremely broad topic but the structure most typically involved is

the hand. The hand embodies a structure-function relationship, the whole hand is designed

around function. The intricate synergies that allow the hand to create complex movement and

fingers with multiple articulating joints only purpose is for function.

Robotic prosthetics are on the cutting edge of biomedical engineering. BMIs and BCI are

rapidly being innovated to perform more intricate actions with less and less invasive surgeries.

They adapt to different signals just like a real limb would making it extremely easy to use.

Robotic prosthetics greatly improve the quality of life for the physically disabled that may have

no way to interact with their surroundings.

Reference Page

Athanasiou, A., Xygonakis, I., Pandria, N., Kartsidis, P., Arfaras, G., Kavazidi, K. R.,
Foroglou, N., Astaras, A., … Bamidis, P. D. (2017). Towards Rehabilitation Robotics:
Off-the-Shelf BCI Control of Anthropomorphic Robotic Arms. BioMed research
international, 2017, 5708937.https://doi.org/10.1155/2017/5708937

Downey, J. E., Weiss, J. M., Muelling, K., Venkatraman, A., Valois, J. S., Hebert, M.,
Bagnell, J. A., Schwartz, A. B., … Collinger, J. L. (2016). Blending of brain-machine
interface and vision-guided autonomous robotics improves neuroprosthetic arm
ROBOTIC PROSTHESIS 8

performance during grasping. Journal of neuroengineering and rehabilitation, 13, 28.


doi:10.1186/s12984-016-0134-9

Santello, M., Bianchi, M., Gabiccini, M., Ricciardi, E., Salvietti, G., Prattichizzo, D.,
Ernst, M., Moscatelli, A., Jörntell, H., Kappers, A. M., Kyriakopoulos, K., Albu-Schäffer,
A., Castellini, C., … Bicchi, A. (2016). Hand synergies: Integration of robotics and
neuroscience for understanding the control of biological and artificial hands. Physics of
life reviews, 17, 1-23.https://doi.org/10.1016/j.plrev.2016.02.001

You might also like