
- Home
- Companies
- BioInteractive Technologies
- Articles
- Force Myography to Control Robotic ...
Force Myography to Control Robotic Upper Extremity Prostheses: A Feasibility Study
Advancement in assistive technology has led to the commercial availability of multi-dexterous robotic prostheses for the upper extremity. The relatively low performance of the currently used techniques to detect the intention of the user to control such advanced robotic prostheses, however, limits their use. This article explores the use of force myography (FMG) as a potential alternative to the well-established surface electromyography. Specifically, the use of FMG to control different grips of a commercially available robotic hand, Bebionic3, is investigated. Four male transradially amputated subjects participated in the study, and a protocol was developed to assess the prediction accuracy of 11 grips. Different combinations of grips were examined, ranging from 6 up to 11 grips. The results indicate that it is possible to classify six primary grips important in activities of daily living using FMG with an accuracy of above 70% in the residual limb. Additional strategies to increase classification accuracy, such as using the available modes on the Bebionic3, allowed results to improve up to 88.83 and 89.00% for opposed thumb and non-opposed thumb modes, respectively.
The loss of a limb, regardless of the cause, has a significant negative impact on the individual. Prostheses are devices designed to mitigate this loss and have existed since the ancient Egyptian era. Today, the technology of prostheses has evolved considerably; in the case of upper extremity devices, robotic multi-dexterous hands, such as Otto Bock’s Michelangelo hand, Touch Bionics’ i-Limb, and Steeper Group’s Bebionic3 (Connolly, 2008; Medynski and Rattray, 2011; Belter et al., 2013), have been commercially available in the last decade.
However, despite recent technological advances, the overall rate of prostheses use in upper extremity amputees remain low (Biddiss and Chau, 2007) as the state-of-the-art is still far from effectively emulating the human hand and arm (Peerdeman et al., 2011). One of the problems is that, the increased complexity has introduced the new challenge of effectively controlling these devices (Østlie et al., 2012; Yang et al., 2014).
A large component of the difficulty of controlling these devices is due to their unreliability; misclassification of the user’s intentions frequently leads to unplanned movements (Biddiss et al., 2007). Although the conventional myoelectric control strategy involving two sEMG electrodes are sufficient for traditional myoelectric grippers involving only two configurations, opened and closed, the control of a more advanced terminal device requires a series of muscle cocontraction signaling, similar to Morse codes (Yang et al., 2014), making the user experience unintuitive and leading to human errors. One way the community has attempted to address this unintuitive control strategy is by including multiple sEMG electrodes to detect more subtle muscle activation profiles for various grips (Daley et al., 2010; Yang et al., 2014; Naik et al., 2015). Another approach was to modify the configuration of the sEMG electrodes placement (Fang and Liu, 2014). Meanwhile, others have focused more on the pattern recognition algorithm and a self-correcting system to sources of error in classification, such as inertia and force variation (Al-Timemy et al., 2013; Amsuss et al., 2014). However, one of the known limitations to classification accuracy and robustness is due to the sensors themselves. sEMG is prone to signal inconsistency due to interference from ambient noise, such as transmission from fluorescent lighting and televisions, changes in electrochemical signals due to sweat or humidity, electrode shifts as a result of limb movement, and signal cross-talking between adjacent muscles, which may make them unsuitable for prolonged use (Cram and Kasman, 1998; Oskoei et al., 2007; Castellini et al., 2014). Other recognized challenges include the adverse effects of limb position, weight, inertia, and force variation differences during the training on the pattern recognition performance (Cipriani, et al., 2011; Scheme, et al., 2010).
Other approaches, such as targeted muscle reinnervation (TMR) (Kuiken et al., 2009), electroneurography (ENG) (Cloutier and Yang, 2013), intracortical neural interfaces (Fang et al., 2015), and electrocorticography (ECoG) (Pistohl et al., 2012), all of which allow a more direct transmission of neural signals via surgical implants, have been explored. However, due to their invasive natures and costs, alternative non-invasive approaches have been sought. Examples of less invasive methods that have been explored include sonomyography, mechanomyography, electroencephalography, near infrared spectroscopy, magnetoencephalographic, and functional magnetic resonance imaging (Silva et al., 2005; Fang et al., 2015). Although all techniques have their own benefits and limitations, the main focus of this study was to investigate the use of one particular method, termed force myography (FMG) (Wininger et al., 2008).
Force myography, which is also referred to as residual kinetic imaging (RKI) (Phillips and Craelius, 2005) or muscle pressure mapping (MPM) (Radmand et al., 2014), is a technique involving the use of force sensitive resistors (FSRs) on the surface of the limb to detect the volumetric changes in the underlying musculotendinous complex. In a recent study (Ravindra and Castellini, 2014), researchers investigated the pros and cons of three types of non-invasive sensors, including sEMG, ultrasound, and FMG. In the scope of Ravindra and Castellini’s work, they concluded that FMG is the most promising of the three, as it has the potential to provide the highest accuracy in prediction, stability over time, wearability, and affordability of cost. In a different study, Li et al. (2012) investigated the use of FMG for classification and concluded that the use of FMG to decipher the user’s control intention was feasible. However, to the best of our knowledge, there have only been a small number of studies conducted on end-user subjects with transradial amputations.
This study compares the classification accuracy in the sound and residual limbs of four transradially amputated subjects and investigates whether the use of FMG is feasible. In addition, we demonstrated the control of a stand-alone commercially available prosthesis, Bebionic3, in real-time using the FMG technique (for demonstration video – see Supplementary Material).
Materials and MethodsAn experiment involving transradially amputated subjects in order to determine feasibility of the use of FMG to classify grip patterns was conducted. Forearm muscular deformation profiles were collected in both residual and sound limbs to compare classification accuracies for various grips.
HardwareTo extract FMG signals, an FSR strap prototype (Xiao and Menon, 2014) developed by the MENRVA Research Group at Simon Fraser University was used and is shown in Figure Figure1.1. The strap is 28.0 cm long and consists of eight embedded FSRs (FSR 402 from Interlink Electronics), which were evenly spaced on the strap’s inner surface. The strap itself is made of flexible chloroprene elastomeric (FloTex) foam with an adjustable Velcro to allow a customized fit for various forearm circumferences.
FSR strap prototype developed at MENRVA Research Group. The strap is 28.0 cm long and consists of eight embedded FSRs (FSR 402 from Interlink Electronics), which were evenly spaced on the strap’s inner surface. The strap itself is made of flexible chloroprene elastomeric (FloTex) foam with an adjustable Velcro to allow a customized fit user for various forearm circumferences.
The signals from the FSRs were then extracted via a simple voltage divider circuit. There are two terminals in each FSR; one terminal is connected to a common analog input pin of an Arduino ProMini micro-controller with an internal pull-up resistor of 37.5 kΩ, and the other to a digital control pin and is schematically represented in Figure Figure2.2. The eight FSR signals were digitized sequentially using the micro-controller and transmitted via a Bluetooth module to a personal computer for data collection. The data collection software was developed in LabVIEW from National Instruments with a sampling rate of 10 Hz as proposed by Oliver et al. (2006). The sampling rate was selected in order to abide by the Nyquist criterion, where the sampling frequency must be twice the highest sampling frequency of movement in order to avoid the distortion of measured signals. Data in our study were collected in isometric conditions, and even in case of motion, since the frequency of human hand motion is typically2006), 10 Hz is sufficient as the sampling rate for the purposes of the study.
Schematic for FMG signal extraction and transmission. There are two terminals for each FSR. One terminal is connected to a common analog input pin of an Arduino ProMini micro-controller with an internal pull-up resistor of 37.5 kΩ, and the other to a digital control pin. The eight FSR signals are digitized sequentially using the micro-controller and transmitted via a Bluetooth module to a personal computer for data collection.
ProtocolIn order to extract FMG data, an FSR strap was aligned to the bulk of the forearm, and donned first to the sound forearm and then the residual forearm as seen in Figure Figure3.3. Four grips, such as power grip, tripod grip, finger point (non-opposed), and key grip, have been identified as the most functional grips for activities of daily living by Peerdeman et al. (2011) and the most useful by Yang et al. (2014). In this study, these grips in addition to relaxed hand position and open palm are considered as the primary grips. Furthermore, five more grips available in the Bebionic3, were tested. In total, 11 grips were examined and are shown in Figure Figure44.
FSR strap donned on the subjects’ (A) sound forearm (B) residual forearm.
The 11 grips tested for the study.
The subjects and the systems were trained by mirroring each grip in their residual limb with their sound limb as has done in previous experiments (Nielsen et al., 2011). The elbow was were flexed at 90° and each grip was held isometrically for a duration of three seconds per grip. The complete set was repeated five times, with 5 min of rest between each set as seen in Ravindra et al.’s previous work (Ravindra and Castellini, 2014). The grip sequences were kept the same for every set throughout the protocol to minimize confusion for the subjects.
SubjectsFour transradially amputated male subjects were recruited for this study through Barber Prosthetics Clinic located in Vancouver, Canada. The clinical characteristics of each subject are described in Table Table1.1. Although the sample size appears to be small at first, it is not, given the difficulty of recruiting transradially amputated individuals in this field (Atzori et al., 2014a,b). The average age of the subjects was 45 years old with a SD of ±17.2 years. All subjects provided written consent to testing after being informed of the testing procedure. The test procedure was approved by the Simon Fraser University Office of Research Ethics.
Data Collection and AnalysisThe signal processing steps are described in Figure Figure5.5. Eleven grip gestures were recorded in a single trial, each grip gesture lasted 3 s (30 samples at 10 Hz), and a total of five trials were performed by each participant. The recorded FSR data were classified using the linear discriminant analysis (LDA) provided by MATLAB software from MathWorks. LDA was chosen for this study because of its ease to apply it in real-time, and ability to achieve similar or better classification results than other more complex methods (Englehart and Hudgins, 2003; Scheme and Englehart, 2011; Zhang et al., 2013; Amsuss et al., 2014).
First, we investigated the primary six grips. Second, we analyzed the 11 grip patterns, which were then subsequently categorized into the available opposed thumb and non-opposed thumb modes, based on the available control strategy of the Bebionic3. Inter-trial cross-validation was performed on the five trials in which four trials were used for training and one trial was used for testing. This resulted in five individual accuracies for each trial that was used for testing. The average accuracy was then obtained to represent the overall accuracy for all five trials.
Results
An overview of the classification accuracies for all the tested conditions are illustrated in Table Table2.2. This analysis was performed for the primary grips identified as necessary for activities of daily living. With the six primary grips, an accuracy of up to 73.89% was achieved as shown in the confusion matrices in Figure Figure66 by taking the average of the diagonal. In all subjects, the classification accuracy of the sound limb was consistently higher than that of the residual limb. In addition, when all 11 grips were included, the classification accuracy decreased regardless of the individual. Among subjects, subject 1 and 4 appeared to have the best results, whereas subject 3 appeared to have the worst results throughout. When modes that were available on the Bebionic3 were taken into account, it was possible to increase the best classification accuracy from the six primary grips by approximately 18% in the opposed thumb mode for subject 2, and nearly 25% in the non-opposed thumb mode for subject 1. The confusion matrices for 11 grips, opposed thumb mode and non-opposed thumb mode of residual limb can be found in the Appendix.
Confusion matrix – primary grips for residual limb for subjects 1 to 4. The diagonal entries represent the classification accuracy for the different grips. The off-diagonal entries represent inaccurate classifications. (1) For example, for subject 1, open palm grip in the second column is misclassified 10.67% of the time as relaxed hand and 10% of the time as tripod grip. (2) Confusion matrix – primary grips for residual limb for subject 2. (3) Confusion matrix – primary grips for residual limb for subject 3. (4) Confusion matrix – primary grips for residual limb for subject 4.