Recently, we presented this project at Maker Faire in San Mateo, CA. I'm working on a team at Galileo University in Guatemala City where we're developing a low-cost prosthetic hand and I'm in charge of developing the myoelectric controller for the bionic prosthetic. The main idea is to take myoelectric signals directly from the arm using surface mounted electrodes, obtain some features with Digital Signal Processing algorithms and then put them in a Machine Learning Neural Network to recognize the intention of the user and finally create that movement with the bionic hand. At this time the system is trained only with open and close actions.

 

 

Goal

 

We are building a low cost open source/hardware mechanical/bionic prosthetic hand. The body powered mechanical version should be under $100 and the bionic version under $500. At the moment, a Guatemalan ex-soldier is using the mechanical version of the prosthetic, giving us great feedback and getting used to it, after about 2~3 months we plan on upgrading his mechanical version with the bionic version. If all goes well we plan on releasing it as a DIY kit to the general public.

 

Abel2.JPG

 

System Architecture and Components

System.JPG

  • Olimex EKG-EMG Pasive Surface Electrodes (we're searching for high quality electrodes to get better results)
  • Olimex EKG-EMG Shield (we're working on instrument amplifiers with multiple channels to get multiple signals).
  • Texas Instruments Tiva C LaunchPad (ARM Cortex-M4)
  • Graphic LCD 84x48 - Nokia 5110 (for user feedback)
  • InMoov Hand (to test the prototype)
  • Texas Instruments Tiva C Peripheral Library
  • ARM CMSIS DSP Library
  • Keil μVision 4

system_arm.jpgcomp_arm2.jpg

Signal Collection

 

After amplifying the EMG signal using Instrumental Amplifiers, the ADC receives the signal with a sample rate of about 500 Hz. However, it is difficult to detect the starting of the muscle contraction. Hence, the average IEMG value of 24 ms raw window length is calculated to determine whether the muscle contracts or not, independent of the unstable noise. When the average IEMG exceeds the pre-defined threshold, 512 samples of EMG data are collected for feature extraction. To compute the IEMG, I used the ARM CMSIS-DSP Library in Keil uVision IDE.

IEMG.JPG

The Code:

     ADCSequenceDataGet(ADC0_BASE, 3, ulADC0Value);                                                

     MuscleSensor = ulADC0Value-dc_offset;                                                

     MuscleSensorVoltage = ((MuscleSensor*3.3f)/4095.0f);                    

 

     for(int j=12-1;j>0;j--) samples24[j]=samples24[j-1];    

     samples24[0]=MuscleSensorVoltage;

 

     output24F32 = &samples24[0];                                                           

     arm_abs_f32(output24F32, abs_output24F32, 12);       

     arm_mean_f32(abs_output24F32, 12, &mean);


Signal Processing

 

After collecting 512 points EMG data, the primary thing is to filter the signal (this was not necesary for the Olimex EKG-EMG Shield, because the shield has analog filters). The bandwidth of the EMG signal selected for the prototype is about 20-250Hz, I implemented a FIR Lowpass Filter with a cutoff frequency about 250 Hz using MATLAB by MathWorks and the ARM CMSIS-DSP Libray.

Filter.JPG

CMSIS-DSP: FIR Lowpass Filter Example

 

Feature Extractions

 

The features of the EMG signal are extracted from the filtered EMG signals. Six features are used to represent the myoelectric signal patterns. They are given below:

  • Integral of the EMG (IEMG): This is an estimate of the summation of absolute value of the EMG signal.

     iemg2.JPG

  • Waveform Length (WL): This is a cumulative variation of the EMG signal that can indicate the degree of variation about the EMG.

     WL.JPG

  • Variance: This is a measure of the power density of the EMG signal.

     var.JPG

  • Zero Crossings (ZC): This parameter counts the number of times that the signal crosses zero.  A threshold needs to be introduced to reduce the noise induced at zero crossings.

      ZC.JPG

  • Slope Sign Changes (SSC): This parameter counts the number of times the slope of the signal changes sign.  Similarly, it needs to include a threshold to reduce noise induced at slope sign changes. Given three continuos data, the number of slope sign change increase if

     SSC.JPG

  • Willison Amplitude (WAMP): It is the number of counts for each change of the EMG signal amplitude that exceeds a pre-defined threshold.  It can indicate the muscle contraction level.

     wamp.JPG


The code:

     outputF32 = &samples[0];

     for(int k=0;k<TEST_LENGTH_SAMPLES-1;k++) samples_minus_one[k]=samples[k+1];

     samples_minus_one[511]=0.0f;

     for(int j=TEST_LENGTH_SAMPLES-1;j>0;j--) samples_plus_one[j]=samples[j-1];

     samples_plus_one[0]=0.0f;

 

    /*IEMG*/

     arm_abs_f32(outputF32, abs_outputF32, TEST_LENGTH_SAMPLES);

     arm_dot_prod_f32(abs_outputF32, ones, TEST_LENGTH_SAMPLES, &iemg);

    /*waveform length*/

     arm_sub_f32(samples, samples_minus_one, difm_outputF32, TEST_LENGTH_SAMPLES);

     arm_abs_f32(difm_outputF32, abs_difm_outputF32, TEST_LENGTH_SAMPLES);

     arm_dot_prod_f32(abs_difm_outputF32, ones, TEST_LENGTH_SAMPLES, &wl);

    /*variance*/

     arm_var_f32(outputF32, TEST_LENGTH_SAMPLES, &var);

     /*wamp*/

     arm_sub_f32(samples, samples_plus_one, difp_outputF32, TEST_LENGTH_SAMPLES);

     arm_abs_f32(difp_outputF32, abs_difp_outputF32, TEST_LENGTH_SAMPLES);

     for(int m=0;m<TEST_LENGTH_SAMPLES;m++) if(abs_difp_outputF32[m]>=0.2f) wamp++;

     /*zc*/

     for(int n=0;n<TEST_LENGTH_SAMPLES-1;n++) {

          sgn = ((-samples[n]*samples[n+1])>0)?1:0;

          if ((sgn>0)&&(abs_difp_outputF32[n]>0.15f)) zc++;

     }

     /*ssc*/

     for(int n=0;n<TEST_LENGTH_SAMPLES;n++) if (((samples[n]-samples_minus_one[n])*(samples[n]-samples_plus_one[n]))>0.01f) ssc++;

 

 

Machine Learning – Artificial Neural Network Classifier

 

The Artificial Intelligence (AI) resides inside the microcontroller (ARM Cortex-M4 - Texas Instruments Tiva C LaunchPad) and uses the extracted features to predict (classify) in realtime the user intent and execute the action. In these first versions we wanted a simple approach to a simple problem, so we started by classifying between [open|close] actions using a three layered feed forward neural network with six inputs (features), six nodes in the hidden layer and one node in the output layer [open=1 | close=0]. The training of the neural network is done offline in a different system using supervised learning and backpropagation, once the weights of the network are calculated, we input the weight matrix into the neural network which resides in the microcontroller so it can do realtime classification.

 

Currently we are working on recognizing more gestures by using a larger ANN, adding online learning and executing everything inside the microcontroller so there is no need of any external systems.

 

The Code:

 

    float32_t sigmoid(float32_t num) {

          return (float32_t)(1.0f/(1.0f+exp(-num)));

     }

 

    /*artificial neural network*/

     float32_t ann(float32_t i1, float32_t i2, float32_t i3, int32_t i4, int32_t i5, int32_t i6) {

          float32_t net1, net2, net3, net4, net5, net6, a1, a2, a3, a4, a5, a6, net7;

 

          net1 = 1.0f * thetha[0][0] + i1 * thetha[1][0] + i2 * thetha[2][0] + i3 * thetha[3][0] + i4 * thetha[4][0] + i5 * thetha[5][0] + i6 * thetha[6][0];

          net2 = 1.0f * thetha[0][1] + i1 * thetha[1][1] + i2 * thetha[2][1] + i3 * thetha[3][1] + i4 * thetha[4][1] + i5 * thetha[5][1] + i6 * thetha[6][1];

          net3 = 1.0f * thetha[0][2] + i1 * thetha[1][2] + i2 * thetha[2][2] + i3 * thetha[3][2] + i4 * thetha[4][2] + i5 * thetha[5][2] + i6 * thetha[6][2];

          net4 = 1.0f * thetha[0][3] + i1 * thetha[1][3] + i2 * thetha[2][3] + i3 * thetha[3][3] + i4 * thetha[4][3] + i5 * thetha[5][3] + i6 * thetha[6][3];

          net5 = 1.0f * thetha[0][4] + i1 * thetha[1][4] + i2 * thetha[2][4] + i3 * thetha[3][4] + i4 * thetha[4][4] + i5 * thetha[5][4] + i6 * thetha[6][4];

          net6 = 1.0f * thetha[0][5] + i1 * thetha[1][5] + i2 * thetha[2][5] + i3 * thetha[3][5] + i4 * thetha[4][5] + i5 * thetha[5][5] + i6 * thetha[6][5];


          a1 = sigmoid(net1);

          a2 = sigmoid(net2);    

          a3 = sigmoid(net3);

          a4 = sigmoid(net4);

          a5 = sigmoid(net5);

          a6 = sigmoid(net6);

 

          net7 = 1 * thetha[0][6] + a1 * thetha[1][6] + a2 * thetha[2][6] + a3 * thetha[3][6] + a4 * thetha[4][6] + a5 * thetha[5][6] + a6 * thetha[6][6];

          return sigmoid(net7);

     }


Future Work


The work described in this blog is still an early prototype, we are currently working on improving the sensors, EMG signal amplification, user feedback/interaction via touchscreen, the machine learning to recognize more gestures and have online learning all done inside the microcontroller.

 

Thanks!

maker_faire.jpgmaker.jpg

more info: turing.galileo.edu


Contact Info

julioefajardo@gmail.com

alilemus@gmail.com