Research

Development of the Next-Generation Myoelectric Controlled Prosthetic Arms

There are over 32 million amputees worldwide whose lives are severely impacted by their limb losses. With the advances of electromyography (EMG) signal processing, myoelectric controlled prosthetic arm, which deciphers movement intentions from EMG signals using pattern recognition algorithms to control prosthetic arms, has shown great potential for intuitive prosthesis control. However, the clinical and commercial impact of existing myoelectric controlled prosthetic arms are still limited because natural limb movements are usually continuous and dynamic; however, existing myoelectric control interfaces based on standard single-channel EMG recordings can only provide discrete decisions of static motions due to the lack of meaningful neuromuscular information can be captured. Our research aims to address this challenge and develop the next-generation myoelectric controlled arms by integrating grid sensing, machine learning, and neuromorphic computing technologies.

Project supported by 2016 Ken Fong Translational Research Fund, SFSU Center for Computing for Life Sciences (CCLS) Mini Grant, and CSUPERB Faculty-Student Collaborative Research Grant

Collaborator:    Dr. Kazunori Okada, Computer Science, SFSUDr. Hao Jiang, Electrical Engineering, SFSU

Project featured in SF State News: Professors team up to design a better prosthetic arm

Okada-Zhang-main

 

MyoHMI: A Low-Cost and Flexible Platform for Developing Real-Time Human Machine Interface for Myoelectric Controlled Applications

Existing research platforms for developing EMG pattern recognition algorithms are typically based on MATLAB and the collection of EMG signals is often done by expensive, non-portable data acquisition systems. The requirement of these resources usually limits the use of these platforms in the lab environments and prohibits their widespread to other fields and applications. To address this limitation, this project aims to develop a low-cost, easy to use, and flexible platform called MyoHMI for developing real-time human machine interfaces for myoelectric controlled applications. MyoHMI facilitates the interface with a commercial EMG-based armband Myo, which costs less than $200 and can be easily worn by the user without the need of special preparation. MyoHMI also provides a highly modular and customizable C/C++ based software engine which seamlessly integrates a variety of interfacing and signal processing modules, from data acquisition through signal processing and pattern recognition, to real-time evaluation and control.

Related publication:

Ian Donovan, Kevin Valenzuela, Alejandro Ortiz, Sergey Dusheyko, Hao Jiang, Kazunori Okada, and Xiaorong Zhang, “MyoHMI: A Low-Cost and Flexible Platform for Developing Real-Time Human Machine Interface for Myoelectric Controlled Applications”, accepted by the 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC 2016), to be held in Budapest, Hungary, October, 2016.

A video of our real-time experiment can be viewed here.

point

 

Toward Anti-Stuttering: Understand the Relation between Stuttering and Anxiety Using Wearable Monitoring, Information Fusion, and Pattern Recognition Methods

Individuals who stutter often suffer from negative emotional reactions, which make the stuttering more severe. One way to investigate emotional aspects of stuttering is to measure physiological changes. Since people’s emotional reaction is greatly impacted by the environment, the physiological signals collected within a controllable laboratory environment cannot fully reflect realistic conditions. This project aims to use emerging wearable devices to monitor the subjects’ activities and emotions in their daily life, and thus to provide practical data for further analysis. In addition, advanced information fusion and pattern recognition methods will be used to find patterns from the data to understand the relation between stuttering and anxiety.

Poster presentation at 2016 SFSU COSE Project Showcase

 

Using Smart Wearable Devices for Seismic Measurements and Post-Earthquake Rescue 

With the advancement of wearable technologies and internet-of-things (IoT), this project aims to utilize smart wearable devices equipped with various sensors to provide feasible and reliable means to capture substantial ground motions to better serve the needs of earthquake research for future earthquake prediction and developing building codes that act as guidelines for designing safer building structures. Furthermore, the smart wearable device can act as a cyber life-saving straw by providing timely information (e.g. location and health status) of the wearer to assist post-earthquake rescue and help release anxiety by providing real-time health status of the wearer to authorized receivers.

Collaborator: Dr. Zhaoshuo Jiang, Civil Engineering, SFSU

Poster presentation at 2015 SFSU COSE Project Showcase