A new tool for the automatic detection of muscular voluntary contractions in the analysis of electromyographic signals
Journal article, Peer reviewed
MetadataVis full innførsel
- Artikler / Articles 
OriginalversjonInteracting with computers. 2015, 27, 492-499 10.1093/iwc/iwv008
Electromyographic (EMG) signals play a key role in many clinical and biomedical applications. They can be used for identifying patients with muscular disabilities, assessing lower-back pain, kinesiology and motor control. There are three common applications of the EMG signal: (1) to determine the activation timing of the muscle; (2) to estimate the force produced by the muscle and (3) to analyze muscular fatigue through analysis of the frequency spectrum of the signal. We have developed an EMG tool that was incorporated in an existing web-based biosignal acquisition and processing framework. This tool can be used on a post-processing environment and provides not only frequency and time parameters, but also an automatic detection of starting and ending times for muscular voluntary contractions using a threshold-based algorithm with the inclusion of the Teager– Kaiser energy operator. The algorithm for the muscular voluntary contraction detection can also be reported after a real-time acquisition, in order to discard possible outliers and simultaneously compare activation times in different muscles. This tool covers all known applications and allows a careful and detailed analysis of the EMG signal for both clinicians and researchers. The detection algorithm works without user interference and is also user-independent. It manages to detect muscular activations in an interactive process. The user simply has to select the signal’s time interval as input, and the outcomes are provided afterwards.
Dette er siste tekst-versjon av artikkelen, og den kan inneholde små forskjeller fra forlagets pdf-versjon. Forlagets pdf-versjon finner du på www.oxfordjournals.org / This is the final text version of the article, and it may contain minor differences from the journal's pdf version. The original publication is available at www.oxfordjournals.org