Turkish Journal of Electrical Engineering and Computer Sciences
DOI
10.3906/elk-1606-296
Abstract
The aim of this paper is to develop a brain-computer interface (BCI) system that can control a robotic arm using EEG signals generated by facial expressions. The EEG signals are acquired using a neurosignal acquisition headset. The robotic arm consists of a 3-D printed prosthetic hand that is attached to a forearm and elbow made of craft wood. The arm is designed to make four moves. Each move is controlled by one facial expression. Hence, four different EEG signals are used in this work. The performance of the BCI robotic arm is evaluated by testing it on 10 subjects. Initially 14 electrodes were used to collect the EEG signals, and the accuracy of the system is around 95%. We have further analyzed the minimum requirement for the number of electrodes for the system to function properly. Seven (instead of 14) electrodes in the parietal, temporal, and frontal regions are sufficient for the system to function properly. The accuracy of the system with 7 electrodes is around 95%.
Keywords
Brain-computer interface, electroenchephalograpy signal, facial expressions, robotic arm
First Page
707
Last Page
720
Recommended Citation
NISAR, HUMAIRA; KHOW, HONG-WAY; and YEAP, KIM HO
(2018)
"Brain-computer interface: controlling a robotic arm using facial expressions,"
Turkish Journal of Electrical Engineering and Computer Sciences: Vol. 26:
No.
2, Article 7.
https://doi.org/10.3906/elk-1606-296
Available at:
https://journals.tubitak.gov.tr/elektrik/vol26/iss2/7
Included in
Computer Engineering Commons, Computer Sciences Commons, Electrical and Computer Engineering Commons