Decoding Complex Imagery Hand Gestures (1703.02929v1)
Abstract: Brain computer interfaces (BCIs) offer individuals suffering from major disabilities an alternative method to interact with their environment. Sensorimotor rhythm (SMRs) based BCIs can successfully perform control tasks; however, the traditional SMR paradigms intuitively disconnect the control and real task, making them non-ideal for complex control scenarios. In this study, we design a new, intuitively connected motor imagery (MI) paradigm using hierarchical common spatial patterns (HCSP) and context information to effectively predict intended hand grasps from electroencephalogram (EEG) data. Experiments with 5 participants yielded an aggregate classification accuracy--intended grasp prediction probability--of 64.5\% for 8 different hand gestures, more than 5 times the chance level.