Institutional Repository
Technical University of Crete
EN  |  EL



My Space

Brain computer interface driving movement in a virtual reality game based on EEG signals

Ramiotis Georgios

Full record

Year 2024
Type of Item Diploma Work
Bibliographic Citation Georgios Ramiotis, "Brain computer interface driving movement in a virtual reality game based on EEG signals", Diploma Work, School of Electrical and Computer Engineering, Technical University of Crete, Chania, Greece, 2024
Appears in Collections


Brain Computer Interfaces (BCIs) are systems that aim to connect the human brain with a computer to control an external application. BCIs rely on external devices that record a user’s bio-signals such as EEG signals. The Motor Imagery paradigm is a mental process where the user simulates a given motor action that can generate unique EEG signal patterns. These patterns can be analyzed and translated into commands for the external application. This emerging technology has seen an expanding use in medical rehabilitation, neurofeedback, control of an exoskeleton and quite recently neurogaming.This thesis focuses on the use of a Brain Computer Interface based on EEG signals with the Motor Imagery paradigm in a virtual reality environment for neurogaming. There are existing software tools such as OpenVibe that offer a straight forward methodology of developing BCIs based on EEG signals. However, they focus on traditional machine learning algorithms to classify the EEG signal and, thus, often lack in performance which is crucial for brain-controlled games that rely on accuracy. For this reason, we have developed a BCI that controls movement in a virtual reality maze game. The BCI relies on the OpenVibe platform to process the EEG signal and generate unique features for classification. To improve the accuracy of the BCI, we developed a Convolutional Neural Network (CNN) to replace the classification system of OpenVibe. To further enhance the performance of our CNN we designed a Wasserstein Generative Adversarial Network to generate artificial EEG features to be used for training. The classified EEG features are then translated into commands for our VR maze game. We have also developed a BCI-based system that enables brain-controlled interaction with in-game props and User Interface menus. We put our BCI to the test by comparing the performance and accuracy of our neural networks offer, to that of OpenVibe’s classification algorithms. We also measure the performance of the system with a varied number of available motor imagery actions. The results have shown that a Deep learning-based classification system improves the accuracy of the BCI compared to OpenVibe.

Available Files