Institutional Repository
Technical University of Crete
EN  |  EL

Search

Browse

My Space

Study of recording and analysis of multi-source data for the estimation of emotion indicators

Chatzianagnostou Christina

Full record


URI: http://purl.tuc.gr/dl/dias/65983581-415E-46A1-BB8C-364FE76404A9
Year 2025
Type of Item Diploma Work
License
Details
Bibliographic Citation Christina Chatzianagnostou, "Study of recording and analysis of multi-source data for the estimation of emotion indicators", Diploma Work, School of Electrical and Computer Engineering, Technical University of Crete, Chania, Greece, 2025 https://doi.org/10.26233/heallink.tuc.104100
Appears in Collections

Summary

Human emotions profoundly influence our cognitive performance, physiological body responses, and overall well-being, yet objectively measuring these emotional states in real-world environments remains a significant challenge. This diploma focuses on developing a multi-device system capable of simultaneously recording and analyzing physiological body signals from multiple sources to detect emotional responses to different stimuli within an immersive projection environment. This experiment was conducted in the context of a collaboration between the Display Lab (School of Electrical and Computer Engineering) and the Transformable Intelligent Environments Lab (TIE Lab, School of Architecture) at the Technical University of Crete.During the experiment, multimodal physiological body data (EEG and ECG) were collected from 33 participants, as they watched two different videos within an immersive audiovisual setup. The analysis focused primarily on the first video, which contained six segments that alternated between stress-inducing and calming content, while the second featured four continuously peaceful blue space scenes. EEG data were recorded using the Unicorn Hybrid Black system and ECG measurements were captured through two wearable devices: the Movesense chest strap and the Traqbeat wristband.Signal processing tools that were used included MATLAB for EEG bandpass filtering and preprocessing, Python with dyconnmap for dynamic brain connectivity analysis, Neural Gas clustering algorithms for signal complexity quantification, and SciPy for ECG R-peak detection and heart rate variability analysis. Key methodological innovations included chronnectomics analysis using weighted Phase Lag Index (wPLI) for brain state transitions, adaptive R-peak detection with Butterworth filtering, and multimodal feature fusion. The results demonstrate reliable inference of emotional states through brain state flexibility, cognitive patterns, and cardiovascular measures, with stress conditions showing significantly reduced neural adaptability and increased cardiac irregularity compared to calm states. The integrated approach enables objective classification of emotional states, establishing biomarkers for stress versus relaxation in real-time monitoring systems.This research demonstrates how effectively multimodal biosignal analysis can distinguish emotional responses to different audiovisual stimuli. These findings open up new horizons, both in clinical settings and in applications of deeper psychological assessment.

Available Files

Services

Statistics