Emotion Detection and Recognition based on Brain and Peripheral Physiological Signals
Emotions affect and determine social relationships and interactions, memory and creativity,
and influence the mechanisms of rational thinking and decision making. The
influence of emotion on decision making has gained attention in computer science. By
detecting and recognizing emotions in an automatic way, machines endeavor to ease interaction
between users and multimedia content.
Automatic emotion detection and recognition can be carried out through analysis of
users' various behavioural, and physiological signals such as electroencephalogram (EEG),
electrocardiogram (ECG), electrodermal activity (EDA), and respiration, among others.
These modalities have been extensively studied individually. However, since different individuals
may experience the same emotion but express it differently, the various modalities
are considered complementary, and fusion of various physiological and behavioural responses
is expected to improve the quality of emotion recognition systems.
Nevertheless, although representative features and their multimodal integration have
been studied in affective computing research for various applications, patterns that arise
from the dynamic interrelation among various modalities during emotional processes have
received less attention. By summarizing each physiological signal only in a number of
features, one may lose information present in the underlying dynamical co-evolution of
various physiological signals.
Considering all these issues, this thesis aims at detecting and recognizing emotion
through brain and peripheral signals, targeting three complementary topics that have
not been thoroughly explored. The first one includes emotion assessment from music
video clips, the second one emotion assessment from odors, and the third one Quality
of Experience (QoE) assessment from two-dimensional (2D) and three-dimensional (3D)
video contents, using in all cases brain and peripheral physiological signals.
Regarding emotion assessment from music video clips, subject-dependent and subject-independent
analyses are carried out in this thesis, and the results reveal that although
there are differentiations among the subjects' brain activation patterns, there are still
common patterns across them. Moreover, the dynamical co-evolution between EEG and
EDA is explored during emotional processes, and the results reveal that the coupling
between EDA and EEG of the temporal lobe increases when strong emotions occur with respect to neutral ones. Finally, possible clustering patterns across subject-categories are
investigated, and the results reveal that there are common characteristics across subject-categories
related to their emotions.
Regarding emotion assessment from odors, since the primary response to odors is
related to pleasantness perception, which has not yet been thoroughly investigated, this
thesis explores the way perceived odor pleasantness influences brain and periphery. In particular,
two independent classifiers are trained and tested, one using EEG and the other
using ECG features. The results reveal that it is possible to assess odor pleasantness
perception from EEG and less accurately from ECG features, in a subject-independent
framework. Also, decision fusion of the EEG and ECG classifiers is shown to discriminate
odor pleasantness perception. Moreover, in order to explore the dynamical co-evolution
between brain and peripheral signals, the coupling between heart rate and EEG is investigated.
The results reveal that there is a significant increase in the coupling between ECG
and temporal lobe EEG, when pleasant or unpleasant odors are experienced with respect
to neutral ones.
Enhanced QoE from multimedia contents targets to increase users' sensation of reality,
in order to induce stronger emotions and render the user more involved in the experience.
In this thesis, QoE is investigated in terms of four aspects, namely perceived depth, perceived
overall quality, content preference, and sensation of reality. In particular, it is
revealed that it is possible to recognize perceived depth, content preference, and sensation
of reality from EEG signals, but not from the peripheral ones. Also, fusion between
peripheral and EEG features is found to improve the performance in some cases. Finally,
the left frontal cortex seems to be activated when sensation of reality is high, indicating
that high sensation of reality is related to approach-related emotional processes.
Although the three topics are independently explored in this thesis, their potential
integration would endeavor to create immersive multimedia systems, which could adapt
their properties according to users' emotions, and enhance, thus, user-experience.
EPFL_TH6418.pdf
restricted
13.17 MB
Adobe PDF
df72cee2bf193078abfb724f61260e62