During my Master Thesis I tried to study human emotions in general and in specific study how audiovisual content could influence human affective state. Doing so we could create Media content that understands emotions in real-time and therefore change it self to modulate our watching and listening experiences.
During my thesis I worked on a biofeedback system that using rated media clips from the IAPS database could create a "pseudonarrative" that influence the state of the user and therefor influence the narrative being created.
My biggest motivation for this project was possible future benefits that this kind of systems could provide to different fields. In particular I was interested in:
The system was developed using g-tec hardware, custom made software using Matlab for the signal processing part and java for the front-end media showing application.
The visual stimuli used during the experiments was taken from the International Affective Picture System IAPS automatically with the software using the ratings. To better understand the database, I created a custom visualization tool to display the how the mediaclips are rated in the database and therefore know which clips are choosen during the "pseudonarrative" creation.
MSc Thesis. CSIM Master UPF