The integration of different senses is one of the most important foundations for intelligent behavior. The better different modalities are integrated, the more successful our linguistic and academic development will be and the more differentiated our self-regulation will become. In most situations, for example when we want to understand the emotions of others, a combination of all our senses is required. Conversely, when emotion recognition is impaired, this impairment affects emotion recognition in all sensory modalities involved. Emotion processing is therefore clearly anchored in cross-modal processing. Fast visual and slower auditory and linguistic associations can produce a consistent perception or divergent perceptions, for example, irony in a reproach uttered with a friendly facial expression. However, it remains unclear how consistent and divergent cross-modal associations are formed and what role the degree of coupling plays for an individual. However, in order to support learning and self-regulation in a meaningful way, we need to understand how these connections work. The project aims to investigate the role of multisensory integration in perception in the understanding of emotions. Using electroencephalography (EEG), the pilot study will record event-related potentials (ERP) and oscillatory brain waves during the integration of auditory and visual stimuli in the recognition of emotions. The focus is on frequency analysis, which should provide information about which amplitude and duration of alpha synchronization, together with which reset rhythms, best support the integration of visual and auditory stimuli.
Original language: German, English