Refine
Document Type
- Article (3)
Language
- English (3)
Has Fulltext
- yes (3)
Is part of the Bibliography
- no (3)
Keywords
- Hierarchical drift-diffusion modeling (1)
- Motivation (1)
- Social decision-making (1)
- Social neuroscience (1)
- congruency (1)
- cortex (1)
- cross-modal (1)
- fMRI (1)
- functional magnetic resonance imaging (1)
- human (1)
Institute
- Medizin (3)
- MPI für Hirnforschung (1)
- Psychologie (1)
Several regions in human temporal and frontal cortex are known to integrate visual and auditory object features. The processing of audio–visual (AV) associations in these regions has been found to be modulated by object familiarity. The aim of the present study was to explore training-induced plasticity in human cortical AV integration. We used functional magnetic resonance imaging to analyze the neural correlates of AV integration for unfamiliar artificial object sounds and images in naïve subjects (PRE training) and after a behavioral training session in which subjects acquired associations between some of these sounds and images (POST-training). In the PRE-training session, unfamiliar artificial object sounds and images were mainly integrated in right inferior frontal cortex (IFC). The POST-training results showed extended integration-related IFC activations bilaterally, and a recruitment of additional regions in bilateral superior temporal gyrus/sulcus and intraparietal sulcus. Furthermore, training-induced differential response patterns to mismatching compared with matching (i.e., associated) artificial AV stimuli were most pronounced in left IFC. These effects were accompanied by complementary training-induced congruency effects in right posterior middle temporal gyrus and fusiform gyrus. Together, these findings demonstrate that short-term cross-modal association learning was sufficient to induce plastic changes of both AV integration of object stimuli and mechanisms of AV congruency processing.
Motives motivate human behavior. Most behaviors are driven by more than one motive, yet it is unclear how different motives interact and how such motive combinations affect the neural computation of the behaviors they drive. To answer this question, we induced two prosocial motives simultaneously (multi-motive condition) and separately (single motive conditions). After the different motive inductions, participants performed the same choice task in which they allocated points in favor of the other person (prosocial choice) or in favor of themselves (egoistic choice). We used fMRI to assess prosocial choice-related brain responses and drift diffusion modeling to specify how motive combinations affect individual components of the choice process. Our results showed that the combination of the two motives in the multi-motive condition increased participants’ choice biases prior to the behavior itself. On the neural level, these changes in initial prosocial bias were associated with neural responses in the bilateral dorsal striatum. In contrast, the efficiency of the prosocial decision process was comparable between the multi-motive and the single-motive conditions. These findings provide insights into the computation of prosocial choices in complex motivational states, the motivational setting that drives most human behaviors.
Why is it hard to divide attention between dissimilar activities, such as reading and listening to a conversation? We used functional magnetic resonance imaging (fMRI) to study interference between simple auditory and visual decisions, independently of motor competition. Overlapping activity for auditory and visual tasks performed in isolation was found in lateral prefrontal regions, middle temporal cortex and parietal cortex. When the visual stimulus occurred during the processing of the tone, its activation in prefrontal and middle temporal cortex was suppressed. Additionally, reduced activity was seen in modality-specific visual cortex. These results paralleled impaired awareness of the visual event. Even without competing motor responses, a simple auditory decision interferes with visual processing on different neural levels, including prefrontal cortex, middle temporal cortex and visual regions.