Refine
Year of publication
- 2008 (3) (remove)
Document Type
- Article (3)
Language
- English (3)
Has Fulltext
- yes (3) (remove)
Is part of the Bibliography
- no (3) (remove)
Keywords
- auditory spatial processing (1)
- congruency (1)
- cortex (1)
- cross-modal (1)
- functional magnetic resonance imaging (1)
- gamma-band activity (1)
- human (1)
- magnetoencephalography (1)
- multisensory (1)
- object perception (1)
Institute
- Medizin (2)
- Psychologie (2)
- MPI für Hirnforschung (1)
Oscillatory activity in human electro- or magnetoencephalogram has been related to cortical stimulus representations and their modulation by cognitive processes. Whereas previous work has focused on gamma-band activity (GBA) during attention or maintenance of representations, there is little evidence for GBA reflecting individual stimulus representations. The present study aimed at identifying stimulus-specific GBA components during auditory spatial short-term memory. A total of 28 adults were assigned to 1 of 2 groups who were presented with only right- or left-lateralized sounds, respectively. In each group, 2 sample stimuli were used which differed in their lateralization angles (15° or 45°) with respect to the midsagittal plane. Statistical probability mapping served to identify spectral amplitude differences between 15° versus 45° stimuli. Distinct GBA components were found for each sample stimulus in different sensors over parieto-occipital cortex contralateral to the side of stimulation peaking during the middle 200–300 ms of the delay phase. The differentiation between "preferred" and "nonpreferred" stimuli during the final 100 ms of the delay phase correlated with task performance. These findings suggest that the observed GBA components reflect the activity of distinct networks tuned to spatial sound features which contribute to the maintenance of task-relevant information in short-term memory.
Background Brain-computer interface methodology based on self-regulation of slow-cortical potentials (SCPs) of the EEG (electroencephalogram) was used to assess conditional associative learning in one severely paralyzed, late-stage ALS patient. After having been taught arbitrary stimulus relations, he was evaluated for formation of equivalence classes among the trained stimuli. Methods A monitor presented visual information in two targets. The method of teaching was matching to sample. Three types of stimuli were presented: signs (A), colored disks (B), and geometrical shapes (C). The sample was one type, and the choice was between two stimuli from another type. The patient used his SCP to steer a cursor to one of the targets. A smiley was presented as a reward when he hit the correct target. The patient was taught A-B and B-C (sample – comparison) matching with three stimuli of each type. Tests for stimulus equivalence involved the untaught B-A, C-B, A-C, and C-A relations. An additional test was discrimination between all three stimuli of one equivalence class presented together versus three unrelated stimuli. The patient also had sessions with identity matching using the same stimuli. Results The patient showed high accuracy, close to 100%, on identity matching and could therefore discriminate the stimuli and control the cursor correctly. Acquisition of A-B matching took 11 sessions (of 70 trials each) and had to be broken into simpler units before he could learn it. Acquisition of B-C matching took two sessions. The patient passed all equivalence class tests at 90% or higher. Conclusion The patient may have had a deficit in acquisition of the first conditional association of signs and colored disks. In contrast, the patient showed clear evidence that A-B and B-C training had resulted in formation of equivalence classes. The brain-computer interface technology combined with the matching to sample method is a useful way to assess various cognitive abilities of severely paralyzed patients, who are without reliable motor control.
Several regions in human temporal and frontal cortex are known to integrate visual and auditory object features. The processing of audio–visual (AV) associations in these regions has been found to be modulated by object familiarity. The aim of the present study was to explore training-induced plasticity in human cortical AV integration. We used functional magnetic resonance imaging to analyze the neural correlates of AV integration for unfamiliar artificial object sounds and images in naïve subjects (PRE training) and after a behavioral training session in which subjects acquired associations between some of these sounds and images (POST-training). In the PRE-training session, unfamiliar artificial object sounds and images were mainly integrated in right inferior frontal cortex (IFC). The POST-training results showed extended integration-related IFC activations bilaterally, and a recruitment of additional regions in bilateral superior temporal gyrus/sulcus and intraparietal sulcus. Furthermore, training-induced differential response patterns to mismatching compared with matching (i.e., associated) artificial AV stimuli were most pronounced in left IFC. These effects were accompanied by complementary training-induced congruency effects in right posterior middle temporal gyrus and fusiform gyrus. Together, these findings demonstrate that short-term cross-modal association learning was sufficient to induce plastic changes of both AV integration of object stimuli and mechanisms of AV congruency processing.