Refine
Language
- English (3)
Has Fulltext
- yes (3)
Is part of the Bibliography
- no (3)
Keywords
- Human behaviour (1)
- Object vision (1)
- auditory (1)
- functional magnetic resonance imaging (1)
- location (1)
- multivoxel pattern analysis (1)
- pitch (1)
- searchlight analysis (1)
- working memory (1)
Institute
- Medizin (3)
Context information supports serial dependence of multiple visual objects across memory episodes
(2020)
Serial dependence is thought to promote perceptual stability by compensating for small changes of an object’s appearance across memory episodes. So far, it has been studied in situations that comprised only a single object. The question of how we selectively create temporal stability of several objects remains unsolved. In a memory task, objects can be differentiated by their to-be-memorized feature (content) as well as accompanying discriminative features (context). We test whether congruent context features, in addition to content similarity, support serial dependence. In four experiments, we observe a stronger serial dependence between objects that share the same context features across trials. Apparently, the binding of content and context features is not erased but rather carried over to the subsequent memory episode. As this reflects temporal dependencies in natural settings, our findings reveal a mechanism that integrates corresponding content and context features to support stable representations of individualized objects over time.
Objective: Research on visual working memory has shown that individual stimulus features are processed in both specialized sensory regions and higher cortical areas. Much less evidence exists for auditory working memory. Here, a main distinction has been proposed between the processing of spatial and non-spatial sound features. Our aim was to examine feature-specific activation patterns in auditory working memory.
Methods: We collected fMRI data while 28 healthy adults performed an auditory delayed match-to-sample task. Stimuli were abstract sounds characterized by both spatial and non-spatial information, i.e., interaural time delay and central frequency, respectively. In separate recording blocks, subjects had to memorize either the spatial or non-spatial feature, which had to be compared with a probe sound presented after a short delay. We performed both univariate and multivariate comparisons between spatial and non-spatial task blocks.
Results: Processing of spatial sound features elicited a higher activity in a small cluster in the superior parietal lobe than did sound pattern processing, whereas there was no significant activation difference for the opposite contrast. The multivariate analysis was applied using a whole-brain searchlight approach to identify feature-selective processing. The task-relevant auditory feature could be decoded from multiple brain regions including the auditory cortex, posterior temporal cortex, middle occipital gyrus, and extended parietal and frontal regions.
Conclusion: In summary, the lack of large univariate activation differences between spatial and non-spatial processing could be attributable to the identical stimulation in both tasks. In contrast, the whole-brain multivariate analysis identified feature-specific activation patterns in widespread cortical regions. This suggests that areas beyond the auditory dorsal and ventral streams contribute to working memory processing of auditory stimulus features.
Context information supports serial dependence of multiple visual objects across memory episodes
(2019)
Visual perception operates in an object-based manner, by integrating associated features via attention. Working memory allows a flexible access to a limited number of currently relevant objects, even when they are occluded or physically no longer present. Recently, it has been shown that we compensate for small changes of an object’s feature over memory episodes, which can support its perceptual stability. This phenomenon was termed ‘serial dependence’ and has mostly been studied in situations that comprised only a single relevant object. However, since we are typically confronted with situations where several objects have to be perceived and held in working memory, the central question of how we selectively create temporal stability of several objects has remained unsolved. As different objects can be distinguished by their accompanying context features, like their color or temporal position, we tested whether serial dependence is supported by the congruence of context features across memory episodes. Specifically, we asked participants to remember the motion directions of two sequentially presented colored dot fields per trial. At the end of a trial one motion direction was cued for continuous report either by its color (Experiment 1) or serial position (Experiment 2). We observed serial dependence, i.e., an attractive bias of currently toward previously memorized objects, between current and past motion directions that was clearly enhanced when items had the same color or serial position across trials. This bias was particularly pronounced for the context feature that was used for cueing and for the target of the previous trial. Together, these findings demonstrate that coding of current object representations depends on previous representations, especially when they share similar content and context features. Apparently the binding of content and context features is not completely erased after a memory episode, but it is carried over to subsequent episodes. As this reflects temporal dependencies in natural settings, the present findings reveal a mechanism that integrates corresponding bundles of content and context features to support stable representations of individualized objects over time.