Refine
Year of publication
- 2016 (2) (remove)
Document Type
- Article (2)
Language
- English (2) (remove)
Has Fulltext
- yes (2)
Is part of the Bibliography
- no (2) (remove)
Keywords
- Cortex (1)
- Human behaviour (1)
Institute
- MPI für empirische Ästhetik (2) (remove)
A variety of joint action studies show that people tend to fall into synchronous behavior with others participating in the same task, and that such synchronization is beneficial, leading to greater rapport, satisfaction, and performance. It has been noted that many of these task environments require simple interactions that involve little planning of action coordination toward a shared goal. The present study utilized a complex joint construction task in which dyads were instructed to build model cars while their hand movements and heart rates were measured. Participants built these models under varying conditions, delimiting how freely they could divide labor during a build session. While hand movement synchrony was sensitive to the different tasks and outcomes, the heart rate measure did not show any effects of interpersonal synchrony. Results for hand movements show that the more participants were constrained by a particular building strategy, the greater their behavioral synchrony. Within the different conditions, the degree of synchrony was predictive of subjective satisfaction and objective product outcomes. However, in contrast to many previous findings, synchrony was negatively associated with superior products, and, depending on the constraints on the interaction, positively or negatively correlated with higher subjective satisfaction. These results show that the task context critically shapes the role of synchronization during joint action, and that in more complex tasks, not synchronization of behavior, but rather complementary types of behavior may be associated with superior task outcomes.
Natural sounds contain information on multiple timescales, so the auditory system must analyze and integrate acoustic information on those different scales to extract behaviorally relevant information. However, this multi-scale process in the auditory system is not widely investigated in the literature, and existing models of temporal integration are mainly built upon detection or recognition tasks on a single timescale. Here we use a paradigm requiring processing on relatively ‘local’ and ‘global’ scales and provide evidence suggesting that the auditory system extracts fine-detail acoustic information using short temporal windows and uses long temporal windows to abstract global acoustic patterns. Behavioral task performance that requires processing fine-detail information does not improve with longer stimulus length, contrary to predictions of previous temporal integration models such as the multiple-looks and the spectro-temporal excitation pattern model. Moreover, the perceptual construction of putatively ‘unitary’ auditory events requires more than hundreds of milliseconds. These findings support the hypothesis of a dual-scale processing likely implemented in the auditory cortex.