Analysing speech and co-speech gesture in constraint-based grammars
- This paper addresses the form-meaning relation of multimodal communicative actions by means of a grammar that combines verbal input with hand gestures. Unlike speech, gesture signals are interpretable only through their semantic relation to the synchronous speech content. This relation serves to resolve the incomplete meaning that is revealed by gestural form alone. We demonstrate that by using standard linguistic methods, speech and gesture can be integrated in a constrained way into a single derivation tree which maps to a uniform meaning representation.
Author: | Katya AlahverdzhievaORCiD, Alex LascaridesORCiDGND |
---|---|
URN: | urn:nbn:de:hebis:30:3-713217 |
DOI: | https://doi.org/10.21248/hpsg.2010.1 |
ISSN: | 1535-1793 |
Parent Title (English): | Proceedings of the ... International Conference on Head-Driven Phrase Structure Grammar (HPSG) |
Publisher: | CSLI Publications |
Place of publication: | Stanford, CA |
Document Type: | Conference Proceeding |
Language: | English |
Date of Publication (online): | 2010/10/13 |
Year of first Publication: | 2010 |
Publishing Institution: | Universitätsbibliothek Johann Christian Senckenberg |
Contributing Corporation: | International Conference on Head Driven Phrase Structure Grammar (17 : 2010 : Paris) |
Release Date: | 2024/09/05 |
GND Keyword: | Gestik; Multimodalität |
Volume: | 17.2010 |
Page Number: | 21 |
First Page: | 6 |
Last Page: | 26 |
Dewey Decimal Classification: | 4 Sprache / 40 Sprache / 400 Sprache |
Sammlungen: | Linguistik |
Linguistik-Klassifikation: | Linguistik-Klassifikation: Nonverbale Kommunikation / Non-verbal communication |
Licence (German): | Creative Commons - CC BY - Namensnennung 4.0 International |