The coordination of talk and action in the collaborative construction of a multimodal text
MetadataShow full item record
This paper explores how speech and action are coordinated in a web-based task undertaken by two high school students working collaboratively at the computer. The paper focuses on the coordination involved in the interactions between the two students and the computer screen, keyboard, and mouse, and explores the temporal synchrony and 'matching' points between speaking and typing, and speaking and mouse movements, within and between participants. Examples include coordination of speaking words aloud whilst typing, coordination of reading aloud from the screen and mouse movements, and coordination between participants, as when one individual is typing and the other talking. The discussion draws on the literature describing the coordination of language and action, kinesic behaviour, and nonverbal communication, including gesture, which have the potential to mediate conversation. Results indicate most coordination of talk and action is at the beginning of the action. Sometimes work is done to ensure coordination, either by slowing down the talk or pausing or stretching sounds mid-utterance. Talk that is coordinated temporally to some action on the screen is precise; in other words even when action and talk are mismatched (e.g., she is not talking about what she is doing), talk and action can start and finish together.
Journal of Pragmatics
Copyright 2010 Elsevier B.V.. This is the author-manuscript version of this paper. Reproduced in accordance with the copyright policy of the publisher. Please refer to the journal's website for access to the definitive, published version.
Organisational, Interpersonal and Intercultural Communication