Loading Events

RI Seminar

February

13
Fri
Eric Vatikiotis-Bateson Cognitive Systems and Linguistics
University of British Columbia
Friday, February 13
3:30 pm to 12:00 am
Extracting and identifying communicative events from multimodal behavior

Event Location: 1305 NSH
Bio: Eric Vatikiotis-Bateson received a Bachelor’s degree in philosophy and physics from St. John’s College, Maryland, in 1974, a certificate in ethnographic film making in 1976, and an M.A. in Linguistics from Indiana University in 1978. From 1982-1987 he was an NIH pre-doctoral fellow at Haskins Laboratories (Connecticut) investigating “the organization and control of speech production”. After receiving a PhD in Linguistics from Indiana University in 1987 he was appointed Staff Scientist at Haskins Labs. From 1990-2003 he was at ATR International in Japan. During this time, he and his collaborators examined the production, perception and associated brain functions of multimodal communication in complex environments, especially spoken language processing, generating more than 150 technical papers and journal articles and numerous patents on multimodal signal coding and decoding. From 2000-2003, he headed the Communication Dynamics Project in the ATR Human Information Science Lab. Since 2003, Vatikiotis-Bateson has held a Canada Research Chair (NSERC, Tier 1) in Linguistics and Cognitive Science and is Director of the Cognitive Systems Program at the University of British Columbia in Vancouver, Canada.

Abstract: Communication, in whatever form, occurs in space and time and requires production and perception of patterned events.

Although it stands to reason that analysis of the spatiotemporal record itself and/or that the effects of time-varying signals on the perceiver should reveal much about the structure of communicative events, for most of the past half century, observable behaviors have more commonly been treated as imperfect realizations of deeper forms, where the imperfections are noise that must be accommodated in order to recover the intended signal. It is only recently that hidden markov and other probabilistic models of complex signals have succeeded at imposing structure on observed behaviors. Unfortunately, both of these approaches fail to address the problem of identifying meaningful events in the behavior. The inversion of hierarchical process models (from observable measures to underlying structures) has either proved equivocal, resulting in many possible model approximations, or computationally ill-posed, resulting in no solutions. Probabilistic models, on the other hand, generate events that cannot be easily interpreted.

In this talk, a research program is described whose goals include developing techniques for non-invasive measurement of multimodal behavior in natural conditions (uncontrolled studies outside the laboratory) and for assessing the time-varying coordination among signals and between signaling entities. These include studies of the production and perception of various types of audiovisual speech and musical performance in which simple optical flow measurement techniques replace more cumbersome and invasive marker-based measurement. A new algorithm for computing time-varying correspondences between such signals is described and used to examine the coordination between musicians and their audience, between visible gestures and audible speech, and between speech production and postural control. Finally, the need to reconcile the dualism inherent in computationally derived event structures and their likely symbolic identification is discussed.