Abstract
Development of multimodal applications is an iterative, complex, and often a rather heuristic process. This is because in multimodal systems the number of interplay-ing components can be far greater than in an unimodal Spoken Dialogue System. From the developer's perspective, a multimodal system presents challenges and technical difficulties on many levels. In this paper we will describe our approach to one specific component of multimodal systems, the Multimodal Integrator. On the other hand, from the designer's perspective, all components must be fine-tuned to a level that their combined overall performance can deliver the desired experience to end users. In both cases, evaluation and analysis of the current implementation is paramount. Hence, looking into the details while getting a good understanding of the overall performance of a multimodal system is the other key topic.
Original language | English |
---|---|
Title of host publication | Turn-Taking and Coordination in Human-Machine Interaction - Papers from the AAAI Spring Symposium, Technical Report |
Publisher | AI Access Foundation |
Pages | 79-82 |
Number of pages | 4 |
Volume | SS-15-07 |
ISBN (Electronic) | 9781577357117 |
Publication status | Published - 2015 |
MoE publication type | B3 Non-refereed conference publication |
Event | AAAI Spring Symposium - Palo Alto, United States Duration: 23 Mar 2015 → 25 Mar 2015 |
Conference
Conference | AAAI Spring Symposium |
---|---|
Country/Territory | United States |
City | Palo Alto |
Period | 23/03/2015 → 25/03/2015 |