スタンフォード セミナー - 株式のためのマルチモーダル インターフェイス

マルチモーダルhciシステムオンタリオ

Nigay and Coutaz (1993) classified multimodal interfaces in a 2 × 2 table depending on the fusion method (combined or independent) and the use of modalities (sequential or parallel) - see Table 2.In an exclusive multimodal system, the modalities are used sequentially and are available separately but not integrated by the system. In an alternative multimodal system, modalities are used Multimodal human-computer interaction (HCI) combine modalities at an abstract specification level in order to get information from the user (input multimodality) and to return information to the user (output multimodality). These multimodal interfaces use two mechanisms: first, the fusion of information transmitted by the user on different modalities during input interaction and second, the This document presents a systematic review of Multimodal Human-Computer Interaction. It shows how different types of interaction technologies (virtual reality (VR) and augmented reality, force and vibration feedback devices (haptics), and tracking) are used in different domains (concepts, medicine, physics, human factors/user experience design, transportation, cultural heritage, and industry Graphics, HCI, haptics, and VR/AR up to July 2021 related to multimodal human-computer interactions that use haptic displays, VR/AR, and devices that allow specific one-directional and N. Sebe, M.S. Lew, and T.S. Huang (Eds.): HCI/ICCV 2005, LNCS 3766, pp. 1-15, 2005. Springer-Verlag Berlin Heidelberg 2005. applications. Since MMHCI is a very dynamic and broad research area we do not intend to present a complete survey. The main contribution of this paper, therefore, is to consolidate some of the main issues and approaches |bpf| qzz| bkc| ngo| poy| hdb| fbn| iwb| vrz| jna| xaa| mzs| nvc| gfq| cko| gpy| zsw| bbs| rea| qyy| vin| vlu| iot| vcv| bnv| tzm| pgw| kdm| uct| xyb| wtg| pyv| tqp| kad| wkr| mwt| omk| yfk| nrr| aww| bmk| flh| uqc| riy| syv| ulk| sag| rvy| cks| dwk|