Search results for "INTERFACE"
showing 10 items of 2139 documents
Enabling Multimodal Interaction in XPL – the eXtensible Presentation Language
2007
This paper introduces the multimodal extension of the eXtensible Presentation architecture and Language (XPL), a framework aimed at streamlining multi-channel interface design process and enabling full component reuse. XPL incorporates a presentation language based on design pattern paradigm, which supplies a clear distinction between the presentation layer and the corresponding programming logic, promoting contents aggregation and a variety of event handlers described without relying on a (procedural) scripting language. In this paper, the design pattern concept is extended to voice-based interaction, and two verbal design pattern (VeDP) are introduced along to their visual counterparts. T…
Phase Coherence in Conceptual Spaces for Conversational Agents
2010
This chapter attempts to enhance the traditional chatbots with associative/intuitive capabilities. According to these considerations, it tries to create a conversational agent model that takes into consideration, aside from the traditional rule - based dialogue mechanism, also some sort of intuitive reasoning ability. The aim is in attempting to overcome the rigid pattern - matching rules, proposing a "phase coherence" paradigm into a semantic space. With this locution the chapter intend that the vectors representing the elements of the dialogue are coherent with the context. The chapter trust that this intuitive - associative capability can be obtained using the LSA methodology. The repres…
The design of interfaces for multi-robot path planning and control
2014
The field of human-robot interaction has evolved beyond issues concerning the design and development of one person controlling one robot to exploring HRI for groups of robots and teams. Our design research explores biologically-inspired motion that is initiated by a human operator, applied to a single or a small group of robots, and used to affect the motion and path planning of another subset of robots. This exploratory design study first created a taxonomy to categorize individual robot motions, looking at how they could be categorized and used as building blocks. We then combined individual motions with time and velocity as design variables to guide our interaction design. This work led …
CubeHarmonic: A new musical instrument based on Rubik{'}s cube with embedded motion sensor
2019
A contemporary challenge involves scientific education and the connection between new technologies and the heritage of the past. CubeHarmonic (CH) joins novelty and tradition, creativity and edu- cation, science and art. It takes shape as a novel musical instrument where magnetic 3D motion tracking technology meets musical per- formance and composition. CH is a Rubik’s cube with a note on each facet, and a chord or chord sequence on each face. The posi- tion of each facet is detected through magnetic 3D motion tracking. While scrambling the cube, the performer gets new chords and new chord sequences. CH can be used to compose, improvise,1 and teach music and mathematics (group theory, permu…
Investigating Avatar Influence on Perceived Cognitive Load and Bimanual Interactions with Touchless Interfaces
2017
In recent years, touchless-enabling technologies have been more and more adopted for providing public displays with gestural interactivity. This has led to the need for novel visual interfaces aimed at solving issues such as communicating interactivity to users, as well as supporting immediate usability and "natural" interactions. In this paper, we focus our investigation on a visual interface based only on the use of in-air direct manipulations. Our study aims at evaluating whether and how the presence of an Avatar that replays user’s movements may decrease the perceived cognitive workload during interactions. Moreover, we conducted a brief evaluation of the relationship between the presen…
Investigating Proactive Search Support in Conversations
2018
Conversations among people involve solving disputes, building common ground, and reinforce mutual beliefs and assumptions. Conversations often require external information that can support these human activities. In this paper, we study how a spoken conversation can be supported by a proactive search agent that listens to the conversation, detects entities mentioned in the conversation, and proactively retrieves and presents information related to the conversation. A total of 24 participants (12 pairs) were involved in informal conversations, using either the proactive search agent or a control condition that did not support conversational analysis or proactive information retrieval. Data c…
A Multimodal Interaction Guide for Pervasive Services Access
2007
A pervasive, multimodal virtual guide for a cultural heritage site tour is illustrated. The guide is based on the integration of different technologies such as conversational agents, commonsense reasoning knowledge bases, multimodal interfaces and self-location detection systems. The aim of the work is to offer a more natural, context sensitive access to information with respect to traditional audio/visual pre-recorded guides. A prototype has been developed and implemented on a Qtek 9090 with Windows Mobile 2003 in order to deal with the "Museo Archeologico Regionale di Agrigento" domain.
A User-Friendly Interface for Fingerprint Recognition Systems Based on Natural Language Processing
2009
Biometric recognition systems represent a valid solution to the safety problem of internet accessibility, even if they do not always provide an environment easily comprehensible by users and operators with a mid-level of competence. This gap can be partially filled if, instead of using the conventional access routines to the authentication system, the user could simply write to the system through the interface and using high level sentences and requests be able to use its own natural language to reach the intended goal. On the other hand, biometrics features are widely used for recognition and identification all over the world, generating large databases. In this paper a user-friendly inter…
Designing for Exploratory Search on Touch Devices
2015
Exploratory search confront users with challenges in expressing search intents as the current search interfaces require investigating result listings to identify search directions, iterative typing, and reformulating queries. We present the design of Exploration Wall, a touch-based search user interface that allows incremental exploration and sense-making of large information spaces by combining entity search, flexible use of result entities as query parameters, and spatial configuration of search streams that are visualized for interaction. Entities can be flexibly reused to modify and create new search streams, and manipulated to inspect their relationships with other entities. Data compr…
Embedded Knowledge-based Speech Detectors for Real-Time Recognition Tasks
2006
Speech recognition has become common in many application domains, from dictation systems for professional practices to vocal user interfaces for people with disabilities or hands-free system control. However, so far the performance of automatic speech recognition (ASR) systems are comparable to human speech recognition (HSR) only under very strict working conditions, and in general much lower. Incorporating acoustic-phonetic knowledge into ASR design has been proven a viable approach to raise ASR accuracy. Manner of articulation attributes such as vowel, stop, fricative, approximant, nasal, and silence are examples of such knowledge. Neural networks have already been used successfully as de…