HCI Bibliography : Search Results skip to search form | skip to results |
Database updated: 2016-05-10 Searches since 2006-12-01: 32,258,629
director@hcibib.org
Hosted by ACM SIGCHI
The HCI Bibliogaphy was moved to a new server 2015-05-12 and again 2016-01-05, substantially degrading the environment for making updates.
There are no plans to add to the database.
Please send questions or comments to director@hcibib.org.
Query: evreinov_g* Results: 26 Sorted by: Date  Comments?
Help Dates
Limit:   
<<First <Previous Permalink Next> Last>> Records: 1 to 25 of 26 Jump to: 2014 | 13 | 11 | 10 | 09 | 08 | 07 | 06 | 05 | 04 | 03 | 02 | 99 |
[1] User experience and expectations of haptic feedback in in-car interaction / Väänänen-Vainio-Mattila, Kaisa / Heikkinen, Jani / Farooq, Ahmed / Evreinov, Grigori / Mäkinen, Erno / Raisamo, Roope Proceedings of the 2014 International Conference on Mobile and Ubiquitous Multimedia 2014-11-25 p.248-251
ACM Digital Library Link
Summary: Haptic feedback based on the sense of touch and movement is a promising area of human-computer interaction in the car context. Most user studies on haptic feedback in the car have been controlled experiments of specific types of haptic stimuli. For the study presented in this paper, twelve participants tried novel haptic feedback prototypes and evaluated communication scenarios in the physical car context. Our aim was to understand user experiences and usage potential of haptic feedback in the car. The qualitative results show that haptic feedback may offer support for safety and social communication, but can be hard to interpret. We propose design considerations for in-car haptics such as simplicity, subtleness and directionality.

[2] Effects of directional haptic and non-speech audio cues in a cognitively demanding navigation task / Nukarinen, Tomi / Raisamo, Roope / Farooq, Ahmed / Evreinov, Grigori / Surakka, Veikko Proceedings of the 8th Nordic Conference on Human-Computer Interaction 2014-10-26 p.61-64
ACM Digital Library Link
Summary: Existing car navigation systems require visual or auditory attention. Providing the driver with directional cues could potentially increase safety. We conducted an experiment comparing directional haptic and non-speech audio cues to visual cueing in a navigation task. Participants (N=16) drove the Lane Change Test simulator with different navigational cues. The participants were to recognize the directional cue (left or right) by responding as fast as possible using a tablet. Reaction times and errors were measured. The participants were also interviewed about the different cues and filled up the NASA-TLX questionnaire. The results showed that in comparison to visual cues all the other cues were reacted to significantly faster. Haptic only cueing resulted in the most errors, but it was evaluated as the most pleasant and the least physically demanding. The results suggest that non-visual cueing could improve safety.

[3] An evaluation of the virtual curvature with the StickGrip haptic device: a case study / Evreinova, Tatiana V. / Evreinov, Grigori Universal Access in the Information Society 2013-06 v.12 n.2 p.161-173
Keywords: Curved surface; Kinesthetic feedback; Pen-based interaction; StickGrip haptic device
Link to Digital Content at Springer
Summary: Dynamic simulation of distance to the physical surface could promote the development of new inexpensive tools for blind and visually impaired users. The StickGrip is a haptic device comprised of the Wacom pen input device added with a motorized penholder. The goal of the research presented in this paper was to assess the accuracy and usefulness of the new pen-based interaction technique when the position and displacement of the penholder in relation to the pen tip provided haptic feedback to the user about the distance to the physical or virtual surface of interaction. The aim was to examine how accurately people are able (1) to align the randomly deformed virtual surfaces to the flat surface and (2) to adjust the number of surface samples having a randomly assigned curvature to the template having the given curvature and kept fixed. These questions were approached by measuring both the values of the adjusted parameters and the parameters of the human performance, such as a ratio between inspection time and control time spent by the participants to complete the matching task with the use of the StickGrip device. The test of the pen-based interaction technique was conducted in the absence of visual feedback when the subject could rely on the proprioception and kinesthetic sense. The results are expected to be useful for alternative visualization and interaction with complex topographic and mathematical surfaces, artwork, and modeling.

[4] Integrating discrete events and continuous head movements for video-based interaction techniques / Evreinova, Tatiana V. / Evreinov, Grigori / Raisamo, Roope Behaviour and Information Technology 2011-11-01 v.30 n.6 p.739-746
Link to Article at Taylor & Francis
Summary: Human head gestures can potentially trigger different commands from the list of available options in graphical user interfaces or in virtual and smart environments. However, continuous tracking techniques are limited in generating discrete events which could be used to execute a predefined set of commands. In this article, we discuss a possibility to encode a set of discrete events by integrating continuous head movements and crossing-based interaction paradigm. A set of commands can be encoded through specific sequences of crossing points when a head-mouse cursor such as a scaled pointer interacts with a graphical object. The goal of the present experiment was testing the perceptual-motor performance of novices in target acquisition tasks using a subset of round head gestures and symbolic icons designating eight types of directional head movements. We have demonstrated that the novices can equally well execute round head gestures in clockwise and counter-clockwise directions by making two crossings for about 2s or three crossings for about 3s. None of the participants reported neck strain or other problems after 360 trials performed during a 40-min test in each of 5 days.

[5] Camera Based Target Acquisition Augmented with Phosphene Sensations Blind and Partially Sighted People: Mobility and Interaction without Sight / Evreinova, Tatiana G. / Evreinov, Grigori / Raisamo, Roope ICCHP'10: International Conference on Computers Helping People with Special Needs 2010-07-14 v.2 p.282-289
Link to Digital Content at Springer
Summary: This paper presents the results of evaluation of the user performance in the target acquisition task using camera-mouse real time face tracking technique augmented with phosphene-based guiding signals. The underlying assumption was that during non-visual inspection of the virtual workspace (screen area), the transcutaneous electrical stimulation of the optic nerve can be considered as alternative feedback when the visual ability is low or absent. The performance of the eight blindfolded subjects was evaluated. The experimental findings show that the camera-based target acquisition augmented with phosphene sensations is an efficient input technique when visual information is not available.

[6] EDITED BOOK The Universal Access Handbook 2009 n.61 p.1034 CRC Press
ISBN: 978-1-4200-6499-5
www.crcpress.com/product/isbn/9780805862805
== Introduction to Universal Access ==
Universal Access and Design for All in the Evolving Information Society
	+ Stephanidis, C.
Perspectives on Accessibility: From Assistive Technologies to Universal Access and Design for All
	+ Emiliani, P. L.
Accessible and Usable Design of Information and Communication Technologies
	+ Vanderheiden, G. C.
== Diversity in the User Population ==
Dimensions of User Diversity
	+ Ashok, M.
	+ Jacko, J. A.
Motor Impairments and Universal Access
	+ Keates, S.
Sensory Impairments
	+ Kinzel, E.
	+ Jacko, J. A.
Cognitive Disabilities
	+ Lewis, C.
Age-Related Diff erences in the Interface Design Process
	+ Kurniawan, S.
International and Intercultural User Interfaces
	+ Marcus, A.
	+ Rau, P.-L. P.
== Technologies for Diverse Contexts of Use ==
Accessing the Web
	+ Hanson, V. L.
	+ Richards, J. T.
	+ Harper, S.
	+ Trewin, S.
Handheld Devices and Mobile Phones
	+ Kaikkonen, A.
	+ Kaasinen, E.
	+ Ketola, P.
Virtual Reality
	+ Hughes, D.
	+ Smith, E.
	+ Shumaker, R.
	+ Hughes, C.
Biometrics and Universal Access
	+ Fairhurst, M. C.
Interface Agents: Potential Benefits and Challenges for Universal Access
	+ and, E. André
M. Rehm
== Development Lifecycle of User Interfaces ==
User Requirements Elicitation for Universal Access
	+ Antona, M.
	+ Ntoa, S.
	+ Adami, I.
	+ Stephanidis, C.
Unified Design for User Interface Adaptation
	+ Savidis, A.
	+ Stephanidis, C.
Designing Universally Accessible Games
	+ Grammenos, D.
	+ Savidis, A.
	+ Stephanidis, C.
Software Requirements for Inclusive User Interfaces
	+ Savidis, A.
	+ Stephanidis, C.
Tools for Inclusive Design
	+ Waller, S.
	+ Clarkson, P. J.
The Evaluation of Accessibility, Usability, and User Experience
	+ Petrie, H.
	+ Bevan, N.
== User Interface Development: Architectures, Components, and Tools ==
A Unified Soft ware Architecture for User Interface Adaptation
	+ Savidis, A.
	+ Stephanidis, C.
A Decision-Making Specifi cation Language for User Interface Adaptation
	+ Savidis, A.
	+ Stephanidis, C.
Methods and Tools for the Development of Unified Web-Based User Interfaces
	+ Doulgeraki, C.
	+ Partarakis, N.
	+ Mourouzis, A.
	+ Stephanidis, C.
User Modeling: A Universal Access Perspective
	+ Adams, R.
Model-Based Tools: A User-Centered Design for All Approach
	+ Stary, C.
Markup Languages in Human-Computer Interaction
	+ Paternò, F.
	+ Santoro, C.
Abstract Interaction Objects in User Interface Programming Languages
	+ Savidis, A.
== Interaction Techniques and Devices ==
Screen Readers
	+ Asakawa, C.
	+ Leporini, B.
Virtual Mouse and Keyboards for Text Entry
	+ Evreinov, G.
Speech Input to Support Universal Access
	+ Feng, J.
	+ Sears, A.
Natural Language and Dialogue Interfaces
	+ Jokinen, K.
Auditory Interfaces and Sonification
	+ Nees, M. A.
	+ Walker, B. N.
Haptic Interaction
	+ Jansson, G.
	+ Raisamo, R.
Vision-Based Hand Gesture Recognition for Human-Computer Interaction
	+ Zabulis, X.
	+ Baltzakis, H.
	+ Argyros, A.
Automatic Hierarchical Scanning for Windows Applications
	+ Ntoa, S.
	+ Savidis, A.
	+ Stephanidis, C.
Eye Tracking
	+ Majaranta, P.
	+ Bates, R.
	+ Donegan, M.
Brain-Body Interfaces
	+ Gnanayutham, P.
	+ George, J.
Sign Language in the Interface: Access for Deaf Signers
	+ Huenerfauth, M.
	+ Hanson, V. L.
Visible Language for Global Mobile Communication: A Case Study of a Design Project in Progress
	+ Marcus, A.
Contributions of "Ambient" Multimodality to Universal Access
	+ Carbonell, N.
== Application Domains ==
Vocal Interfaces in Supporting and Enhancing Accessibility in Digital Libraries
	+ Catarci, T.
	+ Kimani, S.
	+ Dubinsky, Y.
	+ Gabrielli, S.
Theories and Methods for Studying Online Communities for People with Disabilities and Older People
	+ Pfeil, U.
	+ Zaphiris, P.
Computer-Supported Cooperative Work
	+ Gross, T.
	+ Fetter, M.
Developing Inclusive e-Training
	+ Savidis, A.
	+ Stephanidis, C.
Training through Entertainment for Learning Difficulties
	+ Savidis, A.
	+ Grammenos, D.
	+ Stephanidis, C.
Universal Access to Multimedia Documents
	+ Petrie, H.
	+ Weber, G.
	+ Völkel, T.
Interpersonal Communication
	+ Waller, A.
Universal Access in Public Terminals: Information Kiosks and ATMs
	+ Kouroupetroglou, G.
Intelligent Mobility and Transportation for All
	+ Bekiaris, E.
	+ Panou, M.
	+ Gaitanidou, E.
	+ Mourouzis, A.
	+ Ringbauer, B.
Electronic Educational Books for Blind Students
	+ Grammenos, D.
	+ Savidis, A.
	+ Georgalis, Y.
	+ Bourdenas, T.
	+ Stephanidis, C.
Mathematics and Accessibility: A Survey
	+ Pontelli, E.
	+ Karshmer, A. I.
	+ Gupta, G.
Cybertherapy, Cyberpsychology, and the Use of Virtual Reality in Mental Health
	+ Renaud, P.
	+ Bouchard, S.
	+ Chartier, S.
	+ Bonin, M-P
== Nontechnological Issues ==
Policy and Legislation as a Framework of Accessibility
	+ Kemppainen, E.
	+ Kemp, J. D.
	+ Yamada, H.
Standards and Guidelines
	+ Vanderheiden, G. C.
eAccessibility Standardization
	+ Engelen, J.
Management of Design for All
	+ Bühler, C.
Security and Privacy for Universal Access
	+ Maybury, M. T.
Best Practice in Design for All
	+ Miesenberger, K.
== Looking to the Future ==
Implicit Interaction
	+ Ferscha, A.
Ambient Intelligence
	+ Streitz, N. A.
	+ Privat, G.
Emerging Challenges
	+ Stephanidis, C.

[7] Non-visual Gameplay: Making Board Games Easy and Fun Entertainment Software Accessibility / Evreinova, Tatiana V. / Evreinov, Grigori / Raisamo, Roope ICCHP'08: International Conference on Computers Helping People with Special Needs 2008-07-09 p.561-568
Keywords: board game; tabular data; non-visual game; overview cues; audio-haptic mapping; camera mouse
Link to Digital Content at Springer
Summary: In this paper we report the results of a study on an evaluation of a game and techniques which allow playing board games in the total absence of visual feedback. We have demonstrated that a camera mouse can be used for blind navigation within a game field. Snapping a position of the virtual pointer to the regions of interest as well as audio-haptic complementary mapping significantly reduce the cognitive load and facilitate mental matching and integration of overview sound sequences.

[8] Skills vs. Abilities Skills vs. Abilities: Alternative Input and Communication Systems / Evreinov, Grigori ICCHP'08: International Conference on Computers Helping People with Special Needs 2008-07-09 p.1153-1156
Link to Digital Content at Springer
Summary: A spectrum of human abilities, which people use to communicate and socially interact with others, is narrow enough (Table 1). Moreover, even basic human abilities (sensory-motor or/and cognitive) can be lost due to an accident or an illness. Nevertheless, the key issue is not how many different tools are needed to solve a specific problem but whether a person desires to be socially included [1, 2, 4, 7, 12, 14]. Social inclusion aims to reduce inequality between the least advantaged groups and communities and the rest of society. Nevertheless, the inclusion cannot be achieved when a target group or an individual person has a lack of skills to meet social challenges and opportunities.

[9] Emotional and behavioral responses to haptic stimulation Tactile and Haptic User Interfaces / Salminen, Katri / Surakka, Veikko / Lylykangas, Jani / Raisamo, Jukka / Saarinen, Rami / Raisamo, Roope / Rantala, Jussi / Evreinov, Grigori Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems 2008-04-05 v.1 p.1555-1562
ACM Digital Library Link
Summary: A prototype of friction-based horizontally rotating fingertip stimulator was used to investigate emotional experiences and behavioral responses to haptic stimulation. The rotation style of 12 different stimuli was varied by burst length (i.e., 20, 50, 100 ms), continuity (i.e., continuous and discontinuous), and direction (e.g., forward and backward). Using these stimuli 528 stimulus pairs were presented to 12 subjects who were to distinguish if stimuli in each pair were the same or different. Then they rated the stimuli using four scales measuring the pleasantness, arousal, approachability, and dominance qualities of the 12 stimuli. The results showed that continuous forward-backward rotating stimuli were rated as significantly more unpleasant, arousing, avoidable, and dominating than other types of stimulations (e.g., discontinuous forward rotation). The reaction times to these stimuli were significantly faster than reaction times to discontinuous forward and backward rotating stimuli. The results clearly suggest that even simple haptic stimulation can carry emotional information. The results can be utilized when making use of haptics in human-technology interaction.

[10] Non-visual game design and training in gameplay skill acquisition -- A puzzle game case study / Evreinova, Tatiana V. / Evreinov, Grigori / Raisamo, Roope Interacting with Computers 2008 v.20 n.3 p.386-405
Keywords: Non-visual puzzle game; Sonification; Overview sound cues; Sticky labels; Sequential learning; Skills training
Link to Article at ScienceDirect
1. Introduction
2. Games and sounds
3. Gameplay scenario for easy learning
4. Puzzle gameplay simulation
5. Experimental setup
6. Participants and procedure
7. Results and discussion
7.1. Number of player moves
7.2. Player-to-PC moves similarity
7.3. Game completion time
7.4. Length of the tracks
7.5. Number of labels inspected
7.6. Number of rows inspected
7.7. Number of overview verbal cues activations
7.8. Number of false moves
7.9. Resetting the virtual pointer at the centre
8. Concluding remarks
Acknowledgements
References
Summary: This paper reports the results of a study on the design and evaluation of the game and techniques which allow puzzles to be played in the absence of visual feedback. We have demonstrated that a camera-mouse can be used successfully for blind navigation and target location acquisition within a game field. To gradually teach the players the sequential learning method was applied. Blind exploration of the gamespace was augmented with sticky labels and overview sound cues, verbal and non-verbal, which can significantly reduce the cognitive load and facilitate mental matching and integration. The full-sticky labels technique does not require fine motor skills and allows a user to gain control over the game with a minimum level of skills. With the vertical sticky labels technique training was focused on the development of accurate head movements only on a horizontal plane. With practice, the players can use the non-sticky labels technique. After 240 trials (3-4 h), the cumulative experience of the blindfolded players was increased 22.5-27 times compared to the initial 10 trials.

[11] Non-visual interaction with graphs assisted with directional-predictive sounds and vibrations: a comparative study / Evreinova, Tatiana / Evreinov, Grigori / Raisamo, Roope / Vesterinen, Leena Universal Access in the Information Society 2008 v.7 n.1/2 p.93-102
Link to Digital Content at Springer
Summary: Blind and visually impaired students need special educational and developmental tools to allow them to interact with graphic entities on PDA and desktop platforms. In previous research, stylus movements regarding the hidden graph were sonified with three directional-predictive sound (DPS) signals, taking into account an exploration behavior and the concept of the capture radius. The results indicated that the scanpaths were by 24-40% shorter in length and task completion times decreased by 20-25%. The goal of the study presented in this paper was to measure and compare the subjective performance recorded with directional-predictive vibrations (DPV) regarding the subjective performance achieved when the hidden graphic images were explored with DPS. The study also aimed to find out which kind of feedback cues would require less cognitive efforts in interpreting their meaning. The prototype of vibro-tactile pen with embedded vibration motor was used to produce DPV instead of sounds. The performance of eight blindfolded subjects was investigated in terms of the number of both feedbacks used and the time spent to complete non-visual inspection of the hidden graphs. There was a statistically significant difference between the average number of DPS and vibrations and task completion time taken by the players to discover the features of hidden graphs being explored with different capture radius. The experimental findings confirmed the beneficial use of DPS signals in the task when cross-modal coordination should benefit the user in the absence of visual information when compared with DPV patterns.

[12] A camera-joystick for sound-augmented non-visual navigation and target acquisition: a case study / Evreinova, Tatiana / Evreinov, Grigori / Raisamo, Roope Universal Access in the Information Society 2008 v.7 n.3 p.129-144
Link to Digital Content at Springer
Summary: This paper presents the results of a comparative study of user input with a camera-joystick and a manual joystick used in a target acquisition task when neither targets nor pointer could be perceived visually. The camera-joystick is an input technique in which each on-screen item is accessible from the center with a predefined vector of head motion. Absolute pointing was implemented with an acceleration factor of 1.7 and a moving average on 5 detected head positions. The underlying assumption was that, in order to provide a robust input for blind users, the interaction technique has to be based on perceptually well-discriminated human movements, which compose a basic framework of an accessible virtual workspace demanding minimum external auxiliary cues. The target spots, having a diameter of 35mm and a distance between the centers of adjacent spots of 60 mm, were arranged in a rectangular grid of 5 rows by 5 columns. The targets were captured from a distance of 600mm. The results have shown that the camera input is a promising technique for non-visual human-computer interaction. The subjects demonstrated, more than twice, better performance in the target acquisition task with the camera-joystick versus the manual joystick. All the participants reported that the camera-joystick was a robust and preferable input technique when visual information was not available. Blind interaction techniques could be significantly further improved allowing a user-dependent activation of the navigational cues to better coordinate feedbacks with exploratory behavior.

[13] User performance with trackball-mice / Isokoski, Poika / Raisamo, Roope / Martin, Benoît / Evreinov, Grigori Interacting with Computers 2007 v.19 n.3 p.407-427
Keywords: Trackmouse; Optical mouse; Trackball; Fitts' Law; Pointing device; Two-cursor; Two-handed interaction; Dual-stream input
Link to Article at ScienceDirect
Summary: Abstract Trackball-mice are devices that include both a trackball and a mouse. In this paper we discuss our experiences in building and testing trackball-mouse prototypes. We report four experiments on user performance with the prototypes used as trackball-mice, conventional mice, and in two-handed configuration with a separate trackball for the non-dominant hand. The results show that user performance with the two-handed configuration was better than in one-handed operation of a trackball-mouse and in one-handed operation of a mouse. Trackball-mouse use and conventional mouse use were more evenly matched. However, Trackball-mouse operation involves a skill that most users do not have whereas mouse operation is familiar to most. Therefore, widespread introduction of trackball-mice does not appear to be justified on performance grounds alone. However, trackball-mice can be used as regular mice by ignoring the ball. This makes them compatible with traditional graphical user interfaces while offering two extra degrees of freedom in tasks where they are beneficial.

[14] Evaluating the Length of Virtual Horizontal Bar Chart Columns Augmented with Wrench and Sound Feedback People with Disabilities: Materials for Teaching Accessibility and Design for All / Evreinova, Tatiana G. / Evreinov, Grigori / Raisamo, Roope ICCHP'06: International Conference on Computers Helping People with Special Needs 2006-07-11 p.353-360
Link to Digital Content at Springer
Summary: Augmented visualization of the mathematic and scientific data is an essential aid in training blind students' pre-calculus skills. Compared to existing multidimensional wrench-reflection interfaces, one-dimensional stylus-based interaction concept could support blind users with reasonable feedback in different tasks. We designed a mock-up of the cable-suspended haptic interface and a match game-like piece of software to investigate the perception features of the length of the virtual horizontal bar chart columns augmented with wrench and sound feedback. The performance of the eight blindfolded subjects was evaluated in terms of the number of repeated inspections to detect twin chart columns with similar length, and the task completion time required to perform the chart inspection. The experience acquired within simulated gameplay conditions with the use of implemented cable-suspended interface can be applied in developing novel didactic tools for training blind students in estimating linear dimensions of the simulated objects.

[15] Blind and Visually Impaired People: Human Computer Interface Blind and Visually Impaired People: Human Computer Interface / Evreinov, Grigori ICCHP'06: International Conference on Computers Helping People with Special Needs 2006-07-11 p.1029-1030
Link to Digital Content at Springer
Summary: For over ten years human-computer interface, blind interaction and integration of visually impaired users with sighted users are the key issues of equal access to information and service. The vast research on alternative visualization, augmented communication, user-centered design and usability has been done, and much more projects and solutions are under development. However, several generations of graphical interfaces (Xerox, Apple, Microsoft) have brought less or no benefits for the blind users. Some elderly people still recall the times of DOS and command line, when both the system and application software levels were almost equally accessible. Nowadays, multi-processor operating systems are extremely complex and perform hundreds of routine tasks which are not necessary to be supervised or adapted for the user control at all.

[16] The Amodal Communication System Through an Extended Directional Input Blind and Visually Impaired People: Human Computer Interface / Yfantidis, Georgios / Evreinov, Grigori ICCHP'06: International Conference on Computers Helping People with Special Needs 2006-07-11 p.1079-1086
Link to Digital Content at Springer
Summary: Multi-modal interfaces have been overflowing HCI research, incorporating the different senses, to provide adequate feedback or input for human-device interaction. The plethora of sensory combinations that this "creeping multimodalism" implies seems to be creating an oxymoron when it is used as a solution to help people with sensory problems and/or limitations dealing with interfaces. A better solution for those people would be to use systems where the traditional senses are obsolete as driving factors of the interaction, and they are only used as peripheral aids. The quest for such an amodal user experience is the object of our current research.

[17] An alternative approach to strengthening tactile memory for sensory disabled people LONG PAPER / Evreinova, Tatiana G. / Evreinov, Grigori / Raisamo, Roope Universal Access in the Information Society 2006 v.5 n.2 p.189-198
Keywords: Sensory disabled - Hearing-impaired - Game training methodology - Vibro-tactile feedback - Tactile memory - Tactons
Link to Digital Content at Springer
Summary: Deaf and hearing-impaired people need special educational and developmental tools to support their social inclusion. Research in vibro-tactile pattern perception has shown that tactile memory could be a crucial aspect in coding and imaging semantic information for users with sensory limitations. This paper describes a simple matching game designed to facilitate the learning process of 27 vibro-tactile composite patterns (tactons) which can be produced with the Logitech tactile feedback mouse. The underlying assumption was that a particular framework and game intrigue would induce a player to mobilize the perceptive skills and deploy individual playing tactics to recall the tactons when progressing through the game. The performance of ten subjects using soundproof headphones was investigated in terms of the number of repetitions required to memorize and learn the mono-frequency, bi-frequency and three-frequency tactons, and in terms of the selection time needed to match the tactons in the game script. The analysis of the data collected indicated that the novice-to-expert transition was significantly above chance when the results obtained in the first and the last test sessions were statistically analyzed and compared. There was also a significant difference between mean selection times needed to match the composite patterns depending of their complexity in the first and the last test sessions. Upon learning and training within game, the tactons may be employed to assign alphabet characters or symbols to communicate textual or symbolic information.

[18] Rapid Evaluation of the Handwriting Performance for Gesture Based Text Input Gesture Interaction in Multimodal Systems / Evreinov, Grigori E. / Raisamo, Roope GW 2005: Gesture Workshop 2005-05-18 p.339-342
Link to Digital Content at Springer
Summary: Rapid method for evaluation of pen-based text input techniques is necessary both for designers and consumers. We present a method that is based on an immediate performance comparison of the gesture making using the graphic templates of typefaces and the pen-based behavioral patterns. The results showed that besides the cognitive difficulty of symbolic gestures, metaphors and mnemonics, first and foremost the graphic feasibility determines handwriting performance of the gesture-based input techniques.

[19] Mobile Games for Training Tactile Perception / Evreinov, Grigori / Evreinova, Tatiana / Raisamo, Roope Proceedings of the 2004 International Conference on Entertainment Computing 2004-09-01 p.468-475
Link to Digital Content at Springer
Summary: Tactile interactive multimedia propose a wide spectrum of developmental games both for visually impaired children and adults. While some simulators can produce strong vibro-tactile sensations, the discrimination of several tactile patterns remains quite poor. Skin sensitivity is not enough for remembering and recognizing vibration patterns (tactons) and their combinations. Short-term tactile memory is the crucial factor in educational and vocational environments for deaf and blind people. We designed a vibro-tactile pen and software to create tactons and semantic sequences of vibro-tactile patterns on mobile devices (iPAQ pocket PC). We propose special games to facilitate learning and manipulation by tactons. The techniques are based on gesture recognition and spatial-temporal mapping for imaging vibro-tactile signals. The proposed approach and the tools implemented allow creating a new kind of mobile communication environment for deaf and blind people.

[20] Manipulating Vibro-Tactile Sequences on Mobile PC Ubiquitous Computing / Evreinov, Grigori / Evreinova, Tatiana / Raisamo, Roope 2004 Engineering for Human-Computer Interaction 2004-07-11 p.245-252
Link to Digital Content at Springer
Summary: Tactile memory is the crucial factor in coding and transfer of the semantic information through a single vibrator. While some simulators can produce strong vibro-tactile sensations, discrimination of several tactile patterns can remain quite poor. Currently used actuators, such as shaking motor, have also technological and methodological restrictions. We designed a vibro-tactile pen and software to create tactons and semantic sequences of vibro-tactile patterns on mobile devices (iPAQ pocket PC). We proposed special games and techniques to simplify learning and manipulating vibro-tactile patterns. The technique for manipulating vibro-tactile sequences is based on gesture recognition and spatial-temporal mapping for imaging vibro-tactile signals. After training, the tactons could be used as awareness cues or the system of non-verbal communication signals.

[21] Java-Powered Braille Slate Talker Blind People: Braille Interfaces / Arato, A. / Juhasz, Z. / Blenkhorn, P. / Evans, G. / Evreinov, G. ICCHP'04: International Conference on Computers Helping People with Special Needs 2004-07-07 p.506-513
Link to Digital Content at Springer
Summary: A new device, the Braille Slate Talker, is introduced. An ordinary hand held device (a PDA) is used with a fixed layout plastic guide placed over the touch screen to allow Braille input. Contracted Braille is converted to text by a table-driven state machine. Programs are written in Java language to provide full hardware and software platform independence. Future network applications will use Sun's Jini technology.

[22] Optimizing Menu Selection Process for Single-Switch Manipulation Mobility Impaired People: Rehabilitaiton and Health Care / Evreinov, Grigori / Raisamo, Roope ICCHP'04: International Conference on Computers Helping People with Special Needs 2004-07-07 p.836-844
Link to Digital Content at Springer
Summary: Single-switch manipulation is considered as a model for optimizing a menu selection task for physically challenged users. We have applied a short-cyclic hierarchical structure with three levels and three alternatives as a basic layout for symbol input and imaging. A user can make use of the triple-stroke or the long-stroke technique when the button is held down for extended period. It allows to jump over one of menu levels or to cut the cycle. We designed algorithm for adaptive scan interval and have applied it for text entry. Long-stroke technique significantly reduces the number of strokes and increases typing speed. The preliminary tests with able-bodied participants showed an average typing speed of more than 20 signs per minute after one-hour training. Adaptive scan interval could be useful for applications that require periodic time correction depending on user performance. Algorithm for adaptive scan interval and the coupled issues are considered in detail.

[23] Symbol Creator: An Alternative Eye-based Text Entry Technique with Low Demand for Screen Space 2: Eye tracking / Miniotas, Darius / Spakov, Oleg / Evreinov, Grigori Proceedings of IFIP INTERACT'03: Human-Computer Interaction 2003-09-01 p.137
[24] Cyclic Input of Characters through a Single Button Manipulation Typing -- Alternative and Augmentative Communication / Evreinov, Grigori / Raisamo, Roope ICCHP'02: International Conference on Computers Helping People with Special Needs 2002-07-15 p.259-266
Link to Digital Content at Springer
Summary: An alternative text-input method is considered as a model for a menu selection task through manipulation by a single button. A traditional seven-segment display element was used as a layout for symbol input and imaging. Each of the segments was lighted in a temporal sequence and the writer could choose the segment by pressing a button. Instead of the button any switch or similar signal may be used, as well as visual imaging may be substituted by sounds. When all segments have been cycled, the result was interpreted as a character according to a set of rules and depending on the character set used. A physically impaired person to control a computer or other electronic device could use the method. The rationale for the design and the results of a preliminary evaluation are presented.

[25] Isomorphic sonification of spatial relations / Edwards, A. D. N. / Evreinov, G. E. / Agranovski, A. V. Proceedings of the Eighth International Conference on Human-Computer Interaction 1999-08-22 v.1 p.526-530
<<First <Previous Permalink Next> Last>> Records: 1 to 25 of 26 Jump to: 2014 | 13 | 11 | 10 | 09 | 08 | 07 | 06 | 05 | 04 | 03 | 02 | 99 |