Towards Automatically Generated Tactile Detail Maps by 3D Printers for Blind Persons | | BIBAK | Full-Text | 1-7 | |
Timo Götzelmann; Aleksander Pavkovic | |||
This paper introduces an approach for the (semi)automatic generation of
worldwide available, detailed tactile maps including buildings and
blind-specific features based on recognized illustrators' guidelines and
standards. These guidelines for tactile maps are investigated in order to
define a formal rule set and to automatically filter map data accordingly.
Using the rule set, our approach automatically abstracts map data in order to
generate a 2.1D tactile model providing multiple height levels (layers) which
can be printed by usual consumer 3D printers. Based on the popular
OpenStreetMap map data, our automated approach allows to generate arbitrary
detail maps blind persons individually interested in, without the need for
manual adaption of the tactile map. Thus, this approach contributes to the goal
to increase the autonomy of blind persons. Keywords: Tactile Maps; Layering; 2.1D; Worldwide; Blind; Orientation; Accessibility;
Haptic; Braille; 3D Printer; OpenStreetMap |
Opportunities and Limitations of Haptic Technologies for Non-visual Access to 2D and 3D Graphics | | BIBAK | Full-Text | 8-11 | |
Helen Sullivan; Shrirang Sahasrabudhe; Jukka Liimatainen; Markku Hakkinen | |||
Existing and emerging haptic technologies offer methods for non-visually
rendering and interacting with 2D and 3D graphical information. These
technologies include force feedback devices, touch surfaces with vibrotactile
feedback, wearable vibrotactiles, and touch surfaces with electrostatic
feedback. In this paper we will focus on approaches to non-visual access to 3D
shapes. The interactive models focus on two approaches: simulation of 3D shape
and perspective on a 2D touch surface; and interactive exploration of 3D shapes
using physical motion in a virtual 3D space with either a force feedback
controller or wearable haptics. The technologies will be reviewed along with
suitability for their use by students with visual impairments. Methodology and
results from an ongoing series of exploratory usability studies will be
discussed. Benefits and limitations of the technologies and recommendations for
further research will be presented. Keywords: Tactiles; Haptics; non-Visual Displays; Tablets; STEM |
Do Blind Subjects Differ from Sighted Subjects When Exploring Virtual Tactile Maps? | | BIBAK | Full-Text | 12-17 | |
Mariacarla Memeo; Claudio Campus; Luca Brayda | |||
The access to graphical information is difficult for individuals who are
blind or visually impaired. Taking advantage of the residual sensory abilities
such as touch is one way to solve this issue. However, it is not yet clear if
blind subjects perceive new tacto-spatial information in the same way that
sighted people do. In this work we code the discovery of unknown tactile
virtual objects in terms of subjective and behavioral variables, which result
to be in-dependent on visual deprivation and dependent only on task difficulty.
Our methodology can be employed in educational, orientation and mobility
protocols. Keywords: Blind; Visually Impaired; Haptic; Tactile; Cognition; Workload |
Development of Synchronized CUI and GUI for Universal Design Tactile Graphics Production System BPLOT3 | | BIBAK | Full-Text | 18-25 | |
Mamoru Fujiyoshi; Akio Fujiyoshi; Akiko Osawa; Yusuke Kuroda; Yuta Sasaki | |||
Synchronized CUI and GUI are developed for the universal design tactile
graphics production system BPLOT3. BPLOT is the first tactile graphics
production system for the blind that enables the blind to produce tactile
graphics by themselves. With the new synchronized CUI and GUI of BPLOT3, the
blind and the sighted can collaboratively produce tactile graphics.
Proofreading of tactile graphics by a blind person is necessary in order to
produce elaborate tactile graphics which can be used in textbooks or questions
of entrance examinations. Because a blind person can modify tactile graphics by
himself with BPLOT3, it will be a powerful tool. Keywords: Blind; Tactile Graphics; Universal Design; User Interface |
Production of Accessible Tactile Graphics | | BIBAK | Full-Text | 26-33 | |
Denise Prescher; Jens Bornschein; Gerhard Weber | |||
To allow blind and visually impaired users participation in learning
visualized concepts and ideas it is important to provide them not only with
text but also with graphics. As the effort and expertise needed for manually
transcribing graphics is time-consuming we need a better understanding of the
decision-making process leading to the support of alternative descriptions and
materials for tactile exploration. We performed two surveys, the first one on
current practices used for the production of accessible graphics in Germany,
the second one on user experiences in exploring and constructing tactile
graphics. As result we have defined some requirements for enhancing the
production of accessible tactile graphics by a software tool that not only
supports the creation of image masters and descriptions, but also includes
blind users in the editing process. Keywords: Tactile Graphics; Image Descriptions; Textbook Production; Accessible
Distribution; Transcription Process; Survey; Visually Impaired Users |
Edutactile -- A Tool for Rapid Generation of Accurate Guideline-Compliant Tactile Graphics for Science and Mathematics | | BIBAK | Full-Text | 34-41 | |
Mrinal Mech; Kunal Kwatra; Supriya Das; Piyush Chanana; Rohan Paul; M. Balakrishnan | |||
In this paper the authors have presented the design and implementation of
Edutactile, a cross-platform software which automates the process of creation
of tactile diagrams. Edutactile provides for automated application of
guidelines or presets as well as Braille translation and thus abstracts away
the production related issues. This relieves special educators for the visually
challenged from having to learn the workings of the graphics editing software
(Photoshop, CorelDraw) which are currently being used to produced tactile
graphics and instead focus on the content of the diagram. Keywords: Visually Challenged Students; Tactile Graphics; Mathematical and Scientific
Diagrams; Special Educators |
Tactile Map Automated Creation System Using OpenStreetMap | | BIBAK | Full-Text | 42-49 | |
Tetsuya Watanabe; Toshimitsu Yamaguchi; Satoko Koda; Kazunori Minatani | |||
We have developed a Web-based tactile map automated creation system tmacs.
Users simply type in an address or the name of a building and the system
instantly creates an image of a tactile map, which is then printed on capsule
paper and raised up by a heater. This time we modified this system to deal with
OpenStreetMap (OSM). The advantage of using OSM data is that tmacs becomes to
be able to create tactile maps of any location in the world and include
information which is useful for blind people such as tactile paving. Another
feature of the new system is that sighted users can change the point and scale
of a tactile map in the same way as a regular Google Map. We are exploring the
possibility of increasing the number of countries whose tactile maps can be
created with tmacs. Keywords: Blind People; Tactile Map; Tactile Perception; OpenStreetMap; Automated
Creation |
Narrative Map Augmentation with Automated Landmark Extraction and Path Inference | | BIBA | Full-Text | 50-53 | |
Vladimir Kulyukin; Thimma Reddy | |||
Various technologies, including GPS, Wi-Fi localization, and infrared beacons, have been proposed to increase travel independence for visually impaired (VI) and blind travelers. Such systems take readings from sensors, localize those readings on a map, and instruct VI travelers where to move next. Unfortunately, sensor readings can be noisy or absent, which decreases the traveler's situational awareness. However, localization technologies can be augmented with solutions that put the traveler's cognition to use. One such solution is narrative maps, i.e., verbal descriptions of environments produced by O&M professionals for blind travelers. The production of narrative maps is costly, because O&M professionals must travel to designated environments and describe large numbers of routes. Complete narrative coverage may not be feasible due to the sheer size of many environments. But, the quality of produced narrative maps can be improved by automated landmark extraction and path inference. In this paper, an algorithm is proposed that uses scalable natural language processing (NLP) techniques to extract landmarks and their connectivity from verbal route descriptions. Extracted landmarks can be subsequently annotated with sensor readings, used to find new routes, or track the traveler's progress on different routes. |
The Mobile Travel Assistance System NAMO with Way-Finding Support in Public Transport Environments | | BIBA | Full-Text | 54-57 | |
Christian Bühler; Helmut Heck; Annika Nietzio; Frank Reins | |||
Many older people rely on public transport to maintain their personal mobility and thus quality of life. However, problems may arise in unfamiliar environments or during unexpected events. Especially when changing trains in complex stations, many people experience orientation problems or feel insecure and overwhelmed. The namo travel assistant combines technical and human support during the journey. The users can choose the presentation of the information which suits them most: The application offers photos with directional arrows, station plans with marked paths, and contact to a service hotline to get direct support. In this way, namo helps maintain personal mobility in old age while offering an increased sense of security. |
A Mobile Guidance Platform for Public Transportation | | BIBAK | Full-Text | 58-64 | |
Reinhard Koutny; Peter Heumader; Klaus Miesenberger | |||
This paper presents an approach which allows people with disabilities to use
public transportation more effectively in supporting them throughout the whole
journey. Besides the common feature set, like offering time table information
and planning trips consisting of multiple rides it additionally includes
information when to get on or off a vehicle and performs route re-planning in
the case of unexpected events like delays. Moreover, it provides information
particularly important for people with disabilities, like wheelchair users or
blind persons. Depending on the user profile, information regarding the
accessibility of vehicles and also routing information for footpaths are
delivered in real-time, which is especially important at major transfer points
like railway stations where routes tailored to the user's capabilities are
provided. As it cannot be guaranteed that every footpath and every obstacle is
charted and up-to-date, users can improve routing information on their own in a
crowd sourcing based approach. Keywords: Public Transport; Assistive Technology; Navigation; Blind Person; Wheelchair
User |
FB-Finger: Development of a Novel Electric Travel Aid with a Unique Haptic Interface | | BIBAK | Full-Text | 65-72 | |
Kiyohide Ito; Yoshiharu Fujimoto; Ryoko Otsuki; Yuka Niiyama; Akihiro Masatani; Takanori Komatsu; Junichi Akita; Tetsuo Ono; Makoto Okamoto | |||
We developed a unique haptic interface, the "FB-Finger," which enables users
to detect the distance to an object. When a user holds the FB-Finger and places
his/her forefinger on a link, the finger bends or extends depending on the
link's angular motion (which corresponds to the metric distance between the
user and the object). We expected the FB-Finger to provide more accurate
distance estimation than similar commercial electric travel aids. To test this
hypothesis, we conducted psychological experiments with blindfolded sighted
participants who were asked to make distance estimations in conditions using
three different devices. Results revealed that the FB-Finger allowed
participants to make more accurate judgments compared to the other devices.
These findings suggest that using the FB-Finger provides significant potential
for ETA application among visually impaired individuals. Keywords: Haptic Interface; Electric Travel Aid; Perception |
Open Accessibility Data Interlinking | | BIBAK | Full-Text | 73-80 | |
Chaohai Ding; Mike Wald; Gary Wills | |||
This paper presents the research of using Linked Open Data to enhance
accessibility data for accessible travelling. Open accessibility data is the
data related to the accessibility issues associated with geographical data,
which could benefit people with disabilities and their special needs. With the
aim of addressing the gap between users' special needs and data, this paper
presents the results of a survey of open accessibility data retrieved from four
different sources in the UK. An ontology based data integration approach is
proposed to interlink these datasets together to generate a linked open
accessibility repository, which also links to other resources on the Linked
Data Cloud. As a result, this research would not only enrich the open
accessibility data, but also contribute to a novel framework to address
accessibility information barriers by establishing a linked data repository for
publishing, linking and consuming the open accessibility data. Keywords: Linked Data; Open Accessibility Data; Information Retrieval; Data
Interlinking |
Pre-journey Visualization of Travel Routes for the Blind on Refreshable Interactive Tactile Displays | | BIBAK | Full-Text | 81-88 | |
Mihail Ivanchev; Francis Zinke; Ulrike Lucke | |||
In this paper we report on our continuing research of an audio-tactile
system for visualizing travel routes on modern interactive refreshable tactile
displays for blind users. The system is especially well suited for pre-journey
route learning. Similar to systems for sighted users, e. g. online map services
like Google Maps, we utilize an audio-tactile interactive map based on a
concept from third-party research work and freely available geographic data.
The system was implemented as a prototype for a touch-sensitive tactile
display. Our main research interest is to explore audio-tactile concepts for
displaying routes on a slippy map. We therefore developed a catalogue of ideas
currently featuring tactile textures and indications for the route's course,
waypoint symbols, audio indications etc. We summarize the results of an initial
user test which indicates that the route visualization with our set of
strategies is feasible and justifies further research. Keywords: GIS; Accessible Geographic Routes; Visually Impaired; Blind |
Road Information Collection and Sharing System Based on Social Framework | | BIBAK | Full-Text | 89-91 | |
Takatoshi Suenaga | |||
Walking is an important factor in good health, and people derive many
benefits from travelling by foot. However, walking entails risks such as
traffic accidents and falls. If people recognize specific risks before walking,
then they may avoid such accidents. This paper proposes a road information
collection and sharing tool for the public. The proposed system stores passive
risks from the properties of the landscape and active risks identified by
people. Moreover, it realizes an easy way to access such risk information. When
people know and avoid these risks, they will be able to walk safely. Keywords: Road Information; Landscape Gradient; Word of Mouth; Social Framework;
Walking Support |
Waypoint Validation Strategies in Assisted Navigation for Visually Impaired Pedestrian | | BIBAK | Full-Text | 92-99 | |
Slim Kammoun; Marc J-M. Macé; Christophe Jouffrais | |||
In Electronic Orientation Aids, the guidance process consists in two steps:
first, identify the location of a visually impaired user along the expected
trajectory, and second, provide her/him with appropriate instructions on
directions to follow, and pertinent information about the surroundings. In
urban environment, positioning accuracy is not always optimal and tracking the
user's progress along the expected itinerary is often challenging. We present
three new waypoint-based validation strategies to track the user's location
despite low positioning accuracy. These strategies are evaluated within
SIMU4NAV, a multimodal virtual environment subserving the design of Electronic
Orientation Aids for visually impaired people. Results show that the proposed
strategies are more robust to positioning inaccuracies, and hence more
efficient to guide users. Keywords: Assisted Navigation; Guidance; Virtual Environment; Assistive Technology;
Wayfinding |
ARGUS Autonomous Navigation System for People with Visual Impairments | | BIBAK | Full-Text | 100-107 | |
Eduardo Carrasco; Estíbaliz Loyo; Oihana Otaegui; Claudia Fösleitner; Markus Dubielzig; Rafael Olmedo; Wolfgang Wasserburger; John Spiller | |||
This work addresses the challenge of designing an effective, reliable and
affordable autonomous navigation system for blind and visually impaired people
which also covers journey planning and post journey activities (such as
recommendations and experiences sharing). The main contribution focuses on the
integration of accurate real-time user positioning data with binaural 3D audio
based guiding techniques on mobile devices and a web services delivering
platform. The aim is to produce an autonomous navigation system that can be
used to guide targeted users along pre-defined tracks and that can be used also
before and after the journey to carry out several related tasks such as journey
planning, training and sharing of experiences. A preliminary prototype of this
concept has been built and tested with 4 end users in both rural and urban
environments, obtaining encouraging results. Keywords: Blind Navigation; Binaural Audio Guidance; Global Navigation Satellite
Systems (GNSS); Inertial Navigation Systems (INS); Assistive Technology |
A University Indoors Audio-Tactile Mobility Aid for Individuals with Blindness | | BIBAK | Full-Text | 108-115 | |
Konstantinos Papadopoulos; Marialena Barouti; Konstantinos Charitakis | |||
This article presents the development of an Audio-Tactile aid in order to
facilitate and enhance the spatial knowledge as well as the independent and
safe movement of individuals with blindness in the University of Macedonia
indoors. Moreover the developed aid provides information that helps blind
individuals to obtain a cognitive image of the university indoors, plan routes
they wish to track and easily identify specific locations and services. The
implementation procedure of the Audio-Tactile navigation system included the
following steps: 1) development of digital maps that include specific spatial
information for people with blindness, 2) production of tactile maps 3)
research on the readability of the tactile maps by blind individuals and
development of revised tactile maps, 4) development of Audio-Tactile maps and
their connection with touchpad devices, and 5) a study to derive the most
appropriate locations where 10 touchpads will be installed in the university
indoors. Keywords: Mobility; Visual Impairments; Audio-Tactile System |
An OpenStreetMap Editing Interface for Visually Impaired Users Based on Geo-semantic Information | | BIBAK | Full-Text | 116-119 | |
Ahmed El-Safty; Bernhard Schmitz; Thomas Ertl | |||
We present a system for editing OpenStreetMap data, which is based on the
idea that common-sense preconceptions about the world can be encoded
semantically and thus used in conjunction with preexisting data about an area
to predict probable changes. The system can thus reduce the number of
OpenStreetMap tags from which the user can choose. Keywords: OpenStreetMap; Semantic Web; User Interface |
Individualized Route Planning and Guidance Based on Map Content Transformations | | BIBAK | Full-Text | 120-127 | |
Bernhard Schmitz; Thomas Ertl | |||
We have created a system of rule-based map content transformations that
allows to create maps that are better fit for specific purposes and user groups
than the base material. In this paper we demonstrate the application of the map
content transformations in route planning and route guidance of a navigation
system for specific user groups. We show that it is possible to create maps
that are better suited to these tasks than the material on which they are
based. Keywords: Map Content Transformations; OpenStreetMap |
Cognitive Evaluation of Haptic and Audio Feedback in Short Range Navigation Tasks | | BIBAK | Full-Text | 128-135 | |
Manuel Martinez; Angela Constantinescu; Boris Schauerte; Daniel Koester; Rainer Stiefelhagen | |||
Assistive navigation systems for the blind commonly use speech to convey
directions to their users. However, this is problematic for short range
navigation systems that need to provide fine but diligent guidance in order to
avoid obstacles. For this task, we have compared haptic and audio feedback
systems under the NASA-TLX protocol to analyze the additional cognitive load
that they place on users. Both systems are able to guide the users through a
test obstacle course. However, for white cane users, auditory feedback results
in a 22 times higher cognitive load than haptic feedback. This discrepancy in
cognitive load was not found on blindfolded users, thus we argue against
evaluating navigation systems solely with blindfolded users. Keywords: Sonification; Haptics; Navigation; Assistive System; Blind |
Unlocking Physical World Accessibility through ICT: A SWOT Analysis | | BIBA | Full-Text | 136-143 | |
Christophe Ponsard; Vincent Snoeck | |||
Despite progress in awareness and increasing electronic availability of accessibility information, getting a clear picture of physical accessibility of an infrastructure or journey remains an uncertain task. Over the past few years, a number of emerging technologies have gained maturity and adoption. Some examples are smartphones, open data, social networks, and routing engines. They are also triggering societal shifts about the way people interact together through technology. The purpose of this paper is to analyse how these technologies can positively or negatively impact the evolution of physical accessibility by using a SWOT (Strengths-Weaknesses-Opportunities-Threats) approach. |
Personalized Smart Environments to Increase Inclusion of People with Down's Syndrome | | BIBAK | Full-Text | 144-147 | |
Eva Schulze; Anna Zirk | |||
POSEIDON aims at developing a tablet app for people with Down's Syndrome
(DS) to become more independent and integrated. It follows an user-centered
approach by involving primary (people with DS) and secondary users (parents,
carers etc.). In order to assess the needs and requirements as well as the
usage of technology of people with DS an online survey was conducted. Results
indicate that a majority of them use tablets in their daily life. Most of the
carers agree that technical assistants can help to overcome daily challenges
and that there is a need for support in the fields of communication,
socializing and school/work/learning. Important features and design aspects
were mentioned. Keywords: Down's Syndrome; Smart environment; Inclusion; Requirements |
ELDERS-UP! | | BIBAK | Full-Text | 148-151 | |
Salvador Rivas Gil; Víctor Sánchez Martín | |||
Elderly are sometimes set apart in some situations due to the fact that they
are considered less efficient and productive, for example, in the work
environment. For many elderly, their jobs represent the way of feeling useful
for themselves and the society and also for having goals which keeps them
motivated. Although our current society is led by productivity and efficiency
both in professional and personal scenarios, today, information is the key for
efficiency; those who are able to manage the information are the ones that
survive the daily rush without sinking in it. Keywords: Elders; Adaptive User Interface; Engagement; Motivation; Cognitive
Conditions |
An Interactive Robotic System for Human Assistance in Domestic Environments | | BIBAK | Full-Text | 152-155 | |
Manuel Vinagre; Joan Aranda; Alicia Casals | |||
This work introduces an interactive robotic system for assistance, conceived
to tackle some of the challenges that domestic environments impose. The system
is organized into a network of heterogeneous components that share both
physical and logical functions to perform complex tasks. It consists of several
robots for object manipulation, an advanced vision system that supplies
in-formation about objects in the scene and human activity, and a spatial
augmented reality interface that constitutes a comfortable means for
interacting with the system. A first analysis based on users' experiences
confirms the importance of having a friendly user interface. The inclusion of
context awareness from visual perception enriches this interface allowing the
robotic system to become a flexible and proactive assistant. Keywords: Robot Assistance; Human-Robot Interaction; Accessibility; Ambient
Intelligence; Activity Recognition |
RGB-D Video Monitoring System to Assess the Dementia Disease State Based on Recurrent Neural Networks with Parametric Bias Action Recognition and DAFS Index Evaluation | | BIBA | Full-Text | 156-163 | |
Sabrina Iarlori; Francesco Ferracuti; Andrea Giantomassi; Sauro Longhi | |||
Within 2050, demographic changes, due to the significant increase of elderly, will represent one of the most important aspect for social assistance and healthcare institutions, particularly in European Union. Great attention is given to dementia diseases with over 35 million people worldwide who live in this condition, affected by cognitive impairment, frailty and social exclusion with considerable negative consequences for their independence. Preference will be given to intervention with high impact on the quality of life of the individual associated with a socio-economic burden, also for people who care for them. The main challenge comes from the social objective of assisting and keeping elderly people in their familiar home surrounding or to enable them to "aging in place". |
Experiences and Challenges in Designing Non-traditional Interfaces to Enhance the Everyday Life of Children with Intellectual Disabilities | | BIBAK | Full-Text | 164-171 | |
Janio Jadán-Guerrero; Luis A. Guerrero | |||
Some experiences regarding children with disabilities carried out in
Ecuador, Costa Rica and Spain have contributed to realize the importance of
reading in order to enhance their daily life activities, independence and
social integration. This article describes a qualitative study to understand
general issues related to the design of non-traditional technologies for
children with intellectual disabilities. A methodological approach is described
and explained through the results of exploratory surveys and interviews.
According to the information obtained from experts and the method of literacy
acquisition proposed by Troncoso and Del Cerro, the design of a smart kit using
non-traditional user interfaces is presented. A preliminary evaluation of the
first prototype is described. The paper concludes by reflecting upon the
importance of literacy acquisition and the challenges to design non-traditional
interfaces to support learning of children with intellectual disabilities. The
development of the phase two of the prototype and empirical evaluation is part
of the future work. Keywords: Non-Traditional User Interfaces; Human-Computer Interaction; Literacy
Acquisition; Children with Intellectual Disabilities; Daily Life Activities |
Implementation of Applications in an Ambient Intelligence Environment: A Structured Approach | | BIBAK | Full-Text | 172-179 | |
Laura Burzagli; Pier Luigi Emiliani | |||
Based on the work in the FOOD project, an approach for the design of an
intelligent environment and the development of applications to favour
independent living is presented. Starting from the definition of activities to
be carried out in the different living environments, the approach is based on
the formalization of information relevant to describe functionalities,
technology and users and the presence of "intelligence" for adapting the
functionalities and their interfaces to individual users. Keywords: Ambient Intelligence; ICT Applications; Artificial Intelligence; Natural
Interfaces |
Applying Small-Keyboard Computer Control to the Real World | | BIBAK | Full-Text | 180-187 | |
Torsten Felzer; I. Scott MacKenzie; Stephan Rinderknecht | |||
This paper presents a usability study for text entry with a new version of
the assistive keyboard replacement OnScreenDualScribe. Over five sessions
(approximately 1 hr/session), three able-bodied novice participants achieved an
entry rate of 13.9 wpm. In a case study, one disabled expert achieved an entry
rate of 6.6 wpm. The main aspects of the software are described and differences
to the ancestor DualScribe are highlighted. Finally, the potential impact of
the system for persons with neuromuscular diseases -- a user group it
particularly accommodates -- is elaborated. Keywords: Human-computer Interaction; Assistive Technology; Word Prediction; Ambiguous
Keyboards; Neuromuscular Diseases; Keyboard Replacement; Mouse Alternative;
Combined Input Device |
Design and Evaluation of Multi-function Scanning System: A Case Study | | BIBAK | Full-Text | 188-194 | |
Frédéric Vella; Damien Sauzin; Frédéric Philippe Truillet; Nadine Vigouroux | |||
We present in this paper an assistive technology of communication and
command for quadriplegics. To carry out this assistive technology, a user
centered design approach with the patient, his occupational therapists and his
family was conducted. Various iterative versions of the prototype have been
defined by means of the SOKEYTO platform to meet the needs and the abilities of
the quadriplegic person. Options carried out and consecutive choice will be
reported as well the difficulties to implement. The assistive technology was
used by one quadriplegic person. A qualitative evaluation is also reported. Keywords: Quadriplegic People; Scanning; User Centered Design; Communication;
Environment Control |
Semantic Keyboard: Fast Movements between Keys of a Soft Keyboard | | BIBAK | Full-Text | 195-202 | |
Mathieu Raynal; I. Scott MacKenzie; Bruno Merlin | |||
In this paper we describe Semantic Keyboard: a soft keyboard augmented by
semantic pointing. The cursor crosses faster over keys containing
low-probability letters (considering the prefix already entered). This
optimization reduces the movement of the pointer by 60%, and increases the text
entry speed by 13.5% after the first character in a word. Accuracy is
equivalent to a regular soft keyboard. Keywords: Soft Keyboard; Text Entry; Character Prediction |
The Application of Computerized Chinese Handwriting Assessment Tool to Children with Cerebral Palsy | | BIBAK | Full-Text | 203-209 | |
Hui-Shan Lo; Chia-Ling Chen; Hsieh-Ching Chen; I-hsuan Shen; Cecilia W. P. Li-Tzang | |||
The purpose of this research is to assess Chinese handwriting skills of
children with cerebral palsy (CP) by computerized tool. This tool can provide
immediate information about children's handwriting process and products. The
parameters of process record the spatial and temporal characteristics of
handwriting, which including the total writing time, on-paper time, in-air
time, the ratio, and writing speed. The parameter of production is accuracy of
handwriting. 14 children with CP and 13 typically developing children
participated in this study. The results indicated that children with CP had
significantly lower accuracy rate in Chinese handwriting. In addition, children
with CP also demonstrated longer on-paper time and in-air time in writing
Chinese. Further studies will focus on identifying clinical factors which
result in the handwriting difficulties of children with CP. Keywords: Cerebral palsy; Handwriting |
EyeSchool: An Educational Assistive Technology for People with Disabilities -- Passing from Single Actors to Multiple-Actor Environment | | BIBAK | Full-Text | 210-217 | |
Cristina Popescu; Nadine Vigouroux; Mathieu Muratet; Julie Guillot; Petra Vlad; Frédéric Vella; Jawad Hajjam; Sylvie Ervé; Nathalie Louis; Julie Brin; Joseph Colineau; Thierry Hobé; Loïc Brimant | |||
Since 2005, public policy in France has strongly been encouraging young
people with disabilities inclusion within the regular school system. This has
found a direct application through technical innovation, intended to help
students being more independent within their learning activities. In this
context, the purpose of this paper is to underline the manner in which using
assistive information and communication technologies may improve the inclusive
education for people with disabilities. The case study we present underlines
the complexity of the social world into which the use of a precise assistive
tool takes it place. Keywords: Educational Assistive Technology; Notes-taking Tool; Inclusion;
Multiple-actor Environment |
Accessible 4D-Joystick for Remote Controlled Models | | BIBAK | Full-Text | 218-225 | |
David Thaller; Gerhard Nussbaum; Stefan Parker | |||
Presently there are hardly any toys available which can be used by children,
adolescents and adults with severe physical disabilities. A very interesting
group of non-trivial toys are remote controlled (RC) models because the remotes
can be easily substituted with custom ones. Since RC models need very accurate
commands with very low latency in several channels concurrently, the remote for
usage by people with severe physical disabilities must implement several
requirements. This paper describes and discusses the prototype of a mouth
operated joystick accessible for people with severe physical disabilities to
accurately control RC model helicopters, multicopters, airplanes, boats or
cars. Keywords: Joystick; Assistive Technology; RC models; Non-Trivial Toys |
Development of a Personal Mobility Vehicle to Improve the Quality of Life | | BIBAK | Full-Text | 226-233 | |
Yoshiyuki Takahashi; Masamichi Miura | |||
In today's aging society, the importance of assistance for the people with
limited mobility is acknowledged. Therefore, the personal mobility vehicle
(PMV) for the people with limited mobility is proposed in this paper. Proposed
PMV is propelled by kicking motion with power assisted wheels. It aims to
assist short distance transportation in urban area e.g. moving from a home to a
train station. By using a folding mechanism, it will be possible to carry the
vehicle on public transportations and this will help to extend the area of the
user's activities. In this paper, the overview of our developed PMV and the
results of preliminary experiments are introduced. Keywords: Personal Mobility; Limited Mobility; Transportation |
Automated Configuration of Applications for People with Specific Needs | | BIBA | Full-Text | 234-237 | |
Peter Heumader; Reinhard Koutny; Klaus Miesenberger; Karl Kaser | |||
This paper presents an approach to store user settings and abilities in a user profile that can be used to automatically adjust the settings of applications on mobile or desktop devices for people with special needs. The user profile and the settings are determined automatically with a wizard like application or manually with a carer and are dispatched to other devices with the use of cloud services. By this users with special needs will be able to operate new applications without the need of a carer setting up the application for them. |
Visualizing Motion History for Investigating the Voluntary Movement and Cognition of People with Severe and Multiple Disabilities | | BIBAK | Full-Text | 238-243 | |
Mamoru Iwabuchi; Guang Yang; Kimihiko Taniguchi; Syoudai Sano; Takamitsu Aoki; Kenryu Nakamura | |||
Two case studies were conducted with two children with severe physical and
cognitive disabilities in this research, and a computer-vision based technique
called Motion History was applied to visualize their movement. By changing the
conditions of intervention to the children, the Motion History successfully
helped to find their voluntary movement and effective stimuli that attracted
their attention. It was concluded that finding the changes of movement is very
important for extracting voluntary movement and Motion History is suitable for
that purpose. This gives us a greater possibility of evidence-based interaction
with people with severe and multiple disabilities. Keywords: Motion History; Voluntary Movement; Cognition; Severe and Multiple
Disabilities; OAK |
A Virtual Reality Training System for Helping Disabled Children to Acquire Skills in Activities of Daily Living | | BIBAK | Full-Text | 244-251 | |
Kup-Sze Choi; King-Hung Lo | |||
Deficiency of hand function presents difficulty to disabled people in
various activities of daily living. While rehabilitation training in
occupational therapy is helpful for them to cope with their deficiency, the
paper presents a virtual realty based system in attempt to provide an
alternative approach to complement the conventional methods. The system
simulates tasks of daily living in virtual environments and produces real-time
interactive graphics and forces to enable trainees to practise the skills in
cyberspace. Currently, three tasks are simulated, namely, door opening, water
pouring and meat cutting. Visual, audio and haptic cues are produced as
guidance in response to user's actions. The performance of the users is
recorded automatically on the fly with quantifiable metrics to enable objective
analysis of the learning progress. Findings from initial trials with disabled
children show that they found it very interesting to use the system and could
adapt to the virtual training environment for practicing the tasks. Further
study will be conducted to improve system usability and to evaluate the
training effectiveness. Keywords: Virtual reality; activities of daily living; haptic device; force feedback;
occupational therapy |
The Possibilities of Kinect as an Access Device for People with Cerebral Palsy | | BIBAK | Full-Text | 252-255 | |
Isabel María Gómez; Alberto Jesús Molina; Rafael Cabrera; David Valenzuela; Marcelo Garrido | |||
Cerebral palsy (CP) is a general term for a group of permanent,
non-progressive movement disorders that cause physical disability in
development, mainly in the areas of body movement but it might also affect
intellectual capabilities. Among all this diversity of profiles, we find that,
for some of them, access to a computer application is almost impossible in
spite of the great variety of commercial devices based of different
technologies. Kinect might be a viable possibility in order to facilitate
access to games and computer applications that help users improve their skills
or communication. Keywords: Access Device; Kinect; Cerebral Palsy; Middleware Software |
Development of Sit-to-Stand Support System Using Ground Reaction Force | | BIBAK | Full-Text | 256-259 | |
Hidetaka Ikeuchi; Masuji Nagatoshi; Atuyoshi Miura | |||
This paper shows our developing sit-to-stand support system using ground
reaction force to operate device. Computer system control sit-to-support
mechanism automatically by information of the user ground reaction force (GRF).
The user of this device don't need to operate the switch or button. This paper
shows first prototype device and describe about device control rule, experiment
results and the found problems of control, at first. Secondly, to solve these
problems, experiment device for second prototype design is shown, and
experiment method is described. Finally, preliminary results are shown in these
experiments. Keywords: Sit-to-stand Support; Grand Reaction Force; Human Motion Analysis |
A Critical Review of Eight Years of Research on Technologies for Disabled and Older People | | BIBAK | Full-Text | 260-266 | |
Helen Petrie; Blaíthín Gallagher; Jenny S. Darzentas | |||
This paper presents the initial results of a critical review of recent
research on new and emerging technologies designed for older people and people
with disabilities. The review covers research published between 2005 and 2012
in a range of international peer-reviewed journals and conferences, in the
areas of technology, human-computer interaction, disability, assistive
technologies and gerontology. On the basis of this review of research, we are
exploring what issues for disabled and older people are being addressed by
researchers and developers, whether the research is motivated by user needs,
the methodologies used, and outcomes achieved. Keywords: Older Users; Disabled Users; Assistive Technology; New Technology; Methods
for Working with Disabled and Older Users |
User Evaluation of Technology Enhanced Interaction Framework | | BIBAK | Full-Text | 267-274 | |
Kewalin Angkananon; Mike Wald; Lester Gilbert | |||
This paper focuses on user evaluation of the Technology Enhanced Interaction
Framework (TEIF). Questionnaire results from participants using or reviewing
the TEIF method to evaluate requirements and design technology solutions for
problems involving interactions with hearing impaired people showed that they
thought it helped them more than the Other methods and that it would also help
them to gather requirements and to design technology solutions for all disabled
people if information about other disabilities than hearing impairment was
provided. The objective results from the experimental tasks will be analysed to
investigate how the participants performed on the requirements evaluation and
solutions evaluation tasks with the TEIF method and the other preferred method.
These results will be compared with the participants' questionnaire answers
which reflected what they thought about the TEIF method. Future work includes
extending the Method and Technology Suggestions Table to include information
about other disabilities than just hearing impairment. Keywords: Framework; Interaction; Evaluation; Accessibility; Hearing Impairments |
A Unified Semantic Framework for Detailed Description of Assistive Technologies Based on the EASTIN Taxonomy | | BIBAK | Full-Text | 275-282 | |
Nikolaos Kaklanis; Konstantinos Votis; Konstantinos Giannoutakis; Dimitrios Tzovaras; Valerio Gower; Renzo Andrich | |||
This paper presents a unified semantic framework that can used by
vendors/service providers that would like to semantically describe their
assistive technologies according to the categorization proposed by the ISO 9999
standard as well as the EASTIN taxonomy. The framework is based on an approach
towards a unified semantic description of assistive technologies by combining
information coming from different sources. The wealth of information of the
EASTIN network, the biggest and most comprehensive information service on
assistive technology serving older and disabled people, is currently exploited
by the proposed framework in a unified way. The proposed framework offers an
easy mechanism for including a new assistive technology in the whole
Cloud4all/GPII infrastructure. Keywords: Semantic Alignment; Ontology; Assistive Technologies; Application
Classification |
Results from Using Automatic Speech Recognition in Cleft Speech Therapy with Children | | BIBAK | Full-Text | 283-286 | |
Zachary Rubin; Sri Kurniawan; Travis Tollefson | |||
Most children with cleft are required to undertake speech therapy after
undergoing surgery to repair their craniofacial defect. However, the untrained
ear of a parent can lead to incorrect practice resulting in the development of
compensatory structures. Even worse, the boring nature of the cleft speech
therapy often causes children to abandon home exercises and therapy altogether.
We have developed a simple recognition system capable of detecting impairments
on the phoneme level with high accuracy. We embed this into a game environment
and provide it to a cleft palate specialist team for pilot testing with
children 2 to 5 years of age being evaluated for speech therapy. The system
consistently detected cleft speech in high-pressure consonants in 3 out of our
5 sentences. Doctors agreed that this would improve the quality of therapy
outside of the office. Children enjoyed the game overall, but grew bored due to
the delays of phrase-based speech recognition. Keywords: Therapeutic Games; Child Speech Therapy |
Do-It-Yourself (DIY) Assistive Technology: A Communication Board Case Study | | BIBAK | Full-Text | 287-294 | |
Foad Hamidi; Melanie Baljko; Toni Kunic; Ray Feraday | |||
Do-It-Yourself (DIY) and open design approaches allow for the development of
customized, affordable assistive technologies. Freely shared designs and
software components open doors for new ways to create and to share technology,
representing an approach that has the potential to be more efficient,
affordable, and effective than commercial approaches to Assistive Technology
development and deployment. In this paper, we present a case study of how these
methods have been used to develop a DIY, open-source Speech-Generating Device. Keywords: Do-It-Yourself (DIY); Open-Source Hardware; Open Design; Communication
Boards; SGDs; Assistive Technology |
A Decision-Tree Approach for the Applicability of the Accessibility Standard EN 301 549 | | BIBAK | Full-Text | 295-302 | |
Loïc Martínez-Normand; Michael Pluke | |||
Public procurement is one way for public administrations to promote
accessibility. By procuring accessible products and services, they raise the
awareness about accessibility and they have an impact on industry. In Europe,
the European Commission's Mandate M 376 has resulted in a European Standard, EN
301 549, containing accessibility requirements for ICT products and services
that are suitable for use in public procurement. EN 301 549 has been drafted
using a feature-based approach and can be applied to any ICT product and
service. The users of the standard will need guidance to decide which
requirements of the EN apply to a given product or service. This paper presents
a decision-tree approach for that problem. This approach is being validated
during the design of the user interface of a support tool for the assessment of
the accessibility of ICT products and services. Keywords: ICT Accessibility; Standards; Accessibility Requirements; European Policy |
ADAPTAEMPLEO: Interactive Advisor to Adapt Workplaces for Persons with Disabilities and Promote Employment in the Retail Sector | | BIBAK | Full-Text | 303-306 | |
Alberto Ferreras; Andrés Soler; Rakel Poveda; Alfonso Oltra; Carlos García; Purificación Castelló; Juan Manuel Belda-Lois; José Crespo | |||
An interactive advisor for ergonomic assessment and fitting of workplaces to
person with disabilities (physical, sensorial, and/or psychological) is
presented. It has been designed to identify areas of mismatching between
workplace demands and worker functional capabilities, in order to promote the
access to employment, and labor integration for people with disabilities in the
retail sector. The methodology includes the process of incorporation as well as
the adaptive measures of workplaces by using reasonable adjustments. Keywords: Ergonomics; Persons with Disabilities; Work Demands; Functional Capacities;
Workplace Adaptation; Computer Aided System |
The Development of Training Modules on ICT to Support Disabled Lifelong Learners | | BIBAK | Full-Text | 311-314 | |
Simon Ball | |||
A global consortium has come together under the Enable project to create a
suite of freely available, accessible, online, self-paced training modules for
tutors working in adult education, who may be supporting disabled students.
Topics covered include working with disabled people; pedagogical principles of
using ICT to support disabled learners; making online teaching content
accessible; free and built-in ICT to support disabled adult learners; end-user
issues including accessibility and usability; and standardisation. Keywords: Training; Online; Education; Lifelong Learning; Adult Education; Disability;
Accessibility; Inclusion |
Evaluating ICT Based Learning Technologies for Disabled People | | BIBAK | Full-Text | 315-322 | |
Marion Hersh | |||
This paper discusses the need for an evaluation framework specifically for
(ICT-based) learning technologies for disabled learners and demonstrates the
limitations of existing approaches based on the evaluation of assistive
technology or learning technologies for non-disabled learners. It presents
elements of the first full such evaluation framework comprising a set of
evaluation principles and aims and three evaluation methodologies. It has a
wide range of applications including (i) stand-alone and comparative
evaluations of ICT-based learning technologies for disabled people; (ii)
identifying gaps in provision or the need for modifications; (iii) supporting
the design and development of new technologies; (iv) supporting learners in
making informed choices about appropriate learning technologies; and (v)
supporting the policy process and determination of the future research agenda,
including by evaluating the impact of various measures on the effective
implementation and use of ICT learning technologies for disabled learners. Keywords: Evaluation; ICT; learning technologies; aims; principles |
Supporting Senior Citizen Using Tablet Computers | | BIBAK | Full-Text | 323-330 | |
Ingo Dahn; Peter Ferdinand; Pablo Lachmann | |||
It seems widely accepted that senior citizen need special assistance for
using IT technology and that tablet computers are more easy for them to use
than PC. The project "Tablets for Seniors" challenged these preconceptions. It
evaluated over three months the use of Android tablet computers by a group of
19 seniors, aged between 53 and 82. The group of participants was divided into
a subgroup using an interface specifically designed to support seniors and
another group working with the native Android user interface. Support requests
from both groups, in face-to-face meetings or through a dedicated phone
hotline, have been recorded and qualitatively analyzed. As results of this
qualitative study we present in this paper recommendations for the design of
user interface and accompanying support measures. Keywords: Tablet Computer; Seniors; User Interface |
Development of Multimodal Textbooks with Invisible 2-Dimensional Codes for Students with Print Disabilities | | BIBA | Full-Text | 331-337 | |
Akio Fujiyoshi; Mamoru Fujiyoshi; Akiko Ohsawa; Yuko Ota | |||
Utilizing invisible 2-dimensional codes and digital audio players with a 2-dimensional code scanner, we developed a new type of textbooks for students with print disabilities, called "multimodal textbooks." Multimodal textbooks can be read with the combination of the two modes: "reading printed text" and "listening to the speech of the text from a digital audio player with a 2-dimensional code scanner." Since a multimodal textbook looks the same as a regular textbook and the price of a digital audio player is reasonable (about 30 euro), we think multimodal textbooks are suitable for students with print disabilities in ordinary classrooms. |
Towards a Methodology for Curriculum Development within an Accessible Virtual Campus | | BIBAK | Full-Text | 338-341 | |
Hector R. Amado-Salvatierra; Rocael Hernández; Antonio García-Cabot; Eva García-López; Concha Batanero; Salvador Otón | |||
The constant evolution of assistive technologies helps users with
dis-abilities to have a myriad of choices to access digital content, and the
application of accessibility standards and their relationship with assistive
technologies enable and potentiate user interaction with web based systems for
everyday activities. In the context of education through Virtual Learning
Environments, a basic stone of the web accessibility initiative is the content
prepared and provided by teachers, but they need to be instructed on how to
generate accessible documents and how to provide truly accessible curriculum
developments. In this sense, E-Learning solutions adopted by several
institutions, including Higher Education Institutions need to be encouraged to
validate and promote accessibility within a Virtual Campus. This article
presents an initiative promoted by ESVI-AL project, looking to improve
accessibility in virtual higher education through the definition of systematic
and replicable methodological processes for the design and implementation of
accessible virtual curriculum developments. Keywords: Accessibility; Training; e-Learning; Curriculum Design |
The Use of Assistive Technologies as Learning Technologies to Facilitate Flexible Learning in Higher Education | | BIBAK | Full-Text | 342-349 | |
Michael Goldrick; Tanja Stevns; Lars Ballieu Christensen | |||
This paper presents the argument that some assistive technologies have in
recent times become more widely used in education to support all students.
Building on research gathered as part of a European funded project, the authors
present findings that indicate that students are becoming more aware and
sensitive to their own learning preferences and their own styles. More
importantly however, the paper suggests that through the evolution of
technology, students can now choose how to study, where to study and when to
study. Underpinning this change, the paper explores how some assistive
technologies have evolved into learning technologies by taking into
consideration three factors: European social policy, universal design theory
and learning preference theories. Keywords: Flexible Learning; Assistive Technology; Learning Technology; Higher
Education; RoboBraille; European Social Policy; Universal Design Theory and
Learning Preference Theories |
The Literacy of Integrating Assistive Technology into Classroom Instruction for Special Education Teachers in Taiwan | | BIBAK | Full-Text | 350-357 | |
Ming Chung Chen; Chi Nung Chu; Chien-Chuan Ko | |||
This study surveyed the special education teachers' literacy for integrating
assistive technology into the instruction in Taiwan. At first, a scale for
literacy of assistive technology integration for special education teachers was
developed through Delphi technique. 391 special education teachers completed
the web-based questionnaire. The results reveal that the teachers' AT literacy
were inadequate. Though the teachers are aware of the importance of assistive
technology, they lack essential skills and knowledge. The results of the
analysis also indicated that participation in AT training programs and
experiencing with students who used AT devices benefited their AT literacy. Keywords: Special Education Teacher; Assistive Technology; Literacy |
University Examination System for Students with Visual Impairments | | BIBAK | Full-Text | 358-365 | |
Konstantinos Papadopoulos; Zisis Simaioforidis; Konstantinos Charitakis; Marialena Barouti | |||
This paper presents the development of a web based, platform independent
system for university examination purposes that can be easily accessed and used
by students with visual impairments, with minimum effort required to learn its
use. The developed examination system allows students with visual impairments
to take suitably adapted online written examinations according to their
individual and personalized special characteristics and preferences for reading
digital text. Those special parameters and characteristics can be applied as
predefined user options to the examination platform. The user interface for
individuals with low vision is based on the selection of effective color
contrast and the principle of legible texts that students need in order to read
and write during examinations. Based on the above, it was considered necessary
that special parameters and characteristics had to be tested and determined by
the end users themselves with N Print tests. Keywords: Visual Impairment; Examination System; Computer Based Assessment |
"Planet School": Blended Learning for Inclusive Classrooms | | BIBAK | Full-Text | 366-373 | |
Ingo Karl Bosse | |||
"Planet School" is currently the most important blended learning platform in
Germany. The multimedia content of the popular website is developed especially
for teachers by the public service broadcasters WDR and SWR. However, as it
stands today, "Planet School" is neither accessible by all students, nor does
it meet the needs of the entire student population. This paper presents both
the results of the evaluation of the learning platform in inclusive classrooms
and first recommendations on how to offer variable content for students with
special needs. The revised version of "Planet School" shall address different
types of learners and offer accessible and usable materials, including movies,
television broadcasts, interactive and multimedia content for students with
very different prerequisites for learning. The paper has implications for
application-oriented research in the field of e-inclusion and blended learning,
for the development of multimedia content by broadcasters and others as well as
for the use of multimedia in inclusive classrooms. Keywords: E-inclusion; Blended Learning; Broadcasters; Inclusive Classrooms; Inclusive
Multimedia Learning Materials |
Ensuring Sustainable Development of Simultaneous Online Transcription Services for People with Hearing Impairment in the Czech Republic | | BIBAK | Full-Text | 374-381 | |
Zdenek Bumbalek; Jan Zelenka | |||
Real-time speech transcription is a service of potentially tremendous
positive impact on quality of life of the hearing-impaired. Nevertheless there
is a total lack of government funding for these assistive services in the Czech
Republic. In the article, we present such a business model of a socially
orientated service that enables its long term sustainable development and
provides online transcription services for personal use of the target group for
free. Keywords: eScribe; Online Transcription; Speech to Text Services; Business Model;
Social Entrepreneurship; and Hearing Impaired People |
User Interface Design of Sound Tactile | | BIBA | Full-Text | 382-385 | |
Tatsuya Honda; Makoto Okamoto | |||
We have developed a device that allows sounds to be perceived via hair vibrations by deaf people; the concept is similar to cat whiskers, which can detect air currents. The device converts the loudness of a sound into a vibration with a certain power, and the users wear the device in their hair in much the same way as a hair slide. When the device detects a sound, it relays the information to the user by both shaking the hair and activating a light-emitting diode. This allows other users of the device to gain information about the sound, and facilitates sharing. The results of an assessment experiment showed that deaf people could understand animal-call patterns and car-engine sounds. |
Enhancing Storytelling Ability with Virtual Environment among Deaf and Hard-of-Hearing Children | | BIBAK | Full-Text | 386-392 | |
Sigal Eden; Sara Ingber | |||
The study conducted a 3-month intervention to improve deaf and
hard-of-hearing children's storytelling ability through training in arranging
episodes of temporal scripts, and telling the stories they created. We examined
65 D/HH children aged four to seven years who were divided into two groups:
virtual reality (VR) technological intervention and pictorial non-technological
intervention. Participants completed pretest and posttest measures and
demonstrated significant improvement in storytelling achievements following
intervention. In the VR group the improvement was much more significant. In
addition, participants at an early age at onset of treatment correlated with
children's better achievements in storytelling. Keywords: Deaf; Hard-of-Hearing; Virtual Reality; Storytelling; Sequential; Language |
Teaching Morse Language to a Deaf-Blind Person for Reading and Writing SMS on an Ordinary Vibrating Smartphone | | BIBAK | Full-Text | 393-396 | |
Andras Arato; Norbert Markus; Zoltan Juhasz | |||
Deaf-blind people have a very small window to the world. New technology can
help, but portable Braille lines are expensive. We developed and tested a very
low cost method for reading and writing SMS messages with a Hungarian
deaf-blind person using Android smartphone with vibrating motor built in. Words
and characters were converted to vibrating Braille dots and Morse words. Morse
was taught as code for recognizing characters and also as language for
recognizing words. Keywords: Deaf-blind; Morse language; Language teaching |
Urgent Communication Method for Deaf, Language Dysfunction and Foreigners | | BIBAK | Full-Text | 397-403 | |
Naotsune Hosono; Hiromitsu Inoue; Miwa Nakanishi; Yutaka Tomita | |||
This paper discusses a communication method with smart phones for deaf or
language dysfunction people as well as foreigners at the urgent time of sudden
sickness or fire in order to report to the nearest fire station. Such method is
originally proposed by a hearing impaired person. Their appearances are the
same in the daily life. However at the unexpected situation, they will be
suddenly in trouble at such the occasion of disasters or accidents. The
previous research, which was introduced at ICCHP 2010, proposed a method to
create pictograms or icons referring to multiplex local sign languages with
Multivariate Analysis (MVA). Those outcomes are drawn on a booklet to be held a
dialogue between deaf and hearing people. This time they are implemented on a
smart phone. Normally the usability is measured by the effectiveness,
efficiency and satisfaction. Then this time the outcome is measured by the
efficiency, that how quickly to report the fire station nearby. The evaluation
gathering deaf people and a foreigner found that this method is about three
times quicker to do the first report to the station comparing with text
messaging on a smart phone. Keywords: Inclusive Media; Context of Use; Computer Human Interface; Human Centred
Design; Sensory Evaluation; Tablet Terminal |
Building an Application for Learning the Finger Alphabet of Swiss German Sign Language through Use of the Kinect | | BIBAK | Full-Text | 404-407 | |
Phuoc Loc Nguyen; Vivienne Falk; Sarah Ebling | |||
We developed an application for learning the finger alphabet of Swiss German
Sign Language. It consists of a user interface and a recognition algorithm
including the Kinect sensor. The official Kinect Software Development Kit (SDK)
does not recognize fingertips. We extended it with an existing algorithm. Keywords: Sign language; Swiss German Sign Language; Finger Alphabet; Kinect; Learning
Environment |
TerpTube: A Signed Language Mentoring Management System | | BIBA | Full-Text | 408-414 | |
Deborah I. Fels; Daniel Roush; Paul Church; Martin Gerdzhev; Tara Stevens; Ellen Hibbard | |||
Signed language interpreter training programs are necessary to support the training of professional signed language interpreters who facilitate the communication between Deaf and hearing people. However, these programs have few tools that provide asynchronous or non-face-to-face means of giving feedback to or communication with learners in the signed language by peers, instructors or mentors. TerpTube has been designed to support these asynchronous activities through the use of video and signlinking within a computerized mentoring management system. Initial user studies show that mentors and mentees/students found TerpTube easy to use to create and post video material and provide commentary on that video in American Sign Language without the use of text. Having the ability to provide comments to comments was thought to be a good idea but made the user interface confusing. |
Collaborative Gaze Cues and Replay for Deaf and Hard of Hearing Students | | BIBAK | Full-Text | 415-422 | |
Raja Kushalnagar; Poorna Kushalnagar | |||
Deaf and Hard of Hearing students who use visual accommodations face
difficulties in following multimedia lectures due to the delay in visual
translation and dividing attention between simultaneous visuals. As a result,
deaf students miss information. We address these difficulties with two
approaches: visual cues and live replay in recorded lectures. Our analysis
found that when deaf students view the lecture videos with cues, they show less
delay in switching to the active visual information source and report high
satisfaction with the cues. The students who liked the cues were more likely to
demonstrate reduction in delay time associated with shifting visual attention.
Similarly, when deaf students used gaze controlled replay with lecture videos,
they miss less information and report high satisfaction with live replay. Keywords: Accessible Technology; Educational Technology; DHH Users |
Toward a Reversed Dictionary of French Sign Language (FSL) on the Web | | BIBAK | Full-Text | 423-430 | |
Mohammed Zbakh; Zehira Haddad; Jaime Lopez Krahe | |||
On the web, we can find dictionaries for viewing a sign of French Sign
Language (FSL), from a word. However, finding a word from a sign is much more
complicated. For this purpose, we propose to design a web application to find
the meaning of a FSL sign in the French language from the sign's features. In
order to do this, we have developed an intelligent system capable of learning
and self-improving by feeding off the information presented to it during its
use. We have managed to find a middle ground between the reliability of the
results and the ergonomics of Human-Machine Interfaces (HMI). Keywords: Human Machine Interface; Classification Algorithms; French Sign Language;
Learning Algorithm |
A Novel Approach for Translating English Statements to American Sign Language Gloss | | BIBAK | Full-Text | 431-438 | |
Achraf Othman; Mohamed Jemni | |||
In this paper, we present a study on the relationship between American Sign
Language (ASL) statements and English written texts toward building a
statistical machine translation (SMT) using 3D avatar for interpretation. The
process included a novel algorithm which transforms an English part-of-speech
sentence to ASL-Gloss. The algorithm uses a rule-based approach for building
big parallel corpus from English to ASL-Gloss using dependency rules of
grammatical parts of the sentence. The parallel corpus will be the input of the
translation model of the SMT for ASL. The results we obtained are highly
consistent, reproducible, with fairly high precision and accuracy. Keywords: Sign Language Processing; Hybrid Machine Translation; Artificial Corpus;
Gloss Annotation System |
Hand Location Classification from 3D Signing Virtual Avatars Using Neural Networks | | BIBAK | Full-Text | 439-445 | |
Kabil Jaballah; Mohamed Jemni | |||
3D sign language data is actively being generated and exchanged. Sign
language recognition from 3D data is then a promising research axis aiming to
build new understanding and efficient indexing of this type of content.
Model-based recognition strategies are commonly based on recognizing sign
language features separately. Those features are: the handshape, the hand
position, the orientation and movement. In this paper, we propose a novel
approach for hand position classification in the space. The approach is based
on a two-layer feed-forward network and generates classifications which are
very close to human perception. Evaluations have been made by 10 PhD students
and 2 sign language experts. The evaluation of the results shows the
superiority of our approach compared with classic methods based on the
calculation of the distance between the hand and the face as well as the method
of K nearest neighbors. In fact, the misclassification average of our methods
was the lowest with 4.58%. Keywords: Virtual Signers; Sign Language Recognition; Hand Position; 3D
Classification; Neural Networks |
Towards a Phonological Construction of Classifier Handshapes in 3D Sign Language | | BIBAK | Full-Text | 446-453 | |
Kabil Jaballah; Mohamed Jemni | |||
3D sign language generation has showed real performances since several
years. Many systems have been proposed aiming to generate animated sign
language through avatars, however, the technology still young and many
fundamental parameters of sign language like facial expressions and other
iconic features have been ignored in the proposed systems. In this paper, we
focus on the generation and analysis of descriptive classifiers also called
Size and Shape Specifiers (SASSes) in 3D sign language data. We propose a new
adaptation of the phonological structure of handshapes that have been given by
Brentari. Our adapted framework is able to encode 3D descriptive classifiers
that can express different amounts or sizes of shapes. We describe the way our
model has been implemented through an XML framework. Our model is a way to link
the phonological level with the 3D physical animation level since it is
compliant with sign language phonology as described by Brentari as well as
Liddel & Johnson and compliant with the 3D animation standards. Keywords: 3D Sign Language; Classifiers; phonology |
Efficient Tracking Method to Make a Real Time Sign Language Recognition System | | BIBAK | Full-Text | 454-457 | |
Maher Jebali; Patrice Dalle; Mohamed Jemni | |||
In the field of automatic treatment of natural languages, the analysis and
the exploitation of each statement in sign language (SL) have a great
importance. In fact, the own specificities of SL such as the simultaneity of
many parameters, the significant role of the facial expression, the use of
space to structure the statement, as well as the technical specificities, such
as the change lightening and the presence of occlusion in the space of
one-sighted-capture, have a deep effect on tracking the different parts of the
body. In this paper, we propose an empiric method of tracking adapted to the
specificities of SL that we use to elaborate a real time recognition system
based on a prediction approach. Keywords: Sign Language Recognition; Object Tracking; Sign Language Modeling |
A Virtual Signer to Interpret SignWriting | | BIBAK | Full-Text | 458-465 | |
Yosra Bouzid; Mohamed Jemni | |||
In the absence of a standardized writing system to transcribe their native
sign language, deaf signers cannot communicate between each other in their own
language except face-to-face. They can't leave messages, read books take class
notes and send email in sign language. Certainly, being able to read and write
their own language would bring to these signers the same advantage that writing
systems for spoken languages bring to speakers. SignWriting system seems at
present the most appropriate method that could meet the deaf needs than other
existing notations, as it was intended as an everyday tool for reading and
writing. However, such script requires a training to learn to interpret the
proposed transcriptions. In this paper, we present an avatar-based system
named, tuniSigner, able to display and interpret automatically sign language
transcriptions, in the well known SignWriting system. Showing how the actual
gestures should be performed in virtual reality would be very useful to
signers. Keywords: Deaf; Hearing Impaired; Virtual Avatar; SignWriting; Sign Language |
A Multi-layer Model for Sign Language's Non-Manual Gestures Generation | | BIBAK | Full-Text | 466-473 | |
Oussama El Ghoul; Mohamed Jemni | |||
Contrary to the popular believes, the structure of signs exceeds the simple
combination of hands movements and shapes. Furthermore, sign significance
resides, not in the hand shape, the position, the movement, the orientation or
facial expression but in the combination of all five. In this context, our aim
is to propose a model for non-manual gesture generation for sign language
machine translation. We developed in previous works a new gesture generator
that does not support facial animation. We propose a multi-layer model to be
used for the development of new software for generating non-manual gestures
NMG. Three layers compose the system. The first layer represents the interface
between the system and external programs. Its role is to do the linguistic
treatment in order to compute all linguistic information, such as the
grammatical structure of the sentence. The second layer contains two modules
(the manual gesture generator and the non-manual gesture generator). In first
module the non-manual gestures generator uses three dimension facial modeling
and animation techniques to produce facial expression in sign language. Keywords: Multi-layer Model; Non-Manual Gesture; Sign Language; Machine Translation |
SIGN MOTION: An Innovative Creation and Annotation Platform for Sign Language 3D-Content Corpora Building Relying on Low Cost Motion Sensors | | BIBAK | Full-Text | 474-481 | |
Mehrez Boulares; Mohamed Jemni | |||
The manual transcription process of Sign Language is a work-intensive step
which requires considerable effort to create Signs. Even, often the result of
this step misses the natural aspect of motion to be conform to the natural
human interpretation. In other words, the lack of the sign language annotated
corpora is closely related to the difficulty of the sign creation task. In this
paper, we propose a novel tool Signmotion for creating an annotated sign
language corpus based on Natural Human Gestures: By overlaying a real signer
motion onto an articulated 3D skeleton using Microsoft Kinect and Leap motion
sensors. Signmotion is created to support the natural 3D facial expression, the
natural 3D body posture and gives the possibility to annotate and analyze each
sign and motion in the recorded animation. The resulting data and structure are
precise enough to create and to store signs to be used for Sign Language data
analysis or Machine Translation using virtual signer. Keywords: Transcription; Sign Language; Kinect; Leap Motion; Virtual Agent; Facial
Expression; Motion Analysis; Machine Translation; Corpus |
Gestures in Sign Language: Animation and Generation in Real-Time | | BIBAK | Full-Text | 482-489 | |
Nour Ben Yahia; Mohamed Jemni | |||
Many statistics have confirmed that many deaf are enabled to access to
written information. As a solution, computer applications designed for deaf
person, have been created. Therefore, to facilitate access to information, new
methods improving the dialogue between human and machine are required. The
signs generation is based on different parameters such as manual configuration,
orientation of hands, the location where the sign is made, the movement made by
hand and the facial expression accompanying the realization of the sign. We
take into account all these parameters and the system presented in this paper
is based also on avatars which have many degrees of freedom. The challenge of
this project is to find the tradeoff between computational time and realistic
representation that must be closer to real-time generation signs. Keywords: Sign language; Animation; Avatar |
Improving Accessibility of Lectures for Deaf and Hard-of-Hearing Students Using a Speech Recognition System and a Real-Time Collaborative Editor | | BIBAK | Full-Text | 490-497 | |
Benoît Lathière; Dominique Archambault | |||
The purpose of this study is to experiment the usability of a speech
recognition system to help deaf and hard-of-hearing students to understand the
lesson inside the classroom by subtitling the speech of the professor in live.
The proposed solution is to repeat the professor's speech in a microphone
plugged to a notebook with a speech-to-text software and to generate the text
inside a collaborative editor displayed in front of the student. The repeater
is a valid volunteer listening the professor's speech in the classroom. The
software transforms the voice in text. The deaf student could read the text on
his own device (a notebook or a mobile device). Keywords: Deaf and Hard-of-Hearing Students; Speech Recognition; Collaborative Text
Editor; Live Transcript |
Examining the Characteristics of Deaf and Hard of Hearing Users of Social Networking Sites | | BIBAK | Full-Text | 498-505 | |
Ines Kozuh; Manfred Hintermair; Matjaz Debevc | |||
In this study we examined whether the level of hearing loss is related to
the frequency of communication within different situations and performance
activities on social networking sites. It was also investigated as to how the
frequency of activities were related to the perceived accessibility of these
sites. Firstly, the findings revealed that users with lower levels of hearing
loss communicated more frequently with hearing persons in the written language
than users at higher levels. In contrast, they communicated less frequently
with deaf users in sign language than those with higher levels of hearing loss.
Secondly, users with lower levels of hearing loss posted videos more frequently
than those with higher levels. Thirdly, the more frequently the deaf and hard
of hearing users actualized their profiles, posted photos, videos, commented
and liked the content, the higher the perceived accessibility of those sites
they reported. Keywords: Deaf; Hard of Hearing; Social Networking Sites; Communication; Evaluation |
A Smart-Phone Based System to Detect Warning Sound for Hearing Impaired People | | BIBAK | Full-Text | 506-511 | |
Koichiro Takeuchi; Tetsuya Matsumoto; Yoshinori Takeuchi; Hiroaki Kudo; Noboru Ohnishi | |||
We propose a simple system for detecting warning sound. The system processes
a signal captured by a microphone by an IIR band pass filter with a pass band
covering warning sound spectrum and then applies an IIR comb filter
corresponding to the fundamental frequency of warning sounds. The system
calculates the ratio of the mean of the absolute values of the input signal to
the output signal of the comb filter. If the ratio is smaller than a threshold,
the system judges that warning sounds exist. As an experiment result, the
proposed system can detect the ambulance sirens with accuracy above of 94%
under noisy environments of SNR 0 dB, while over-detection rate is less than
3%. In an experiment using five real sounds recording approaching siren on the
road, its accuracy ranges from 30 to 82%. Keywords: Sound Source Recognition; Warning Sound; IIR Comb Filter; Real Time |
A Support System for Teaching Practical Skills to Students with Hearing Impairment | | BIBAK | Full-Text | 512-515 | |
Takuya Suzuki; Makoto Kobayashi | |||
In the class of practical lesson such as painting or modeling to hearing
impaired students, conventional translation services do not make enough effects
because students cannot see a signer or captioning texts and teacher's
operation simultaneously. To solve the problem, we propose a tabletop
projection system with special software which displays synchronized explanation
texts which is prepared in advance. With this system, the teacher can project
text information just besides operation area on the table. To control the
timing of changing those texts, the teacher use foot pedals. Hearing impaired
students' answers for questionnaires after a practical lesson of Manga drawing
with proposed system showed that it was useful for such a lesson. Keywords: Hearing Impaired Student; Practical Lesson; Tabletop Projection |
Learning Environments -- Not Just Smart for Some! | | BIBAK | Full-Text | 520-527 | |
Andreja Istenic Starcic; Sharon Kerr | |||
This paper is discussing Universal Curriculum Design in Higher Education for
curriculum delivered on and using the facilities of Smart Devices. The case
study in Australia (2012-2013) was focused on universal design and pedagogical
approach involving a literature review and an analysis of the university
context with a placement orientation module design in the spirit of inclusive
practice for delivery via smart devices. Through legislative requirements the
majority of Smart Devices are developed with inbuilt accessibility features.
Developing curriculum using Universal Design Principals ensures that students
and faculty have the opportunity to maximize the capability and facilities of
their Smart Devices. Contemporary working and learning environments depend on
ICT integration. Smart environments such as smart phones are facilitating
ubiquitous engagement. University education has to prepare graduates to take
proactive roles in engaging with ICT providing them with learning environments
that both model and demonstrate best practice. Keywords: Universal Curriculum Design; Learning Environment; Disability; Higher
Education; Smart Devices; Sensory Independent Learning |
Different ICT Competency but Similar Pattern between Students with/without Learning Disabilities? | | BIBAK | Full-Text | 528-531 | |
Ming-Chung Chen; Chen-Ming Chen; Ya-Ping Wu; Chien-Chuan Ko; Yao-Ming Yeh | |||
This paper explored if the ICT skills is different between students
with/without learning disabilities across the grades. Meanwhile the current
paper also explored if the structural equation modeling (SEM) is different
between the students with/without learning abilities. 547 students with LD and
2298 students without LD from grade 3 to grade 9 participated in this survey.
The results indicated that although the ICT skill is different between the
students with/without LD, the structure of model is similar between the two
groups. Keywords: ICT skills; Structural Equation Modeling; Students with Learning
Disabilities |
The Application of Computer-Based Chinese Handwriting Assessment System to Children with Dysgraphia | | BIBAK | Full-Text | 532-539 | |
Ting-Fang Wu; Guey-Shya Chen; Hui-Shan Lo | |||
The purpose of this study is to develop a computer-based Chinese handwriting
assessing system. This on-line evaluation system consist two kinds of input
modules, one is copying Chinese characters and the other is memory writing.
This system can provide immediate information about students' handwriting
process and products. The parameters of process record the spatial and temporal
characteristics of handwriting, which including the total time of writing, on
paper time, in air time, the ratio of in-air time/ on-paper time, and the
speed. The parameter of production is accuracy of handwriting. 25 children aged
between 8 and 10 years with dysgraphia and 50 typically developing children of
similar age participated in this study. The results indicated that children
with dysgraphia had significantly lower accuracy rate in both copy and memory
writing tasks. Children with dysgraphia also demonstrated grater the ratio of
in-air time/ on-paper time in both copy and memory writing complex characters.
The system proposed in this study is able to record the real time handwriting
performance of pupil with and without writing difficulties. The kinematic and
kinetic indicators provide more information about how children control their
motion when writing. Further studies can include more writing forms, such as
copying, writing in memory, dictation, and free writing in the assessment
system to comprehensively understand the writing problems of the children with
dysgraphia. Keywords: Children with Dysgraphia; Handwriting; Computer-Based Assessment |
eBooks, Accessibility and the Catalysts for Culture Change | | BIBAK | Full-Text | 543-550 | |
E. A. Draffan; Alistair McNaught; Abi James | |||
The evolution of any product is usually in response to perceived benefits;
either for the workflow, cost-benefit or for the end users. The development of
accessible digital print resources at source of publication is uniquely
advantageous in many ways. A system with improved accessibility for humans also
enables content to be machine read[1]. Although the global publishing and
digital distribution industries have not uniformly embraced accessibility, the
United Kingdom (UK) has been able to make significant positive progress. The UK
has not embraced a specific disability ebook format and distribution system;
instead, through a model of cross-industry stakeholder engagement, a cultural
shift has begun to embed accessibility at source within the publishing
industry. The authors maintain that the cultural change witnessed is not a
coincidence and has its roots in a particular set of catalysts being initiated
by stakeholders resulting in a model that could be replicated. Keywords: eBooks; Accessibility; Culture Change; Disability; Print Impairment;
ereading |
Electronic Braille Blocks: A Tangible Interface-Based Application for Teaching Braille Letter Recognition to Very Young Blind Children | | BIBAK | Full-Text | 551-558 | |
Rabia Jafri | |||
A software solution for teaching Braille letter recognition to very young
blind children is presented which allows them to interact with the computer by
manipulating NFC-tag embedded blocks with Braille letters embossed on their
sides. Braille letter recognition is taught and reinforced through various
exercises and games and auditory feedback is provided via a speech interface.
By embedding interactivity into physical blocks, our system provides the best
of both worlds: the manipulation and exploration of physical objects in
accordance with the sensory dependence and developmental needs of young
children and the exploitation of the power of digital technology to extend and
enhance the learning process taking place through traditional exploratory play.
Furthermore, this is a cost-effective solution and does not require children to
have previous experience with computers. This system can be easily adapted in
the future to teach other concepts such as Braille numbers, shape or texture
recognition. Keywords: Tangible User Interfaces; Braille Literacy; Blind; Visually Impaired;
Educational Software; Children |
Fostering Better Deaf/Hearing Communication through a Novel Mobile App for Fingerspelling | | BIBA | Full-Text | 559-564 | |
Jorge Andres Toro; John C. McDonald; Rosalee Wolfe | |||
Fingerspelling is a critical communication of sign language used not only by
deaf children but also by parents, teachers and interpreters who support them.
The recognition of fingerspelling is particularly difficult for sign language
learners and support software for practice is particularly limited due to the
fluid and natural way that signers will spell with their hands. Any software
tool that helps people practice reading fingerspelling must be natural enough
to represent the fluidity of this motion while at the same time being flexible
enough to spell any list of words in the target language in any order.
To address these needs, this paper introduces a novel mobile app called "Fingerspelling Tutor" that produces natural full-motion fingerspelling using a realistic 3D computer animated character. The app can fingerspell any word that the user types in and can provide practice and quizzing opportunities for the user that are not limited to a fixed set of word lists. The software also allows users to post on social media sites to share their progress with fellow students. |
Developing a New Framework for Evaluating Arabic Dyslexia Training Tools | | BIBAK | Full-Text | 565-568 | |
Fadwa AlRowais; Mike Wald; Gary Wills | |||
Compared to numerous studies in dyslexia, there is still a paucity of
research exploring dyslexia in Arabic and especially the issues that arise in
deciding the success or failure of Arabic dyslexia training tools. The present
research attempts to address this gap by developing an Arabic Framework for
Dyslexia Training Tools (AFDTT) that can be utilized to support the design and
guide the evaluation of such training tools. This paper demonstrates the
development, confirmation and refinement of the AFDTT. Drawing upon established
theories and prior research findings, the initial version of the proposed
framework has been developed. Confirmation and refinement involving feedback
from content experts were carried out on the components of the proposed
framework. Keywords: Dyslexia; Arabic; Framework; Training Tool |
A Fully Accessible Arabic Learning Platform for Assisting Children with Intellectual Challenges | | BIBAK | Full-Text | 569-576 | |
Moutaz Saleh; Jihad Mohamad Aljaam | |||
Children with intellectual challenges (IC) are growing up with wide exposure
to computer technology. Computer software and assistive devices have the
potential to help these children in their education, career development, and
independent living. In spite of the current spread of the use of computers in
education in the Arab world, complete suites of solutions for children with IC
are very scarce. This paper presents a fully accessible Arabic learning
platform for assisting IC children in the State of Qatar. The platform provides
four main components which are divided into learning management content,
multimedia educational tutorials, edutainment games, and ontology-based
learning with the aim of enhancing those children skills, understanding,
communications, and memorization skills, while overcoming their obesity
problems. The effectiveness of the proposed platform has been tested on IC
children, and the results show clear advances on such children's learning
capabilities and improved largely their performance. Keywords: Intellectual Challenges; Learning; Assistive Technology; Accessibility;
Multimedia Tutorials; Edutainment Games; Ontology |
Learning with the iPad in Early Childhood | | BIBAK | Full-Text | 579-582 | |
Linda Chmiliar | |||
Young children typically learn skills and knowledge through play and the
exploration of their environment. In the last few years, many preschool
children have also had the experience of playing on their parent's smart phone
and/or tablet. Although, there is some research that indicates that exploration
that includes the use of digital technologies can support the development of
preschool children, research looking specifically at learning with the iPad for
preschool children is just beginning to emerge. The focus of this study was to
look at the use of the iPad by preschool children with special needs over a 6
week period of time. Keywords: Mobile Technology; Special Needs; Preschool; iPad |
The Influence of Age and Device Orientation on the Performance of Touch Gestures | | BIBAK | Full-Text | 583-590 | |
Linda Wulf; Markus Garschall; Michael Klein; Manfred Tscheligi | |||
Touch interaction has become a popular and widespread interaction technique.
Recent studies indicate significant potential for touch interaction with regard
to the integration of older adults into the world of ICT. We carried out a
study with the goal of gaining deeper insight into performance differences
between young and old users as well as the influence of tablet device
orientation on performance. We implemented an application for the iPad that
measures various performance characteristics when performing six gestural tasks
-- tap, drag, pinch, pinch-pan, rotate left and rotate right -- for both
portrait and landscape orientations. Results showed the importance of device
orientation as an influencing factor on performance and indicate that age is
not the exclusive influencing factor on touch interaction performance. Keywords: Gesture; Tablet; Touchscreen; Aging; Age-Related Differences; Device
Orientation; User Evaluation |
A Tablet-Based Approach to Facilitate the Viewing of Classroom Lecture by Low Vision Students | | BIBAK | Full-Text | 591-596 | |
Stephanie Ludi; Michael Timbrook; Piper Chester | |||
In this paper we describe a tablet-based system that is designed to help
students with partial sight access math and science lecture material in and out
of the classroom. The instructor writes material on the whiteboard, that has a
Mimio Capture bar affixed magnetically as well as sleeves for the markers. The
lecture material is sent as written strokes that the iOS app displays for the
student in real-time. Students can adjust the size and contrast of the
material, as well as write notes on the lecture itself for later viewing. The
access to lecture provided by the system provides students the ability to
follow an active lecture and take more ownership over learning through note
taking. Keywords: Education; Mathematics; Tablet; Visually Impaired |
The iPad as a Mobile Learning Tool for Post-secondary Students with Disabilities | | BIBAK | Full-Text | 597-600 | |
Linda Chmiliar; Carrie Anton | |||
The use of the iPad as a mobile learning tool in post-secondary set-tings
with students with disabilities is an area still relatively unstudied. This
research study investigates how 2 post-secondary students with disabilities
participating in a university course, used iPads in their studies. The study
examines: how students used the iPad; if the iPad, the course materials on the
iPad, and the apps helped to support students in their course work; what issues
arose and how they were addressed; and what kinds of supports the students
needed to use this tool effectively in their studies. Keywords: Mobile Technology; Special Needs; Post-Secondary Students; iPad |