Touching the Past: Haptic Augmented Reality for Museum Artefacts | | BIBAK | Full-Text | 3-14 | |
Mariza Dima; Linda Hurcombe; Mark Wright | |||
In this paper we propose a novel interaction technique that creates the
illusion of tactile exploration of museum artefacts which are otherwise
impossible to touch. The technique meets the contextual necessity, often
requested by museum curators, to background technology and to direct the focus
of the museum visitor's experience to the artefact itself. Our approach relies
on the combination of haptic interaction and the adaptation of a well-known
illusion that enables museum visitors to make sense of the actual physical
non-touchable artefact in an embodied way, using their sensory and motor
skills. We call this technique Haptic Augmented Reality. Keywords: Museum; haptics; touch; authenticity; haptic augmented reality |
Augmented and Geo-Located Information in an Architectural Education Framework | | BIBAK | Full-Text | 15-26 | |
Ernest Redondo; Janina Puig; David Fonseca; Sergi Villagrasa; Isidro Navarro | |||
This work aims to design an academic experience involving the implementation
of an augmented reality tool in architecture education practices to improve the
motivation and final marks of the student. We worked under different platforms
for mobile devices to create virtual information channels through a database
associated with 3D virtual models and any other type of media content, which
are geo-located in their real position. The basis of our proposal is the
spatial skills improvement that students can achieve using their innate
affinity with user-friendly digital media such as smartphones or tablets, which
allow them to visualize educational exercises in real geo-located environments
and to share and evaluate students' own-generated proposals on site. The
proposed method aims to improve the access to multimedia content on mobile
devices, allowing access to be adapted to all types of users and contents. The
students were divided into various groups, control and experimental, in respect
of the function of the devices and activities to perform. The goal they were
given was to display 3D architectural geo-referenced content using SketchUp and
ArMedia for iOS and a custom platform or Android environment. Keywords: Augmented reality; e-learning; geo-e-learning; urban planning; educational
research |
The Didactical Design of Virtual Reality Based Learning Environments for Maintenance Technicians | | BIBAK | Full-Text | 27-38 | |
Tina Haase; Nathalie Weisenburger; Wilhelm Termath; Ulrike Frosch; Dana Bergmann; Michael Dick | |||
The paper at hand describes the necessity of developing didactically
designed Virtual Reality (VR) based learning environments. Changing industrial
processes triggered by the fourth industrial revolution will influence working
and learning conditions. VR based learning environments have the potential to
improve the understanding of complex machine behavior. The paper describes
possibilities for the investigation and documentation of expert knowledge as a
crucial source for developing VR scenarios. The consideration of learning
objectives and the current state of the learners know how are essential for
designing an effective learning environment. The basic theoretical approaches
of didactics and their application to virtual learning environments will be
presented with an example for the maintenance of a high voltage circuit
breaker. Finally experiences from the practical use will be reflected and next
steps on the way to a user specific learning environment will be discussed. Keywords: Virtual Reality; Maintenance; Expert knowledge; learning theory; learning
objectives |
Bridging the Gap between Students and Laboratory Experiments | | BIBAK | Full-Text | 39-50 | |
Max Hoffmann; Katharina Schuster; Daniel Schilberg; Sabina Jeschke | |||
After having finished studies, graduates need to apply their knowledge to a
new environment. In order to professionally prepare students for new
situations, virtual reality (VR) simulators can be utilized. During our
research, such a simulator is applied in order to enable the visit of remote
laboratories, which are designed through advanced computer graphics in order to
create simulated representations of real world environments. That way, it is
our aim to facilitate the access to practical engineering laboratories.
Our goal is to enable a secure visit of elusive or dangerous places for students of technical studies. The first step towards the virtualization of engineering environments, e.g. a nuclear power plant, consists in the development of demonstrators. In the present paper, we describe the elaboration of an industry relevant demonstrator for the advanced teaching of engineering students. Within our approach, we use a virtual reality simulator that is called the "Virtual Theatre". Keywords: Virtual Reality; Virtual Theatre; Remote Laboratories; Immersion |
Applying Saliency-Based Region of Interest Detection in Developing a Collaborative Active Learning System with Augmented Reality | | BIBAK | Full-Text | 51-62 | |
Trung-Nghia Le; Yen-Thanh Le; Minh-Triet Tran | |||
Learning activities are not necessary to be only in traditional physical
classrooms but can also be set up in virtual environment. Therefore the authors
propose a novel augmented reality system to organize a class supporting
real-time collaboration and active interaction between educators and learners.
A pre-processing phase is integrated into a visual search engine, the heart of
our system, to recognize printed materials with low computational cost and high
accuracy. The authors also propose a simple yet efficient visual saliency
estimation technique based on regional contrast is developed to quickly filter
out low informative regions in printed materials. This technique not only
reduces unnecessary computational cost of keypoint descriptors but also
increases robustness and accuracy of visual object recognition. Our
experimental results show that the whole visual object recognition process can
be speed up 19 times and the accuracy can increase up to 22%. Furthermore, this
pre-processing stage is independent of the choice of features and matching
model in a general process. Therefore it can be used to boost the performance
of existing systems into real-time manner. Keywords: Smart Education; Active Learning; Visual Search; Saliency Image;
Human-Computer Interaction |
A 3D Virtual Learning System for STEM Education | | BIBAK | Full-Text | 63-72 | |
Tao Ma; Xinhua Xiao; William Wee; Chia Yung Han; Xuefu Zhou | |||
A recent boom has been seen in 3D virtual worlds for entertainment, and this
in turn has led to a surge of interest in their educational applications.
Although booming development has been seen, most of them only strengthen the
traditional teaching methods using a new platform without changing the nature
of how to teach and learn. Modern computer science technology should be applied
in STEM education for the purpose of rising learning efficiency and interests.
In this paper, we focus on the reasoning, design, and implementation of a 3D
virtual learning system that merges STEM experiments into virtual laboratory
and brings entertainment to knowledge learning. An advanced hand gesture
interface was introduced to enable flexible manipulation on virtual objects
with two hands. The recognition ability of single hand grasping-moving-rotating
activity (SH-GMR) allows single hand to move and rotate a virtual object at the
same time. We implemented several virtual experiments in the VR environment to
demonstrate to the public that the proposed system is a powerful tool for STEM
education. The benefits of this system are evaluated followed by two virtual
experiments in STEM field. Keywords: 3D virtual learning; Human machine interface (HCI); hand gesture
interaction; single hand grasping-moving-rotating (SH-GMR); STEM education |
Visible Breadboard: System for Dynamic, Programmable, and Tangible Circuit Prototyping with Visible Electricity | | BIBAK | Full-Text | 73-84 | |
Yoichi Ochiai | |||
This paper reports a new system for prototyping circuits called the Visible
Breadboard. The Visible Breadboard is a solderless breadboard that allows users
to make or erase physical wirings with tangible input by hand and to see the
voltage level of each hole at all times by a colored LED light.
The Visible Breadboard has 60 solid-state relays set in parallel crosses and controlled by a micro-controller. These relays connect the 36 holes on the system surface. The connected holes work as wirings in the circuit into which users can insert electronic materials. Each hole has an AD converter function working as a voltmeter and a full-color LED. The voltage of each hole can be visualized by these full-colored LEDs. Users can operate this system by touching the surface with their fingertips. Users can also connect the Visible Breadboard to a PC. When the Visible Breadboard is connected to the PC, it functions as a new kind of interface for developing and sharing circuits. Our experimental results showed that this device enables users to build circuits faster and more easily than an ordinary solderless breadboard. Keywords: Rapid Prototyping; Physical Computing; HCI |
The Application of Augmented Reality for Reanimating Cultural Heritage | | BIBAK | Full-Text | 85-95 | |
Sasithorn Rattanarungrot; Martin White; Zeeshan Patoli; Tudor Pascu | |||
This paper presents the design of a service-oriented architecture to support
dynamic cultural content acquisition on a mobile augmented reality system for
reanimating cultural heritage. The reanimating cultural heritage system
provides several domain interfaces (Web, Web3D, Mobile and Augmented Reality)
for presenting cultural objects accessed from an aggregated RCH data repository
via web services. This paper largely focuses on the augmented reality system,
but discusses the Web, Web3D and Mobile domains to set the paper in context.
The mobile augmented reality system performs multiple objects tracking to
augment digital media contents on real world cultural object scenes. The
proposed mobile augmented reality system is composed of a mobile interface
(smartphone, tablet), middleware including the augmented reality SDK and
supporting software modules for the augmented reality application, and a web
service framework. Keywords: service-oriented architecture; multiple object tracking; web service
framework; augmented reality |
Training to Improve Spatial Orientation in Engineering Students Using Virtual Environments | | BIBAK | Full-Text | 96-104 | |
Cristina Roca-Gonzalez; Jorge Martín-Gutiérrez; Cristina Mato Corredeguas; Melchor García-Domínguez | |||
This work present the results obtained from a experience performed with
freshmen students of the Industrial Engineering degree at Las Palmas de Gran
Canaria University aiming for improvement of their spatial abilities. The work
linked to spatial abilities show a great lack of uniformity according to the
adopted terminology as a consequence of different approaches, researchers'
field of study and the research's scale. But all research agree on the
relationship between a high level of spatial ability and the possibility of
success in certain professional careers and university degrees such as
engineering which is our actual case. The pilot study described in this paper,
aims to improve the Spatial Orientation component of spatial abilities and for
this we conducted two experiences or trainings based on orienteering sports:
one was performed in a real environment meanwhile the other took place in a
virtual environment. The results show that this component can be trained and
improved in both environments without finding any significant difference
between both types of training. Keywords: Spatial abilities; Spatial orientation; Environmental scale; Orienteering;
Virtual worlds |
Staging Choreographies for Team Training in Multiple Virtual Worlds Based on Ontologies and Alignments | | BIBAK | Full-Text | 105-115 | |
Emanuel Silva; Nuno Silva; Leonel Morgado | |||
In this paper we present an approach that makes possible the staging of
choreographies for education and training purposes in potentially any virtual
world platform. A choreography is seen here as the description of a set of
actions that must or may be executed by a group of participants, including the
goals to be achieved and any restrictions that may exist. We present a
system-architecture and the formalization of a set of processes that are able
to transform a choreography from a platform-independent representation into a
specific virtual world platform's representation. We adopt an ontology-based
approach with distinct levels of abstraction for capturing and representing
multi-actors and multi-domain choreographies to be staged in virtual world
platforms with distinct characteristics. Ontologies are characterized according
to two complementary dimensions -- choreography's domain (independent and
dependent) and virtual world platform (independent and dependent) -- giving
rise to four ontologies. Ontology mappings between these ontologies enable the
automatic generation of a choreography for virtually any target virtual world
platform, thus reducing the time and effort of the choreography development. Keywords: virtual worlds; training; choreography; multi-user; model-driven; ontology;
mapping |
"Make Your Own Planet": Workshop for Digital Expression and Physical Creation | | BIBAK | Full-Text | 116-123 | |
Hiroshi Suzuki; Hisashi Sato; Haruo Hayami | |||
We propose the "Make Your Own Planet" workshop, which combines handicraft
and digital representation tools (3DCG effects). In this workshop, a child uses
a USB camera to select textures freely in the process of making an original
3DCG planet. All 3DCG planets are then placed in a simulated universe for
public viewing. By watching this universe, viewers can appreciate the planet of
each child. Further, the texture of each 3DCG planet is translated to a
polyhedron template and printed out as a paper-craft template. In this process,
children employ computers to transform their planets into physical objects that
they can bring home. We first describe the workshop concept and then the method
by which it was implemented. Finally, we evaluate the workshop. Keywords: Digital workshop; 3DCG; Unity; I/O device |
Usability Evaluation of Virtual Museums' Interfaces Visualization Technologies | | BIBAK | Full-Text | 124-133 | |
Stella Sylaiou; Vassilis Killintzis; Ioannis Paliokas; Katerina Mania; Petros Patias | |||
This paper reports on a user-centered formative usability evaluation of
diverse visualization technologies used in Virtual Museums. It initially
presents the selection criteria and the five museum websites involved in the
analysis. Then, it describes the evaluation process, in which a group of
subjects explored the museums' on-line resources and answered in two usability
questions concerning overall reaction to the website and the subjective
satisfaction of the users. After user testing, quantitative and qualitative
data have been collected and statistically analysed. However, much research
remains to be done on future research in terms of larger sample, different
methodologies and varied contexts. Keywords: history and culture; digital humanitis; cultural informatics |
Manasek AR: A Location-Based Augmented Reality Application for Hajj and Umrah | | BIBAK | Full-Text | 134-143 | |
Mounira Taileb; Elham Al-Ghamdi; Nusaibah Al-Ghanmi; Abeer Al-Mutari; Khadija Al-Jadani; Mona Al-Ghamdi; Alanood Al-Mutari | |||
In this paper a location-based augmented reality application is presented.
It is a mobile application whose goal is to facilitate the journey of millions
of pilgrims when performing Hajj and Umrah and overcome the difficulties they
face. Using the Augmented Reality, the application displays different types of
information about the pilgrims surroundings in a mobile camera view. The
usability testing of the proposed application ended successfully with a very
high rate of positive feedback from users. Keywords: Location-based Augmented Reality; GPS; compass; accelerometer |
Support of Temporal Change Observation Using Augmented Reality for Learning | | BIBAK | Full-Text | 144-155 | |
Takafumi Taketomi; Angie Chen; Goshiro Yamamoto; Hirokazu Kato | |||
An augmented reality (AR) technology enables to show an additional
information by superimposing virtual objects onto the real world. The AR
technology is gradually used in the learning environment for observing
unseeable objects. Observation is the important process of inspecting a target
object with significant details. It forms the basic of all scientific knowledge
in education. However, there are only few AR applications which can visualize
the temporal changes of the objects. In addition, the effect of this temporal
change visualization by AR is not investigated from a scientific aspect. In
this study, in order to clarify the effect of temporal change visualization by
AR, we have compared the AR-based temporal change visualization method with the
conventional temporal change visualization methods in the experiment.
Especially, we set an observation of the plant growth as a practical scenario.
Through the experiment, we have confirmed that superimpose the past appearance
onto the user's viewpoint is effective for temporal change observation
scenario. Keywords: Augmented Reality; Temporal Change Visualization; Leaning Support |
Augmented Reality Workshops for Art Students | | BIBAK | Full-Text | 156-166 | |
Marcin Wichrowski; Ewa Satalecka; Alicja Wieczorkowska | |||
In this paper, we describe the program of our AR workshops dedicated to art
students. Our observations regarding supervising such lab courses and students'
works are presented. We would like to present a methodology for AR training
when the students are not experienced in computer programming. We hope this
will encourage other art and IT teachers to join efforts and incorporate AR
into curriculum as a very promising concept of merging technology with visual
communication. The potential of AR is very high and, therefore, it is important
to introduce students to AR and the process of creating their own working
projects. Keywords: Augmented Reality; education; AR workshops; art projects; mobile AR |
Serious Games as Positive Technologies | | BIBAK | Full-Text | 169-177 | |
Luca Argenton; Esther Schek; Fabrizia Mantovani | |||
Serious games are emerging as innovative tools to promote opportunities for
human psychological growth and well-being. The aim of the present paper is to
introduce them as Positive Technologies. Positive Technology is an emergent
field based on both theoretical and applied research, whose goal is to
investigate how Information and Communication Technologies (ICTs) can be used
to empower the quality of personal experience at three levels: hedonic
well-being, eudaimonic well-being and social well-being. As Positive
Technologies, serious games can influence both individual and interpersonal
experiences by nurturing positive emotions, promoting engagement, as well as
enhancing social integration and connectedness. An in-depth analysis of each of
these aspects will be presented in the chapter, with the support of concrete
examples. Keywords: Positive psychology; positive technology; serious games; wellbeing |
An Experience-Based Chinese Opera Using Live Video Mapping | | BIBAK | Full-Text | 178-189 | |
Xiang-Dan Huang; Byung-Gook Lee; Hyung-Woo Kim; Joon-Jae Lee | |||
In this work, we choose Chinese Opera as research material, hoping to
increase people's acceptance and intimate to the performance. The theme is
"Havoc the Dragon Palace", one chapter of the sixteenth century Chinese novel
"Journey to the West" by Wu Cheng'en. We developed the rendering technique and
named "Live Video Mapping". It focuses on both the movement of human detection
and the interaction with background video real-time. The virtual images on the
stage not only generate good of view but also make audience experience the
illusion of space in which the space is expanding and enhancing. Taking into
account the above factors, this study explore the possibility of interactive
video mapping, as well as understanding and increasing the affinity of Chinese
Opera to promote the value of the Chinese Opera. Keywords: Journey to the West; Chinese Opera; real-time interactive experience; live
video mapping |
Serious Games: Customizing the Audio-Visual Interface | | BIBAK | Full-Text | 190-199 | |
Bill Kapralos; Robert Shewaga; Gary Ng | |||
Serious games are gaining in popularity within a wide range of educational
and training applications given their ability to engage and motivate learners
in the educational process. Recent hardware and computational advancements are
providing developers the opportunity to develop applications that employ a high
level of fidelity (realism) and novel interaction techniques. However, despite
these great advances in hardware and computational power, real-time high
fidelity rendering of complex virtual environments (found in many serious
games) across all modalities is still not feasible. Perceptual-based rendering
exploits various aspects of the multi-modal perceptual system to reduce
computational requirements without any resulting perceptual effects on the
resulting scene. A series of human-based experiments demonstrated a potentially
strong effect of sound on visual fidelity perception, and task performance.
However, the resulting effects were subjective whereby the influence of sound
was dependent on various individual factors including musical listening
preferences. This suggests the importance of customizing (individualizing) a
serious game's virtual environment with respect to audio-visual fidelity,
background sounds, etc. In this paper details regarding this series of
audio-visual experiments will be provided followed by a description of current
work that is examining the customization of a serious game's virtual
environment by each user through the use of a game-based calibration method. Keywords: Serious games; virtual simulation; audio-visual interaction; audio-visual
fidelity; calibration |
Designing AR Game Enhancing Interactivity between Virtual Objects and Hand for Overcoming Space Limit | | BIBAK | Full-Text | 200-209 | |
Kyungyeon Moon; Jonghee Sang; Woonteak Woo | |||
We propose real-time interactive game that is based on Augmented Reality
(AR). It is composed of AR marker, Head Mounted Display and depth camera. By
using marker, the proposed system augments game space, fishing place. And
player can interact virtual game object such as bait or fish with bare hands
based on computer vision. The rapid development of AR technologies has raised
profound interests in the design of AR games, but the existing games have not
provided realistically felt game environments because the way to play games
remains the same when the platform is changed. In addition, studies in this
field did not fully utilize AR technologies, so that inherent characteristics
of AR game do not impact user experience and draw attention explicitly on
design concepts. Our system gives the experience that is grasping the virtual
objects. Also, it can be applied to various game contents that are actually
felt as real. Keywords: entertainment; augmented reality; 3D interaction; HMD; hand-tracking |
THE GROWTH: An Environmental Game Focusing on Overpopulation Issues | | BIBAK | Full-Text | 210-221 | |
Charn Pisithpunth; Panagiotis Petridis; Petros Lameras; Ian Dunwell | |||
THE GROWTH is an environmental game aiming to tackle growing population
issues and its impact on natural environment. The game also extends to cover
social issues and unsustainable resources consumption caused by rapid
population growth. Unlike many environmental games, THE GROWTH demonstrates
that financial, social and health factors can be improved simply by committing
to sustainable consumption patterns. The game aims to investigate the
possibility of using serious games to promote players' environmental awareness
and ultimately, the possibility of using serious games to modify players'
consumption patterns. This game is designed for a specific target group of male
population between 20-30 years of age in Bangkok (Thailand) and is focused on
environmental issues raising the residential accommodation. Early experimental
sessions were conducted with 17 participants and this paper presents the
preliminary results of the study. Keywords: Applications: Education; Applications: Entertainment; Applications: Virtual
worlds and social computing; Interaction and navigation in VR and MR:
Immersion; serious games |
Responses during Facial Emotional Expression Recognition Tasks Using Virtual Reality and Static IAPS Pictures for Adults with Schizophrenia | | BIBAK | Full-Text | 225-235 | |
Esubalew Bekele; Dayi Bian; Zhi Zheng; Joel Peterman; Sohee Park; Nilanjan Sarkar | |||
Technology-assisted intervention has the potential to adaptively
individualize and improve outcomes of traditional schizophrenia (SZ)
intervention. Virtual reality (VR) technology, in particular, has the potential
to simulate real world social and communication interactions and hence could be
useful as a therapeutic platform for SZ. Emotional face recognition is
considered among the core building blocks of social communication. Studies have
shown that emotional face processing and understanding is impaired in patients
with SZ. The current study develops a novel VR-based system that presents
avatars that can change their facial emotion dynamically for emotion
recognition tasks. Additionally, this system allows real-time measurement of
physiological signals and eye gaze during the emotion recognition tasks, which
can be used to gain insight about the emotion recognition process in SZ
population. This study further compares VR-based facial emotion recognition
with that of the more traditional emotion recognition from static faces using a
small usability study. Results from the usability study suggest that VR could
be a viable platform for SZ intervention and implicit signals such as
physiological signals and eye gaze can be utilized to better understand the
underlying pattern that is not available from user reports and performance
alone. Keywords: facial expression; emotion recognition; virtual reality; IAPS; adaptive
interaction; eye tracking; physiological processing; schizophrenia intervention |
Attention Training with an Easy-to-Use Brain Computer Interface | | BIBAK | Full-Text | 236-247 | |
Filippo Benedetti; Nicola Catenacci Volpi; Leonardo Parisi; Giuseppe Sartori | |||
This paper presents a cognitive training based on a brain-computer interface
(BCI) that was developed for an adult subject with an attention disorder.
According to the neurofeedback methodology, the user processes in real time his
own electrical brain activity, which is detected through a non-invasive EEG
device. The subject was trained in actively self modulating his own electrical
patterns within a play therapy by using a reward-based virtual environment.
Moreover, a consumer easy-to-use EEG headset was used, in order to assess its
suitability for a concrete clinical application. At the end of the training,
the patient obtained a significant improvement in attention. Keywords: Play therapy; Attention training; Rehabilitation; Brain-computer interface
(BCI); Neurofeedback |
Augmented Reality Treatment for Phantom Limb Pain | | BIBAK | Full-Text | 248-257 | |
Francesco Carrino; Didier Rizzotti; Claudia Gheorghe; Patrick Kabasu Bakajika; Frédérique Francescotti-Paquier; Elena Mugellini | |||
Mirror therapy is used from many years to treat phantom limb pain in
amputees. However, this approach presents several limitations that could be
overcome using the possibilities of new technologies. In this paper we present
a novel approach based on augmented reality, 3D tracking and 3D modeling to
enhance the capabilities of the classic mirror therapy. The system was
conceived to be integrated in a three steps treatment called "Graded motor
imagery" that includes: limb laterality recognition, motor imagery and,
finally, mirror therapy. Aiming at a future home care therapy, we chose to work
with low-cost technologies studying their advantages and drawbacks.
In this paper, we present the conception and a first qualitative evaluation of the developed system. Keywords: Augmented Reality; 3D tracking; 3D modeling; phantom limb pain treatment;
mirror therapy |
Comparing Data from a Computer Based Intervention Program for Patients with Alzheimer's Disease | | BIBAK | Full-Text | 258-266 | |
Agisilaos Chaldogeridis; Thrasyvoulos Tsiatsos; Moses Gialaouzidis; Magdalini Tsolaki | |||
Nowadays, dealing with Alzheimer's disease (AD) includes a combination of
pharmaceutical and non-pharmaceutical treatment. But, current drugs do not, and
potential future drugs might not, improve quality of life. Evidence suggests
psychosocial interventions, like educational and arts programs, do in fact have
such a benefit. Supportive and enriching information technology may be more
important than biotechnology (Whitehouse, 2013). So non-pharmaceutical
treatment including physical and mental exercising as well seem to perform
better. There are many forms of mental exercising from simple crosswords
puzzles to sophisticated video games that exercise different cognitive skills.
Main object of this report is to present the results of a computer-based
intervention program for people with AD that take place in two Day Care Centers
of Greek Association of Alzheimer's Disease and Related Disorders in
Thessaloniki, Greece. There is a significant amount of data that include
patients, who have taken part in interventions programs since 2009. For the
purpose of this study we included data for a period of one year only. These
patients have been tested before and after each intervention program (pre-test
and post-test). Our work was to compare these data to examine how the program
performs and which cognitive skills seem to have better improvement. The
results showed that patients' overall scores were preserved for this period of
time and had a slightly improvement which is a promising result indicating that
this intervention program has positive effects. Keywords: computerized cognitive training; Alzheimer's disease; cognitive
rehabilitation |
Virtual Reality-Based System for Training in Dental Anesthesia | | BIBAK | Full-Text | 267-276 | |
Cléber G. Corrêa; Fátima de Lourdes dos Santos Nunes; Romero Tori | |||
This paper presents the development and preliminary evaluation of a Virtual
Reality-based system for training in dental anesthesia. The development focused
the simulation of an anesthesia procedure task. The evaluation involved graphic
and haptic issues and had the presence of experts in the dentistry area. The
assessment aimed at attributes that may influence the human-computer
interaction, hindering realism, an important challenge in systems of this type.
The attributes selected were: the update rate, the appearance of the virtual
models and the number of viewpoints of the virtual environment, as well as the
characteristics of the haptic device. Despite constraints were found, in the
perception of the experts, the system may provide realism and help with the
training of certain tasks. Keywords: dental anesthesia; human-computer interaction; Virtual Reality |
Adaptive Architecture to Support Context-Aware Collaborative Networked Virtual Surgical Simulators (CNVSS) | | BIBAK | Full-Text | 277-286 | |
Christian Diaz; Helmuth Trefftz; Lucia Quintero; Diego Acosta; Sakti Srivastava | |||
Stand-alone and networked surgical virtual reality based simulators have
been proposed as means to train surgical skills with or without a supervisor
nearby the student or trainee. However, surgical skills teaching in medicine
schools and hospitals is changing, requiring the development of new tools to
focus on: (i) importance of mentors role, (ii) teamwork skills and (iii) remote
training support. For these reasons a surgical simulator should not only allow
the training involving a student and an instructor that are located remotely,
but also the collaborative training session involving a group of several
students adopting different medical roles during the training session.
Collaborative Networked Virtual Surgical Simulators (CNVSS) allow collaborative training of surgical procedures where remotely located users with different surgical roles can take part in a training session. Several works have addressed the issues related to the development of CNVSS using various strategies. To the best of our knowledge no one has focused on handling heterogeneity in collaborative surgical virtual environments. Handling heterogeneity in this type of collaborative sessions is important because not all remotely located users have homogeneous Internet connections, nor the same interaction devices and displays, nor the same computational resources, among other factors. Additionally, if heterogeneity is not handled properly, it will have an adverse impact on the performance of each user during the collaborative session. In this paper we describe the development of an adaptive architecture with the purpose of implementing a context-aware model for collaborative virtual surgical simulation in order to handle the heterogeneity involved in the collaboration session. Keywords: Context Aware; Collaborative Networked Surgical Simulators; Remote Medical
Training |
Three-Dimensional Fitt's Law Model Used to Predict Movement Time in Serious Games for Rehabilitation | | BIBAK | Full-Text | 287-297 | |
Sergio García-Vergara; Ayanna M. Howard | |||
Virtual reality serious game platforms have been developed to enhance the
effectiveness of rehabilitation protocols for those with motor skill disorders.
Such systems increase the user's motivation to perform the recommended in-home
therapy exercises, but typically don't incorporate an objective method for
assessing the user's outcome metrics. We expand on the commonly used human
modeling method, Fitt's law, used to predict the amount of time needed to
complete a task, and apply it as an assessment method for virtual environments.
During game-play, we compare the user's movement time to the predicted value as
a means for assessing the individual's kinematic performance. Taking into
consideration the structure of virtual gaming environments, we expand the
nominal Fitt's model to one that makes accurate time predictions for
three-dimensional movements. Results show that the three-dimensional refinement
made to the Fitt's model makes better predictions when interacting with virtual
gaming platforms than its two-dimensional counterpart. Keywords: Fitt's law; virtual reality games; physical therapy and rehabilitation;
linear modeling |
Multi-users Real-Time Interaction with Bacterial Biofilm Images Using Augmented Reality | | BIBAK | Full-Text | 298-308 | |
Mohammadreza Hosseini; Tomasz Bednarz; Arcot Sowmya | |||
Augmented Reality (AR) applications may be used to enhance understanding of
physical objects by addition of digital information to captured video streams.
We propose new bio-secure system for interactions with bacterium biofilm images
using the AR technology to improve safety in experimental lab. In proposed
application we used state-of-the-art real-time features detection and matching
methods. Also, various methods of feature detection and matching were compared
with each other for real-time interaction and accuracy. The implementation of
an app on a tablet device (Apple iPad) makes it useable by multi users in
parallel. Keywords: Multi-user; Real-time; biofilm; Augmented reality |
Attention Control and Eyesight Focus for Senior Citizens | | BIBAK | Full-Text | 309-315 | |
Miikka Lääkkö; Aryan Firouzian; Jari Tervonen; Goshiro Yamamoto; Petri Pulli | |||
The population is aging fast and with aging come cognitive impairments that
often require costly facility care. This paper proposes Smart Glasses that can
help alleviate these impairments at their early stages and thus allow senior
citizens stay away from facility care longer. The Smart Glasses produce
exogenous cues to attract user attention. Four usability experiments are
described to evaluate the utility of the cues and other usability factors of
the proposed system. We expect the results will give us valuable information on
how to improve the design of the system based on senior citizens' needs. Keywords: smart glasses; aging in-place; assistive technology; attention control;
cognitive impairment |
Sense of Presence and Metacognition Enhancement in Virtual Reality Exposure Therapy in the Treatment of Social Phobias and the Fear of Flying | | BIBAK | Full-Text | 316-328 | |
Ioannis Paliokas; Athanasios Tsakiris; Athanasios Vidalis; Dimitrios Tzovaras | |||
The aim of this research effort is to identify feeling-of-presence and
metacognitive amplifiers over existing well-established VRET treatment methods.
Patient real time projection in virtual environments during stimuli exposure
and electroencephalography (EEG) report sharing are among the techniques, which
have been used to achieve the desired result. Initialized from theoretical
inferences, is moving towards a proof-of-concept prototype, which has been
developed as a realization of the proposed method. The evaluation of the
prototype made possible with an expert team of 28 therapists testing the fear
of public speaking and fear of flying case studies. Keywords: Virtual Reality Exposure Therapy; Anxiety Disorders; Sense of Presence;
Metacognition; Fear of Public Speech; Fear of Flying |
Severe Neglect and Computer-Based Home Training | | BIBAK | Full-Text | 329-339 | |
Inge Linda Wilms | |||
Cognitive rehabilitation from a functional perspective often requires
intensive training over a longer period of time. In the case of rehabilitation
of unilateral neglect, the frequency and intensity needed is expensive and
difficult to implement both for the therapists and the patients. For this
reason, this case study tests the possibility of using computer-based training
in the rehabilitation efforts for a patient with severe neglect who had no
previous skills in computer usage. The article describes the results of the
training both in terms of neuropsychological tests and the reading ability of
the patient. Keywords: optokinetic training; home training; computer-based training; unilateral
neglect; prism adaptation training; bottom-up |
Spatial Augmented Reality in Collaborative Design Training: Articulation between I-Space, We-Space and Space-Between | | BIBAK | Full-Text | 343-353 | |
Samia Ben Rajeb; Pierre Leclercq | |||
This paper analyses the use of augmented reality in advanced project-based
training in design. Our study considers how augmented environments can
contribute to this type of group training: what types of interaction spaces
constitute these new learning environments and how are these spaces constructed
so as to promote collective reflection? Keywords: Project based learning; collaborative design; augmented reality |
Passenger Ship Evacuation -- Design and Verification | | BIBAK | Full-Text | 354-365 | |
Luis Guarin; Yasmine Hifi; Dracos Vassalos | |||
This paper introduces the concept of escape and evacuation from passenger
ships from a perspective of ship design and risk management. As part of that
process, the use of computer simulation tools for analysing the evacuation
performance of ships carrying large numbers of persons on board is becoming
more relevant and useful. The objective of this paper is to present the
pedestrian dynamics simulation tool EVI, developed to undertake advanced escape
and evacuation analysis in the design verification of cruise vessels, passenger
ferries and large offshore construction vessels, among others. Keywords: Evacuation analysis; passenger ships; offshore vessels |
Evaluation of User Experience Goal Fulfillment: Case Remote Operator Station | | BIBAK | Full-Text | 366-377 | |
Hannu Karvonen; Hanna Koskinen; Helena Tokkonen; Jaakko Hakulinen | |||
In this paper, the results of a user experience (UX) goal evaluation study
are reported. The study was carried out as a part of a research and development
project of a novel remote operator station (ROS) for container gantry crane
operation in port yards. The objectives of the study were both to compare the
UXs of two different user interface concepts and to give feedback on how well
the UX goals experience of safe operation, sense of control, and feeling of
presence are fulfilled with the developed ROS prototype. According to the
results, the experience of safe operation and feeling of presence were not
supported with the current version of the system. However, there was much
better support for the fulfilment of the sense of control UX goal in the
results. Methodologically, further work is needed in adapting the utilized
Usability Case method to suit UX goal evaluation better. Keywords: remote operation; user experience; user experience goal; evaluation |
Increasing the Transparency of Unmanned Systems: Applications of Ecological Interface Design | | BIBAK | Full-Text | 378-389 | |
Ryan Kilgore; Martin Voshell | |||
This paper describes ongoing efforts to address the challenges of
supervising teams of heterogeneous unmanned vehicles through the use of
demonstrated Ecological Interface Design (EID) principles. We first review the
EID framework and discuss how we have applied it to the unmanned systems
domain. Then, drawing from specific interface examples, we present several
generalizable design strategies for improved supervisory control displays. We
discuss how ecological display techniques can be used to increase the
transparency and observability of highly automated unmanned systems by enabling
operators to efficiently perceive and reason about automated support outcomes
and purposefully direct system behavior. Keywords: Ecological Interface Design (EID); automation transparency; unmanned
systems; supervisory control; displays |
Collaborative Visualization of a Warfare Simulation Using a Commercial Game Engine | | BIBAK | Full-Text | 390-401 | |
Hyungki Kim; Yuna Kang; Suchul Shin; Imkyu Kim; Soonhung Han | |||
The requirement about reusable 3D visualization tool was continuously raised
in various industries. Especially in the defense modeling and simulation field,
there are abundant researches about reusable and interoperable visualization
system, since it has a critical role to the efficient decision making by
offering diverse validation and analyzing process. Also to facilitate the
effectiveness, many current operating systems are applying VR (Virtual Reality)
and AR (Augmented Reality) technologies aggressively. In this background, we
conducted the research about the design for the collaborative visualization
environment for the warfare simulation through commercial game engine. We
define the requirements by analyzing advantages and disadvantages of existing
tools or engines like SIMDIS or Vega, and propose the methods how to utilize
the functionalities of commercial game engine to satisfy the requirements. The
implemented prototype offers collaborative visualization environment inside the
CAVE environment, which is the facility for immersive virtual environment, by
cooperating with handheld devices. Keywords: 3D Visualization; Game Engine; Warfare Simulation; Collaborative
Visualization Environment |
VELOS: Crowd Modeling for Enhanced Ship Evacuation Analysis | | BIBA | Full-Text | 402-413 | |
Konstantinos V. Kostas; Alexandros-Alvertos Ginnis; Constantinos G. Politis; Panagiotis D. Kaklis | |||
Virtual Environment for Life On Ships (VELOS) is a multi-user Virtual Reality (VR) system that supports designers to assess (early in the design process) passenger and crew activities on a ship for both normal and hectic conditions of operations and to improve the ship design accordingly [1]. Realistic simulations of behavioral aspects of crowd in emergency conditions require modeling of panic aspects and social conventions of inter-relations. The present paper provides a description of the enhanced crowd modeling approach employed in VELOS for the performance of ship evacuation assessment and analysis based on the guidelines provided by IMO's Circular MSC 1238/2007 [2]. |
Applying Augmented Reality to the Concept Development Stage of the Total Design Methodology | | BIBAK | Full-Text | 414-425 | |
Gordon M. Mair; Andrew Robinson; John Storr | |||
This paper suggests an approach to assist the identification of suitable
areas of application of AR within the product design process. The approach
utilizes an established methodology for product design development that allows
each stage in the design process to be identified and considered in a logical
and structured manner. By doing this we can consider the suitability for AR at
each stage as opposed to the use of hand drawings, basic computer aided design,
virtual reality, or rapid prototyping techniques and suchlike to produce
physical models. As an example of this we consider the concept design stage of
the product design process and conduct some preliminary experiments in the use
of AR to facilitate the activity. Keywords: Augmented reality; product design; total design; concept design; industrial
design |
Authoring of Automatic Data Preparation and Scene Enrichment for Maritime Virtual Reality Applications | | BIBA | Full-Text | 426-434 | |
Benjamin Mesing; Uwe von Lukas | |||
When realizing virtual reality scenarios for the maritime sector a key
challenge is dealing with the huge amount of data.
Adding interactive behaviour for providing a rich interactive experience manually requires a lot of time and effort. Additionally, even though shipyards today often use PDM or PLM systems to manage and aggregate the data, the export to a visualisation format is not without problems and often needs some post procession to take place. We present a framework, that combines the capabilities of processing large amounts of data for preparing virtual reality scenarios and enriching it with dynamic aspects like interactive door opening capabilities. An authoring interface allows orchestrating the data preparation chain by non-expert users to realise individual scenarios easily. |
AR-Based Vehicular Safety Information System for Forward Collision Warning | | BIBAK | Full-Text | 435-442 | |
Hye Sun Park; Kyong-Ho Kim | |||
This paper proposes an AR (augmented reality) based vehicular safety
information system that provides warning information allowing drivers to easily
avoid obstacles without being visually distracted. The proposed system consists
of four stages: fusion data based object tracking, collision threat assessment,
AR-registration, and a warning display strategy. It is shown experimentally
that the proposed system can predict the threat of a collision from a tracked
forward obstacle even during the nighttime and under bad weather conditions.
The system can provide safety information for avoiding collisions by projecting
information directly into the driver's field of view. The proposed system is
expected to help drivers by conveniently providing safety information and
allowing them to safely avoid forward obstacles. Keywords: AR (augmented reality); vehicular safety information; forward collision;
warning system; data fusion; object tracking; threat assessment; warning
strategy |
An Augmented Reality Framework for Supporting and Monitoring Operators during Maintenance Tasks | | BIBAK | Full-Text | 443-454 | |
Guido Maria Re; Monica Bordegoni | |||
The paper proposes a framework for supporting maintenance services in
industrial environments through the use of a mobile device and Augmented
Reality (AR) technologies. 3D visual instructions about the task to carry out
are represented in the real world by means of AR and they are visible through
the mobile device. In addition to the solutions proposed so far, the framework
introduces the possibility to monitor the operator's work from a remote
location. The mobile device stores information for each maintenance step that
has been completed and it makes them available on a remote database.
Supervisors can consequently check the maintenance activity from a remote PC at
any time. The paper presents also a prototype system, developed according to
the framework, and an initial case study in the field of food industry. Keywords: Augmented Reality; Framework; Maintenance tasks; Remote Supervision |
Using VR for Complex Product Design | | BIBAK | Full-Text | 455-464 | |
Loukas Rentzos; Charalampos Vourtsis; Dimitris Mavrikios; George Chryssolouris | |||
Virtual reality is a key technology for the designing of products through
complex human-product interactions. This paper deals with the development of a
product design method for complex human-product interactions, using the virtual
reality (VR) technology. This VR method uses the graph theory in order for the
complexity of the designed product to be measured on the basis of human task
analysis. The latter is for the purpose of recording and analyzing the
human-product interactions within an immersive simulation session. The proposed
method undergoes tests in a realistic aerospace case. Keywords: Product Design; Product Complexity; Immersive Environment; Virtual
Prototyping |
Maritime Applications of Augmented Reality -- Experiences and Challenges | | BIBAK | Full-Text | 465-475 | |
Uwe von Lukas; Matthias Vahl; Benjamin Mesing | |||
The paper summarizes experiences from applied research in visual computing
for the maritime sector. It starts with initial remarks on Augmented Reality in
general and the specific boundary conditions of the maritime industry. The
focus is on a presentation of various concrete AR applications that have been
implemented for use cases in maritime engineering, production, operation and
retrofitting. The paper closes with remarks on future research in this area. Keywords: Augmented Reality; Mixed Reality; Applied Research; Maritime Industry;
Mobile Systems |