HCI Bibliography Home | HCI Conferences | ImmersiveMe Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
ImmersiveMe Tables of Contents: 14

Proceedings of the 2014 International Workshop on Immersive Media Experiences

Fullname:Proceedings of the 2nd ACM International Workshop on Immersive Media Experiences
Editors:Teresa Chambel; Paula Viana; V. Michael Bove; Sharon Strover; Graham Thomas
Location:Orlando, Florida
Standard No:ISBN: 978-1-4503-3122-7; ACM DL: Table of Contents; hcibib: ImmersiveMe14
Links:Workshop Website | Conference Website
  1. Perceptual Immersion
  2. Enabling Technologies and Applications
  3. Keynote Talk
  4. Immersion Affect and Effect

Perceptual Immersion

Development of a Simple and Low-Cost Olfactory Display for Immersive Media Experiences BIBAFull-Text 1-6
  Nicolas S. Herrera; Ryan P. McMahan
Olfaction is an important perceptual function that is often neglected in immersive media (IM) and virtual reality (VR) applications. Because the effects of olfaction have not been researched as much as those of visual, auditory, or haptic senses, the effects of olfactory stimuli on IM experiences are mainly unexplored, largely unknown, and debatable in many examples. A major factor limiting olfaction research is the lack of olfactory display options. Commercial solutions are often inadequate and expensive. Prior research on olfactory displays is helpful, but pertinent details are normally missing, and the devices are often too complex to replicate. To address this issue, we have developed a simple, low-cost olfactory display by using inexpensive components and leveraging airflow for vaporization and scent delivery. In this paper, we detail the development of our display and describe an informal study evaluating its effectiveness.
The Sensation of Taste in the Future of Immersive Media BIBAFull-Text 7-12
  Nimesha Ranasinghe; Kuan-Yi Lee; Gajan Suthokumar; Ellen Yi-Luen Do
To create a truly immersive virtual experience, perceiving information through multiple human senses is important. Therefore, new forms of media are required that deeply involve various human senses -- not only sight, sound, and touch, but also nontraditional senses like taste and smell -- to create a perception of presence in a non-physical environment. However, at present, the sensation of taste is considered as one of the final frontiers of immersive media to be achieved. This paper discusses key aspects and opportunities of including the sensation of taste in the future of immersive media technologies. As a solution, we then present 'Taste+' utensils that digitally enhance the taste sensations of food and beverages without additional flavoring ingredients. Finally, we envision several future usage scenarios and challenges of the indicated technology to facilitate future immersive digital experiences.

Enabling Technologies and Applications

ENF Signal Induced by Power Grid: A New Modality for Video Synchronization BIBAFull-Text 13-18
  Hui Su; Adi Hajj-Ahmad; Chau-Wai Wong; Ravi Garg; Min Wu
Multiple videos capturing the same scene from possibly different viewing angles may be synthesized for novel immersive experience. Synchronization is an important task for such applications involving multiple pieces of audio-visual data. In this work, we exploit the electric network frequency (ENF) signal inherently embedded in the soundtrack and/or image sequence of video to temporally align video recordings. ENF is the supply frequency of power distribution networks in a power grid. Its value fluctuates slightly from its nominal value of 50 Hz or 60 Hz, and the fluctuation trends stay consistent within the same grid. Audio and video recordings that are created in areas of electric activities may capture the ENF signal due to electromagnetic interferences and other physical phenomena. We propose to synchronize video recordings by aligning the embedded ENF signals. Without major constraints on viewing angle and camera calibration as many existing methods impose, the proposed approach emerges as a new synchronization modality.
On-Display Spatial Audio for Multiple Applications on Large Displays BIBAFull-Text 19-22
  Sachin Deshpande
We describe a spatial audio system design for multiple applications on large displays. Our system provides spatial audio based on an application window's location on the display screen. Our approach allows conveying application audio height information without special content encoding. Our approach supports spatial audio from multiple concurrent on-display AV windows. The on-display audio location moves and resizes when the AV application window is moved and resized. Our design can handle any X.Y format content and spatialize it with a limited number of discrete loudspeakers. We have implemented this spatial audio system without requiring any special hardware, by reusing existing surround sound cards and developing special software to utilize the discrete individual audio channels supported by them. Subjective listening tests confirm the spatial location perceived by listeners matches the AV window location in our system.
Integration of a Precise Indoor Position Tracking Algorithm with an HMD-Based Virtual Reality System BIBAFull-Text 23-26
  Jongkyu Shin; Gwangseok An; Kyogu Lee
In this paper, we present a new system for a highly immersive virtual reality experience utilizing head-mounted displays (HMDs) with accurate indoor-position tracking abilities. The system is designed to have six degrees of freedom in the virtual world, which allows users to physically move around in the real world while wearing the wireless system with an HMD. Three-dimensional X, Y, and Z coordinate data are estimated in real time using ultrasonic sensors. In addition, the pitch, roll, and yaw values are also measured. Unlike previously developed systems which require external input devices to move in the virtual environment, our system provides a natural virtual reality experience by precisely matching the physical movements in a real world to those in a virtual environment. Results show that the system is able to estimate accurate positions of a user, and delivers a highly immersive virtual/mixed reality experience.

Keynote Talk

Immersion, Imagination & Innovation BIBAFull-Text 27
  Christopher Stapleton
The application of immersion to awaken human potential through innovative experience design is dependent upon one's ability to spark the imagination. Our keynote speaker shares his journey of exploring how the diverse application of interactive entertainment techniques can define new innovations in life transformative simulations. Using the interplay of story, play and game, his design research showcases examples of how stimulating the imagination can enhance military training for the US Army, informal education for NASA, medical imaging for ER surgeons, teacher training in Urban classrooms, experiential marketing in shopping malls as well as cognitive rehabilitation in therapy clinics.
   As technological advancements in simulation catch up with the science-fiction of our parents, what kind of creative leaps will our children will be making into a future where reality, virtuality and imagination work as one world? The future of immersive media will transfer from theme parks to our living rooms, and all new design paradigms will emerge to transform our homes into a school, museum, theme park, training facility, shopping center, as well as a medical and rehabilitation clinic. This talk presents a vision with all new challenges for the future.

Immersion Affect and Effect

Transforming Lives Through Story Immersion: Innovation of Aphasia Rehabilitation Therapy through Storytelling Learning Landscapes BIBAFull-Text 29-34
  Christopher Stapleton; Janet Whiteside PhD; Jim Davies PhD; Dana Mott; Jennifer Vick
Aphasia is a disease that renders its victims unable to effectively use language. Evidence supports the efficacy of treatment for aphasia yet the effectiveness or transferability of learned communicative abilities to everyday conversation continues to be investigated. In this paper we explore an alternative approach to aphasia treatment based on the art and science of storytelling. Inherent in storytelling are the motivations to share an experience, the cognitive abilities to organize story, and the language system to convey the experience. This approach is based on decades of research in aphasia therapy and immersive storytelling (in other fields) and has been used to engage a subject's creativity and emotions to produce transformative results in real life. We report on early, promising results that could radically innovate the rehabilitative practice of aphasia.
Sensory Fiction: A Design Fiction of Emotional Computation BIBAFull-Text 35-40
  Felix Heibeck; Alexis Hope; Julie Legault
This paper is situated in the emergent field of "Design Fiction" and describes how this approach can be applied to explorations in the field of immersive media experiences. We present Sensory Fiction -- an exploration in augmenting the emotions of a reader via a modular, multi-sensory system. The science on the nature of emotions is still inconclusive and direct ways of controlling them computationally are yet to be discovered. However, this project creates a Design Fiction that highlights the opportunities and challenges that the availability of such technology might bring. We leveraged existing scientific insights to build a functional prototype that aims to induce and evoke emotions by simulating the physiological system. Used in combination with conceptual, non-functional modules (i.e. modules that do not function physically but that introduce the idea of a physical actuation), we created an artifact to spark discussion about the future of immersive emotional experiences but that can also be experienced by the audience. Lastly, we show how presenting the project in appropriate contexts and analyzing the audience's reaction is a useful strategy to evaluate Design Fiction projects.
GACE: Gesture and Appearance Cutout Embedding for Gaming Applications BIBAFull-Text 41-44
  Tam V. Nguyen; Yong He Tan; Jose Sepulveda
This paper presents a lightweight game framework that provides real-time integration of human appearance and gesture-guided control within the game. It augments a new immersive experience since it allows game users to see their personal appearance interacting in real-time with other computer graphical characters in the game. With the goal to make the system easily realizable, we address the challenges in the whole pipeline of video processing, gesture recognition, and communication. To this end, we introduce the game framework, Gesture and Appearance Cutout Embedding (GACE), which runs the human appearance cutout algorithm and connects with game components by using memory mapped files. We also introduce the gesture-based support to enhance the immersion. Extensive experiments have shown that the proposed system runs reliably and comfortably in real-time with a commodity setting.
Mindful: A Platform for Large-Scale Affective Field Research BIBAFull-Text 45-48
  Guy Feigenblat; Jonathan Herzig; Michal Shmueli-Scheuer; David Konopnicki
In this work we present Mindful, a platform for defining, configuring, executing and distributing affective experiments to a large scale audience. This type of experiments measure the emotional reaction of participants to media content selected by experimenters. Furthermore, the platform manages profiles of registered users who have agreed to participate in an experiment as well as a data collection and analysis mechanisms. The analyzed data is then used to enrich users' profile and to better understand their emotional behavior. Throughout the paper we describe the platform in details and present a use case of how the platform is being used in practice.