[1]
MARCut: Marker-based Laser Cutting for Personal Fabrication on Existing
Objects
Work-in-Progress
/
Kikuchi, Takashi
/
Hiroi, Yuichi
/
Smith, Ross T.
/
Thomas, Bruce H.
/
Sugimoto, Maki
Proceedings of the 2016 International Conference on Tangible and Embedded
Interaction
2016-02-14
p.468-474
© Copyright 2016 ACM
Summary: Typical personal fabrication using a laser cutter allows objects to be
created from raw material and the engraving of existing objects. Current
methods to precisely align an object with the laser is a difficult process due
to indirect manipulations. In this paper, we propose a marker-based system as a
novel paradigm for direct interactive laser cutting on existing objects. Our
system, MARCut, performs the laser cutting based on tangible markers that are
applied directly onto the object to express the design. Two types of markers
are available; hand constructed Shape Markers that represent the desired
geometry, and Command Markers that indicate the operational parameters such as
cut, engrave or material.
[2]
Controlling stiffness with jamming for wearable haptics
Wearable interfaces
/
Simon, Timothy M.
/
Thomas, Bruce H.
/
Smith, Ross T.
Proceedings of the 2015 International Symposium on Wearable Computers
2015-09-07
p.45-46
© Copyright 2015 ACM
Summary: Layer jamming devices enhance wearable technologies by providing haptic
feedback through stiffness control. In this paper we present our prototype that
demonstrates improved haptic fidelity of a wearable layer jamming device, using
computer controlled solenoid to enable fine-grained control of the garments
stiffness property. We also explore variable stiffness configurations for
virtual UI components. An evaluation was conducted to validate the methodology,
demonstrating dynamic stiffness control with a two waveforms.
[3]
Interactive Visualisation for Surface Proximity Monitoring
/
Marshall, D. F.
/
Gardner, H. J.
/
Thomas, B. H.
Proceedings of AUIC 2015 Australasian User Interface Conference
2015-01-27
p.41-50
© Copyright 2015 Australian Computer Society
Summary: We consider tasks that require users to be aware of the proximity of two 3D
surfaces and where one or both of these surfaces is changing over time. We
consider situations where users need to quickly and accurately assess when and
where the two surfaces approach each other and eventually intersect. Because
occlusion in 3D visualisations remains an issue in the perception of such data,
a complete, simultaneous perception of the proximity of two such surfaces could
be helpful. We propose and implement a new, interactive, visualisation
technique, "Proximity Map Projection" (PMP), to provide this assistance to
users and describe a user study to investigate the effectiveness of PMP in a
static scenario. This study found that PMP enabled faster and more accurate
identification of regions of nearest proximity and greatest protrusion. As well
as affirming the potential benefits of PMP, this study motivates several areas
of further investigation of the technique.
[4]
Performance improvement using data tags for handheld spatial augmented
reality
User study and data analysis
/
Irlitti, Andrew
/
Von Itzstein, Stewart
/
Smith, Ross T.
/
Thomas, Bruce H.
Proceedings of the 2014 ACM Symposium on Virtual Reality Software and
Technology
2014-11-11
p.161-165
© Copyright 2014 ACM
Summary: Mobile devices such as some recent phones are now fitted with projection
capabilities that support Spatial Augmented Reality (SAR) and require
investigation to uncover new interaction possibilities. This paper presents a
study measuring user performance in a search and select task using a tracked
handheld projector and data tags, a 3D physical cue. This physical cue is used
to mark the location of hidden SAR information. The experiment required
participants to search for virtual symbols presented on two 5ft, multi-sided
control panels. Two methods of presenting AR information were employed, SAR
alone and SAR with the inclusion of physical cues to indicate the location of
the information. The results showed that attaching data tags, compared to
virtual content alone lowered the overall task completion time and reduced
handheld projector movement. Subjectively, participants also preferred the
combination of virtual data with data tags across both task variations.
[5]
Object-based touch manipulation for remote guidance of physical tasks
Spatial pointing and touching
/
Adcock, Matt
/
Ranatunga, Dulitha
/
Smith, Ross
/
Thomas, Bruce H.
Proceedings of the 2014 ACM Symposium Spatial User Interaction
2014-10-04
p.113-122
© Copyright 2014 ACM
Summary: This paper presents a spatial multi-touch system for the remote guidance of
physical tasks that uses semantic information about the physical properties of
the environment. It enables a remote expert to observe a video feed of the
local worker's environment and directly specify object movements via a touch
display. Visual feedback for the gestures is displayed directly in the local
worker's physical environment with Spatial Augmented Reality and observed by
the remote expert through the video feed. A virtual representation of the
physical environment is captured with a Kinect that facilitates the
context-based interactions. We evaluate two methods of remote worker
interaction, object-based and sketch-based, and also investigate the impact of
two camera positions, top and side, for task performance. Our results indicate
translation and aggregate tasks could be more accurately performed via the
object based technique when the top-down camera feed was used. While, in the
case of the side on camera view, sketching was faster and rotations were more
accurate. We also found that for object-based interactions the top view was
better on all four of our measured criteria, while for sketching no significant
difference was found between camera views.
[6]
Wearable jamming mitten for virtual environment haptics
Wearable input/output
/
Simon, Timothy M.
/
Smith, Ross T.
/
Thomas, Bruce H.
Proceedings of the 2014 International Symposium on Wearable Computers
2014-09-13
v.1
p.67-70
© Copyright 2014 ACM
Summary: This paper presents a new mitten incorporating vacuum layer jamming
technology to provide haptic feedback to a user. We demonstrate that layer
jamming technology can be successfully applied to a mitten, and discuss
advantages layer jamming provides as a wearable technology through its low
profile form factor. Jamming differs from traditional wearable haptic systems
by restricting a user's movement, rather than applying an actuation force on
the user's body. Restricting the user's movement is achieved by varying the
stiffness of wearable items, such as gloves. We performed a pilot study where
the qualitative results showed users found the haptic sensation of the jamming
mitten similar to grasping the physical counterpart.
[7]
CARL: activity-aware automation for energy efficiency
HomeSys 2014
/
Thomas, Brian L.
/
Cook, Diane J.
Adjunct Proceedings of the 2014 International Joint Conference on Pervasive
and Ubiquitous Computing
2014-09-13
v.2
p.939-946
© Copyright 2014 ACM
Summary: Society is becoming increasingly aware of the impact that our lifestyle
choices have on energy usage and the environment. This paper explores the
hypothesis that ubiquitous computing technologies can be used to understand
this impact and to provide activity-aware interventions to reduce energy
consumption. Specifically, we introduce a method to provide energy-efficient
home automation based on the recognition of activities and their associated
devices. We describe CARL (CASAS Activity-based Resource Limitation), a
prototype energy-efficient smart home, and evaluate the performance of our
activity-aware automation when using both historic and real-time sensor data to
drive intelligent home automation.
[8]
Adding input controls and sensors to RFID tags to support dynamic tangible
user interfaces
Let's get physical
/
Simon, Timothy M.
/
Thomas, Bruce H.
/
Smith, Ross T.
/
Smith, Mark
Proceedings of the 2014 International Conference on Tangible and Embedded
Interaction
2014-02-16
p.165-172
© Copyright 2014 ACM
Summary: Providing high resolution tangible user interface components without
batteries such as dials and sliders that support dynamic user interface
arrangement is challenging. Previous work uses RFID to support limited
resolution custom-built components. We demonstrate improved techniques using
commercial off the shelf input controls incorporated into passive RFID tags
using an on-off key subcarrier to encode state information into the RFID
signal. Our method supports high resolution components that do not require
power cables or batteries. We provide exemplars demonstrating how the technique
supports a range of user interface components including buttons, dials,
sliders, flex and light sensors. Compared to previous work, we obtain a higher
resolution, only limited by sample time, for all components and demonstrate 115
discrete dial positions. Our technique allows the TUI components to be freely
placed and rearranged without hardwiring or batteries.
[9]
Ephemeral Interaction Using Everyday Objects
Virtual and Augmented Reality
/
Walsh, J. A.
/
Itzstein, S. V.
/
Thomas, B. H.
Proceedings of AUIC'14, Australasian User Interface Conference
2014-01-22
p.29-38
Keywords: user interfaces, ephemeral, tangible, projected, extensible customizable,
reconfigurable
© Copyright 2014 Australian Computer Society
Summary: The ability for Tangible User Interfaces to enable the intuitive control of
existing systems and adapt to individual users' usage scenarios remains an area
of development. Previous research in customizable tangible interfaces has
focused primarily on the offline creation by the original system developer,
instead of offering extensibility to the end user. This paper presents our
system to support the ad-hoc creation of 'disposable' UIs using both projected
controls and physical objects. To support these controls, a software based
patch panel enables data to be mapped to external systems, and from external
systems back to the system itself. Using a projector, depth camera and 6DOF
tracking system, users can create and map tangible/touch-based ad-hoc user
controls to existing system functionality. This allows users to both quickly
create new inputs for existing functionality, as well as create new arbitrary
input devices from completely passive components.
[10]
Spatial Augmented Reality User Interface Techniques for Room Size Modelling
Tasks
Virtual and Augmented Reality
/
Marner, M. R.
/
Thomas, B. H.
Proceedings of AUIC'14, Australasian User Interface Conference
2014-01-22
p.39-46
© Copyright 2014 Australian Computer Society
Summary: This paper present results of our investigations into using spatial
augmented reality to improve kitchen design and other interior architecture
tasks. We have developed user interface techniques for room sized modelling
tasks, including cabinet layout, viewing and modifying preset designs, and
modifying materials and surface finishes. These techniques are based on
Physical-Virtual Tools, which consist of physical input devices augmented with
projected information. These tools and techniques address key user interface
issues for spatial augmented reality systems, and we discuss how they can be
generalised for other applications. The techniques have been developed in the
context of a demonstration application, BuildMyKitchen. BuildMyKitchen allows
architects to design kitchen cabinets and layouts, and work with clients on the
design, in an interactive spatial augmented reality environment.
[11]
Depth Perception in View-Dependent Near-Field Spatial AR
Posters
/
Broecker, M.
/
Smith, R. T.
/
Thomas, B. H.
Proceedings of AUIC'14, Australasian User Interface Conference
2014-01-22
p.87-88
© Copyright 2014 Australian Computer Society
Summary: View-dependent rendering techniques are an important tool in Spatial
Augmented Reality. These allow the addition of more detail and the depiction of
purely virtual geometry inside the shape of physical props. This paper
investigates the impact of different depth cues onto the depth perception of
users.
[12]
Visualization of off-surface 3D viewpoint locations in spatial augmented
reality
Full papers
/
Adcock, Matt
/
Feng, David
/
Thomas, Bruce
Proceedings of the 2013 ACM Symposium Spatial User Interaction
2013-07-20
p.1-8
© Copyright 2013 ACM
Summary: Spatial Augmented Reality (SAR) systems can be used to convey guidance in a
physical task from a remote expert. Sometimes that remote expert is provided
with a single camera view of the workspace but, if they are given a live
captured 3D model and can freely control their point of view, the local worker
needs to know what the remote expert can see. We present three new SAR
techniques, Composite Wedge, Vector Boxes, and Eyelight, for visualizing
off-surface 3D viewpoints and supporting the required workspace awareness. Our
study showed that the Composite Wedge cue was best for providing location
awareness, and the Eyelight cue was best for providing visibility map
awareness.
[13]
Tangible Agile Mapping: Ad-hoc Tangible User Interaction Definition
Papers: Interfaces
/
Walsh, J. A.
/
Itzstein, S. T.
/
Thomas, B. H.
Proceedings of AUIC'13, Australasian User Interface Conference
2013-01-29
p.3-12
Keywords: Tangible user interfaces, programming by demonstration, organic users
interfaces, proxemic interactions, authoring by interaction
© Copyright 2013 Australian Computer Society
Summary: People naturally externalize mental systems through physical objects to
leverage their spatial intelligence. The advent of tangible user interfaces has
allowed human computer interaction to utilize these skills. However, current
systems must be written from scratch and designed for a specific purpose, thus
meaning end users cannot extend or repurpose the system. This paper presents
Tangible Agile Mapping, our architecture to address this problem by allowing
tangible systems to be defined ad-hoc. Our architecture addresses the tangible
ad-hoc definition of objects, properties and rules to support tangible
interactions. This paper also describes Spatial Augmented Reality TAM as an
implementation of this architecture that utilizes a projector-camera setup
combined with gesture-based navigation to allow users to create tangible
systems from scratch. Results of a user study show that the architecture and
our implementation are effective in allowing users to develop tangible systems,
even for users with little computing or tangible experience.
[14]
Spatial augmented reality based tangible CAD system
Posters
/
Joo, Hyeon Joon
/
Smith, Ross
/
Thomas, Bruce
/
Park, Jun
Proceedings of the 2012 ACM Symposium on Virtual Reality Software and
Technology
2012-12-10
p.207-208
© Copyright 2012 ACM
Summary: In current Computer Aided Design (CAD) systems, designers are commonly
restricted to a traditional workstation environment with mouse and keyboard.
This environment is indirect from the physical object they are designing, and
as such they may lose the one to one correspondence between the virtual and
physical magnification of the design. In order to reduce this, we propose a
Spatial Augmented Reality (SAR) based CAD system which consists of a fixed
camera-projector pair, a Light Emitting Diode (LED) pen with two buttons, a
wireless communication module, and a physical drawing board.
[15]
Merging Tangible Buttons and Spatial Augmented Reality to Support Ubiquitous
Prototype Designs
Papers
/
Simon, Tim M.
/
Smith, Ross T.
/
Thomas, Bruce
/
Von Itzstein, Stewart
/
Smith, Mark
/
Park, Joonsuk
/
Park, Jun
Proceedings of AUIC'12, Australasian User Interface Conference
2012
p.29-38
Copyright © 2012 Australian Computer Society
Summary: The industrial design prototyping process has previously shown promising
enhancements using Spatial Augmented Reality to increase the fidelity of
concept visualizations. This paper explores further improvements to the process
by incorporating tangible buttons to allow dynamically positioned controls to
be employed by the designer. The tangible buttons are equipped with RFID tags
that are read by a wearable glove sensor system to emulate button activation
for simulating prototype design functionality. We present a new environmental
setup to support the low cost development of an active user interface that is
not restricted to the two-dimensional surface of a traditional computer
display. The design of our system has been guided by the requirements of
industrial designers and an expert review of the system was conducted to
identify its usefulness and usability aspects. Additionally, the quantitative
performance evaluation of the RFID tags indicated that the concept development
using our system to support a simulated user interface functionality is an
improvement to the design process.
[16]
Supporting Freeform Modelling in Spatial Augmented Reality Environments with
a New Deformable Material
Papers
/
Maas, E. T. A.
/
Marner, M. R.
/
Smith, R. T.
/
Thomas, B. H.
Proceedings of AUIC'12, Australasian User Interface Conference
2012
p.77-86
Copyright © 2012 Australian Computer Society
Summary: This paper describes how a new free-form modelling material, Quimo (Quick
Mock-up), can be used by industrial designers in spatial augmented reality
environments. Quimo is a white malleable material that can be sculpted and
deformed with bare hands into an approximate model. The material is white in
colour, retains its shape once sculpted, and allows for later modification.
Projecting imagery onto the surface of the low-fidelity mock-up allows for
detailed prototype visualisations to be presented. This ability allows the
designer to create design concept visualisations and re-configure the physical
shape and projected appearance rapidly. We detail the construction techniques
used to create the Quimo material and present the modelling techniques employed
during mock-up creation. We then extend the functionality of the material by
integrating low-visibility retro-reflective fiducial markers to capture the
surface geometry. The surface tracking allows the combined physical and virtual
modelling techniques to be integrated. This is advantageous compared to the
traditional prototyping process that requires a new mock-up to be built
whenever a significant change of the shape or visual appearance is desired. We
demonstrate that Quimo, augmented with projected imagery, supports interactive
changes of an existing prototype concept for advanced visualisation.
[17]
Data Mining Office Behavioural Information from Simple Sensors
Posters
/
O'Malley, Samuel J.
/
Smith, Ross T.
/
Thomas, Bruce H.
Proceedings of AUIC'12, Australasian User Interface Conference
2012
p.97-98
Keywords: Digital Foam, Data Mining, Apriori Algorithm, Non-invasive, Ambient Display,
Market Basket Analysis
Copyright © 2012 Australian Computer Society
Summary: This paper discussed the concept of using three simple sensors to monitor
the behavioural patterns of an office occupant. The goal of this study is to
capture behaviour information of the occupant without the use of invasive
sensors such as cameras that do not maintain a level of privacy when installed.
Our initial analysis has shown that data mining can be applied to capture
re-occurring behaviours and provide real-time presence information to others
that occupy the same building..
[18]
DOMER: a wizard of oz interface for using interactive robots to scaffold
social skills for children with autism spectrum disorders
Late-breaking reports/poster session
/
Villano, Michael
/
Crowell, Charles R.
/
Wier, Kristin
/
Tang, Karen
/
Thomas, Brynn
/
Shea, Nicole
/
Schmitt, Lauren M.
/
Diehl, Joshua J.
Proceedings of the 6th International Conference on Human-Robot Interaction
2011-03-06
p.279-280
© Copyright 2011 ACM
Summary: This report describes the development of a prototypical Wizard of Oz,
graphical user interface to wirelessly control a small, humanoid robot
(Aldebaran Nao) during a therapy session for children with Autism Spectrum
Disorders (ASD). The Dynamically Operated Manually Executed Robot interface
(DOMER) enables an operator to initiate pre-developed behavior sequences for
the robot as well as access the text-to-speech capability of the robot in
real-time interactions between children with ASD and their therapist.
Preliminary results from a pilot study suggest that the interface enables the
operator to control the robot with sufficient fidelity such that the robot can
provide positive feedback, practice social dialogue, and play the game, "Simon
Says" in a convincing and engaging manner.
[19]
Visualising Environmental Corrosion in Outdoor Augmented Reality
/
Walsh, J. A.
/
Thomas, B. H.
Proceedings of AUIC'11, Australasian User Interface Conference
2011
p.39-46
Copyright © 2011 Australian Computer Society
Summary: This paper provides a description of outdoor visualisation of environmental
corrosion data. This system was developed to aid in the visual understanding of
data from wireless sensors used to monitor large structures. Due to the
laborious manual inspections required for large structures (such as bridges),
wireless environmental sensors have been designed to automate this process. Our
system visualizes this information in its real-world context using the Tinmith
mobile outdoor augmented reality system. We provide an overview of the
visualizations, outlining a user study that was conducted to determine the
effectiveness of the visualizations in providing the user with
context-sensitive information, along with the preliminary results of this
study. The paper concludes with an overview of future work on the system and
final thoughts.
[20]
Seeing more than the graph: evaluation of multivariate graph visualization
methods
Workshops
/
Cunningham, Andrew
/
Xu, Kai
/
Thomas, Bruce
Proceedings of the 2010 International Conference on Advanced Visual
Interfaces
2010-05-26
p.429
© Copyright 2010 ACM
Summary: Many real-world networks are multivariate, i.e., they have attributes
associated with nodes and/or edges. Examples include social networks whose
nodes represent people and edges represent relationships. There is usually
information about each person (such as name, age, and gender) and the
relationship (such type, duration, and strength). Besides common graph analysis
tasks (such as identifying the most influential or structurally important
nodes), there are more complex analyses for multivariate networks. One of these
is the multivariate graph clustering, i.e., identifying clusters formed by
nodes that have similar attributes and are close to each other in terms of
graph distance. For instance, in social network analysis, it is interesting to
sociologists whether or not people with similar characteristics (node
attributes) are also connected to each other. Currently there are very few
visualization methods available for such analysis.
Graph and multivariate visualization have been well studied separately in
the literature. Herman et al. summarized the recent work on graph visualization
[3], and Wong and Bergeron covered the development in multivariate
visualization [4]. However, there is relatively less work available on
multivariate network visualization. Two types of approaches are commonly used.
The first one is the mapping approach, which maps attributes to visual elements
of a node or edge. A simple example is to map one attribute to node size and
another to node color [2]. A more advanced mapping approach uses glyphs to
represent node or edge attributes. One such example is to use the length and
width of a rectangle node glyph to represent two node attributes [1]. The
second one is the 2.5D approach: it uses the third dimension to present the
multivariate information, while the graph is shown on a 2D plane. Examples
include the recently proposed "GraphScape" [5], which adopts a landscape
metaphor: each attribute is represented by a two-and-a-half- dimensional
surface, whose height indicates its value.
Each approach has its strength and weakness. The mapping approach is
effective of showing numerical value using visual element such as size, but it
can be difficult to compare the value of attributes represented by different
elements such as size and color. The problem is alleviated by a carefully
designed glyph, but visual complexity increases quickly as the number of
attributes that a glyph needs to represent grows. The 2.5D approach is good at
showing the distribution of attribute values over the network, but the
attribute surface could introduce occlusion and affect the visibility of
underlying network.
In this paper, we present a study evaluating the effectiveness of these two
approaches for different analysis tasks. We compare the performance of mapping
and 2.5D approach in a controlled lab environment. We included both simple
tasks (such as identifying nodes with the largest attribute value) and complex
tasks (such as multivariate graph clustering). The performance is measured both
in terms of accuracy and completion time. The results indicate that
statistically mapping approach performs better for the simple tasks, while the
2.5D approach is favored in the complex task. The outcomes from this study
provide some guidelines for the design of effective multivariate graph
visualization for different analysis tasks.
[21]
Design and impressions of a multi-user tabletop interaction device
Contributed papers
/
Cunningham, Andrew
/
Close, Ben
/
Thomas, Bruce
/
Hutterer, Peter
Proceedings of AUIC'10, Australasian User Interface Conference
2010-01-20
p.71-79
© Copyright 2010 Australian Computer Society
Summary: TableMouse is a cursor manipulation device designed specifically for
multiple users interacting on large tabletop surface. TableMouse tracks
position, height, orientation, button state, and unique identification. It is
designed using infrared light emitting diodes and computer vision to perform
device tracking and identification. This paper explores the functional design
of such a device. Insights into the inherent features enabled by this
functionality -- out of arms reach interaction, collaborative interaction --
are described. The architecture, vision analysis process, and issues to
consider are described. Finally two example applications utilising the
TableMouse are described.
[22]
EDITED BOOK
Tabletops -- Horizontal Interactive Displays
Human-Computer Interaction Series
/
Müller-Tomfelde, Christian
2010
n.18
p.456
Springer London
DOI: 10.1007/978-1-84996-113-4
== Under Tabletops ==
Building Interactive Multi-touch Surfaces (27-49)
+ Schöning, Johannes
+ Hook, Jonathan
+ Bartindale, Tom
+ Schmidt, Dominik
+ Oliver, Patrick
+ et al
From Table-System to Tabletop: Integrating Technology into Interactive Surfaces (51-69)
+ Kunz, Andreas
+ Fjeld, Morten
High-Resolution Interactive Displays (71-100)
+ Ashdown, Mark
+ Tuddenham, Philip
+ Robinson, Peter
Optical Design of Tabletop Displays and Interactive Applications (101-129)
+ Kakehi, Yasuaki
+ Naemura, Takeshi
Hand and Object Recognition on Liquid Crystal Displays (131-146)
+ Koike, Hideki
+ Sato, Toshiki
+ Nishikawa, Wataru
+ Fukuchi, Kentaro
== On and Above Tabletops ==
Augmenting Interactive Tabletops with Translucent Tangible Controls (149-170)
+ Weiss, Malte
+ Hollan, James D.
+ Borchers, Jan
Active Tangible Interactions (171-187)
+ Inami, Masahiko
+ Sugimoto, Maki
+ Thomas, Bruce H.
+ Richter, Jan
Interaction on the Tabletop: Bringing the Physical to the Digital (189-221)
+ Hilliges, Otmar
+ Butz, Andreas
+ Izadi, Shahram
+ Wilson, Andrew D.
Supporting Atomic User Actions on the Table (223-247)
+ Aliakseyeu, Dzmitry
+ Subramanian, Sriram
+ Alexander, Jason
Imprecision, Inaccuracy, and Frustration: The Tale of Touch Input (249-275)
+ Benko, Hrvoje
+ Wigdor, Daniel
On, Above, and Beyond: Taking Tabletops to the Third Dimension (277-299)
+ Grossman, Tovi
+ Wigdor, Daniel
== Around and Beyond Tabletops ==
Individual and Group Support in Tabletop Interaction Techniques (303-333)
+ Nacenta, Miguel A.
+ Pinelle, David
+ Gutwin, Carl
+ Mandryk, Regan
File System Access for Tabletop Interaction (335-355)
+ Collins, Anthony
+ Kay, Judy
Theory of Tabletop Territoriality (357-385)
+ Scott, Stacey D.
+ Carpendale, Sheelagh
Digital Tables for Collaborative Information Exploration (387-405)
+ Isenberg, Petra
+ Hinrichs, Uta
+ Hancock, Mark
+ Carpendale, Sheelagh
Coordination and Awareness in Remote Tabletop Collaboration (407-434)
+ Tuddenham, Philip
+ Robinson, Peter
Horizontal Interactive Surfaces in Distributed Assemblies (435-456)
+ Müller-Tomfelde, Christian
+ O'Hara, Kenton
[23]
EDITED BOOK
The Engineering of Mixed Reality Systems
Human-Computer Interaction Series
/
Dubois, Emmanuel
/
Gray, Philip
/
Nigay, Laurence
2010
n.21
p.445
Springer London
DOI: 10.1007/978-1-84882-733-2
Introduction (1-6)
+ Dubois, Emmanuel
+ Gray, Phil
+ Nigay, Laurence
== Interaction Design ==
An Integrating Framework for Mixed Systems (9-31)
+ Coutrix, Céline
+ Nigay, Laurence
A Holistic Approach to Design and Evaluation of Mixed Reality Systems (33-55)
+ Nilsson, Susanna
+ Johansson, Björn
+ Jönsson, Arne
Embedded Mixed Reality Environments (57-78)
+ Schnädelbach, Holger
+ Galani, Areti
+ Flintham, Martin
The Semantic Environment: Heuristics for a Cross-Context Human-Information Interaction Model (79-99)
+ Resmini, Andrea
+ Rosati, Luca
Tangible Interaction in Mixed Reality Systems (101-120)
+ Couture, Nadine
+ Rivière, Guillaume
+ Reuter, Patrick
Designing a Mixed Reality Intergenerational Entertainment System (121-141)
+ Khoo, Eng Tat
+ Merritt, Tim
+ Cheok, Adrian David
Auditory-Induced Presence in Mixed Reality Environments and Related Technology (143-163)
+ Larsson, Pontus
+ Väljamäe, Aleksander
+ Västfjäll, Daniel
+ Tajadura-Jiménez, Ana
+ Kleiner, Mendel
An Exploration of Exertion in Mixed Reality Systems via the "Table Tennis for Three" Game (165-182)
+ Mueller, Florian 'Floyd'
+ Gibbs, Martin R.
+ Vetere, Frank
Developing Mixed Interactive Systems: A Model-Based Process for Generating and Managing Design Solutions (183-208)
+ Gauffre, Guillaume
+ Charfi, Syrine
+ Bortolaso, Christophe
+ Bach, Cédric
+ Dubois, Emmanuel
== Software Design and Implementation ==
Designing Outdoor Mixed Reality Hardware Systems (211-231)
+ Avery, Benjamin
+ Smith, Ross T.
+ Piekarski, Wayne
+ Thomas, Bruce H.
Multimodal Excitatory Interfaces with Automatic Content Classification (233-250)
+ Williamson, John
+ Murray-Smith, Roderick
Management of Tracking for Mixed and Augmented Reality Systems (251-273)
+ Keitler, Peter
+ Pustka, Daniel
+ Huber, Manuel
+ Echtler, Florian
+ Klinker, Gudrun
Authoring Immersive Mixed Reality Experiences (275-291)
+ Misker, Jan M. V.
+ van der Ster, Jelle
Fiia: A Model-Based Approach to Engineering Collaborative Augmented Reality (293-312)
+ Wolfe, Christopher
+ Smith, J. David
+ Phillips, W. Greg
+ Graham, T. C. Nicholas
A Software Engineering Method for the Design of Mixed Reality Systems (313-334)
+ Dupuy-Chessa, S.
+ Godet-Bar, G.
+ Pérez-Medina, J.-L.
+ Rieu, D.
+ Juras, D.
== Applications of Mixed Reality ==
Enhancing Health-Care Services with Mixed Reality Systems (337-356)
+ Stantchev, Vladimir
The eXperience Induction Machine: A New Paradigm for Mixed-Reality Interaction Design and Psychological Experimentation (357-379)
+ Bernardet, Ulysses
+ Badia, Sergi Bermúdez i
+ Duff, Armin
+ Inderbitzin, Martin
+ Groux, Sylvain Le
+ Manzolli, Jônatas
+ Mathews, Zenon
+ Mura, Anna
+ Väljamäe, Aleksander
+ Verschure, Paul F. M. J
MyCoach: In Situ User Evaluation of a Virtual and Physical Coach for Running (381-397)
+ Biemans, Margit
+ Haaker, Timber
+ Szwajcer, Ellen
The RoboCup Mixed Reality League -- A Case Study (399-418)
+ Gerndt, Reinhard
+ Bohnen, Matthias
+ Guerra, Rodrigo da Silva
+ Asada, Minoru
== Applications of Mixed Reality ==
Mixed-Reality Prototypes to Support Early Creative Design (419-445)
+ Safin, Stéphane
+ Delfosse, Vincent
+ Leclercq, Pierre
[24]
TableMouse: a novel multiuser tabletop pointing device
Interact
/
Cunningham, Andrew
/
Close, Ben
/
Thomas, Bruce H.
/
Hutterer, Peter
Proceedings of OZCHI'09, the CHISIG Annual Conference on Human-Computer
Interaction
2009-11-23
p.169-176
Keywords: collaboration, collocation, device, interaction
© Copyright 2009 CHISIG and author(s)
Summary: This paper introduces the TableMouse, a new cursor manipulation interaction
technology for tabletop computing, specifically designed to support multiple
users operating on large horizontal displays. The TableMouse is a low-cost
absolute positioning device utilising visually-tracked infrared light emitting
diodes for button state, 3D position, 1D orientation, and unique identification
information. The supporting software infrastructure is designed to support up
to 16 TableMouse devices simultaneously, each with an individual system cursor.
This paper introduces the device and software infrastructure and presents two
applications exposing its functionality. A formal benchmarking was performed
against the traditional mouse for its performance and accuracy.
[25]
FrostWall: a dual-sided situated display for informal collaboration in the
corridor
Locative
/
Kjeldskov, Jesper
/
Paay, Jeni
/
O'Hara, Kenton
/
Smith, Ross
/
Thomas, Bruce
Proceedings of OZCHI'09, the CHISIG Annual Conference on Human-Computer
Interaction
2009-11-23
p.369-372
Keywords: dual-sided interface, informal collaboration, situated display, ubiquitous
computing
© Copyright 2009 CHISIG and author(s)
Summary: FrostWall is designed to support collegial communication and collaboration
within a co-located work environment by facilitating and encouraging informal
information exchange in the corridors of a workplace using large situated
displays. FrostWall displays provide a flexible display area between the inside
of a private office workspace and the public corridor outside it. FrostWall
uses "frosting" of glass windows and partitions between private and public
workspaces in combination with projectors to create a display area that is
effectively dual-sided: readable and operable from both sides. In addition to
facilitating informal digital communication and information exchange between
co-workers, this situated display area also provides a venue for playfulness
and personal expression enhancing social cohesion between colleagues. FrostWall
is also a unique vehicle for future research into interaction design for
dual-sided interfaces.