Immersive Analytics is a new multidisciplinary initiative to explore future interaction technologies for data analytics. Immersive Analytics aims to bring together researchers in Information Visualisation, Visual Analytics, Virtual and Augmented Reality and Natural User Interfaces. Call for papersDue to the recent advances in immersive technologies (VR, AR, large displays, tangible surfaces, etc.), we see new opportunities to using these technologies to analyse and explore data. Visual analytics is concerned with analytical reasoning facilitated by interactive visual interfaces. This definition is agnostic of the actual interface devices employed by visual analysis systems. Nevertheless, the affordances of the display and input devices used for analyzing data strongly affect the experience of the users of such systems, and so, their degree of engagement and productivity. However, a systematic approach to developing visual analytic tools that move beyond the desktop is yet lacking. In this call, we are looking for innovative research, design, and view-points, mature or work-in-progress, that fall into or are related to the following topics: Real-world VA/AR, Collaboration, Hybrid 2D/3D, Affordances for Immersion, Changing Technologies, Application Areas, Platforms and Toolkits. We call this new research thrust ``Immersive Analytics", a topic that will explore the applicability and development of emerging user-interface technologies for creating more engaging experiences and seamless work-flows for data analysis applications. More info here. Submissions will be peer-reviewed and accepted submissions will be published as part of the ACM Digital Library.. |
#immersive2016 Tweets |
Accepted Papers
We are happy to inform that we were able to accept 13 papers. Congratulations to the all the authors and thanks everyone for considering to submit to Immersive2016!
Immersive Solutions for Future Air Traffic Control and Management
In this paper we present the activities of Air Traffic Control and Management (ATC/M) and expose scenarios that illustrate current and future challenges of this domain that can be tackled with the use of immersion. We introduce the concepts of an immersive Remote Tower and Collaborative Immersive Trajectory analysis that make use of immersive technologies such as Head Mounted Displays (HMDs) or big tiled displats to immerse users into their tasks to better support the management and the analysis of the complex data produced in this domain.
-
Visual Immersion in the Context of Wall Displays
Immersion is the subjective impression of being deeply involved in a specific situation, and can be sensory or cognitive. In this position paper, we use a basic model of visual perception to study how ultra-high resolution wall displays can provide visual immersion. With their large size, depending on the position of viewers in front of them, wall displays can provide a surrounding and vivid environment. Users close to the wall can have their visual field filled by the wall and they are able to see clearly a large amount information with a fine resolution. However, when close to the wall, visual distortion due to large possible viewing angles, can affect the viewing of data. On the contrary, from far away, distortion is no longer an issue, but the viewers’ visual field is not fully contained inside the wall, and the information details seen are less fine.
-
Redefining a Contribution for Immersive Visualization Research
Immersive computing modalities such as AR, VR, and speech-based input are regaining prominence as research threads in the visualization field due to the advancement in technology and availability of cheap consumer hardware. This renewed interest is similar to what we observed a decade ago when multitouch technology was gaining main- stream adoption. In this work, we reflect on lessons learned from designing for multitouch, with the goal of highlighting problems that may also emerge in AR/VR research. Specifically, we emphasize the need for the field to rearticulate what is expected from research efforts in the area of visualization on immersive technologies.
-
On Spatial Perception Issues In Augmented Reality Based Large-Scale Immersive Analytic
Beyond other domains, the field of immersive analytics makes use of Augmented Reality techniques to success- fully support users analyzing data. When displaying ubiquitous data integrated into the everyday life, spatial immersion issues like depth perception, data localization and object relations become relevant. Although there is a variety of techniques to deal with those, they become difficult to apply if the examined data or the reference space becomes oversized and abstract. Within this work, we discuss ob- served problems in large-scale immersive analytics systems and the applicability of current countermeasures to identify needs for action.
-
Gesture-driven Interactions on a Virtual Hologram in Mixed Reality
This paper describes a framework using the Microsoft Kinect 2 and the Microsoft HoloLens that can assist users in analyzing complex datasets. The system allows for groups of people to view a topological map as a virtual hologram in order to assist them in understanding complex datasets. In addition, the gestures that are built into the system were created with the idea of usability in mind. By allowing the user to resize, rotate and reposition the map, it opens up a much wider range of understanding the data that they have received. Custom gestures are also possible depending on the situation, such as raising or lowering the water level in a potential flood hot spot, or viewing graphs and charts associated with a specific data point.
-
Improving 3D Visualizations: Exploring Spatial Interaction with Mobile Devices
3D data visualizations, while offering a lot of potential, have also well-known issues regarding occlusion and readability. Immersive technologies might help overcoming these issues by addressing the perceptional problems and increasing the tangibility of the data. In this work, we explore the potential of spatial interaction with mobile devices. Building on the related work and our own experiences, we report on visualizations that are fixed in space or fixed on the device, as well as combining them with head-coupled perspective. A number of prototypes we developed, helped us to gain practical insights in the possibilities and limitations of these techniques.
-
In this paper we propose Mixed Reality (MR) interfaces as tools for the analysis and exploration of health- related data. Reported findings originate from the research project “SMARTACT” in which several intervention studies are conducted to investigate how participants’ long-term health behavior can be improved. We conducted a focus group to identify limitations of current data analysis technologies and practices, possible uses of MR interfaces and associated open questions to leverage their potentials in the given domain.
-
Gaze-directed Immersive Visualization of Scientific Ensembles
The latest advances in head-mounted displays (HMDs) for augmented reality (AR) and mixed reality (MR) have produced commercialized devices that are gradually accepted by the public. These HMDs are generally equipped with gaze tracking, which provides excellent opportunities to extend eye tracking research and develop gaze-based immersive visualization techniques for various AR/MR applications. This paper explores the gaze tracking function on the latest Microsoft HoloLens and present a gaze-directed visualization approach to study ensembles of 2D oil spill simulations in mixed reality. Our approach allows users to place an ensemble as an image stack in a real environment and explore the ensemble with gaze tracking. The prototype system demonstrates the challenges and promising effects of gaze-based interaction in the state-of-the-art mixed reality.
-
Personalized Views for Immersive Analytics
In this paper we present work-in-progress toward a vision of personalized views of visual analytics interfaces in the context of collaborative analytics in immersive spaces. In particular, we are interested in the sense of immersion, responsiveness, and personalization afforded by gaze- based input. Through combining large screen visual analytics tools with eye-tracking, a collaborative visual analytics system can become egocentric while not disrupting the collaborative nature of the experience. We present a prototype system and several ideas for real-time personalization of views in visual analytics.
-
Slicing the Aurora: An Immersive Proxemics-Aware Visualization
The Aurora Borealis or Northern Lights is a phenomenon that has fascinated people throughout history. The AuroraMAX outreach initiative provides a collection of time-lapse videos of the night sky captured by a camera at Yellowknife in Canada. We present an interactive visualization of this AuroraMAX image data on a large touch display. Our visualization slices each time-lapse video to represent an entire night as a single image or keogram, provides different views on the keograms, and allows people to explore and compare nights to discover interesting patterns. To entice people to interact, we use proxemic interaction and animate the visualization in response to people’s movements in front of the display. We deployed the visualization in a public space at an art-science festival. Initial findings suggest that the proxemic interaction aspect helps to draw people in and that the visualization generates interest from passersby, providing opportunities for science outreach.
-
Immersive Analytics for Multi-objective Dynamic Integrated Climate-Economy (DICE) Models
We are creating an immersive data exploration tool and present here early work on an immersive analytics tool for exploring the output of a Dynamic Integrated Climate-Economy (DICE) model. DICE models are critical for informing environmental decision making and policy analysis. They often produce complex and multi-layered output, but need to be understood by decision makers who are not experts. We discuss our current and targeted feature set. Additionally, we look ahead to the potential for rigorous evaluation of the system to uncover whether or not it is an improvement over current visualization methods.
-
We explore the role that immersive technologies, specifically virtual reality (VR) and hybrid 2D/3D sketch-based interfaces and visualizations, can play in analytical reasoning for medicine. Two case studies are described: (1) immersive explanations of medical procedures, and (2) immersive design of medical devices. Both tightly integrate 2D imagery and data with 3D interfaces, models, and visualizations. This is an approach we argue is likely to be particularly useful in medicine, where analytical tasks of- ten involve relating 2D data (e.g., medical imaging) to 3D contexts (e.g., a patient’s body). User feedback and observations from our interdisciplinary team indicate the utility of the approach for the current case studies as well as some shortcomings and areas for future research. This work con- tributes to a broader discussion of how hybrid 2D/3D inter- faces may form an essential ingredient of future immersive analytics systems across a variety of domains.
-
A Taxonomy for Designing Walking-based Locomotion Techniques for Virtual Reality
Designers have yet to find a fully general and effective solution to solve the problem of walking in large or unlimited virtual environments. A detailed taxonomy of walking-based locomotion techniques would be beneficial to better understand, analyze, and design walking techniques for virtual reality (VR). We present a taxonomy that can help designers and researchers investigate the fundamental components of locomotion techniques. Researchers can create novel locomotion techniques by making choices from the components of this taxonomy, analyze and improve existing techniques, or perform experiments to evaluate locomotion techniques in detail using the organization we present.
Submission Info
The type of contributions can include, but is not limited to, the following:
- Techniques: Interactions, visualizations, displays, devices, setups, ...
- Applications and use cases: When, how, and why using immersive technologies.
- Evaluations: Quantitative or qualitative evaluations on immersive techniques and setups.
- Position papers: Ideas, thoughs, viewpoints, critiques, on how to shape the field, how to do research, what are the methodologies, etc.
Submissions can be of any length from 2 to 7 pages, in proportion to the contribution: work-in-progress, latest braking news, or mature research. According to the conference and ACM, the page limit includes.
Paper Submission deadline: | October 3rd, 23:59 PDT |
Paper format: | ACM CHI Extended Abstract |
Submission length: | 2 - 7 pages (including references) |
Submission website: | EasyChair |
Review Process: | 3 peer reviews each. Single-blind mandatory, double-blind optional. |
Publication Format: | Archival in ACM Digital Library |
Presentation: | ~8 minutes at Workshop (depending on paper length). |
Registration: | Workshop participants will pay early-registration rate for ISS. |
Final reviews: | October 13th |
Camera-ready and ACM Copyright: | October 20th |
Schedule
9:00 | Introduction into Immersive Analytics and overview of the workshop structure |
9:10 |
Keynote I: 3D to 3D: New Dimensions of Data Representation
|
9:35 |
Keynote II: Subtle Interaction for Increased Expressivity
|
10:00am |
Talks: Design, Collaboration, and Reflection1. Slicing the Aurora: An Immersive Proxemics-Aware Visualization 2. Redefining a Contribution for Immersive Visualization Research 3. Personalized Views for Immersive Analytics |
10:30am | Coffee Break |
11:00am |
Talks: Application4. Immersive solutions for future Air Traffic Control and Management 5. Immersive Analysis of Health-Related Data with Mixed Reality Interfaces: Potentials and Open Questions 6. Immersive Analytics for Multi-objective Dynamic Integrated Climate-Economy (DICE) Models Talks: Perception7. Visual Immersion in the Context of Wall Displays 8. On Spatial Perception Issues In Augmented Reality Based Large-Scale Immersive Analytics Talks: Interaction9. Gesture-driven Interactions on a Virtual Hologram in Mixed Reality 10. Improving 3D Visualizations: Exploring Spatial Interaction with Mobile Devices 11. Gaze-directed Immersive Visualization of Scientific Ensembles 12. Immersive Analytics for Medicine: Hybrid 2D/3D Sketch-Based Interfaces for Annotating Medical Data and Designing Medical Devices 13. Taxonomy of walking-based interactions for VR |
12:30pm | Lunch |
1:30pm | Topic Discussion |
3:00pm | Coffee Break |
3:30pm | Group Discussion |
5:00pm | Group Reports |
5:20pm | Closing |
5:30pm | Interactive Demos |
Topics (Detail)
Real-World VA: What questions do technologies like augmented reality raise for visual analytics? Traditional information visualization supports open-ended exploration based on Shneiderman's information mantra: overview first, zoom and filter, then details on demand. In our view a different model is required for analytical applications grounded in the physical world. In this case objects in the physical environment provide the immediate and primary focus and so the natural model is to provide detailed information about these objects and only provide contextual information on demand.
Collaboration: Much research has been devoted to computer-assisted collaboration both synchronous and asynchronous, local and remote. These new devices and environments potentially support new models for collaboration as shown in Figures 1 and 2. What paradigms are potentially enabled by these new interaction modalities? How de we evaluate them?
Hybrid 2D/3D: Traditionally 3D visualisation has been used in the physical sciences, engineering and design; while 2D visualisations have been used to display statistical and abstract data in information visualisations. Increasingly there is a need to combine both sorts of visualisation in holistic visualisations. For instance in the life sciences different aspects of a cell are displayed using 2- and 3-D images, 2-D network data and the various -omics. Can these new technologies support more holistic visualisations of such data incorporating 3-D spatial information as well as abstract data?
Affordances for Immersion: What are the interface “tricks” and affordances such as high-resolution displays, sound, touch, responsive interaction that change the user perception from an allocentric view of the data to a more egocentric and immersive view of the data?
Changing Technologies: What are the lessons that can be learnt from previous research into the use of 3D visualisation for information visualisation? Do the new technologies invalidate the current wisdom that it is better to use 2D visualisation for abstract data since the designer of the visualisation has complete freedom to map data to an occlusion free 2D display?
Application Areas: What are the most fertile application areas for immersive analytics? For example, these could be in life-sciences, disaster and emergency management, archaeology, and many more.
Platforms and Toolkits: How do we develop generic platforms that support immersive analytics? Currently there is a wide range of different development platforms and existing cross-platform tools however they do not quite have a broad enough focus for immersive analytics. For example, Unity1 is designed for gaming applications rather than analytics while the Visualization Toolkit (VTK)2 is targeted at scientific visualisation applications.
PC Members
- Mark Billinghurst, University of South Australia, Australia
- Wolfgang Bueschel, Dresden University of Technology, Germany
- Maxime Cordeil, Monash University, Australia
- Steve Drucker, Microsoft Research, USA
- Carla dal Sasso Freitas, Federal University of Rio Grande do Sul, Brazil
- Christophe Hurter, ENAC, France
- Tobias Isenberg, Inria, France
- Karsten Klein, Konstanz University, Germany
- Kim Marriott, Monash University, Australia
- Jon McCormack, Monash University, Australia
- Chris North, Virginia Tech, USA
- Emmanuel Pietriga, Inria, France
- Aaron Quigley, University of St. Andrews, UK
- Gerik Scheuermann, University of Leipzig, Germany
- Falk Schreiber, Konstanz University, Germany
- Wolfgang Stuerzlinger, Simon Fraser University, Canada
- Aurelien Tabard, University Claude Bernard, Lyon, France
- Bruce Thomas, University of South Australia, Australia
Community
Below we list groups and projects interested in Immersive Analytics; feel free to contact them. If you want to be listed below as well, email us your tag:
- Immersive Analytics at Monash University, Australia (Tim Dwyer, Kim Marriott)
- PNNL, WA, USA (Nick Cramer)
- Workshop on Immersive Analytics at IEEE VR 2016 (March 2016)
- Immersive Analytics community site
- Dagstuhl semniar on on Immersive Analytics (June 2016)
- Immersive Analytics, position paper BDVA conference 2015.
- Image and Signal Processing Group, Leipzig University, Germany: We develop visual analysis methods and solutions for CFD, material science, bioinformatics, neuroscience, and digital humanities. Recently, we started using light-weight immersive technology like HTC Vive and Oculus Rift for immersive analytics. http://www.informatik.uni-leipzig.de/bsv/homepage/en
- Alexander Klippel, Department of Geography, Pennsylvania State University
- At the Interactive Media Lab Dresden we are researching Natural User Interfaces using very large displays in combination with mobile handheld surfaces for data exploration and information visualization. Tangible Displays, BodyLenses, Gaze, Multitouch and Embodied Interaction are novel means of exploring complex data. See our IMLD publication list.
- Interacting with Large Data (ILDA) at INRIA: we design and develop advanced interactive visual interfaces for ultra-high-resolution wall displays such as the WILD and WILDER platforms in Paris and ANDES in Santiago de Chile. We also work on software toolkits to develop visualization and interaction techniques for these cluster-driven wall displays.