Applications and a Three-dimensional Desktop Environment for an Immersive Virtual Reality System

We developed an application launcher called Multiverse for scientific visualizations in a CAVE-type virtual reality (VR) system. Multiverse can be regarded as a type of three-dimensional (3D) desktop environment. In Multiverse, a user in a CAVE room can browse multiple visualization applications with 3D icons and explore movies that float in the air. Touching one of the movies causes"teleportation"into the application's VR space. After analyzing the simulation data using the application, the user can jump back into Multiverse's VR desktop environment in the CAVE.


Introduction
CAVE is a room-sized virtual reality (VR) system, which was developed in the early 1990s at the University of Illinois, Chicago [1]. In a CAVE room, the viewer is surrounded by wall screens and a floor screen. Stereo-images are projected onto the surfaces. Tracking systems are used to capture the viewer's head position and direction. The wide viewing angle provided by the surrounding screens on the walls and floor generates a high-quality immersive VR experience. The viewer can interact with three-dimensional (3D) virtual objects using a portable controller known as wand, in which the tracking system is installed. CAVE systems have been used for scientific visualizations from the first system [1] until the latest generation (StarCAVE) [2]. For example, visualization applications in CAVE systems have been developed to analyze general computational fluid dynamics (CFD) [3], turbulence simulations [4], CFD of molten iron [5], CFD of wind turbines [6], seismic simulation [7], meteorological simulation [8], biomedical fluid simulation [9], magnetic resonance imaging [10], geomagnetic fields [11], archaeological studies [12], and geophysical surveys [13].
Recently, a new CAVE system was installed at the Integrated Research Center (IRC) at Kobe University. This CAVE system was named "π-CAVE" after  the IRC's location on Port Island (PI). Fig. 1 shows a front view of the π-CAVE while Fig. 2 shows the configuration of its projectors and mirrors. The original CAVE system had a cubic geometry with a side length of 3 m. A straightforward extension to enlarge the VR space of a CAVE is to use a rectangular parallelepiped shape. More sophisticated configurations have been proposed for advanced CAVE systems, such as StarCAVE [2], but we used the rectangular parallelepiped approach for π-CAVE to maximize the VR volume in the space allowed in the IRC building. The side lengths of π-CAVE were 3 m × 3 m × 7.8 m. As far as we know, this is the largest CAVE system in Japan.
We have developed several VR applications for the scientific visualization of large-scale simulation data. Of these, Virtual LHD [14] was our first VR visualization application. This application was developed for the CompleXcope CAVE system installed at the National Institute for Fusion Science, Japan. Currently, Virtual LHD is used to visualize the magnetohydrodynamic (MHD) equilibrium state of a nuclear fusion experiment. We also developed a generalpurpose visualization application, VFIVE [15,16,17,17], for 3D scalar/vector field data. Recently, we added a new visualization method to VFIVE at π-CAVE for visualizing magnetic field lines frozen into a fluid [18]. The original VFIVE only accepted a structured grid data format as the input, but an extension of VFIVE for unstructured grid data was developed at Chuo University [19]. The development and its applications of VFIVE are summarized in our recent papers [20,21].
In addition to improvements of VFIVE, we also developed the following four types of novel CAVE visualization applications for π-CAVE. (1) IonJetEngine: for VR visualization of plasma particle in cell (PIC) simulations of an ion jet engine in space probes (2) RetinaProtein: for molecular dynamics (MD) simulations of proteins (3) SeismicWave: for the simulation of seismic wave propagation (4) CellDivision: to simulate three-dimensional time sequence microscope images of mouse embryos. All of these new CAVE visualization programs were written using OpenGL and CAVElib. We started developing these visualization applications when the construction of π-CAVE was underway.
Several problems occur if multiple CAVE visualization applications are executed one after another, as follows. First, the command has to be typed in to launch the first application using the keyboard beside the CAVE room. The user then enters the CAVE room wearing stereo glasses. After analyzing the data from the first application in the CAVE, the user leaves the CAVE room and takes off the glasses. Next, the user types in the command to launch the second application and enters the CAVE room wearing the stereo glasses. These steps have to be repeated if there are many applications. This inconvenience occurs because the CAVE must be used for single tasks.
To resolve this inconvenience, we developed an application launcher for CAVE. This program, Multiverse, is a CAVE application written in CAVElib and OpenGL. Multiverse can control other VR applications. These sub-applications are depicted in CAVE's VR space using 3D icons or panels. If the user in the CAVE room touches one of the panels using the wand, they are "teleported" to the corresponding VR application.
In this paper, we report the hardware used by the π-CAVE system in section 2, and we describe the design and implementation of Multiverse in section 3. The visualization applications loaded into Multiverse are described in section 4.
2 π-CAVE system π-CAVE has a rectangular parallelepiped configuration with side lengths of 3 m × 3 m × 7.8 m (Fig. 3). The large width (7.8 m) is one of the characteristic features of the CAVE system. The large volume of π-CAVE allows several people to stand on the floor at the same time, without any mutual occlusion of the screen views in the room. Like many other CAVE systems, π-CAVE has four screens: three wall screens (front, right, and left) and a floor screen. Soft, semi-transparent screens are used on the walls. The images are rear-projected onto these screens. The floor is a hard screen where the stereo image is projected from the ceiling. Two projectors are used to generate the front wall image (Fig. 4). An edge blending technique is applied to the interface between the two images. Another pair of projectors is used for the floor screen. Each side wall screen (right and left) is projected onto using a projector. In total, six projectors are used. The resolution of the projector (Christie WU12K-M) shown in Fig. 5 with the counterpart mirror, is 1920 × 1200 pixels. The brightness is 10,500 lumens. An optical motion tracking system (Vicon) is used for head and wand tracking. Ten cameras with 640 × 480 resolution are installed on top of the wall screens. A commonly used API (Trackd) is used for the interface to CAVElib. Two computer systems are used for computations and for rendering π-CAVE. One is a Linux PC (HP Z800) with 192 GB of shared memory. Three sets of GPUs (NVIDIA QuadroPLEX) are used for real-time stereoscopic image generation by the six projectors. The other computer system is a Windows PC cluster system.
We used OpenGL for the graphics API and CAVElib for the VR API. We are also aiming to use VR Juggler [22] for the VR API. Some of our first trials using VR Juggler can be found in our report [23].

Multiverse
We developed an applications launcher, Multiverse, for the π-CAVE system. At the start of this Multiverse environment, the viewer in the π-CAVE stands in the virtual building in IRC where π-CAVE is installed. The 3D CAD model data of the IRC building (Fig. 6) is loaded into Multiverse and rendered in 3D in real time. This is the Multiverse's start-up environment known as World.
In the World mode of Multiverse, the viewer can walk through the building. Fig. 7(a) shows a snapshot where the user is approaching the IRC building. In Fig. 7(b), the viewer is (literally) walking into the (virtual) IRC building. Some fine structures of the building, including the virtual π-CAVE is shown in Fig. 7(c) and (d), are also loaded from CAD data files.  In Multiverse, there are two methods of showing the application list loaded in Multiverse. The first is to use "ribbons" that connect the wand and application icons. In the "ribbons" mode, the user in the World finds one or more curves or wires that start from the wand tip. Each wire is a type of guide that leads the user to a Gate.
A Gate is an entrance to the VR world of the corresponding application. If multiple visualization applications are loaded into Multiverse, this automatically generates the corresponding number of Gates. All of these are connected to the user (or the wand) via guide wires (Fig. 8). If the user walks or "flies" into a place in front of a Gate, they will find an exploratory movie near the Gate (see the rectangular panel in the center of the blue, torus-shaped Gate in Fig. 8). This explains the type of application that will be executed when the user selects the Gate. To select the application, the user (literally) walks through the Gate when the corresponding VR application program loads and the user feels as if they have been "teleported" to the visualization space. Each VR world is known as a Universe in Multiverse. Another method of showing the application lists loaded in the Multiverse is to use a virtual elevator. When the user enters the elevator in the (virtual) IRC building, they are automatically taken upward by the elevator into the sky above the IRC building. The spatial scale of the view changes rapidly from the building, to the city, country, and finally the globe. The user finds that they are "floating" in space surrounded by stars. Several panels then appear in front of the viewer. Each panel represents a visualization application (Fig. 9).
When the user touches one of the panels, the corresponding VR application is launched and the user is "teleported" to the selected visualization Universe.
In short, Multiverse is composed of the World and several Universes. World is a type of 3D desktop environment and a Universe is a visualization application loaded onto Multiverse.
In the program code, each Universe is simply a standard CAVE application with a unified interface to the Multiverse class. A Universe is an instance of a class that is derived from a virtual class known as Vacuum. Vacuum represents an empty space, which only has an interface to the Multiverse class through the member functions initialize(), draw(), update_per_frame(), and compute(). These function names convey their roles to readers who are familiar with CAVElib programming.

Applications
In this section, we describe five applications, or Universes, which we developed as the first applications for the Multiverse environment.

Universe::GeomagField
We converted VFIVE, which is described in section 1, into a class of Universe. VFIVE is a general-purpose visualization tool, so we can visualize any vector/scalar field provided that the data are legitimate for VFIVE's input data format in the Multiverse framework. Fig. 10 shows a snapshot of an example of a Universe based on VFIVE, known as GeomagField. The input data used by GeomagField was a geodynamo simulation performed by one of the authors and his colleagues [24,25,26]. The purpose of this simulation was to understand the mechanism that generates the Earth's magnetic field (or geomagnetic field). Fig. 11 shows another snapshot of GeomagField in which two VFIVE visualization methods were applied. The temperature distribution was visualized by volume rendering (colored in orange to yellow). The 3D arrow glyphs show the flow velocity vectors around the wand position. The arrows followed the motion when the viewer moved the wand, which changed the directions and

Universe::IonJetEngine
The second example of a Universe is known as IonJetEngine and a snapshot is shown in Fig. 12.
This Universe visualized a plasma PIC simulation of the ion jet engine of a space probe. The positions of the particles (ions and electrons) were represented by balls (yellow for ions and blue for electrons). The velocity distribution of the jet was visualized as the set of the individual motions of the particles. A 3D model of the virtual space probe from which the plasma jet beams were ejected is also shown in Fig. 12.  Fig. 13 shows a Universe known as RetinaProtein, which was a molecular dynamics simulation of rhodopsin [27], a protein in the human retina. At the start of this Universe, the viewer observed a 3D model of a human (see the top panel of Fig. 13). As the viewer approached the model's face, the fine structures of the eyes became visible until MD simulation visualization appeared. Figure 13: Snapshots of the RetinaProtein Universe. The molecular structure of rhodopsin was visualized in the human retina. The MD simulation data were provided by Prof. Ten-no of Kobe University and his colleagues.

Universe::SeismicWave
In this Universe, a simulation of seismic wave propagation [28]was visualized, which was performed by Prof. Furumura of the University of Tokyo by animated volume rendering (see Fig. 14). In this Universe, we implemented rapid volume rendering based on the 3D texture mapping technique in CAVEs. The full details of this implementation will be reported elsewhere.

Universe::CellDivision
The final Universe described here is CellDivision and a snapshot is shown in Fig. 15. The target data used for this visualization were not simulation data. Instead, they were microscope images of live mouse embryos. The data were provided by Dr. Yamagata of Osaka University. The time sequence of microscope images was visualized as an animated volume rendering using the same tool used for SeismicWave in the previous subsection.

Summary
In many CAVE systems, VR applications are executed as single tasks. Thus, the user has to type in each command one after another outside the CAVE room. To convert a CAVE into a more convenient tool for scientific visualization, we developed an application launcher known as Multiverse. Multiverse comprises a World and Universes. World, which correspond to the desktop of a PC operating system, where the user can select visualization applications by touching icons floating in the World. Using the virtual touch screen interface, the specified application program is launched and the user is "teleported" to another VR space containing the corresponding visualization application, which is known as a Universe. We developed five Universes, which can be launched from the Multiverse environment. Multiverse was designed as a general application framework, so it can read and control other Universes. A user can jump back to a World and switch to another Universe at any time from any Universe.
During the implementation of Multiverse, we developed several new fundamental tools and methods for the CAVE environment, such as a fast speed volume renderer, a 3D model (CAD) data loader/renderer, and a 2D movie file loader/renderer. Details of these fundamental tools and methods will be reported elsewhere.