Using Virtual Reality for Teaching Kinematics

Simulations have been used for decades to teach physics concepts. Virtual Reality (VR) opens new avenues: the benefits of acting out physis (embodiment) can be combined with the affordances of a simulated environment. This paper aims to demonstrate how to create physics-education simulations in VR with comparatively small effort beyond 2D-simulations, using the Unity game development environment in connection with consumer-grade VR gear.


Introduction
Simulations have proven their effectiveness for physics teaching and learning, as they provide opportunities for unrestricted engaged exploration [1,2], without the risk of destroying equipment ormuch worse -bodily harm.Otherwise unaffordable laboratory equipment can be made available virtually at scale, and immaterial concepts such as fields can be made tangible [3].
Moving beyond two-dimensional screen-based simulations, which can be developed using a variety of standard tools, the challenge of writing VR simulations can seem daunting.This challenge used to start with the availability of VR environments, once the domain of "caves" [4], and included the complications of interfacing with the VR equipment, but this changed with the advent of affordable gear for gaming purposes.This paper aims to show how to construct physics-education simulations in VR with comparably small effort beyond 2D-simulations, using the Unity game development environment [5,6] in connection with software libraries that allow for unified development across platforms and equipment, as well as consumer-grade VR equipment such as HTC Vive [7] or Oculus [8].VR gear usually includes a headset and hand controllers; these sets currently cost a few hundred dollars, but admittedly also require an almost equally as expensive graphics card in the computer running the simulation.
The left panel of Figure 1 shows a typical example of a setup: the student is wearing VR-glasses and uses two hand controllers; in the background is one of the two so-called "lighthouses", which are used for positional triangulation (more modern setups do not need this feature anymore, and more modern glasses are less bulky and wireless).In this example, HTC Vive (right panel of Figure 1) is used in connection with an NVIDIA GeForce GTX 1070 (8 GB) card in a laptop, but the development environment Unity used here hides and automatically manages these hardware details, HTC Vive has the advantage that it does not need a Facebook-account (the company is now called Meta, and more recent devices need to sign on to the "Metaverse" -something that might be problematic for privacy reasons).The small star-shaped device in Figure 1 is a so-called Trackable Object, which could be attached to real-world object as a means to capture their position and rotation, so a real object could be interactively integrated into the virtual world.

Using VR in Connection with Unity
Unity is a leading game development platform, which has found its way into several other areas of industry and academia for simulation and visualization purposes.The platform is free for noncommercial purposes (however, authors should carefully study the license as it applies to publication beyond classroom use, even free-of-charge).Unity is a professional environment, also used for a large number of commercially available games, and learning to use it for any purpose entails a steep learning curve.However, at the same time, the environment hides a lot of intricacies that would otherwise slow down the development of immersive environments: rendering objects in the virtual world is abstracted away by consequently treating them as software objects (similar to VPython [9]) and by providing a "camera"-object that "looks" at what the player sees on the screen.Lighting, obstruction, rendering of textures, as well as basic physics such as gravity, momentum conservation, friction, etc., are also all taken care of by the underlying engine, so developers can focus on the objects themselves and their interactions.Figure 2 shows the development environment of Unity.
Unity is best learned using the tutorials provided in the "Learning" section of their website [5].The tutorials come with downloadable setups and assets; videos and text materials work through the essential steps of writing immersive, interactive content.Readers familiar with VPython [9] have a large advantage, but there are some differences worth pointing out: Unity is even more strongly objectoriented, in that there is usually no one script controlling all objects (this would be possible but bad style); instead, event-driven object-orientation is consistently implemented by attaching small behaviour scripts directly to the objects that usually get triggered when events like collisions occur (see the right panel in Figure 2).These scripts are most frequently written in C#, but other languages are possible, such as JavaScript.The second large difference is the built-in physics engine, which automatically applies all of Newtonian physics to objects (if desired; this is the "RigidBody" component in the right panel of Figure 2).The third large difference is the graphical scene editor, which allows to put objects into their initial places without calculating their coordinates (this is the upper left panel in Figure 2).While interfacing with graphics cards and standard input devices such as mice, keyboards, and joysticks is a given, one might still expect that interfacing with the VR equipment is the most complex task when moving to VR; fortunately, the SteamVR package (available for free from the Unity Asset Store) abstracts away the equipment (including differences between vendors).Thus, when writing VR applications, it turns out that learning Unity in the first place is the large step, while the addition of VR can be accomplished in two small steps.
The essential first step is replacing Unity's "Main Camera" (what a player would see on the computer screen) by SteamVR's "Player" (what the player will see in their headsets; see the object list in the upper left panel of Figure 2), which in one fell swoop lets the user move around in the scene in a natural and intuitive way -essentially, the player's eyes become the "camera" in the virtual space.If virtual scenes simply need to be observed by looking and walking around, one would be finished now; in most cases though, the second step would be the literal manipulation of objects: grabbing them, moving them around, or throwing them.
As this second step, all game objects that the hand controllers should manipulate need to be linked to SteamVR's input system; this, however, is also fairly simply accomplished by adding a script component to those objects.The SteamVR library provides several ready-to-use scripts for this purpose that can be attached to objects, for example a "Throwable" behaviour (right panel of Figure 2): the user can intuitively grab the object, move it around, and throw it (with the Unity physics engine again taking over).From then on, Unity can be programmed in the same way as for non-VR games.

A Kinematics Simulation
Students frequently struggle with the fundamental kinematics concept of position, velocity, and acceleration, as all three may point in different directions [10].The concept of motion is readily accessible (arguably, hardly any physics concept is more easily embodied), but the concept of describing motion through vectors seems abstract to many learners -these vectors have no physical manifestation.In this example, the idea is to give these vectors manifestations in a simulation, by showing the velocity and acceleration vectors of a "throwable" ball.These travel along with the ball and can be readily observed in the virtual world.
The ball-object in this case is a sphere from the standard Unity library of simple geometric objects, covered with a metallic texture from the Unity Asset Store; the "Throwable" behaviour script comes from SteamVR library and is attached to the sphere, and finally the ball has a physics ("RigidBody") component from the standard Unity library that makes it fall, bounce, and roll with friction.Figure 3 shows screenshots of the simulation on the computer screen, which displays a clipped-out section of what the user sees in the VR glasses.
Figure 3. Screenshots of the simulation on the computer screen, which shows a clipped-out section of the user's headset display [11].
The left panel of Figure 3 shows the "throwable" ball and the hand controllers, which appear as gloves (again, these "hands" come with SteamVR).The user can grab the ball and move it around; the middle panel shows the user holding the ball and swinging it around in a horizontal circle (while holding the ball, the gloves disappear, which may appear unintuitive, but in actuality does not impact immersion).The arrows, in this case, are standard Unity cylinders, but the length is calculated in a selfwritten C#-script based on kinematics.The green arrow shows the instantaneous velocity (during this circular motion in tangential direction), and the red arrow shows the centripetal acceleration.The user can also throw the ball; the right panel of Figure 3 shows the ball in such a freefall trajectory, still on the way up.The simulation includes a cage, from which the ball would bounce off, so it does not get lost out of reach of the user (without the cage, the ball could roll away in the virtual world to places where in the real world there are walls or furniture in the way).During these collisions, the instantaneous acceleration vector can be observed.
A big challenge of Virtual Reality (besides finding enough open floorspace) is, that only one student at a time can wear the glasses.However, the clipped-out view of Figure 3 can be projected for all students to observe what is happening in virtual space and give directions to the user.In another implementation, a Trackable Object could be attached to a real ball, which students could interact with directlythe student could see the real ball in the virtual world, but with the vectors attached to it.A problem would be the integration of real surfaces (walls, floor, ceiling, furniture) into the virtual world.

Beyond Simulations
Embodiment is one possible application of VR systems; another is data collection for movement in three dimensions.In addition to the already tracked hand controllers and the headset, most systems also provide various types of robust Trackable Objects like the one in Figure 1, which can be attached to real objects.By default, the HTC Vive system collects fifty position and rotation datapoints per second for all tracked objects, with an absolute spatial resolution in the centimetre range.Individual datapoints are a bit noisy, so for the simulation presented here, a 10-datapoint running average was implemented to provide a solid base for the smooth rendering of the calculated velocities and accelerations.Finally, while it is instructional to have students interact with simulations, it can be even more instructional to have them write simulations [12,13]; while having used the somewhat simpler VPython in introductory physics courses [14], the author has been using Unity and VR in more advanced seminar courses.

Conclusion
Virtual Reality has become mainstream as a result of high-end gaming.As this particular market segment might have outlived its initial hype, and as VR headsets have become more mainstream up to the point where they are available in supermarkets, this development made the technology affordable for other applications, including education, where its immersive and interactive nature allows for the embodiment of otherwise abstract concepts.The same gaming applications also gave rise to packages like SteamVR that make the development of VR simulations relatively easy in standard game development platforms like Unity; these packages abstract away the intricacies of VR development, so authors can focus on the physics functionality of their simulations.

Figure 1 .
Figure 1.Student wearing VR glasses (left panel) and a typical (but by now older) VR system (right panel) including glasses, hand controllers, and a Trackable Object.

Figure 2 .
Figure 2. Screenshot of the Unity development environment.