From IrisVR‘s company blog, experiments that show how VR is true to scale and what that means for architects and other AEC professionals:
One of the key features and advantages of Virtual Reality(VR) technology is that the user gets a better sense of scale and depth compared to looking through regular monitors and projectors. Generally people who experienced VR models in a correctly set up Head Mounted Display(HMD), such as the Oculus Rift, report that they feel the virtual world is at the right scale. For architects and other design professionals, getting an accurate sense of scale, depth and volume of space is fundamentally critical. IrisVR delivers premium VR experience for architects and other AEC professionals in general…
We are not content with users’ qualitative feedback alone; we want to take a scientific approach to verify that scale in VR matches real world scale. So we designed and conducted two experiments to find a way to “project” the virtual world into real world. We then are able to measure the virtual distance and rotation in physical real world and see if they match or not.
Experiment One: Leap Motion Distance Tracking and Measurement
Leap motion is a hand tracking device that tracks user’s arm, palm, and fingers. We set up leap motion to work with Oculus Rift in Unity so that user can see his hands virtually in Oculus Rift. The idea is if the user sees a box in virtual world, and if he reaches out to touch the edges of the box with his index fingers, by measuring the distance between his virtual index fingers, we obtain the width of the virtual box (See Figure 1). Meanwhile, if we measure the distance between the user’s fingers in real world, we get the width of the box in real world. By comparing the two measurements, we can tell if the box in VR is scaled correctly.
Figure 1 Touching the edge of a box in virtual world to measure the box dimension
The measurement of distance in real world should match the distance in virtual world.
We mounted the leap motion device on the center of Oculus Rift, so that it can track user’s hands from the perspective of the user’s eyes. Figure 2 and Figure 3 show the experiment setup. In Figure 2, the user holds up his index finger in front of his face. The leap motion tracks his hands and projects them into the virtual world, as shown on the monitor. The distance between the virtual fingers is shown in the virtual world too so the user can spread his hands to particular distance (See Figure 3).
Figure 2. Experiment Setup
Figure 3. Leap Motion tracks virtual hands
We recruited five participants for this experiments. Each participant is asked to put their index fingers apart at certain distance. Then a measurer will measure the distance between the finger tips. To reduce the error related to measuring and tracking, we measured several different distances at 1”, 2”, 4”, 8” and 12”.
The results of the experiment is listed in the table below:
Table 1. Record of measurement
In the table, “Virtual Distance” is the distance in virtual world, and “Real Measure” is measurement in physical world with tape measure. The real measurement is rounded to the closest quarter inch. The 0” measure is taken while participant put his index finger together. This 0” measurement is used to benchmark the tracking system accuracy/error. The “Ave.Diff” row gives the average difference between the physical and virtual distance for a participant, excluding the 0” measurement. The “Ave VS zero” row is calculated by deducting 0” measurement from the Ave. Diff.
It is pretty obvious by looking at the “Ave VS zero” row that the physical measurement is very close to virtual distances, as most measurement has accuracy within 0.1 inch. To obtain an objective assessment, we leveraged statistical analysis tool and ran a two tailed pairwise t test for the measurement of 0” and “Ave VS zero”. See table below for detail.
Table 2 Two Groups for two tailed pairwise t test