One of the most exciting developments in modern virtual reality (VR) is the ability to lose yourself in another world. Generating this feeling of being in a different place or environment to the one where you’re situated is called ‘presence’ - and it’s critical to creating immersive VR experiences.
Presence is one of the most studied variables in developing immersive VR. The research covers everything from visual realism and field of view to human emotions and agency. How these elements combine to create presence, however, has so far remained elusive.
Understanding the problem
Virtual reality, especially at a commercial level, is still in its infancy. Mass-produced headsets are bound by technical limitations such as battery life, graphics, and screen size. This has raised questions for researchers about how hardware affects presence.
The visual realism of the environment and the field of view the user can see are among the most important elements when creating presence. However, both require higher computer processing power and result in more expensive and bulky technology – neither of which is ideal when creating a commercial product.
So how exactly can developers generate presence, while still giving mass markets an effective VR experience?
Setting the scene
Gathering a study group of 360 participants, the team exposed them to one of 16 different VR environments. Each of these environments induced either a happy or fearful emotion; gave the participant agency or no agency; and tested varying degrees of graphic quality and field of view.
In the ‘happy’ environment, participants found themselves in a park in the daytime with a friendly dog. Those with agency could interact with the dog using a laser pointer, while those without could only observe.
In contrast, the fear-inducing environment saw participants in the same park, but this time at night, confronted with a wolf-like creature ready to attack. Participants who were given agency had a torch they could flash to distract the creature, while those without agency could take no action.
Contrary to initial predictions, the results of the study highlighted technical factors alone – such as the quality of graphics or the field of view – were not enough to affect presence. Instead, human factors were the primary driver, with fear being the most impactful emotion.
Additionally, researchers discovered that technical features are only relevant in specific scenarios. They suggest considering the purpose of a VR application to help decide software properties and those of the headset itself. For example, realism only contributed to presence in the ‘fear’ scenario. In cases where fear is the dominant intended emotion, game designers should prioritise visual realism. Using a wider field of view was only found to be beneficial when users were also afforded agency. Developing headsets with a larger screen and wider view is most important for interactive games or training applications that afford this agency to users.
The team went on to propose a structural equation model, ‘TAP-Fear’, which offers designers guidelines for navigating technical limitations while leveraging human factors to maximise user presence.
This research marks a significant step forward in understanding the relationship between emotion and technology in the effectiveness of VR. As the industry continues to evolve, game developers should adapt their design according to the level of emotion and agency users experience, and models like TAP-Fear can be used to guide the creation of more immersive and impactful experiences.