Immersion (virtual reality)

Immersion into virtual reality (VR) is a perception of being physically present in a non-physical world. The perception is created by surrounding the user of the VR system in images, sound or other stimuli that provide an engrossing total environment.

A woman using the Manus VR glove development kit in 2016

Etymology

The name is a metaphoric use of the experience of submersion applied to representation, fiction or simulation. Immersion can also be defined as the state of consciousness where a "visitor" (Maurice Benayoun) or "immersant" (Char Davies)'s awareness of physical self is transformed by being surrounded in an artificial environment; used for describing partial or complete suspension of disbelief, enabling action or reaction to stimulations encountered in a virtual or artistic environment. The greater the suspension of disbelief, the greater the degree of presence achieved.

Types

According to Ernest W. Adams,[1] immersion can be separated into three main categories:

  • Tactical immersion: Tactical immersion is experienced when performing tactile operations that involve skill. Players feel "in the zone" while perfecting actions that result in success.
  • Strategic immersion: Strategic immersion is more cerebral, and is associated with mental challenge. Chess players experience strategic immersion when choosing a correct solution among a broad array of possibilities.
  • Narrative immersion: Narrative immersion occurs when players become invested in a story, and is similar to what is experienced while reading a book or watching a movie.

Staffan Björk and Jussi Holopainen, in Patterns In Game Design,[2] divide immersion into similar categories, but call them sensory-motoric immersion, cognitive immersion and emotional immersion, respectively. In addition to these, they add a new category: spatial immersion, which occurs when a player feels the simulated world is perceptually convincing. The player feels that he or she is really "there" and that a simulated world looks and feels "real".

Presence

10.000 moving cities, Marc Lee, Telepresence-Based Installation[3]

Presence, a term derived from the shortening of the original "telepresence", is a phenomenon enabling people to interact with and feel connected to the world outside their physical bodies via technology. It is defined as a person's subjective sensation of being there in a scene depicted by a medium, usually virtual in nature.[4] Most designers focus on the technology used to create a high-fidelity virtual environment; however, the human factors involved in achieving a state of presence must be taken into account as well. It is the subjective perception, although generated by and/or filtered through human-made technology, that ultimately determines the successful attainment of presence.[5]

Virtual reality glasses can produce a visceral feeling of being in a simulated world, a form of spatial immersion called Presence. According to Oculus VR, the technology requirements to achieve this visceral reaction are low-latency and precise tracking of movements.[6][7][8]

Michael Abrash gave a talk on VR at Steam Dev Days in 2014.[9] According to the VR research team at Valve, all of the following are needed to establish presence.

  • A wide field of view (80 degrees or better)
  • Adequate resolution (1080p or better)
  • Low pixel persistence (3 ms or less)
  • A high enough refresh rate (>60 Hz, 95 Hz is enough but less may be adequate)
  • Global display where all pixels are illuminated simultaneously (rolling display may work with eye tracking.)
  • Optics (at most two lenses per eye with trade-offs, ideal optics not practical using current technology)
  • Optical calibration
  • Rock-solid tracking – translation with millimeter accuracy or better, orientation with quarter degree accuracy or better, and volume of 1.5 meter or more on a side
  • Low latency (20 ms motion to last photon, 25 ms may be good enough)

Immersive virtual reality

A Cave Automatic Virtual Environment (CAVE) system

Immersive virtual reality is a hypothetical future technology that exists today as virtual reality art projects, for the most part.[10] It consists of immersion in an artificial environment where the user feels just as immersed as they usually feel in everyday life.

Direct interaction of the nervous system

The most considered method would be to induce the sensations that made up the virtual reality in the nervous system directly. In functionalism/conventional biology we interact with everyday life through the nervous system. Thus we receive all input from all the senses as nerve impulses. It gives your neurons a feeling of heightened sensation. It would involve the user receiving inputs as artificially stimulated nerve impulses, the system would receive the CNS outputs (natural nerve impulses) and process them allowing the user to interact with the virtual reality. Natural impulses between the body and central nervous system would need to be prevented. This could be done by blocking out natural impulses using nanorobots which attach themselves to the brain wiring, whilst receiving the digital impulses of which describe the virtual world, which could then be sent into the wiring of the brain. A feedback system between the user and the computer which stores the information would also be needed. Considering how much information would be required for such a system, it is likely that it would be based on hypothetical forms of computer technology.

Requirements

Understanding of the nervous system

A comprehensive understanding of which nerve impulses correspond to which sensations, and which motor impulses correspond to which muscle contractions will be required. This will allow the correct sensations in the user, and actions in the virtual reality to occur. The Blue Brain Project is the current, most promising research with the idea of understanding how the brain works by building very large scale computer models.

Ability to manipulate CNS

The central nervous system would obviously need to be manipulated. Whilst non-invasive devices using radiation have been postulated, invasive cybernetic implants are likely to become available sooner and be more accurate. Molecular nanotechnology is likely to provide the degree of precision required and could allow the implant to be built inside the body rather than be inserted by an operation.

Computer hardware/software to process inputs/outputs

A very powerful computer would be necessary for processing virtual reality complex enough to be nearly indistinguishable from everyday life and interacting with central nervous system fast enough.

Immersive digital environments

Cosmopolis (2005), Maurice Benayoun's Giant Virtual Reality Interactive Installation

An immersive digital environment is an artificial, interactive, computer-created scene or "world" within which a user can immerse themselves.[11]

Immersive digital environments could be thought of as synonymous with virtual reality, but without the implication that actual "reality" is being simulated. An immersive digital environment could be a model of reality, but it could also be a complete fantasy user interface or abstraction, as long as the user of the environment is immersed within it. The definition of immersion is wide and variable, but here it is assumed to mean simply that the user feels like they are part of the simulated "universe". The success with which an immersive digital environment can actually immerse the user is dependent on many factors such as believable 3D computer graphics, surround sound, interactive user-input and other factors such as simplicity, functionality and potential for enjoyment. New technologies are currently under development which claim to bring realistic environmental effects to the players' environment – effects like wind, seat vibration and ambient lighting.

Perception

To create a sense of full immersion, the 5 senses (sight, sound, touch, smell, taste) must perceive the digital environment to be physically real. Immersive technology can perceptually fool the senses through:

  • Panoramic 3D displays (visual)
  • Surround sound acoustics (auditory)
  • Haptics and force feedback (tactile)
  • Smell replication (olfactory)
  • Taste replication (gustation)

Interaction

Once the senses reach a sufficient belief that the digital environment is real (it is interaction and involvement which can never be real), the user must then be able to interact with the environment in a natural, intuitive manner. Various immersive technologies such as gestural controls, motion tracking, and computer vision respond to the user's actions and movements. Brain control interfaces (BCI) respond to the user's brainwave activity.

Examples and applications

Training and rehearsal simulations run the gamut from part task procedural training (often buttonology, for example: which button do you push to deploy a refueling boom) through situational simulation (such as crisis response or convoy driver training) to full motion simulations which train pilots or soldiers and law enforcement in scenarios that are too dangerous to train in actual equipment using live ordinance.

Video games from simple arcade to massively multiplayer online game and training programs such as flight and driving simulators. Entertainment environments such as motion simulators that immerse the riders/players in a virtual digital environment enhanced by motion, visual and aural cues. Reality simulators, such as one of the Virunga Mountains in Rwanda that takes you on a trip through the jungle to meet a tribe of mountain gorillas.[12] Or training versions such as one which simulates taking a ride through human arteries and the heart to witness the buildup of plaque and thus learn about cholesterol and health.[13]

In parallel with scientist, artists like Knowbotic Research, Donna Cox, Rebecca Allen, Robbie Cooper, Maurice Benayoun, Char Davies, and Jeffrey Shaw use the potential of immersive virtual reality to create physiologic or symbolic experiences and situations.

Other examples of immersion technology include physical environment / immersive space with surrounding digital projections and sound such as the CAVE, and the use of virtual reality headsets for viewing movies, with head-tracking and computer control of the image presented, so that the viewer appears to be inside the scene. The next generation is VIRTSIM, which achieves total immersion through motion capture and wireless head mounted displays for teams of up to thirteen immersants enabling natural movement through space and interaction in both the virtual and physical space simultaneously.

Use in medical care

New fields of studies linked to immersive virtual reality emerge every day. Researchers see a great potential in virtual reality tests serving as complementary interview methods in psychiatric care.[14] Immersive virtual reality have in studies also been used as an educational tool in which the visualization of psychotic states have been used to get increased understanding of patients with similar symptoms.[15] New treatment methods are available for schizophrenia[16] and other newly developed research areas where immersive virtual reality is expected to achieve melioration is in education of surgical procedures,[17] rehabilitation program from injuries and surgeries[18] and reduction of phantom limb pain.[19]

Applications in the built environment

In the domain of architectural design and building science, immersive virtual environments are adopted to facilitate architects and building engineers to enhance the design process through assimilating their sense of scale, depth, and spatial awareness. Such platforms integrate the use of virtual reality models and mixed reality technologies in various functions of building science research,[20] construction operations,[21] personnel training, end-user surveys, performance simulations[22] and building information modeling visualization.[23][24] Head-mounted displays (with both 3 degrees of freedom and 6 degrees of freedom systems) and CAVE platforms are used for spatial visualization and building information modeling (BIM) navigations for different design and evaluation purposes.[25] Clients, architects and building owners use derived applications from game engines to navigate 1:1 scale BIM models, allowing a virtual walkthrough experience of future buildings.[24] For such use cases, the performance improvement of space navigation between virtual reality headsets and 2D desktop screens has been investigated in various studies, with some suggesting significant improvement in virtual reality headsets[26][27] while others indicate no significant difference.[28][29] Architects and building engineers can also use immersive design tools to model various building elements in virtual reality CAD interfaces,[30][31] and apply property modifications to building information modeling (BIM) files through such environments.[23][32]

In the building construction phase, immersive environments are used to improve site preparations, on site communication and collaboration of team members, safety[33][34] and logistics.[35] For training of construction workers, virtual environments have shown to be highly effective in skill transfer with studies showing similar performance results to training in real environments.[36] Moreover, virtual platforms are also used in the operation phase of buildings to interact and visualize data with Internet of Things (IoT) devices available in buildings, process improvement and also resource management.[37][38]

Occupant and end-user studies are performed through immersive environments.[39][40] Virtual immersive platforms engage future occupants in the building design process by providing a sense of presence to users with integrating pre-construction mock-ups and BIM models for the evaluation of alternative design options in the building model in a timely and cost efficient manner.[41] Studies conducting human experiments have shown users perform similarly in daily office activities (object identification, reading speed and comprehension) within immersive virtual environments and benchmarked physical environments.[39] In the field of lighting, virtual reality headsets have been used investigate the influence of façade patterns on the perceptual impressions and satisfaction of a simulated daylit space.[42] Moreover, artificial lighting studies have implemented immersive virtual environments to evaluate end-users lighting preferences of simulated virtual scenes with the controlling of the blinds and artificial lights in the virtual environment.[40]

For structural engineering and analysis, immersive environments enable the user to focus on structural investigations without getting too distracted to operate and navigate the simulation tool.[43] Virtual and augmented reality applications have been designed for finite element analysis of shell structures. Using stylus and data gloves as input devices, the user can create, modify mesh, and specify boundary conditions. For a simple geometry, real-time color-coded results are obtained by changing loads on the model.[44] Studies have used artificial neural networks (ANN) or approximation methods to achieve real-time interaction for the complex geometry, and to simulate its impact via haptic gloves.[45] Large scale structures and bridge simulation have also been achieved in immersive virtual environments. The user can move the loads acting on the bridge, and finite element analysis results are updated immediately using an approximate module.[46]

Detrimental effects

A panoramic device intended to reduce the effect of seasickness

Simulation sickness, or simulator sickness, is a condition where a person exhibits symptoms similar to motion sickness caused by playing computer/simulation/video games (Oculus Rift is working to solve simulator sickness).[47]

Motion sickness due to virtual reality is very similar to simulation sickness and motion sickness due to films. In virtual reality, however, the effect is made more acute as all external reference points are blocked from vision, the simulated images are three-dimensional and in some cases stereo sound that may also give a sense of motion. Studies have shown that exposure to rotational motions in a virtual environment can cause significant increases in nausea and other symptoms of motion sickness.[48]

Other behavioural changes such as stress, addiction, isolation and mood changes are also discussed to be side-effects caused by immersive virtual reality.[49]

See also

Footnotes

  1. Adams, Ernest (July 9, 2004). "Postmodernism and the Three Types of Immersion". Gamasutra. Archived from the original on October 24, 2007. Retrieved 2007-12-26.
  2. Björk, Staffan; Jussi Holopainen (2004). Patterns In Game Design. Charles River Media. p. 206. ISBN 978-1-58450-354-5.
  3. "10.000 Moving Cities - Same but Different, interactive net-and-telepresence-based installation 2015". Marc Lee. Archived from the original on 2018-08-15. Retrieved 2017-03-12.
  4. Barfield, Woodrow; Zeltzer, David; Sheridan, Thomas; Slater, Mel (1995). "Presence and Performance Within Virtual Environments". In Barfield, Woodrow; Furness, III, Thomas A. (eds.). Virtual Environments and Advanced Interface Design. Oxford University Press. p. 473. ISBN 978-0195075557.
  5. Thornson, Carol; Goldiez, Brian (January 2009). "Predicting presence: Constructing the Tendency toward Presence Inventory". International Journal of Human Computer Studies. 67 (1): 62–78. doi:10.1016/j.ijhcs.2008.08.006.
  6. Seth Rosenblatt (19 March 2014). "Oculus Rift Dev Kit 2 now on sale for $350". CNET. CBS Interactive. Archived from the original on 28 March 2014.
  7. "Oculus Rift DK2 hands-on and first-impressions". SlashGear. 19 March 2014.
  8. "Announcing the Oculus Rift Development Kit 2 (DK2)". oculusvr.com. Archived from the original on 13 September 2014. Retrieved 3 May 2018.
  9. Abrash M. (2014). What VR could, should, and almost certainly will be within two years Archived 2014-03-20 at the Wayback Machine
  10. Joseph Nechvatal, Immersive Ideals / Critical Distances. LAP Lambert Academic Publishing. 2009, pp. 367-368
  11. Joseph Nechvatal, Immersive Ideals / Critical Distances. LAP Lambert Academic Publishing. 2009, pp. 48-60
  12. pulseworks.com Archived 2009-05-05 at the Wayback Machine
  13. "Thank You".
  14. Freeman, D.; Antley, A.; Ehlers, A.; Dunn, G.; Thompson, C.; Vorontsova, N.; Garety, P.; Kuipers, E.; Glucksman, E.; Slater, M. (2014). "The use of immersive virtual reality (VR) to predict the occurrence 6 months later of paranoid thinking and posttraumatic stress symptoms assessed by self-report and interviewer methods: A study of individuals who have been physically assaulted". Psychological Assessment. 26 (3): 841–847. doi:10.1037/a0036240. PMC 4151801. PMID 24708073.
  15. http://www.life-slc.org/docs/Bailenson_etal-immersiveVR.pdf
  16. Freeman, D. (2007). "Studying and Treating Schizophrenia Using Virtual Reality: A New Paradigm". Schizophrenia Bulletin. 34 (4): 605–610. doi:10.1093/schbul/sbn020. PMC 2486455. PMID 18375568.
  17. Virtual Reality in Neuro-Psycho-Physiology, p. 36, at Google Books
  18. De Los Reyes-Guzman, A.; Dimbwadyo-Terrer, I.; Trincado-Alonso, F.; Aznar, M. A.; Alcubilla, C.; Pérez-Nombela, S.; Del Ama-Espinosa, A.; Polonio-López, B. A.; Gil-Agudo, Á. (2014). "A Data-Globe and Immersive Virtual Reality Environment for Upper Limb Rehabilitation after Spinal Cord Injury". XIII Mediterranean Conference on Medical and Biological Engineering and Computing 2013. IFMBE Proceedings. 41. p. 1759. doi:10.1007/978-3-319-00846-2_434. ISBN 978-3-319-00845-5.
  19. Llobera, J.; González-Franco, M.; Perez-Marcos, D.; Valls-Solé, J.; Slater, M.; Sanchez-Vives, M. V. (2012). "Virtual reality for assessment of patients suffering chronic pain: A case study". Experimental Brain Research. 225 (1): 105–117. doi:10.1007/s00221-012-3352-9. PMID 23223781.
  20. Kuliga, S.F.; Thrash, T.; Dalton, R.C.; Hölscher, C. (2015). "Virtual reality as an empirical research tool — Exploring user experience in a real building and a corresponding virtual model". Computers, Environment and Urban Systems. 54: 363–375. doi:10.1016/j.compenvurbsys.2015.09.006.
  21. Kamat Vineet R.; Martinez Julio C. (2001-10-01). "Visualizing Simulated Construction Operations in 3D". Journal of Computing in Civil Engineering. 15 (4): 329–337. doi:10.1061/(asce)0887-3801(2001)15:4(329).
  22. Malkawi, Ali M.; Srinivasan, Ravi S. (2005). "A new paradigm for Human-Building Interaction: the use of CFD and Augmented Reality". Automation in Construction. 14 (1): 71–84. doi:10.1016/j.autcon.2004.08.001.
  23. "Revit Live | Immersive Architectural Visualization | Autodesk". Archived from the original on 2017-11-09. Retrieved 2017-11-09.
  24. "IrisVR - Virtual Reality for Architecture, Engineering, and Construction". irisvr.com. Retrieved 2017-11-09.
  25. Frost, P.; Warren, P. (2000). Virtual reality used in a collaborative architectural design process. 2000 IEEE Conference on Information Visualization. An International Conference on Computer Visualization and Graphics. pp. 568–573. doi:10.1109/iv.2000.859814. ISBN 978-0-7695-0743-9.
  26. Santos, Beatriz Sousa; Dias, Paulo; Pimentel, Angela; Baggerman, Jan-Willem; Ferreira, Carlos; Silva, Samuel; Madeira, Joaquim (2009-01-01). "Head-mounted display versus desktop for 3D navigation in virtual reality: a user study". Multimedia Tools and Applications. 41 (1): 161. CiteSeerX 10.1.1.469.4984. doi:10.1007/s11042-008-0223-2. ISSN 1380-7501.
  27. Ruddle, Roy A.; Payne, Stephen J.; Jones, Dylan M. (1999-04-01). "Navigating Large-Scale Virtual Environments: What Differences Occur Between Helmet-Mounted and Desk-Top Displays?" (PDF). Presence: Teleoperators and Virtual Environments. 8 (2): 157–168. doi:10.1162/105474699566143. ISSN 1054-7460.
  28. Robertson, George; Czerwinski, Mary; van Dantzich, Maarten (1997). Immersion in Desktop Virtual Reality. Proceedings of the 10th Annual ACM Symposium on User Interface Software and Technology. UIST '97. New York, NY, USA: ACM. pp. 11–19. CiteSeerX 10.1.1.125.175. doi:10.1145/263407.263409. ISBN 978-0897918817.
  29. Ruddle, Roy A; Péruch, Patrick (2004-03-01). "Effects of proprioceptive feedback and environmental characteristics on spatial learning in virtual environments". International Journal of Human-Computer Studies. 60 (3): 299–326. CiteSeerX 10.1.1.294.6442. doi:10.1016/j.ijhcs.2003.10.001.
  30. "vSpline". www.vspline.com. Archived from the original on 2017-09-19. Retrieved 2017-11-09.
  31. "VR - Gravity Sketch". Gravity Sketch. Archived from the original on 2017-01-15. Retrieved 2017-11-09.
  32. "VR Productivity for AEC". www.kalloctech.com. Archived from the original on 2017-11-09. Retrieved 2017-11-09.
  33. Colombo, Simone; Manca, Davide; Brambilla, Sara; Totaro, Roberto; Galvagni, Remo (2011-01-01). "Towards the Automatic Measurement of Human Performance in Virtual Environments for Industrial Safety". ASME 2011 World Conference on Innovative Virtual Reality. pp. 67–76. doi:10.1115/winvr2011-5564. ISBN 978-0-7918-4432-8.
  34. "DAQRI - Smart Helmet®". daqri.com. Archived from the original on 2017-11-09. Retrieved 2017-11-09.
  35. Messner, John I. (2006). "Evaluating the Use of Immersive Display Media for Construction Planning". Intelligent Computing in Engineering and Architecture. Lecture Notes in Computer Science. 4200. Springer, Berlin, Heidelberg. pp. 484–491. doi:10.1007/11888598_43. ISBN 9783540462460.
  36. Waller, David; Hunt, Earl; Knapp, David (1998-04-01). "The Transfer of Spatial Knowledge in Virtual Environment Training". Presence: Teleoperators and Virtual Environments. 7 (2): 129–143. CiteSeerX 10.1.1.39.6307. doi:10.1162/105474698565631. ISSN 1054-7460.
  37. V.Whisker, A. Baratta, S. Yerrapathruni, J.Messner, T. Shaw,M.Warren, E. Rotthoff, J. Winters, J. Clelland, F. Johnson (2003). "Using immersive virtual environments to develop and visualize construction schedules for advanced nuclear power plants". Proceedings of ICAPP. 3: 4–7. CiteSeerX 10.1.1.456.7914.CS1 maint: multiple names: authors list (link)
  38. Colombo, Simone; Nazir, Salman; Manca, Davide (2014-10-01). "Immersive Virtual Reality for Training and Decision Making: Preliminary Results of Experiments Performed With a Plant Simulator". SPE Economics & Management. 6 (4): 165–172. doi:10.2118/164993-pa. ISSN 2150-1173.
  39. Heydarian, Arsalan; Carneiro, Joao P.; Gerber, David; Becerik-Gerber, Burcin; Hayes, Timothy; Wood, Wendy (2015). "Immersive virtual environments versus physical built environments: A benchmarking study for building design and user-built environment explorations". Automation in Construction. 54: 116–126. doi:10.1016/j.autcon.2015.03.020.
  40. Heydarian, Arsalan; Carneiro, Joao P.; Gerber, David; Becerik-Gerber, Burcin (2015). "Immersive virtual environments, understanding the impact of design features and occupant choice upon lighting for building performance". Building and Environment. 89: 217–228. doi:10.1016/j.buildenv.2015.02.038.
  41. Mahdjoub, Morad; Monticolo, Davy; Gomes, Samuel; Sagot, Jean-Claude (2010). "A collaborative Design for Usability approach supported by Virtual Reality and a Multi-Agent System embedded in a PLM environment". Computer-Aided Design. 42 (5): 402–413. doi:10.1016/j.cad.2009.02.009.
  42. Chamilothori, Kynthia; Wienold, Jan; Andersen, Marilyne (2016). "Daylight patterns as a means to influence the spatial ambiance: a preliminary study". Proceedings of the 3rd International Congress on Ambiances.
  43. Huang, J.M.; Ong, S.K.; Nee, A.Y.C. (2017). "Visualization and interaction of finite element analysis in augmented reality". Computer-Aided Design. 84: 1–14. doi:10.1016/j.cad.2016.10.004.
  44. Liverani, A.; Kuester, F.; Hamann, B. (1999). Towards interactive finite element analysis of shell structures in virtual reality. 1999 IEEE International Conference on Information Visualization (Cat. No. PR00210). pp. 340–346. doi:10.1109/iv.1999.781580. ISBN 978-0-7695-0210-6.
  45. Hambli, Ridha; Chamekh, Abdessalam; Salah, Hédi Bel Hadj (2006). "Real-time deformation of structure using finite element and neural networks in virtual reality applications". Finite Elements in Analysis and Design. 42 (11): 985–991. doi:10.1016/j.finel.2006.03.008.
  46. Connell, Mike; Tullberg, Odd (2002). "A framework for immersive FEM visualisation using transparent object communication in a distributed network environment". Advances in Engineering Software. 33 (7–10): 453–459. doi:10.1016/s0965-9978(02)00063-7.
  47. "Oculus Rift is working to solve simulator sickness". Polygon. 19 August 2013. Archived from the original on 2015-09-24. Retrieved 2015-05-05.
  48. So, R.H.Y. and Lo, W.T. (1999) "Cybersickness: An Experimental Study to Isolate the Effects of Rotational Scene Oscillations." Proceedings of IEEE Virtual Reality '99 Conference, March 13–17, 1999, Houston, Texas. Published by IEEE Computer Society, pp. 237–241
  49. "Archived copy" (PDF). Archived (PDF) from the original on 2014-12-18. Retrieved 2014-11-25.CS1 maint: archived copy as title (link)

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.