Foveated rendering

Foveated rendering is a rendering technique which uses an eye tracker integrated with a virtual reality headset to reduce the rendering workload by greatly reducing the image quality in the peripheral vision (outside of the zone gazed by the fovea).[1][2]

A less sophisticated variant called fixed foveated rendering doesn't utilise eye tracking and instead assumes a fixed focal point.[3][4]

History

At Tech Crunch Disrupt SF 2014, Fove unveiled a headset featuring foveated rendering.[5] This was followed by a successful kickstarter in May 2015.[6]

At CES 2016, SensoMotoric Instruments (SMI) demoed a new 250 Hz eye tracking system and a working foveated rendering solution. It resulted from a partnership with camera sensor manufacturer Omnivision who provided the camera hardware for the new system.[7][8]

In July 2016, Nvidia demonstrated during SIGGRAPH a new method of foveated rendering claimed to be invisible to users.[1][9]

In February 2017, Qualcomm announced their Snapdragon 835 Virtual Reality Development Kit (VRDK) which includes foveated rendering support called Adreno Foveation.

During CES 2019 on January 7 HTC announced an upcoming virtual reality headset called Vive Pro Eye featuring eye-tracking and support for foveated rendering.[10][11]

In December 2019, Facebook's Oculus Quest SDK gave developers access to dynamic fixed foveated rendering, allowing the variation in level of detail to be changed on the fly via an API.[12]

Use

According to chief scientist Michael Abrash at Oculus, utilising foveated rendering in conjunction with sparse rendering and deep learning image reconstruction has the potential to require an order of magnitude fewer pixels to be rendered in comparison to a full image.[13] Later, these results have been demonstrated and published.[14]

See also

References

  1. Parrish, Kevin (2016-07-22). "Nvidia plans to prove that new method improves image quality in virtual reality". Digital Trends. Retrieved 2017-02-02.
  2. "Understanding Foveated Rendering". Sensics. 2016-04-11. Retrieved 2017-02-04.
  3. Carbotte, Kevin (30 March 2018). "What Is Fixed Foveated Rendering, And Why Does It Matter?". Tom's Hardware. Retrieved 9 September 2019.
  4. Orland, Kyle (21 March 2016). "How Valve got passable VR running on a four-year-old graphics card". Ars Technica. Retrieved 9 September 2019.
  5. "FOVE Uses Eye Tracking To Make Virtual Reality More Immersive". TechCrunch. Retrieved 2019-02-06.
  6. "FOVE: The World's First Eye Tracking Virtual Reality Headset". Kickstarter. Retrieved 2019-02-06.
  7. Mason, Will (2016-01-15). "SMI's 250Hz Eye Tracking and Foveated Rendering Are For Real, and the Cost May Surprise You". UploadVR. Retrieved 2017-02-02.
  8. Mason, Will (2016-01-15). "SMI's 250Hz Eye Tracking and Foveated Rendering Are For Real, and the Cost May Surprise You". UploadVR. Retrieved 2020-06-18.
  9. "NVIDIA Partners with SMI on Innovative Rendering Technique That Improves VR". Nvidia. 2016-01-21. Retrieved 2017-02-02.
  10. Statt, Nick (2019-01-07). "HTC announces new Vive Pro Eye virtual reality headset with native eye tracking". The Verge. Retrieved 2019-01-14.
  11. Kernel Foveated Rendering, 2018-07-01, retrieved 2018-07-01
  12. "Oculus Quest gets dynamic fixed foveated rendering". VentureBeat. 2019-12-22. Retrieved 2020-01-21.
  13. Oculus (2018-09-26), Oculus Connect 5 | Keynote Day 01, retrieved 2018-09-30
  14. Kaplanyan, Anton (2020-05-15). "DeepFovea: AR/VR rendering, inspired by human vision". Retrieved 2020-05-15.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.