Hao Li

Hao Li
Native name 黎顥
Born (1981-01-17) January 17, 1981
Saarbrücken, West Germany
Residence United States
Citizenship German
Alma mater ETH Zurich (2010)
Karlsruhe Institute of Technology (2006)
Known for Human Digitization, Facial Performance Capture
Awards TR35 Award
Scientific career
Fields Computer Graphics, Computer Vision
Institutions Pinscreen (founder/CEO)
University of Southern California (assistant professor)
Institute for Creative Technologies (director)
Thesis Animation Reconstruction of Deformable Surfaces (2010)
Doctoral advisor Mark Pauly
Website www.hao-li.com

Hao Li (Chinese: 黎顥; pinyin: Lí Hào; born January 17, 1981 in Saarbrücken, West Germany) is a computer scientist, innovator, and entrepreneur from Germany, working in the fields of Computer Graphics and Computer Vision. He is founder and CEO of Pinscreen, Inc., assistant professor of Computer Science[1] at the University of Southern California, as well as director of the Vision and Graphics Lab at the USC Institute for Creative Technologies.[2] He was previously a visiting professor at Weta Digital and a research lead at Industrial Light & Magic / Lucasfilm.

For his contributions in non-rigid shape registration, human digitization, and real-time facial performance capture, Li received the TR35 Award in 2013, recognizing him as one of the top 35 innovators under the age of 35, from the MIT Technology Review.[3] He was named Andrew and Erna Viterbi Early Career Chair in 2015, and was awarded the Google Faculty Research Award and the Okawa Foundation Research Grant the same year. Li won an Office of Naval Research Young Investigator Award in 2018.[4]

Biography

Li was born in 1981 in Saarbrücken, Germany (then West Germany). His parents are both Taiwanese immigrants living in Germany. He went to a French-German high school in Saarbrücken and speaks four languages fluently (English, German, French, and Mandarin Chinese). He obtained his Diplom (eq. M.Sc.) in Computer Science at the Karlsruhe Institute of Technology (then University of Karlsruhe (TH)) in 2006 and his PhD in Computer Science at ETH Zurich in 2010. He was a visiting researcher at ENSIMAG in 2003, the National University of Singapore in 2006, Stanford University in 2008, and EPFL in 2010. He was also a postdoctoral fellow at Columbia University and Princeton University between 2011 and 2012.

Career

Li joined Industrial Light & Magic / Lucasfilm in 2012 as a research lead to develop the next generation real-time performance capture technologies for virtual production and visual effects. In 2013, he became a tenure-track assistant professor of Computer Science[1] at the University of Southern California. In 2014, he spent a summer as a visiting professor at Weta Digital advancing the facial tracking and hair digitization technologies for the visual effects of Furious 7 and The Hobbit: The Battle of the Five Armies. In 2015, he founded Pinscreen, Inc., an Augmented Reality and social media startup based on his research in 3D human capture. In 2016, he was appointed director of the Vision and Graphics Lab at the USC Institute for Creative Technologies.

Research

He is best known for his work on dynamic geometry processing and data-driven techniques for making 3D human digitization and facial animation accessible to the masses. During his PhD, Li co-created the first real-time and markerless system for performance-driven facial animation based on depth sensors which won the best paper award at the ACM SIGGRAPH / Eurographics Symposium on Computer Animation in 2009.[5] The team later commercialized a variant of this technology as the facial animation software Faceshift[6] (acquired by Apple Inc. in 2015 and incorporated into the iPhone X in 2017[7][8]). His technique in deformable shape registration is used by the company C-Rad AB and widely deployed in hospitals for tracking tumors in real-time during radiation therapy. In 2013, he introduced a home scanning system that uses a Kinect to capture people into game characters or realistic miniature versions.[9] This technology was licensed by Artec and released as a free software Shapify.me. In 2014, he was brought on as visiting professor at Weta Digital to build the high-fidelity facial performance capture pipeline for reenacting the deceased actor Paul Walker[10] in the movie Furious 7 (2015).

His recent research focuses on combining techniques in Deep Learning and Computer Graphics to facilitate the creation of 3D avatars and to enable true immersive face-to-face communication and telepresence in Virtual Reality.[11] In collaboration with Oculus / Facebook, he developed in 2015, the first facial performance sensing head-mounted display,[12] which allows users to transfer their facial expressions onto their digital avatars while being immersed in a virtual environment. In the same year, he founded the company Pinscreen, Inc.[13] in Santa Monica, which introduced a technology that can generate realistic 3D avatars of a person including the hair from a single photograph.[14] Their innovations include the development of deep neural networks that can infer photorealistic faces[15] and expressions.[16]

Awards

  • Office of Naval Research Young Investigator Award.[4]
  • Andrew and Erna Viterbi Early Career Chair.[17]
  • Okawa Foundation Research Grant.[18]
  • Google Faculty Research Award.[19]
  • World's top 35 innovator under 35 by MIT Technology Review.[3]
  • Best Paper Award at the ACM SIGGRAPH / Eurographics Symposium on Computer Animation 2009.

Miscellaneous

For his technological contributions in visual effects, Li has been credited in major motion pictures, including Blade Runner 2049 (2017), Valerian and the City of a Thousand Planets (2017), Furious 7 (2015), The Hobbit: The Battle of the Five Armies (2014), and Noah (2014).

References

  1. 1 2 from faculty roster at USC Computer Science Department, retrieved 2015-03-03.
  2. "Hao Li to Spearhead Vision and Graphics Lab at the USC Institute for Creative Technologies". viterbi.usc.edu. USC.
  3. 1 2 from MIT Technology Review TR35 Awards, retrieved 2015-03-03.
  4. 1 2 "Hao Li Earns Office of Naval Research Young Investigator Award - USC Viterbi | School of Engineering". USC Viterbi | School of Engineering. Retrieved 2018-03-25.
  5. "Face/Off: live facial puppetry". ACM.
  6. "Performance driven facial animation". www.fxguide.com. fxguide.
  7. "All of Apple's Face-Tracking Tech Behind the iPhone X's Animoji". WIRED. Retrieved 2017-10-25.
  8. "Professor's research contributed to iPhone X | Daily Trojan". Daily Trojan. 2017-09-27. Retrieved 2017-10-25.
  9. "Hao Li wants to scan you into your favourite games". wired.co.uk. Wired.
  10. "How I Made It: USC professor brings computer animation to life". www.latimes.com. LA Times.
  11. "Who wants to show up as Gandalf at their next meeting?". news.usc.edu. USC.
  12. "Oculus Rift Hack Transfers Your Facial Expressions onto Your Avatar". technologyreview.com. MIT Technology Review.
  13. "Stealth Face-Tracking Startup Pinscreen Raises $1.8 Million". uploadvr.com. UploadVR.
  14. "Pinscreen launches with high-tech distractions for a nerve-wracking election". techcrunch.com. TechCrunch.
  15. "Photorealistic facial texture from a single still". fxguide.com. fxguide.
  16. Pierson, David. "Fake videos are on the rise. As they become more realistic, seeing shouldn't always be believing". latimes.com. Retrieved 2018-03-25.
  17. "Endowed chairs and professorships at USC Viterbi School of Engineering". USC.
  18. "USC Professors Earn International Award". USC.
  19. "Google Faculty Research Award 2015" (PDF).


This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.