“The first risk is that people are already using the fact deepfakes exist to discredit genuine video evidence. Even though there’s footage of you doing or saying something you can say it was a deepfake and it’s very hard to prove otherwise.” Hao Li – USC professor
Source: BBC News – Fact deepfakes exist
About Hao Li:
Hao Li is a computer scientist, innovator, and entrepreneur from Germany, working in the fields of computer graphics and computer vision. He is founder and CEO of Pinscreen, Inc, as well as Distinguished Fellow at the University of California, Berkeley.
He was previously an associate professor of computer science at the University of Southern California, and former director of the Vision and Graphics Lab at the USC Institute for Creative Technologies. He was also a visiting professor at Weta Digital and a research lead at Industrial Light & Magic / Lucasfilm.
For his work in non-rigid shape registration, human digitization, and real-time facial performance capture, Li received the TR35 Award in 2013 from the MIT Technology Review. He was named Andrew and Erna Viterbi Early Career Chair in 2015, and was awarded the Google Faculty Research Award and the Okawa Foundation Research Grant the same year. Li won an Office of Naval Research Young Investigator Award in 2018 and was named to the DARPA ISAT Study Group in 2019. He is a member of the Global Future Council on Virtual and Augmented Reality of the World Economic Forum.
He has worked on dynamic geometry processing and data-driven techniques for making 3D human digitization and facial animation. During his PhD, Li co-created a real-time and markerless system for performance-driven facial animation based on depth sensors which won the best paper award at the ACM SIGGRAPH / Eurographics Symposium on Computer Animation in 2009. The team later commercialized a variant of this technology as the facial animation software Faceshift (acquired by Apple Inc. in 2015 and incorporated into the iPhone X in 2017 ). This technique in deformable shape registration is used by the company C-Rad AB and deployed in hospitals for tracking tumors in real-time during radiation therapy. In 2013, he worked on a home scanning system that uses a Kinect to capture people into game characters or realistic miniature versions. This technology was licensed by Artec and released as a free software Shapify.me. In 2014, he was brought on as visiting professor at Weta Digital to build the high-fidelity facial performance capture pipeline for reenacting the deceased actor Paul Walker in the movie Furious 7 (2015).