Prediction #2: Adversaries to Generate Deepfakes to Bypass Facial Recognition

By Steve Povolny, head of McAfee Advanced Threat Research

Credit: ID 142202786 © Tawatdchai Muelae | Dreamstime.com

Computer-based facial recognition, in its earliest forms, has been around since the mid-1960s. While dramatic changes have since taken place, the underlying concept remains: it provides a means for a computer to identify or verify a face. There are many use cases for the technology, most related to authentication and to answer a single question: is this person who they claim to be?

As time moves onwards, the pace of technology has brought increased processing power, memory and storage to facial recognition technology. New products have leveraged facial recognition in innovative ways to simplify everyday life, from unlocking smart phones, to passport ID verification in airports, and even as a law enforcement aid to identify criminals on the street.

One of the most prevalent enhancements to facial recognition is the advancement of artificial intelligence (AI). A recent manifestation of this is deepfakes, an AI-driven technique producing extremely realistic text, images, and videos that are difficult for humans to discern real from fake. Primarily used for the spread of misinformation, the technology leverages capabilities. Generative Adversarial Networks (GANs), a recent analytic technology, that on the downside, can create fake but incredibly realistic images, text, and videos. Enhanced computers can rapidly process numerous biometrics of a face, and mathematically build or classify human features, among many other applications. While the technical benefits are impressive, underlying flaws inherent in all types of models represent a rapidly growing threat, which cyber criminals will look to exploit.

As technologies are adopted over the coming years, a very viable threat vector will emerge, and we predict adversaries will begin to generate deepfakes to bypass facial recognition. It will be critical for businesses to understand the security risks presented by facial recognition and other biometric systems and invest in educating themselves of the risks as well as hardening critical systems.

Show Comments