A decade after the introduction of the trailblazing iPhone, Apple has once again redefined the smartphone genre with the introduction of the iPhone X, whose Face ID system puts facial recognition technology at the center of the user authentication process. Powering this facial recognition technology is a new Apple-designed AI chip called the A11 Bionic.
Apple’s Face ID system allows a user to unlock his or her iPhone X simply by looking at the smartphone and swiping up on the display, the company announced at its Sept. 12 product-introduction event. This eliminates the need for a home button and fingerprint sensor, contributing to the high-end iPhone X’s clean and elegant design.
The A11 Bionic chip can accurately map and recognize a face by working with the iPhone X’s advanced TrueDepth camera system. TrueDepth employs a dot projector, infrared camera (IR) and flood illuminator to make precise face measurements.
The IR and dot patterns are processed by neural networks on the A11 to generate a mathematical model of a user’s face. The system can accommodate changes in appearance that can occur over time.
Beyond unlocking the iPhone X, Face ID can be used to enable Apple Pay and gain access to secure apps, according to Apple.
The AI chip also enhances other aspects of the user interface besides authentication.
For example, Apple demonstrated a system it calls Animoji, which combines emojis with facial recognition. Using the TrueDepth camera, Animoji can detect more than 50 different facial muscle movements. It then animates those expressions on different Animoji characters, such as a panda or a robot. This allows users to send messages illustrated with Animojis that are smiling, frowning or displaying other expressions.
Furthermore, the A11 Bionic also performs scene recognition, allowing the iPhone X to support augmented reality.
Apple said the A11 is based on a six-core central processing unit (CPU) design. Two of the A11’s performance cores are 25 percent faster and four of its efficiency cores are 70 percent faster than on the previous-generation chip, the A10 Fusion.
The new A11 chip also integrates an Apple-designed graphics processing unit (GPU) with a three-core design that delivers up to 30 percent faster graphics performance than the previous generation. With their highly parallel architectures, GPUs are suited for processing AI tasks. Apple said the A11’s additional horsepower will enable the iPhone X to perform new machine learning feats.
Apple announced the new mainstream-oriented iPhone 8 will integrate the A11 chip as well, but doesn’t have the additional camera and sensor hardware required to run Face ID.
The introduction of the A11 puts Apple at the vanguard of a growing trend among mobile device makers to place AI at the center of their products. Increasingly, AI is playing a major role in the user interface of smartphones and tablets, using chips to locally process artificial intelligence tasks like voice recognition.
By integrating chips like the A11, mobile devices eliminate the need to send data out to the cloud for AI processing. This improves the speed of AI tasks while also ensuring the security and privacy of user data.
Other companies offering similar chips including Qualcomm Inc., whose newest Snapdragon smartphone chipset has a module for AI processing. Leading mobile processor maker ARM Ltd. also recently introduced the Cortex-A75 and A55, which include features designed to accelerate machine learning. Furthermore, Chinese telecom giant Huawei announced the Kirin 970 for smartphone applications.
Stephan Cunningham is vice president, product management at Veritone. Working in concert with core internal teams including industry-specific general managers and engineering as well as directly with clients and prospects, he leads the disciplines and business processes which govern the Veritone Platform.