iPhone AR Selfie Revolution

Is that ARKit all over your face?

Mike Rundle
9 min readSep 5, 2017

On September 12th, Apple will hold an event in their all-new Steve Jobs Theater located on the grounds of their all-new spaceship campus to unveil some all-new iPhones. Bloomberg’s Mark Gurman has written a number of articles detailing what to expect (and the latest seems pretty spot-on) but in general, here’s what most analysts think will be unveiled:

  • Two new iPhones with beefed up processors that mostly look the same as the current iPhone 7 and iPhone 7 Plus. Maybe called the 7s and 7s Plus, or maybe the 8 and 8 Plus. Who knows.
  • A redesigned high-end iPhone with an edge-to-edge, taller OLED screen, no home button, and new front sensors to enable face unlocking to replace Touch ID. Maybe called Premium, Edition, Pro, X, etc.

Much has been said about the huge updates in iOS 11 (huge for iPad, less so for iPhone) but few articles have really dug into what face unlocking would mean, beyond the obvious that you’ll be able to verify your identity and unlock your phone with your face.

I think this will be the flagship feature of the new iPhone, and will let Apple leapfrog competitors with futuristic face-scanning sensors that will have a gigantic impact on the future of augmented reality.

Face Unlock Technology, Apple Style

Some Android phones have had face unlocking since 2011. Essentially, the phone takes a 2D photo of your face using the front-facing camera, and uses software to compare it mathematically to a previously-taken photo that the user has deemed to be of themselves. Since it launched, Android face unlocking has been both slow and insecure, and even Samsung’s flagship S8 phone can be duped into unlocking by holding a photo up to it so it’s obviously not a very good piece of technology.

And there is absolutely no way that Apple would include face unlock on their new top-end iPhone if it could ever be defeated with a photo of your face instead of the real thing.

For the past few months there’s been a lot of chatter about this technology, but it’s usually buried in a larger article about the iPhone’s other rumored new capabilities.

From a Bloomberg article written in July:

Apple is testing an improved security system that allows users to log in, authenticate payments, and launch secure apps by scanning their face, according to people familiar with the product. This is powered by a new 3-D sensor, added the people, who asked not to be identified discussing technology that’s still in development. The company is also testing eye scanning to augment the system, one of the people said.

A brand new 3D depth sensor that can also track eye movements.

What about the speed?

The sensor’s speed and accuracy are focal points of the feature. It can scan a user’s face and unlock the iPhone within a few hundred milliseconds, the person said. It is designed to work even if the device is laying flat on a table, rather than just close up to the face.

Super fast. As fast or faster than Touch ID. Again: if it were slower, Apple wouldn’t green-light it.

And from a WSJ article on this new sensor from August:

Depth-sensing technology, generally called “structured light,” sprays thousands of tiny infrared dots across a person’s face or any other target.

By reading distortions in this field of dots, the camera gathers superaccurate depth information. Since the phone’s camera can see infrared but humans can’t, such a system could allow the phone to unlock in complete darkness.

Infrared dots that work in low-light (or no light) that scan the user’s face to generate a super accurate 3D depth map.

Let’s dig deeper into the eye-tracking bit up above. Is there anything to indicate that Apple is doing something really big there? Yup, Apple acquired SensoMotoric Instruments earlier this year, a company that has been building serious eye-tracking technology for over 20 years.

SensoMotoric Instruments, founded in 1991, has developed a range of eye tracking hardware and software for several fields of use, including virtual and augmented reality, in-car systems, clinical research, cognitive training, linguistics, neuroscience, physical training and biomechanics, and psychology.

More about their eye tracking technology:

The company’s Eye Tracking Glasses, for instance, are capable of recording a person’s natural gaze behavior in real-time and in real world situations with a sampling rate up to 120Hz. […] SensoMotoric has also developed eye-tracking technology for virtual reality headsets such as the Oculus Rift, which can analyze the wearer’s gaze and help to reduce motion sickness, a common side effect of VR. The solution can also allow for a person’s gaze to control menus or aim in a game with their gaze.

Real-time 120Hz tracking of eye movements at such a detailed level that users can control software interfaces merely by glancing at visual targets.

But wait! At WWDC this summer, Apple introduced a number of APIs that give developers more advanced facial recognition abilities:

Vision Framework allows you to detect face rectangle and face landmarks (face contour, median line, eyes, brows, nose, lips, pupils position)

Beyond this announced API, it was also uncovered that the new iPhone will know if you’re looking at it and will suppress notification sounds.

Piecing together all these tidbits, a picture starts to emerge of the technologies behind new face identification capabilities on the high-end iPhone:

  • A dedicated sensor (or sensors) that use infrared light to generate a highly-detailed 3D scan of a user’s face and immediate surroundings
  • An improved front-facing camera (or cameras) that will take higher fidelity photos and record video at a higher framerate
  • Faster and more secure unlocking and payment authorization than Touch ID, even in extremely low light
  • Advanced new image processing functionality that can track and decipher eye movements, determine alertness, attention, and more

These aren’t just the ingredients for a new way to unlock your phone, these are the foundational elements for some truly futuristic technology that no one else is building.

Way Beyond Unlocking

Users will be able to unlock their iPhones with just their face. They’ll be able to make purchases by using their face as their authorization.

I think these two features will be the least exciting aspects of what the new top-end iPhone can do with its new cameras and sensors.

Over the last 2 years, the consumer tech world has gone crazy over selfie lenses that superimpose objects and effects over your picture, or distort it directly like a funhouse mirror. Snapchat was the pioneer in this space early on, then beefed up their selfie lenses tech stack with their acquisition of Looksery in September 2015 for a reported $150 million:

In a suspicious turn of events, Selfie animation app Looksery disappeared from the App Stores this morning just as Snapchat launched Lenses. Meme artist Ronen V informed me that Lenses looked identical to Looksery’s technology. And after I inquired, Snapchat confirmed to me that it has in fact acquired Looksery, and the Looksery team has joined Snapchat’s.

A few months later, a selfie filters app called MSQRD (masquerade) came out from a small team in Belarus and simply took over the App Store. It had a very simple design, but with filters and effects that went far beyond the fidelity and richness of Snapchat’s capabilities at the time. Before being acquired by Facebook just a couple months after they launched, they achieved more than 25M downloads through word-of-mouth as the app spread like wildfire across the phones of teens around the U.S. and other countries.

So what does all this have to do with the new iPhone?

First, all the innovation in selfie lenses has been purely based in software using more and more advanced image processing techniques to turn a 2D photo of your face into a 3D mesh that can be processed and changed in real-time. The resolution of the front-facing camera on the iPhone is not as high as the rear-facing camera, so even though it can record at 1080P it’s really not made for high framerate, high resolution sampling, at least not compared to the rear-facing camera that can record at 4K and powers ARKit.

When the new top-end iPhone comes out, it’s rumored that both the front and rear cameras will support the recording of 4K resolution video at 60FPS, which is an incredible leap beyond today’s FaceTime HD camera that records at half the resolution and half the framerate. Here’s what 9to5Mac said:

The jump to 60 FPS, especially with the rear camera, makes sense on several fronts. For one, Apple almost always improves the iPhone camera with each hardware iteration and support for 60 FPS is the next step up. The iPhone 8 is also expected to offer a host of augmented reality capabilities as part of Apple’s ARKit framework, and 60 FPS support will allow for AR improvements such as better tracking.

Even if Apple wasn’t planning to include a 3D depth sensor on the front of the phone, these rumored improvements to the front-facing camera resolution and frame rate would have an enormous impact on the fidelity and realism of selfie lenses. More data to work with = improved face and edge detection to develop a mesh = better processing and tracking.

But! Because this new high resolution camera will sit right next to an incredible infrared face-scanning sensor, developers won’t have to sift through mountains of image data to figure out where someone’s face is, they’ll just use the data coming off the 3D sensor to know unequivocally where someone’s most minute facial details are in 3D space, which will blow the doors off what is possible with augmented reality today.

Just off the top of my head, here’s some crazy stuff that will not only be possible, but potentially straightforward with the new top-end iPhone:

  • Knowing where a user is looking while they’re using your app. Making facets of an interface be completely guided by gaze, no touching needed.
  • Incredible biometric and health information from a user’s face. In 2013, scientists could determine a person’s pulse by analyzing a video of them. Surely Apple has looked into all the possibilities around this.
  • Fully immersive selfie AR effects, with fidelity and graphic detail more similar to CGI effects in movies currently achieved via motion capture. Zero lag, high resolution selfie lenses way beyond what’s possible today.
  • Facial movements and emotion recognition that alter the software you’re currently using. If you look sad, the app could recognize that and change or adapt functionality. Imagine knowing if the GIF you just sent to a friend actually made them laugh or smile because their iPhone will know exactly what reaction they experienced (and maybe let you know.)
  • A revolution in mobile advertising where apps and advertisers will know if you actually looked at a banner or not. This data would be more valuable than any metric advertisers currently receive, but could have pretty evil consequences.

These are just a couple things that came to mind, but with these new technologies in place and Apple-provided APIs given to developers to access them, I’m sure countless fascinating experiences (that could never be built before!) will arrive shortly. Apple may have even been working with a few 3rd-party developers to build some killer demos for their upcoming Apple Event to give the world a taste at what is possible.

The new iPhone is set to be unveiled on September 12th, so we may not need to wait very long to see the future.

--

--