Bringing Wide Color to Instagram

Mike Krieger
Instagram Engineering
6 min readJan 9, 2017

--

Last September, Apple announced the iPhone 7 and 7 Plus, which include cameras that capture a greater range of colors than previous models, and screens that can display that wider color range. We’ve just finished updating Instagram to support wide color, and since we’re one of the first major apps to do so, I wanted to share the process of converting the app to help any others doing the conversion. In my role as CTO I’ll often do deep-dives on a particular technical area, and wide color was my main area for November and December 2016.

Why Wide Color Matters

For years, most photos captured and shared have been in the sRGB color space. sRGB has great compatibility with most displays, so it became the standard for images shared on the Web — and more recently, on mobile.

For years, sRGB did a good job of representing the colors displayed on most monitors. But as display and camera technology improves, we’re starting to be limited by the colors represented in sRGB.

Take, for example, this “photo room” we have at Instagram HQ:

When captured by an iPhone 7 Plus, most of the oranges and colors in the room are outside the sRGB color gamut, so detail is lost unless we use a wider color space. The color space that Apple chose for its devices going forward is Display P3. Here, highlighted in blue, are all the portions of the image that are outside of sRGB but present in Display P3; in other words, parts of the image where information is getting lost:

Next, we’ll walk through what we needed to change at each step of the Instagram image pipeline to bring wide color support to Feed, Stories, and Direct. When we started this project, none of us at IG were deep experts in color. For a good starting point, I recommend Craig Hockenberry’s new book; an early draft was helpful as we started converting Instagram.

A Canary

The most useful tool when working on wide color compatibility is a “canary image” that will only show itself if you’re in wide color. Here’s our sample one.

If that just looks like a red square to you, you’re likely on a monitor that can only display sRGB colors. If you open it on a wide-color display device, you should see the Instagram logo “magically” appear — otherwise, the information is lost.

You can use this canary to identify exactly where in the process your app is losing wide color information — the step where it turns back into just a red square.

Capture

This is the easy part. As of iOS10, Apple’s APIs will output wide-color images when available from compatible cameras. One tweak we made while we were looking at this was converting to the new AVCaptureDeviceDiscoverySession, which let us take full advantage of the new dual lens system on the 7 Plus.

Core Graphics Operations

After we capture images (or import them from the Camera Roll), we often apply simple operations like crops and resizes. Most of these are done in Core Graphics, so there were a few changes we had to make for wide-color compatibility.

If you’ve ever done image manipulation in Core Graphics, the following pattern will be familiar to you:

UIGraphicsBeginImageContextWithOptions(…)
// your drawing operations here
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

As a legacy API, it’s not wide-color aware. Instead, we’ll use the new UIGraphicsImageRenderer:

UIGraphicsImageRendererFormat *format =  [[UIGraphicsImageRendererFormat alloc] init];
format.prefersExtendedRange = YES;
UIGraphicsImageRenderer *renderer = [[UIGraphicsImageRenderer alloc] initWithSize:size format:format];
UIImage *image = [renderer imageWithActions:^(UIGraphicsImageRendererContext *rendererContext) {
// your drawing operations here
}];

What we did to simplify the transition at IG was to create a wrapper class around UIGraphicsImageRenderer that takes a block of image drawing actions that accepts a CGContext. It’s implemented as a category on UIImage, so engineers can use [UIImage renderedImageWithSize:(CGSize) actions:(ImageActionsBlock)actions], whereas ImageActionsBlock’s single argument is a CGContextRef. On iOS9 it will use the old UIGraphicsBeginImage approach, calling the block once the context is ready; on iOS10 it uses the new renderer, calling the block insideimageWithActions.

ColorSpace Creation

In other places — like when initializing a CGContext for other drawing operations — it’s common to use CGColorSpaceCreateDeviceRGB when creating a CGColorSpaceRef. This will create an sRGB colorspace on most devices, and we’ll lose our wide color information. Most of the initial work for wide color on Instagram was tracking down everywhere that this color space was hard-coded.

Instead, we can see if our screen supports wide colors (using UIScreen.mainScreen.traitCollection.displayGamut), and if so, use CGColorSpaceCreateWithName(kCGColorSpaceDisplayP3). Again, we found that creating a wrapper that returns the appropriate colorspace for that device was helpful.

When we’re downloading images and aren’t sure what color space to use, we instead use CGImageGetColorSpace, so once we serve Display P3 images to our iOS app, we only create wide-color graphics contexts when needed.

Filter Pipeline

Instagram uses OpenGL for most of its image editing and filtering. OpenGL isn’t color managed; it operates on a range (say, 0.0 to 1.0), and it’s up to the output surface to determine what colors that actually maps to.

The good news is that this meant we had to make very few changes to make our GL pipeline wide-color compatible. The biggest change was to ensure that when we extracted pixel buffers from our GL surface, we were using the appropriate colorspace before converting from a CVPixelBufferRef to a CGImageRef.

We did have trouble getting EAGLView, the built-in way of displaying GL content in a UIView, to be color space-aware. Our solution was to render to an offscreen buffer, grab a wide color image from the buffer, and place it back on the screen using a UIImageView, which is wide-color compatible by default. This wouldn’t work for high-frame-rate applications like games, but was sufficient for our needs. If you’re developing a high-frame-rate application in wide color and have solved this, please reach out and I’ll add the information to this post.

Image Export

At this point, we’ve captured a wide color image, resized it in CoreGraphics, and put it through OpenGL, all while preserving wide color. The last step is taking our UIImage and turning it into a JPEG. This is one of the simplest transitions: replace the legacy UIImageJPEGRepresentation with UIGraphicsImageRenderer and its jpegData method.

It’s at this point that you can load up your exported image (Xcode’s debugger integration for opening UIImages in Preview is handy here) in Photoshop and check the resulting image’s color profile and other color information.

Image Storage / CDN

Once the images are received by our backend, we do some final resizing in Python using Pillow. We then serve images globally through Facebook’s CDN.

Our challenge was that most of our app’s users are currently using devices that aren’t wide-color compatible — and many don’t have good color management built in. Converting images between multiple color profiles on the fly would have added complexity to either our CDN or mobile apps.

To keep things simple, we opted to store both a wide-color and non-wide version in our backend, and use the Python ImageCmslibrary for conversion between the two at storage time (here’s a handy tutorial). This library works in tandem with Pillow and accepts an Image object when converting:

# the ICC_PROFILES are strings representing file paths on disk
converted_image = ImageCms.profileToProfile(image,
DISPLAY_P3_ICC_PROFILE,
SRGB_ICC_PROFILE)

At read time, our apps specify whether their display has a wide color gamut in their User-Agent, and the backend dynamically serves the image with the right profile. In the future, when most images captured are wide color and most displays are color managed, we’ll likely revisit the double-writing approach.

Bringing it Together

It’s still early days for wide color, and documentation is still sparse, which is why I wanted to share the nitty gritty of how we converted Instagram. If in the process of converting your own app you hit any questions, please drop a note in the comments. And if you’re interested in joining Instagram’s iOS team, take a look at our openings.

--

--