Starting very soon, your Instagram feed is going to look different — a little softer, a little more professional, a little bit deeper. It’s not because of anything Instagram is going to do, though. It’s because of a new feature on the iPhone 7 Plus: “portrait mode,” which allows you to mimic in your phone photos the same depth of field and filmic blur you see in photographs taken by expensive SLR cameras. Portrait mode is available in a public beta now and will make its official debut later this year. I’ve been messing around with it for a few days, and it makes it stupid easy to take photos that might not be good — but that immediately scan as good-looking. And it’s going to be impossible for the Instagram photographer in all of us to stay away.
While portrait mode may still be in beta, it already works easily enough. While in portrait mode, your screen will give you instructions — get closer to your subject, move farther away, get more light — until it clicks on a bright yellow tag letting your know you’ve got “Depth Effect” going on. You can also see the results live on your screen, allowing you to fine-tune them. It can feel a little silly: Taking a photo in public is no longer pulling your phone out and snapping a pic, but moving back and forth, your phone in front of you like a dowsing rod, until you find just the right spot to get that pretty bokeh effect going on.
Some background: Due to limitations in the size of the lens, most smartphone cameras take photographs with a large depth of field, meaning that nearly everything in the photo is in focus, no matter how far it is from the lens. While this is great for certain subjects, it works less well for others, like portraits, where you might want to focus on a single subject. Maybe more important for performative social-media purposes, our eyes have been trained, by years of looking at images, to regard photos with shallower depths of field, where objects too close or too far away from the lens are rendered in an out-of-focus blur, as better — or, at the very least, as more professional. (Until now, Instagram users have used the app’s “Tilt Shift” tool to create inexact blur effects that mimic shallow depths of field.)
The iPhone 7 Plus can’t create true depth of field — its lenses are still too small. But it can fake it. Without getting deep into the weeds of how it works, portrait mode uses the phone’s dual cameras to quickly render a 3-D map of what your phone sees, thanks to the lenses being slightly offset from each other. It then takes that 3-D map and puts things that are farther away into a blur, things that are slightly closer into less blur, and the object that the phone determines to be the focal point into full focus. It’s smart enough to not focus on the nearest thing — things in the extreme foreground get the same blur as things in the background — and works on anything about one to five feet away that’s sufficiently well lit.
It’s an impressive bit of software to simulate something anyone who takes an Intro to Photography course learns how to do, but which has largely been unavailable to smartphone cameras. (Some other dual-lens smartphones I’ve used, the LG G5 and the LG V20, can do something in the same vein, but Apple’s solution already runs circle around both.)
I spent a day taking random photos, not really trying to do much more than find things I could foreground and also have enough going on in the background to look interesting. Any pro photographer could look at the photos (particularly in full resolution) and tell it’s a faked effect, and there are still artifacts that are slipping through in beta, including the occasional halo around the object in focus. (There’s going to be a whole new genre of portrait-mode fails circulating on Tumblr.) But when you’re uploading a pic to Instagram, it works well enough, and could fool plenty of people of thinking you’ve shelled out for a fancy DSLR and a decent lens.
What this means is, in the short term, you’re gonna start to see these bokeh photos popping up more and more, a clear signifier to anyone that this person probably shelled out for the iPhone 7 Plus. As you can see in the examples here, none of these photos are particularly great on their own. But the addition of the blurred background, thanks to years of movies and professional photography, is a clear cue to most viewers that they are looking at something that’s of a higher quality than a photo where everything is in focus.
Other phone makers would be wise to try to ape this as quickly as possible. It’s also entirely possible you’ll start seeing app solutions to get the same effect for single-lens-smartphone users who want to try to play catch-up — it’s possible for a skilled Photoshopper to fake depth of field. Eventually, as multi-camera smartphones become the standard, nearly everyone will be able to use this effect, and a new differentiator will have to be found. But for right now, this will do.
This is actually a bit of a return to the past for Instagram. When the app launched, in 2010, it was for iOS only. When the iPhone 4s launched with a significantly better eight-megapixel camera, it became immediately clear who had the newer phone and who was still limping along, especially on the iPhone’s 3GS two-megapixel camera. When the app was opened up to Android after 18 months, there were widespread (and problematic) complaints about how Android users were going to “ruin” the service. In recent years, these distinctions have largely been erased, as nearly every smartphone sold in the past three years has a pretty good camera on it. But there’s soon going to be a clear delineation between those who chose to lay out at least $769 for the iPhone 7 Plus, and those who didn’t.