The iPhone 7 Plus and iPhone 8 Plus do a lot of heavy lifting when you shoot in Portrait mode at night — and if you look closely, you can even see it in the viewfinder.
The iPhone 8 Plus and its iPhone 7 Plus predecessor pack a ton of photographic computational power and mathematics underneath their sleek glass and metal exteriors. Whenever you open the Camera app on an iPhone, the app begins to instantly analyze the scene for movement and lighting conditions to give you the best picture; this is largely thanks to its A-series chipset and image signal processor (ISP).
In daylight, this isn’t visible to the end user — you line up the shot, take the photo, and get your (hopefully) desired result. But when you shoot Portrait mode at night in almost pitch-black conditions, you can see a little bit of that magic in action.
What follows is a fascinating view of the processors at work while shooting in Portrait mode with poor lighting conditions, discovered while testing the iPhone 7 Plus (running iOS 11) and iPhone 8 Plus for an upcoming photography review.
Portrait mode is a special beast
Before iOS 11, Apple’s iPhone 7 Plus couldn’t shoot in low or dark light at all when it came to Portrait mode. Apple didn’t want to provide a poor experience with the blurred “bokeh” depth effect, and because of the telephoto lens’s less-than-stellar low-light capability, users were limited to taking portraits in sunlit or bright conditions (or using artificial lighting in a smart way).
But with the latest iOS update and the release of the iPhone 8 Plus, Portrait mode now supports photography in all sorts of lighting conditions including dim lighting, shooting with Flash, and using HDR to balance out tricky exposures.
These are great upgrades to Portrait mode, and especially impressive for Apple to ship, given that dim lighting isn’t easy for any camera — let alone metering for depth. Portrait mode’s “bokeh” effects attempt to blur both the background (and, as of iOS 11, the foreground) around a subject; in little to no light, the camera has to work twice as hard to figure out all of that depth information so that you avoid an out-of-focus subject or weirdly-metered lighting. So Apple’s A-series processors and ISP attempt to provide as much focus data as possible to allow the sensor to properly capture the image — and, in the case of the iPhone 8 Plus, sync with the flash for its Slow Sync feature.
Bring on the blobs
When you frame a Portrait mode at night, most of this intense processing happens in an instant. But when it comes to getting focus points in the dark, if you look closely you may see some… interesting aberrations on the screen. Specifically, something I’m calling “focus blobs.”
Several times while shooting low light Portrait mode photos with the iPhone 7 Plus and 8 Plus cameras, I caught glimpses of almost amoeba-like black blobs appearing and undulating around the screen. If you’ve ever been unfortunate enough to witness water leak into an LCD panel, the look is very similar — a strange sort of dark shape on top of the existing picture, with a slightly lighter ring around it.
I should note that these aren’t particularly glaring if you’re not looking for them: I was certain I was imagining things at first, but we were able to capture it in a screen recording of a low-light shoot, as well as in a few screenshots (see above). And replicating this is a pretty simple experiment: While running iOS 11, point your iPhone 7 Plus or 8 Plus camera at a light surface, then point it toward an almost pitch-black subject. Focus blobs will almost immediately appear as the iPhone’s telephoto camera tries to find a subject, light, and meter accordingly.
It’s worth noting this only happens in Portrait mode: In Photo mode, the camera can swap from bright to low lighting conditions almost instantly, in part because the wide-angle lens has a much better aperture for letting in light than its telephoto pair (f/1.8 vs f/2.8).
A spot of magic
Why can we see these blobs? I suspect it has something to do with Portrait mode’s “bokeh” preview, which tries to give you a basic idea of what’s in focus and what will be blurred out. On well-lit subjects, this pops into focus as a basic blur, but it’s a bit slower of a lock when you lose the light.
As such, what I’m calling “focus blobs” may very well be an attempt by Portrait mode to offer bokeh preview for its depth map. It just doesn’t quite preview the way that it might in a better light because, well… there’s literally no light.
Even if it doesn’t quite work from a “show the user what they’re going to get” perspective, it’s an incredibly cool peek at how hard your iPhone Plus software and hardware is working to properly map low-light image depth. (Even if it is almost impossible to take a screenshot or video of the blobs without boosting the contrast like crazy.) Apple’s done some great work in the Portrait arena, and it’s pretty nifty to see the processor in action.
What does this mean for the average user? Nothing but a cool science experiment. These blobs are normal, and part of the iPhone’s camera system — they won’t show up in your final image, and you likely won’t notice them unless you’re shooting in almost complete darkness. For me, they’re just another a reminder of how much effort has gone into making Portrait mode more functional in iOS 11 — and my photos and I are grateful for it.
This News Credit Goes To >> Source link