For many people, the choice between flagships primarily dials down to their cameras. While Apple used to go unchallenged in this segments until a few years ago, Android competition has more than caught up in the last couple of years. Apple introduced several upgrades for its iPhone cameras, but the marquee feature is the new Smart HDR.
New and fancy HDR modes are what manufacturers use to work around limitations of smartphone cameras. These modes rely on analyzing data gathered by camera sensor using smart algorithms in order to generate a more realistic image. Just like regular HDR, modes like HDR+ on Google Pixel phones and Smart HDR on iPhones gather details from multiple shots clicked at different settings in order to capture as much essence of the scene as possible. Here’s all you need to know about the new Smart HDR technology.
Smart HDR is powered by the new A12 Bionic NPU
One of the biggest attractions about the new iPhones launched is their new 7nm process based A12 Bionic chip. The focus this year is on reducing power consumption and on the new NPU that’s eight times faster than the NPU used on A11 Bionic in iPhone X.
The NPU upgrade makes it abundantly clear that the neural engine will have a much bigger role to play in improving the performance of different aspects of Apple devices. One such crucial area is the camera, and on the new iPhones, Apple relies on both improved hardware and increased NPU computational power for improving image quality via Smart HDR.
Also Read: 5 Apple iPhone launch Event Key Highlights
What is Smart HDR? How does it work?
Using Smart HDR, Apple will be improving the quality of their HDR images by analyzing more info in multiple frames before combining them into one. The ISP and the neural engine will come together to handle the increased workload for Smart HDR shots.
To show what smart HDR can do, Apple also demonstrated a sample shot directly against the sun. The dynamic range in the shot was indeed handled well.
Here is how Smart HDR works. Suppose you are shooting a moving subject. The iPhone camera will detect motion and shoots four frame buffer so you can capture the perfect shot without any motion blur. The A12 Bionic also captures secondary inter frames at different exposures to capture highlights. The camera also simultaneously shoots a long exposure shot for better shadow details. Then the camera analyses all the shots captured and combines them in the best possible way for that perfect shot.
The camera can also detect faces and facial features like eyes for instant enhancements like red-eye reduction.
Will Smart HDR work for the front selfie camera?
Yes. The face ID sensor has also been upgraded and it will also benefit from Smart HDR algorithms.
Will Smart HDR be passed on to previous generation iPhones via OTA update?
No. Since Smart HDR heavily relies on hardware (NPU) it won’t be wise to expect Apple to pass it on to last generation iPhones.
Is Smart HDR better than HDR+ on Pixel phones?
Well, it’s still too soon to confirm that. We will need to test Apple Smart HDR mode before we know for sure if the technology is anything as promising as the launch demonstration shows.
Pixel phones could shoot awesome HDR+ shots even before the visual core was turned on via OTA update, which is to say the computation power won’t have much of a hand to play in deciding which one is better. It will all dial down to how smart the processing algorithms are and what tradeoffs Apple and Pixel decide to make. We will know soon and upgrade this post with more details.