If you asked reviewers about the best camera on a phone last year, chances are that Apple’s phones might not feature in the top three. Google’s Pixel 3 and Huawei’s duo of P30 Pro and Mate 20 Pro ruled the charts with their versatile camera setups , and the ability to take stunning pictures in low light. With its new iPhones, Apple’s trying to rectify that.
Let’s take a look at the camera specifications of the new devices first:
iPhone 11:
Rear camera: 12-megapixel wide sensor with f/1.8 aperture + 12-megapixel ultra-wide sensor with f/2.4 aperture, 120° field of view
Front camera: 12-megapixel true depth sensor with f/2.2 aperture
iPhone 11 Pro and iPhone 11 Pro Max:
Rear camera: 12-megapixel wide sensor with f/1.8 aperture + 12-megapixel ultra-wide sensor with f/2.4 aperture, 120° field of view + 12-megapixel telephotos sensor with f/2.0 aperture
Front camera: 12-megapixel true depth sensor with f/2.2 aperture
Photography improvements
With Apple switching to a dual-camera system on the iPhone 11 – and Google’s upcoming Pixel 4 will be getting a second camera – the era of the single-camera flagship is pretty much over. And I’m so glad Apple chose to include an ultra-wide-angle sensor instead of a telephoto sensor as the second camera, because it allows for a wider field of view and a chance to create more dramatic shots. Sure, optical zoom is nice, but a wide-angle snapper is better.
One major change the company has brought is the new night mode. It will automatically kick in when the phone camera detects the detects low-light conditions in your frame, and snap a bright picture as shown in the example below. Sadly, you can’t manually activate the night mode like the Night Sight on the Pixel 3 and the night mode on the P30 Pro.
The company says it has improved its smartHDR algorithm to capture more “natural-looking” images with better highlights and shadow details. Apple has also improved its Portrait mode, so it now detects objects and pets just as well as it does humans.
With the new A13 Bionic chip, Apple also brings a new image processing technology called Deep Fusion. Using the processor’s neural engine, it’s designed to perform pixel-by-pixel processing to improve texture, details, and noise reduction. The company says Deep Fusion will arrive later in the fall, through an update for new iPhones.
The Pro and Pro Max variants of the new iPhone will feature a telephoto sensor along with wide and ultra-wide sensors. All three cameras can share information with each other, resulting in better images. For instance, the telephoto sensor can use details from the other two sensors to create a better depth effect. You can even capture photos from all three cameras at the same time.
Video improvements
The iPhone maker has brought a ton of changes to video capture as well. Both ultra-wide and wide cameras now support shooting footage in 4K with extended dynamic range and cinematic stabilization (Apple’s term for electronic stabilization). I have to say, Apple’s demo video shot with the new iPhones look pretty sick.
With the new iPhones, Apple’s made it easier to switch between different cameras to zoom in and out seamlessly, as you compose your shot. Plus, you can use audio zoom – just like the Samsung Galaxy Note 10 – to focus on a specific area for voice recording. The company is also allowing third-party video apps to shoot multi-camera videos in real time. That’ll be something cool to look forward to once the new iPhones come out.
Overall, the new iPhones represent a significant step in getting closer to the imaging prowess we’ve seen in the world of Android flagships. But of course, we’ll have to wait for a few weeks until the first reviews come out to learn how these features fare in real life.