iPhone 11 Pro

I was excited about the iPhone 11 Pro because it offered a completely new camera system. The main difference compared to my previous device, iPhone 8 is two extra lenses, telephoto and ultra-wide. This sounded like an ideal package. I don’t always carry my main camera with me and I end up shooting a lot of photos with my phone. Does iPhone 11 Pro deliver on its promises? Is iPhone 12 Pro meaningfully better?

iPhone 11 Pro

iPhone 11 Pro

By Alexander Grebenyuk, All Rights Reserved

Ultra-Wide #

1.54mm (13 mm full frame equivalent)
1/3.6” (1.0µm pixels)

Let’s start with an ultra-wide lens because it’s probably the most fun to use. And it’s a very wide-angle lens indeed with a huge field of view. Capturing a 120-degree field of view, it’s close to a human eye with a 135-degree field of view. Some people mistakenly call this lens “fish-eye”. Fish-eye is a special type of an ultra-wide angle lens and it produces massive curvilinear barrel distortion which iPhone camera doesn’t.

This lens allows for great pictures of big, open scenes or of tight spaces where there is no room to step back to capture a full scene with a regular lens. In short, this lens allows you to take photos that you simply were not able to snap before. But it’s a bit of a hit or miss.

Ideal Lighting #

It works well in ideal lighting conditions. The photos are relatively sharp, even at the edges. It captures the color well.

Ultra-Wide, Ideal Lighting

Ultra-Wide, Ideal Lighting

By Alexander Grebenyuk, All Rights Reserved

Ultra-Wide, Ideal Lighting

Ultra-Wide, Ideal Lighting

By Alexander Grebenyuk, All Rights Reserved

Low Light #

It starts to struggle in low light. The shutter speeds are still acceptable for handheld shooting (1/60), but there is almost always significant amount of blurring even in the center of the frame. Partially this has to do with lack on an optical image stabilization (OIS) on this lens. The photos don’t look great even on the iPhone screen. I wasted a lot of potential shots where I would try to use ultra-wide with unsatisfactory results.

Ultra-Wide, Low Light

Ultra-Wide, Low Light

By Alexander Grebenyuk, All Rights Reserved

Ultra-Wide, Low Light

Ultra-Wide, Low Light

By Alexander Grebenyuk, All Rights Reserved

Downscale an image to 414 points (iPhone 11 Pro screen width) and it still doesn't look sharp.

In the dark, I would say it is barely usable. It’s still better than nothing, but always make sure you take at least a few shots with a regular wide lens.

Ultra-Wide, Darkness

Ultra-Wide, Darkness

By Alexander Grebenyuk, All Rights Reserved

Ultra-Wide, Darkness

Ultra-Wide, Darkness

By Alexander Grebenyuk, All Rights Reserved

The rules of thumb for this camera would be:

  • Don’t overuse it
  • Make sure to snap at least a few shots of a scene with a regular lens too
  • Avoid using it in any low-light situations
  • Be extra careful with composition when taking pictures of people

Telephoto #

6mm (51 mm full frame equivalent)
1/3.6” (1.0µm pixels)

A telephoto lens has two main use cases:

  • Taking portrait photos
  • Taking photos of subjects from farther away when you can’t get closer or when you want to compress composition

Portrait Mode #

A telephoto lens on iPhone 11 Pro is great for taking portraits when subject is close to the camera. It’s especially good with portrait mode. I used this combination to take some great photos, sometimes even in poor lighting. And it works not just for portraits.

But to be honset, I rarely use portrait mode because it’s often a hit or miss. The blur at the edges of the subject is always noticeable. And the fake “bokeh” does not look great and often distracts from the subject. For example, look what mess it turned the Christmas tree into. Having said that, you might not notice these things on an iPhone screen.

Telephoto, Portrait Mode

Telephoto, Portrait Mode

By Alexander Grebenyuk, All Rights Reserved

Telephoto, Portrait Mode

Telephoto, Portrait Mode

By Alexander Grebenyuk, All Rights Reserved

Subject Farther Away #

In photography, there is no strick definition of a telephoto lens, but a lens is generally considered telephoto if its focal length is at least 100 mm. A “telephoto” lens in iPhone is 6mm (51 mm full frame equivalent). So the name can be a bit misleading, it won’t allow you to reach as far as a “proper” telephoto lens would. I would’ve preferred to have an equivalent of at least 85mm lens, but this was probably impossible without avoiding a bigger camera bump (is it really a problem though?).

Now when it comes to taking photos of subjects that are farther away, I was rather disappointed. Almost none of the photos I took were good. There is loss of contrast and detail, even in ideal lighting. I stopped using this lens after a while. It’s no surprise Apple doesn’t include this lens on the base iPhones – this camera is just not that good. And you don’t need it for portrait mode either, you can use a wide lens instead.

Telephoto, Ideal Lighting

Telephoto, Ideal Lighting

By Alexander Grebenyuk, All Rights Reserved

Telephoto, Ideal Lighting

By Alexander Grebenyuk, All Rights Reserved

-->
Telephoto, Ideal Lighting

Telephoto, Ideal Lighting

By Alexander Grebenyuk, All Rights Reserved

</div>

Telephoto, Ideal Lighting

Telephoto, Ideal Lighting

By Alexander Grebenyuk, All Rights Reserved

Telephoto, Ideal Lighting

Telephoto, Ideal Lighting

By Alexander Grebenyuk, All Rights Reserved

Wide #

4.25mm (26mm full frame equivalent)
1/2.55” (1.4µm pixels)

The wide lens is the workhorse of the iPhone camera system. It takes great photos in ideal lighting conditions. They often look good even when viewed on a larger display.

Wide, Ideal Lighting

Wide, Ideal Lighting

By Alexander Grebenyuk, All Rights Reserved

Wide, Ideal Lighting

Wide, Ideal Lighting

By Alexander Grebenyuk, All Rights Reserved

It performs well in low light.

Wide, Low Light

Wide, Low Light

By Alexander Grebenyuk, All Rights Reserved

Wide, Low Light

Wide, Low Light

By Alexander Grebenyuk, All Rights Reserved

It struggles in the dark, but it is usable. iPhone comfortably goes up to 800 ISO. It’s a bit of a hit or miss though. But sometimes it produces really great results, like the last photo in this section.

Wide, Darkness

Wide, Darkness

By Alexander Grebenyuk, All Rights Reserved

Wide, Darkness

Wide, Darkness

By Alexander Grebenyuk, All Rights Reserved

Wide, Darkness

By Alexander Grebenyuk, All Rights Reserved

-->
Wide, Darkness

Wide, Darkness

By Alexander Grebenyuk, All Rights Reserved

<div class="PhotoCollage_FlexItem PhotoCollaget_Right">
    <div class="Photo">

Wide, Darkness

Wide, Darkness

By Alexander Grebenyuk, All Rights Reserved

</div> </div> </div>

Technical Details #

There isn’t a massive difference between iPhone X (or iPhone 8) and iPhone 11 Pro wide cameras. I say “cameras” and not “lenses” for a reason. The lenses don’t share sensors, stabilization, or barely any components. So it’s really three different cameras in one phone.

                iPhone X: Wide        iPhone 11 Pro: Wide     Telephoto               Ultra-Wide

Sensor Size     1/3" (1.22µm pixels)  1/2.55" (1.4µm pixels)  1/3.6" (1.0µm pixels)   1/3.6" (1.0µm pixels)
Focal Length    3.99mm (28mm FF)      4.25mm (26mm FF)        6mm (51 mm FF)          1.54mm (13 mm FF)
ISO             22 - 2112             32 - 3072               21 - 2016               21 - 2016
Resolution      4032 x 3024           4032 x 3024	          4032 x 3024             4032 x 3024
Aperture        ƒ/1.8                 ƒ/1.8                   ƒ/2.0                   ƒ/2.4                   
OIS             Yes                   Yes                     Yes                     –

On iPhone 11 Pro, a wide lens has the best sensor, by far: 1.4μm pixels vs 1.0μm pixels on other cameras. And I think this is the main problem with this camera system. There is a direct correlation between pixel size and the details and contrast of the images. This partially explains my poor experience with telephoto and ultra-wide cameras (you should also take into account longer focal length and higher aperture). Having said that, this is clearly a trade-off, you can’t simply put the best sensor in all cameras. It’s still a bit of a shame and feels like a waste compared to interchangeable lens cameras which only need one sensor.

It looks like Apple didn’t think these additional lenses were as important and put inferior sensors in them. How inferior? iPhone 6S, the first device in which Apple used 12MP sensor, had 1.22μm pixel size. I think it’s safe to say that telephoto and ultra-wide cameras on iPhone 11 Pro have the worst 12 MP sensors Apple has ever put in an iPhone.

Speaking about resolution, when Apple moved from 8MP to 12MP in iPhone 6S, there was a noticeable reduction in pixel size (around 23%). iPhone 6 had 1.5μm pixels, iPhones 6S – 1.22 μm. iPhone 11 Pro still hasn’t beaten iPhone 6 in terms of pixel size. 2688x1520 resolution is generally enough for viewing photos even on bigger screens.

Tech websites love to compare smartphone cameras: Apple, Samsung, Google, etc. It’s a bit funny because they pretty much all use camera systems made by Sony. It’s clear which phones use better sensors. iPhone 11 Pro uses custom Sony sensor, “Sony Exmor IMX356-Inspired”. It is CMOS BSI (I assume it’s also stacked). It has Phase Detection Auto Focus (PDAF). We also know that there is the room to grow. There are phone cameras on the market with significantly larger pixel size, e.g. up to 2.4µm pixels with binning (looks at this absolute unit of a camera).

iPhone 12 Pro #

Now, if you compare iPhone 11 Pro with iPhone 12 Pro, these cameras are virtually identical. Unfortunately, the main camera improvements this year were reserved for iPhone 12 Pro Max. These are marginal improvements, but they should help a little bit, especially with photos in low light. And the sensors on ultra-wide and telephoto cameras are still the same lackluster sensors from iPhone 11.

  • ƒ/1.8 -> ƒ/1.6 on wide lens
  • 1.4µm -> 1.7µm pixels (only on Max)
  • Max ISO 3072 -> 5,808 and 7,616 (on Max)
  • LiDAR makes Night mode portraits possible

Source: Apple

At the end of the day, cameras are still bound by optics. With the current sensor technology, there isn’t much room for growth in terms of raw optical power. I will go as far as to claim that there were no significant improvements in raw optics performance in iPhones since iPhone 5S (2013).

                iPhone 5S             iPhone 11 Pro: Wide

Sensor Size     1/3" (1.5µm pixels)   1/2.55" (1.4µm pixels)
Technology      CMOS BSI              CMOS BSI
ISO             32 - 3200             32 - 3072             
Resolution      8 MP (3264 × 2448)    12 MP (4032 x 3024)
Aperture        ƒ/2.2                 ƒ/1.8 (+0.66 stops)                
Stabilization   AIS*                  OIS

AIS (Auto Image Stabilization) is a software solution that takes four photos with shorter exposures, selects the sharpest areas of each one, and then combines them into a single, final photo. Sounds a bit like Deep Fusion, doesn’t it?

The change from ƒ/2.2 to ƒ/1.8 gives you only +0.66 exposure stops which is insignificant. The only major difference is resolution. To quote Apple: “Any digital camera is only as good as its sensor. While more pixels produce a bigger picture, we prefer bigger pixels because it means a better picture”.

Apple needs to keep the iPhone photography hype going. So their new answer is software.

Software #

Apple’s answer to keep the iPhone photography hype going is “computational photography”. But what is it really?

HDR #

Apple originally introduced HDR in iOS 4.1 (2010). The idea was clear and by no means new: a camera takes several photos in rapid succession at different exposures and blends them to bring more highlight and shadow detail to your photos. Originally this feature was opt-in – HDR photos would often have unappealing unrealistic looking colors. HDR is only useful in situations where a camera doesn’t have enough dynamic range to capture a scene. Starting with iPhone 8 Plus, Apple enabled HDR by default – the device enables it automatically when it deems fit.

There are clear benefits of HDR and there are also known problems with it. There is a trade-off: to take multiple photos you need shorter exposures, which means higher ISOs, which means more noise. In iPhone XS Apple made some wrong trade-offs when it comes to HDR. iPhone XS would apply it too aggressively and would perform too much noise processing, leading to loss of detail. With iPhone 11 Pro, Apple is so confident in HDR that you can no longer even find HDR photos using Smart Albums in Apple Photos - these photos are no longer marked as “HDR”. HDR on iPhones is great. It does its job and does it in a subtle way.

Deep Fusion #

Apple introduced Deep Fusion or as everyone knows it, “sweater mode”, in 2019. Based on Phil Shiller’s explanation the iPhone takes 9 images: 4 quick underexposed, 4 quick overexposed, and one properly exposed. So…bracketing. Then the neural engine combines them to somehow optimize for detail and noise. Sounds like HDR on steroids, doesn’t it? But you can already see where this is going. Even more images mean even faster shutter speeds and higher ISO. It means more noise. I get the idea with HDR where different images allow the camera to fill the information that otherwise would’ve been lost because it would fall off the histogram. But how can 9 poor images – low signal, high noise – result in one good detailed image? In reality, it sounds a lot like auto image stabilization (AIS) in iPhone 5S (2013). It’s hard to tell whether it works at all.

Source: Apple

Portrait Mode #

Portrait Mode, introduced in iPhone 7 Plus, has become part of the standard iPhone offering. It is an attempt to fake a “pro camera look” with software. Humans don’t like to be tricked and you can see that these photos don’t look quite right: the fake “bokeh” is noisy and weird, it often distracts from the subject achieving the opposite of the intended effect. The edges of the subject are often blurred. I barely see any pictures taken with portrait mode shared on social media. This feature is in the “HDR in iOS 4.2” state, it is disabled by default and you can now even turn it off after the fact. That’s useful because it has an equal chance of ruining your photo as it has of “improving” it.

Photo Segmentation #

Apple uses machine learning to identify certain features in the photo, such as human hair or teeth, and perform post-processing targeting this feature. It’s not entirely clear what it does, so I can’t comment on it. The idea is great because Apple Photos has practically no way to apply adjustments only to a specific part of the image which makes it worthless for any sufficiently advanced image processing.

Night Mode #

Night Mode is an iPhone feature that I think is super cool. If you take a photo using Night Mode, it might take around 1 to 5 seconds. But it’s not the same as simply taking a long exposure shot with 5s shutter speed. Firstly, it would’ve been impossible to do it handheld. And secondly, iPhone 11 Pro cameras are not even capable of shutter speeds higher than 1s. Instead, the iPhone uses “adaptive bracketing,” a technique that captures the same scene multiple times with different settings. Some shots may have short exposures to capture motion, while others are longer to pull detail out of shadow areas. iPhone then combines these images, and it actually works great. This photo I shared earlier was shot using Night Mode. It took about 2s to capture it, yet the EXIF says that it used 1/8 shutter speed.

Overview #

Is “computational photography” revolutionizing smartphone cameras? I can’t say it is. Some of the features are cool, such as Night Mode. But Apple isn’t uniquely positioned in this market. In fact, Night Sight – an equivalent of Night Mode – appeared on Pixel 3 way before it did on iPhones.

Is the iPhone the only camera that uses software? The question is absurd. For example, Sony is known for its amazing AI-powered auto-focus systems. Previously this was a major hurdle when it comes to pro photography. The focus needs to be on point when you shoot with a shallow depth of field. And thanks to AI, Sony cameras are able to perfectly focus and keep track of human (or even animal) faces and/or eyes. It is even able to keep track of the subject when it rotates and the eyes are no longer visible.

Source: Sony

Does the iPhone use more software and in more ways than cameras? Maybe, but in most cases it does so to try to compensate for the lack of evolution in optics and sensor technology. Portrait Mode, HDR, Deep Fusion, Night Mode – all these headline iPhone camera features are not needed on good cameras.

Final Thoughts #

iPhone 11 Pro

iPhone 11 Pro

By Alexander Grebenyuk, All Rights Reserved

I was initially extremely excited about the new iPhone 11 Pro camera system. It seemed like an ideal package. In the end, it probably didn’t change the way I take photos as much as I expected. To summarize my experience:

  • An ultra-wide lens is only usable in ideal lighting conditions. From around 500 photos I took with this lens, I only like one.
  • A telephoto lens is not great even in ideal lighting conditions. It lacks contrast and detail. I prefer shooting with a wide lens.
  • A regular wide lens offers slightly better performance in low light. In good lighting conditions, I didn’t notice any significant difference compared to my iPhone 8 camera.

Should you upgrade from an older model? I take a lot of photos with my phone and welcome any camera improvements. Also, it’s not just a better camera that you are getting. Having said that, I don’t think you’ll see a massive improvement in the wide camera even if you are upgrading from older models. You can post a photo on Instagram taken with iPhone 5S today and nobody will notice the difference.

Should you buy a “Pro” because of the telephoto lens? I would say no, you won’t miss it.

To be honest, I wish Apple would embrace the fact the iPhone is the most popular camera and make the appropriate design changes. I wouldn’t mind more camera in my phone.