Australian technology news, reviews, and guides to help you
Australian technology news, reviews, and guides to help you
Apple iPhone 12 Pro Max

Leica vs iPhone: is Apple’s phone as good as the real thing?

Leica sure makes some lovely cameras, but they definitely cost a pretty penny. Can an iPhone match what a $9K Leica does?

Your phone can do a lot more than what the word phone originally was meant to do. You can surf the web, answer emails, listen to music, use it as your recipe system, make movies, read books, pay for things, open car doors and your home, and just about anything else.

But one of the things we typically use it for most of all is the camera. Easily one of the leading reasons why people buy new phones, the camera is often where most of all the exciting technology goes.

Now more than ever, you can expect several lenses, plenty of megapixels, crisp clarity and beautiful colour, with all of this enough to rival a proper camera with a proper lens.

Recently, we’ve been getting quite familiar with a rather expensive dedicated camera, the Leica Q2 Monochrom, a $9K camera that delivers beautiful images in crystal clear clarity thanks to its properly sharp lens. It’s easily one of the more impressive cameras we’ve seen in recent times, but it has also made us wonder: is it good enough to make us leave our regular workhorse, the iPhone?

Leica's Q2 Monochrome reviewed

Testing a Leica Q2 against an iPhone 12 Pro Max

It’s worth noting that for this test we actually used the iPhone 12 Pro Max, since it’s the phone we hand on hand during the Q2 review.

A few weeks on, Apple now has an iPhone 13 series, which includes upgrades to the camera, with the iPhone 13 Pro and iPhone 13 Pro Max supporting a macro mode thanks in part to the ultra-wide camera getting auto-focus support. There’s also a bigger camera bump there, so unsurprisingly, the iPhone 13 Pro and Pro Max are bigger and better again.

However, while the iPhone 12 Pro Max is no longer the model to compare against, a Leica vs iPhone comparison is still just as valid because this test was performed with the now “old iPhone”, and the camera quality is getting even better.

So how does it perform?

We used our default test model (thank you, four year old daughter) to capture some shots to see just how clear and sharp each could be, and while there are differences, we suspect reviewers and photographers may well be the ones to pick them up, and not necessarily everyone else.

Let’s start with some flowers.

In the shots above, there’s a pretty clear image on each, and a nice depth of field for both, but one of these is from a real camera, and the other from a phone. Can you guess which is which?

Here’s a crop of each to give you a little more to work with, which in turn reveals the answer. With a little more detail, the left is the Leica, coming from a 47 megapixel camera, compared with the 12 megapixel iPhone 12 Pro Max.

One of the obvious indications is in the right show, with a flower seemingly appearing from nowhere on on the iPhone, thanks in part to our reliance on the portrait mode. However, the result is a great start, and may have you questioning what you’re seeing, especially when you start training the camera on people.

Next up we have our lovely daughter looking at a book in the garden, and both cameras are doing a fairly solid job.

With the iPhone 12 Pro Max, we’re capturing a portrait shot with Apple’s monochromatic profile using the wide-angle camera, something available on every iPhone, ranging from the iPhone XR to the iPhone 13 Pro Max. Even the iPhone SE should be able to take this shot.

So which is which?

A tighter crop on the image reveals which is which, thanks to how the depth of field is created.

Real depth of field is created by changing how much light is let into a lens. More light means a more shallow depth of field as an aperture opens up larger, while less light sees a smaller hole for light to come in as an aperture closes. A real camera and a low-light lens can do this light handling trick because it has elements of a lens to make it work and the capability to open up to a low aperture, but the iPhone can’t, so it needs to emulate this.

This emulation is what smartphone makers typically call a “Portrait Mode”, whereby the background of the shot is blurred and some software smarts do the maths to try and work out where the depth would actually occur.

Not every phone with a portrait mode does this the same way: less expensive phones capture a low resolution image to blur while algorithms keep the portrait in the foreground in focus, while Google’s software smarts in the Pixel phones can do this from one camera. Depending on the phone you have, Apple will either do the whole thing with the guts of the iPhone and an extra camera, but the result is typically the same regardless: an attempt at a soft-background portrait shot that works in much the same way as a real camera and real lens.

On the left, we see Apple trying to get that depth of field right, only to miss some of the edges of hair rather than have them fade back into a softer focal plane. That’s an iPhone shot, with hair notoriously difficult for portrait emulation to nail. On the right, the Leica doesn’t need to simulate anything, and captures the focal plane the way a regular camera would.

The results however aren’t far from each other, and while a trained eye (or a photographer or reviewer) can spot the difference, we suspect few would, and would likely be happy with what they saw.

 

The above example offers more of that similarity in performance, with the left from the Leica and the right from the iPhone, but each delivering depth and clarity that mightn’t be easy to tell which is from a real camera and which is from a phone.

But that is largely the point these days: with a camera as capable as the one found inside many phones, you mightn’t be able to tell the difference at all.

Read next