Xiaomi Mi Note 10
Samsung Galaxy Note 20 Ultra 5G
Samsung Galaxy S20 5G
How come Android phones are better at everything, but cannot beat Apple with their cameras?
Android phones routinely beat Apple with their cameras. Who told you they couldn’t? And routinely lose, of course. Android covers a very, very wide swath of camera hardware. You can buy a brand new Android phone for under $75, and other Android phones for well over $2,000.
The iPhone 11 Pro is the first Apple phone that’s caught up to the leading Android phones. Well, sort of. But let’s look first at the hardware because a big part of overall image quality does need good hardware system.
The iPhone 11 Pro uses a 1/2.55″ primary sensor at 12 megapixels. Apple actually started using the 1/2.55″ sensor in the iPhone XS in 2018. On Android, they were pretty standard in 2014. Before the XS, Apple’s sensors were smaller, thus less light-sensitive, than every flagship Android phone. Even those from the major Chinese companies.
The “portrait” camera, basically a standard lens in 35mm terms, was first available in the iPhone 7 in 2016. There were lots of 2-camera phones prior to late 2016, but they were generally using the second camera for image quality enhancements (B&W and colour shot together), depth mapping for bokeh effects, ultra-wide. So give Apple props here.
However, since 2016, other companies have delivered actual telephoto cameras, not just a normal/portrait lens. These range from 3x to 5x relative to the main wide-angle lens. The iPhone 11 has Apple’s first ultra-wide-angle camera. LG introduced an ultra-wide front camera on the V10 in 2015, and an ultra-wide back camera in 2016 on the G5.
Missing from Apple’s phones in 2019 was the late 2018/early 2019 new feature, the time-of-flight sensor. Most companies were using 2-lens parallax to depth map an image for bokeh. This actually worked pretty well when Huawei and Leica first did it in 2016, using a matched set of cameras (one B&W, one colour), but it’s not all that great between two different lenses. So the time-of-flight camera reads out the time it takes a laser to reflect from a subject on a per-pixel basis. So it’s very good for portrait depth mapping.
Since 2018, many companies have been offering phones with yet-again larger sensors. Most of these incorporate a quad-Bayer colour matrix and 48, 64, or even 108 megapixels, allowing a variety of pixel-chunking modes, letting the user (or an AI) trade-off top sharpness and much improved low-light performance.
Android phones sometimes offer speciality cameras. CAT sells a phone with a built-in FLIR heat camera. Moto has an add-on to the Moto Z series that gives you a 360-degree camera, and another that gives you a 10x optical zoom. A number of flagship phones this year are adding a macro camera.
Google Pixel 5 - 5G
The Software Quality
Apple has been pretty competitive with their software, and that helped. After all, they were actually doing decently, often making the top ten in image quality, using a far inferior image sensor. But it’s a fact that pretty much all phone sensor chips are a compromise, and it’s the software that really does deliver top performance, though the hardware helps, too.
Apple was recommending their 2-shot HDR mode for a while, and they made it the default in 2019’s camera app. But that app actually took it a bit further. When you run the app, it’s always taking photos, and buffering the last 8. It’s also alternating between full exposure and underexposure. When you press the shutter icon, the phone takes one more shot in low light, no more in bright light. It picks the best out of the four sets for bright light and may use them all for low light. Others do similar things, but Apple’s is very good.
Some kind of advanced multi-shot night mode is another one. Apple implemented one in the iPhone 11, and it’s good, snapping nine shots, using an AI for colour rendition, etc. Google and Huawei first did similar night modes in 2018. In fact, Google published an article on their AI blog telling everything they did.
Google also implemented a computational telephoto in 2018, using a technique called DRIZZLE to boost resolution using multiple shots. Samsung and others are using this in 2020, but nothing from Apple yet.
Google also has a thing called a computational raw image. They can create a single file from multiple images used in any mode that’s using computational photography, and store it all. So nothing’s lost, versus a JPEG or HEIC files, which toss out most of the information captured — information you may not notice missing if you don’t edit, but that you certainly will if you want to edit an image. iOS supports raw files, but not computational raw, and Apple doesn’t support raw in their app, or on the ultra-wide camera.
Samsung Galaxy S20 FE 5G
The Ratings and Reviews
There are always reviews for camera phones, which attempt to apply a set of standard tests to determine the better cameras on the market. None of these is perfect, of course, but at least when the folks doing these include very detailed test results, you can decide if their ratings comport with your personal preferences.
As you can see, Apple’s base iPhone 11 is a match for the Huawei P20 Pro… but they just released the P40 Pro. The iPhone 11 Pro Max and Samsung’s Galaxy S10+ are a match as well, but Samsung’s 2020 flagships are already in out, just not reviewed yet by DxO Mark. The Xiaomi Mi Pro uses the same basic 108-megapixel main sensor as the Samsung Galaxy S20 Ultra, but Samsung’s using a different colour filter and, of course, different AI and image processing.
If you don’t like DxO Mark, feel free to find other review materials you trust better. Apple does a good job, but every major smartphone company is working hard to build a better camera. Most of them are pushing hardware technology much harder than Apple. Thank you.