Six months ago, I decided to replace my well-used iPhone 13 Pro Max with the iPhone 15 Pro Max. The main, if not the only reason, was simple: I just wanted a better camera.
The ability to shoot at the full 48 megapixels instead of the boring 12 MP promised a lot of benefits – at least better zoom and generally more detail in objects. Since I take many review photos with my iPhone, the camera has always been one of the most frequently used parts of my smartphone.
After six months, I’ve come to a sad conclusion. The camera system that Apple uses today, and by all rumors will continue using in the iPhone 16, is conceptually flawed and poorly thought out.
It’s not just that the camera modules and photo-processing algorithms in iOS have been falling behind Android flagships for several years. And it’s not just that some photos are better left unzoomed manually, especially when they contain text.
The real issue is how the company has spread out the focal lengths of the three lenses. It’s so illogical, it feels like we’re dealing with a budget Android smartphone, not a $1,500 device.
What’s Wrong with the iPhone 15 Pro Max’s Cameras
The iPhone 15 Pro Max has three camera modules. I’ll list each one and point out the problem immediately:
▪ 0.5x ultra-wide: 12 MP, f/2.2, 13 mm
▪ 1x main: 48 MP, f/1.8, 24 mm
▪ 5x telephoto: 12 MP, f/2.8, 120 mm
These millimeters refer to the focal length, the distance from the center of the lens to the sensor, which determines the angle of view. The smaller the number, the wider the captured field of view in photos, and the larger the number, the closer distant objects appear. You can digitally zoom in on a shot by cropping it, but you can’t zoom out.
So, on the iPhone 15 Pro Max, there are three natural, non-cropped focal lengths: 13, 24, and 120 mm. In practice, together with the lens configurations and focusing system for each camera, this results in the following minimum distance before an object comes into focus: 3 mm, 10 cm, and… 3 meters, respectively. You can see the problem, right?
So, here’s the question. How can you take a close-up, high-quality shot of an object, including a person, standing in front of you at the most common distance for such photos – about one to one and a half meters – with the iPhone 15 Pro Max?
The simple answer is: you can’t. Or rather, you can, but it will be terrible, with AI filters, color distortions, and a degraded result due to the need to heavily crop the main camera’s output, which contains much less resolution and data than those supposed 48 megapixels.
You might think, “Who cares about this? It’s just some made-up problem for photography enthusiasts.” But it’s more widespread than you think.
The Problems Created by the iPhone 15 Pro Max Camera
You constantly have to move far back from small objects to a ridiculously inappropriate distance; otherwise, they either look bad in the photo, or the images are filled with artifacts and excessive AI post-processing. iPhone 15 Pro Max owners, does this sound familiar?
Let’s say I want to take a picture of a cake. I have two options. Either the cake will be shot up close with the main camera, and the lens’s geometric distortion will stretch it into an oval, while the lens itself captures a bunch of unnecessary things around the edges of the frame. Then I’ll have to crop the shot, fix the distortion in a third-party editor, and deal with other headaches.
Or, I step back three meters, after which the iPhone will finally switch to the telephoto lens, because the focal length is now appropriate. Voilà! The image is free of cropping artifacts, there’s no unnecessary clutter, and it’s not embarrassing to edit and post the result somewhere other than just social media where people see it only on a 6-inch screen.
I constantly find myself fighting with the iPhone camera, stepping far back from objects just to get the phone to switch to telephoto mode, so the shot comes out optically, geometrically, and aesthetically pleasing, at least reaching amateur level.
Is this convenient? Of course not. In many situations, you can’t physically move three meters away from the subject, and even if you do, it’s hard to maintain the desired frame and composition. When you have to stand on a chair just to take a decent picture on an expensive smartphone with three cameras, that’s a failure, not amazing.
Meanwhile, iPhone 15 Pro owners, who still have a telephoto lens with a 77 mm focal length, simply point their phone at the subject at a natural distance of one to one and a half meters, select the 3x mode in the Camera app, and press the shutter. No need to step back (with rare exceptions), everything looks decent, is in focus, without artifacts or AI nonsense.
The iPhone 16 Pro Will Also Have 5x Zoom, Get Ready
If you don’t own an iPhone 15 Pro Max, this pain is still unknown to you. But the iPhone 16 Pro will also have a telephoto lens with a 120 mm focal length, just like the iPhone 16 Pro Max. Even more people will have to perform acrobatics just to take a decent photo of objects, say, on a table.
Personally, I won’t be buying the new iPhone until the company resolves the issue of objects at the most convenient shooting distance of one to one and a half meters being photographed only with a cropped main camera and awful AI algorithms. For the past three months, I’ve been taking photos for reviews on iPhones.ru (for example, here) far more often with my Android phone, which costs just 35,000 rubles – and has a much more thoughtfully selected set of cameras. The results make me much happier than what the iPhone produces.
Taking product shots with an iPhone is probably not a popular use case. I’m sure most people won’t take my complaint seriously, but I still felt it was important to share. Apple’s flagship has suddenly stopped being my go-to workhorse for this task, even though it’s been fine for the last three years. The chance that I’ll buy a more advanced Android camera flagship instead of the new iPhone has never been higher.
But at least Apple Intelligence is coming soon! Now that sounds useful.