Is the camera on the iPhone 16 Pro different from the camera on the iPhone 15 Pro?

Almost as exciting to progressives as a new COVID-19 vaccine… Apple has announced the iPhone 16 (two cameras/lenses) and iPhone 16 Pro (three cameras/lenses).

For us photo nerds, plainly, the 16 Pro is the only device of interest. I can’t figure out what’s different, though. Here’s what Apple says:

With iPhone 16 Pro and iPhone 16 Pro Max, the world’s favorite camera gets even more powerful. Powered by A18 Pro, the upgraded camera system introduces a new 48MP Fusion camera with a faster, more efficient quad-pixel sensor and Apple Camera Interface, unlocking 4K120 fps video recording in Dolby Vision — the highest resolution and frame-rate combination ever available on iPhone, and a smartphone first. The quad-pixel sensor can read data 2x faster, enabling zero shutter lag for 48MP ProRAW or HEIF photos. A new 48MP Ultra Wide camera also features a quad-pixel sensor with autofocus, so users can take higher-resolution 48MP ProRAW and HEIF images when capturing uniquely framed, wider-angle shots or getting close to their subjects with macro photography. The powerful 5x Telephoto camera now comes on both iPhone 16 Pro and iPhone 16 Pro Max, allowing users to catch the action from farther away, no matter which model they choose. iPhone 16 Pro and iPhone 16 Pro Max now take spatial photos in addition to videos to help users relive memories with remarkable depth on Apple Vision Pro.

For still photography, it sounds as though maybe the wide angle camera will yield higher resolution results (but is the lens good enough for that to matter?).

There is some new camera software, which makes the phone work more like a legacy DSLR:

Camera Control — a result of thoughtful hardware and software integration — makes the pro camera system more versatile with an innovative new way to quickly launch the camera, take a photo, and start video recording. It has a tactile switch that powers the click experience, a high-precision force sensor that enables the light press gesture, and a capacitive sensor that allows for touch interactions. A new camera preview helps users frame the shot and adjust other control options — such as zoom, exposure, or depth of field — to compose a stunning photo or video by sliding their finger on the Camera Control. Later this fall, Camera Control will be updated with a two-stage shutter to automatically lock focus and exposure on a subject with a light press, letting users reframe the shot without losing focus. Additionally, developers will be able to bring Camera Control to third-party apps such as Kino, which will offer users the ability to adjust white balance and set focus points, including at various levels of depth in their scene.

But maybe this will also work with older iPhones?

The company claims that they’re going to automatically generate blather suitable for emailing (“Built for Apple Intelligence”), but there is no evidence that they’ve tackled the “fill out a shopping/shipping form” challenge.

I guess I will buy one to replace my iPhone 14 Pro Max (recently failed and required a $219 new camera module at the Palm Beach Gardens, Florida Apple Store (a model of customer service, I have to admit!)), if only to enter the glorious USB-C era that Android users entered 10 years ago and to lord it over Android users (“I have AI and you have nothing”).

What’s a good example of a recent photo that I couldn’t have taken without the cameraphone? Here’s one from Costco that can be captioned “Starlink is everywhere”:

And here’s the Big Bang Bar pinball machine, one of about 200 made, at the Delray Beach Silverball Museum:

It’s unlikely I would have carried a serious camera into these situations, so here’s a shout-out to the engineers at Kyocera who pioneered the camera phone in May 1999 (eight years before Apple released the iPhone).

Related:

6 thoughts on “Is the camera on the iPhone 16 Pro different from the camera on the iPhone 15 Pro?

  1. We need a review of mobile starlink on a honda minivan. The lion kingdom would be ecstatic if depth sensing was good enough to walk out of the national gallery of art with an STL file of gloria victis. Somehow suspect depth sensing is still potato quality for 4x the price as 4 years ago.

  2. Apple says in their “what’s new in iOS 18” PDF that you can use Siri to fill forms. They specifically give the example of telling Siri to fill in a passport number, which Apple Intelligence [sic] can retrieve from a scanned image of your passport. We shall see.

    • This is an extrimely easy staff. to get printed number from well – defined format image, either with just OCR or OCR with ML. There are open source libraries to do that. Worked on it a decade ago, and on cursive written English, not printed. Without ML it has been in mainstream for 30 years. And pdf has defined API too

  3. Apple has “compare iPhone models” page which gives very detailed comparison. (Keep scrolling down to second camera section)

    I’m on 15 Pro Max right now, and camera-wise 16 Pro Max has upgraded ultra wide, slightly upgraded main and same telephoto. I really don’t care about quality of ultra wide. Everything else is marginal improvement, so I’m going to stay on 15. All the cool features like satellite sms and AI they are going to release for 15 pro as well.

  4. Last year you could only get 5x optical zoom with the biggest (“Max”) 15 Pro but now that’s available on both 16 Pro models. – a win for people who want the best available camera but prefer a smaller form factor.

    I’m on the 15 Pro Max, probably not upgrading.

Comments are closed.