Apple says it every year, but the iPhone 13 cameras do seem much improved


    Cameras continue to be one of the biggest differentiators in smartphones, and Apple’s iPhone lineup is no different. Apple says the iPhone 13 and 13 Mini have the company’s “most advanced dual-camera system ever,” while the 13 Pro and Pro Max have “our three most powerful cameras ever.”

    Which you’d hope for, of course. But this year, Apple really does appear to be making a big push with its cameras, particularly with the Pro models. As ever, the question will be what Apple is able to wring out of its hardware with image processing and software.

    The iPhone 13 lineup represents the first time Apple has increased the primary camera’s sensor size across the board since the iPhone XS and XR in 2018, though last year’s 12 Pro Max had a 47 percent bigger sensor than the 12 and 12 Pro. Sensor size is a key factor in image quality because, together with lens aperture, it determines how much light the camera is able to capture. More light, less noise and blur.

    The iPhone 13 and 13 Mini’s main cameras have bigger sensors, which is part of the reason why it and the ultrawide are now arranged diagonally in the camera bump. Apple also added sensor-shift optical image stabilization, a feature first seen on the 12 Pro Max. It’s not clear exactly how big the 13’s sensor is, but Apple says it’ll capture 47 percent more light than the 12.

    The 13 Pro and Pro Max have a even bigger primary sensor and slightly faster f/1.5 lens that capture 2.2 times as much light as before, according to Apple. Again, the exact sensor size hasn’t been advertised, but Apple did give the pixel size: it’s 1.9μm, which is bigger than any modern smartphone I’m aware of. Apple can do this because the sensor is a relatively low-resolution 12 megapixels, but it’s still an impressive stat that should translate to better low-light performance. For comparison, the 12 Pro Max had 1.7μm pixels, while every other iPhone since the XS has had 1.4μm pixels.

    It’s not clear exactly what hardware changes Apple has made to the iPhone 13’s ultrawide camera; the company simply says it has a “faster sensor” that “reveals more detail in the dark areas of your photos.” The Pro does have significant hardware tweaks, though, since Apple has increased the aperture to f/1.8 for a 92-percent improvement in light-gathering capability. The sensor also now has focus pixels on board — things are rarely out of focus in ultrawide shots because the depth of field is so large, but adding autofocus means that the camera can be used for macro photography, with a focusing distance of 2cm.

    The telephoto camera remains exclusive to the 13 Pro phones, and Apple has increased its equivalent focal length to 77mm, or three times longer than the primary camera. Previously the iPhone 12 Pro’s telephoto offered 2x zoom while the 12 Pro Max went out to 2.5x. There’s a tradeoff here — if you want to frame something with 2x zoom, the 13 Pro will need to crop in from the main camera, reducing image quality. But your pictures that go beyond 3x zoom will be much sharper than before, and it should make for a better portrait lens. Apple has also added Night mode to the telephoto for the first time.

    When compared to the Android competition, Apple isn’t doing much to outgun them on the hardware front. The large 1.9μm pixels are noteworthy, but most Android phone makers have been prioritizing big, high-resolution sensors rather than pure pixel size. Xiaomi’s Mi 11 Ultra, for example, has a huge 50-megapixel sensor with 1.4μm pixels, meaning it has decent light-gathering ability even when shot at native resolution without binning the pixels together. And while the 3x telephoto lens is going to be useful, it’s now common to see 5x periscope telephotos (or occasionally even 10x) in the Android world.

    So even though Apple has made meaningful hardware improvements across the iPhone 13 lineup, as ever its performance relative to competitors will come down to how well its software and image processing pipeline has been optimized. The iPhone 11 was a hugely better camera than the XS the year before it, after all, even though the hardware barely changed. This year Apple is touting Smart HDR 4, which is capable of individually adjusting exposure for multiple people in the frame, but we’ll have to see the phones for ourselves to know what kind of a difference that makes. The same goes for Photographic Styles, a new filter-like feature that Apple says is smarter about adjusting elements like skin tones and skies in each photo.

    As for video, Apple is making a big deal out of its Cinematic mode that lets you selectively adjust focus and depth of field when post-processing, like Portrait mode for photos. That’s something we’ll definitely have to test extensively. The 13 Pro, meanwhile, lets you record and edit video in Apple’s ProRes codec on the phone itself, or you can export the ProRes file to Final Cut Pro on a Mac.

    All the usual caveats about waiting for full reviews most certainly still apply, but this is looking like a pretty good year for the iPhone camera. Apple is never going to have the flashiest hardware, but it’s made some welcome improvements in areas that make sense, and thankfully it hasn’t locked any features to the Max-sized iPhone. We’re looking forward to seeing the results — as well as those of looming competitors like the Pixel 6.


    Related:



    Source link

    Previous articleWyze Almost Died in 2020, Here’s How It Survived – Review Geek
    Next articleShepard Smith Discusses Being ‘Token Gay’ at Fox News for 25 Years