iPhone 13 and 13 Pro camera upgrades tested

Ad Blocker Detected

Our website is made possible by displaying online advertisements to our visitors. Please consider supporting us by disabling your ad blocker.


CNN
—  

As smartphones continue to evolve, a particular area always comes into focus — and that’s the camera. Apple’s new iPhone 13 Mini, 13, 13 Pro and 13 Pro Max didn’t change much with the design or features, but they did push forward in the imaging space.

It wasn’t a crucial switch in the camera hardware or the addition of a new lens or set of lenses. So where do the improvements stem from?

The big change comes from all new sensors that capture a larger amount of light to emphasize the details and improve the overall image quality. That’s coupled with some updated hardware, sticking with what has been working — and a number of improvements on the software side. It sets the iPhone 13 apart from competitors, in that it can capture a scene more accurately than other devices on the market. Apple’s hardware does let more light in, but it also allows more details to be captured and intelligently uses software to properly take in a scene.

You can read our full reviews of the phones — including the iPhone 13, which we’ve named the best smartphone overall — but now it’s time to take a closer look at the camera.

Jacob Krol/CNN

When it came to testing the iPhone 13 and 13 Pro, we took photos of ourselves — selfies with good or bad hair — our families and friends, along with pets, landscapes, plants and random things throughout our day. The goal is to test in the same way that we use our phones daily.

Both the iPhone 13 line now features the largest sensor in an iPhone. Apple promised improved low-light performance with this camera, and in our testing we saw just that on both the 13 and the 13 Pro. Compared to last year’s iPhone12, the 13 was able to take a clear image with less noise being introduced into the shot. Normally the culprit for those elements is the camera not capturing enough information, as a result of the hardware stack. The larger sensor here (which is paired with a wider aperture on 13 Pro) allows for more light to be captured.

With a nighttime shot at a pumpkin patch with minimal lighting, the iPhone opted to shoot in Night Mode. With this, the device takes a series of shots at varying exposure rates with AI upscaling on top. That’s combined with the standard SmartHDR processing to properly pick apart a scene, and enable accurate colors and proper lighting for the different parts. It’s Apple’s take on computational photography, which is essentially using a whole lot of information to craft an image. It’s also better skilled at identifying elements in a shot year over year. If you look back to an iPhone 11 or older, you’ll see some large scale improvements. The result is a really nice image, as you can see in the embed above. You can still see accurate colors of the ground and plants.

Jacob Krol/CNN

“Long before you even hit the shutter, you just bring the camera up, we’re looking at auto exposure, white balance, auto focus to make sure that we’re getting all of the right information, raw information captured,” Jon McCormack, VP of Camera Software Engineering at Apple, told us. Essentially, this is Apple’s version of an auto shooting mode — the cameras in conjunction with the processor and software work to identify the right settings to take your shot. For example, if it’s a dark night it might slow down the shutter speed to allow more light to come into a shot. Or it might make a correction on focus, if there’s a lot of movement in a given scene.

The iPhone’s advanced software smarts allow it to switch between modes — like Macro on the iPhone 13 Pro kicking in as you get closer to an object or Night Mode auto handling an image dating back to the iPhone 11.

As one might expect, a lot of the focus in our testing was around the performance of the iPhone 13 and 13 Pro series. We quickly focused on testing in a real world way — as we noted above — with selfies, photos of our pets, out and about at pandemic safe events and with more advanced ones that incorporate multiple light sources. Since both of the phones featured a larger sensor, we wanted to dive deep on how that impacted images and video. Was there a noticeable difference? Did shots offer more details and improved lighting?

And with these side-by-side examples, the 13 did offer crisper images compared to the 12 — and in some cases, some that were noticeable even to an untrained eye. Ultimately, it’s a step up that improves image quality, but doesn’t negatively impact how you take a shot. The 13 Pro or Pro Max had more direct improvements over its predecessor — more details even when zooming in, dramatically better photos in low-light or quite frankly dark images, improvements to core modes and just better natural portrait.

“It’s much more than that [low-light performance] because the larger pixels allow the sensor to capture more rich detail and reduce noise,” says Graham Townsend, VP of Camera Hardware Engineering for Apple. On the iPhone 13, the larger sensors were packed into a physically larger camera module with a wider aperture to let light in and a wider sensor to capture the light. That also leaves more receptors to grab the light and translate it over to a usable image. The larger sensor was a key part of designing the device, and one that Townsend’s team worked to integrate.

The iPhone 13 has more than 50% more light gathering capabilities and stabilization (which we’ll unpack below) compared to the 12, while the 13 Pro gets 2.2x improvement year over year.

This comes together to give the software more information to adjust the image properly. It’s the explanation behind the more accurate colors and lighting sequences that we encounter in our tests.

Previous iPhones weren’t slouches, specifically the 11 and the 12, but properly reading a scene or letting enough light in could potentially be a struggle. With this shot of a bright sun that’s being pushed through clouds and several rays emerge, it could skew the colors of the orchard and the associated objects (trees, blades of grass, apples and pumpkin). The iPhone 13 Pro was able to tackle the image without overexposure or burning, but also present colors in an accurate manner.

The other big bonus with the main camera across the 13 Mini, 13, 13 Pro and 13 Pro Max is image stabilization which proved to be a big addition. Our hands shake and we tend to move when capturing a shot, and those little movements can have big impacts on the photos we take. Townsend’s team opted to spread optical-image-stabilization across the main lens, and this first premiered on the 12 Pro Max. Essentially the shot can be still for a longer period of time, and the actual camera will move the opposite direction of the phone’s movement, to keep the shot still.

Townsend noted that “every exposure becomes shorter, which reduces generically subject motion blur for both stills and video.” Considering the iPhone 12 didn’t feature this, images side by side shot free hand look much better. And those upgrading from an iPhone 8, X or even an 11 will see a large impact here. There isn’t a blur effect for particularly rough shots and this paired with the larger sensor produces a detail filled image. It’s especially evident over the older models and competing phones with smaller sensors and no stabilization.

Jacob Krol/CNN

Specifically on the iPhone 13 Pro and 13 Pro Max — which feature the best camera system we’ve tested on a phone — feature an ambient light sensor which enables the device to more quickly dial in the exposure for a scene, which helps a photo or video quickly identify the proper extended settings for the capture. And this ties back to the iPhone calculating a scene and the best shooting conditions from the moment you open the camera app. It also gets the data and shoots information over to the software faster.

The other thing to consider is the hardware included on each iPhone. The 13 features dual-lenses, while the 13 Pro gets three lenses and a LiDAR sensor. When a lens isn’t being used for shooting, it can help gather data. For instance, if you’re shooting a portrait with the main lens on the iPhone 13 — the ultrawide can work at the same time to calculate depth to create the effect.

Townsend summed it up as “we have the hybrid autofocus system where we switch in different methods of auto focus or estimating depth at different points and, we provide these different sources of information” directly to the software. This way the main lens can work to get the object in focus, while the ultrawide is grabbing measurements and depth information to deliver the best shot. On an iPhone 13 Pro, it can use the additional two lenses along with a LiDAR sensor to calculate all of this.

Jacob Krol/CNN

Apple’s two new camera software features are Photographic Styles and Cinematic Mode. The former is a filter of sorts that lets you apply a type of scene to the shot before you hit the shutter button to take an image. It’s also the first time Apple is letting users really customize how the iPhone shoots.

McCormack tied it back to film photography in which you use a different film stock depending on whatever you’re shooting. When a filter is typically applied, it impacts the entire shot and doesn’t factor in the individual objects. A classic example is a person in front of a sunset or a landscape — when you up the saturation, it impacts the entire image. The iPhone is already pulling apart the image to recognize aspects of it — the sky, ground, clouds and the people in the shot — and applying the filter appropriately to each aspect.

To put this succinctly, Photographic Styles applies the filter selectively to the image since it knows the parts. This allows the iPhone to not warp the background with a weird effect and to not make “the person sort of looks like they just spent way too long in the sun,” noted McCormack.

The other big upgrade is a Portrait-like mode for video known as Cinematic Video, which intelligently applies focus in a video. This way if you have a video with two or three people or points of focus, it selects the main one in focus while blurring out the rest. The trick is that you can switch on the fly, to whomever you want in focus. That’s done in real time or after the file is saved in an edit.

It was impressive in our testing, as you can see in the embed below. We wanted to learn a little more about how the actual video files come together and it turns out that the iPhone sticks with one file that’s loaded with all the depth information. And that file has multiple depth tracks that are captured from different lenses. McCormack noted that if “you’re shooting on the wide, we’ll grab the wide, and we’ll grab the ultra wide and by using the disparity between those two images will calculate the depth map for that frame.” The iPhone captures a lot of information to allow for the seamless switching either in real-time or after the fact.

The iPhone 13 and 13 Pro are some of the best phone cameras we’ve tested. All versions of the new iPhone produce shots that don’t get as blurry when your hands are shaky, complete with accurate colors and a larger amount of details when you zoom in and correct lighting throughout.

If you have an iPhone 11 or older, you’ll certainly see the camera improvements on any iPhone 13 model. Those who have a serious knack for shooting and content capture should make the jump to a 13 Pro or 13 Pro Max.

  • iPhone 13 ($999; amazon.com or apple.com): The latest in imaging tech from Apple in a dual-camera setup that offers a wider range of colors and more details for most shots. The main lens also features stabilization to minimize the occurrence of blurry shots.
  • iPhone 13 Pro ($1,099; amazon.com or apple.com): A three-camera system that lets you shoot in wide, ultrawide, 3x zoom or in Macro. It’s the best camera system we’ve tested with a range of mode and extreme performance even in low light.