Though some observers saw few obvious modifications to the iPhone XS camera system from that of the iPhone X, our iPhone XS review confirmed that Apple’s latest phones actually feature non-trivial improvements in image sharpness, dynamic range, and low-light performance — the products of lens, sensor, and processing changes. Even so, early customers have continued to raise questions over the features, and separate reports have addressed Apple’s tweaks to photo and video.

On the photo side, third-party developer Sebastiaan de With has tackled the “Beautygate” theory — that Apple deliberately applied an image filter to make selfies look smoother — with the technical equivalent of a hard no. According to de With, whose well-regarded Halide app provides granular control over each iPhone’s cameras, it’s true that the new XS camera system is smoothing out skin tones in a way that can eliminate blotchiness, but there’s no soft filter.

Rather, users are seeing the product of Apple’s new Smart HDR feature, which uses multiple exposures to eliminate sharp light and dark contrasts. Since the brightest and darkest elements of the image are being normalized, skin looks smoother because the light hitting it isn’t as harsh.

More aggressive noise reduction is also being applied, due to the iPhone XS preferring faster shutter speeds with higher ISO levels — thus enabling the capture of more shots at once but with higher noise. The differences are particularly evident on the lower-resolution front-facing camera, de With notes, but Apple could easily tweak the iPhone’s software to rebalance the cameras’ results. It’s also possible that iOS 12 could let users flip a switch to choose between its prior and latest HDR options.

On the video side, filmmaker Richard Lackey took the iPhone XS Max to Barcelona, Spain for a low-light video test and attempted to determine how Apple’s computational imaging was leading to “voodoo” output that seemed “to be beyond the ability of the optics and sensor alone.” Lackey found that the device’s raw low-light video output could be better overall than the product of post-production color grading, and he said the video was both clean and filled with more color information in dark parts of images than any prior iPhone he’d used.

The key, he believes, is AI-assisted real-time image processing: dynamic analysis of the scene, potentially with separate adjustments localized to individual parts of the image, and dynamic tone mapping for both luminance and color. Additionally, while a noise reduction technique is certainly being used, Lackey says that normally lacking areas have unexpected detail and texture — a result he couldn’t entirely explain.

Although Lackey says he’s unsure precisely how Apple is achieving its results, it’s clear from the company’s public statements that the basic pipeline involves gathering more detail than before in real time — multiple frames and exposures in a split-second — then using advanced processing to instantly splice those frames together into a cleaner image. The specific technique appears to be sophisticated in weighing averages for individual areas of detail within images, resulting in some smoothing but also more detail.

Obviously, the iPhone XS’ larger and faster wide-angle sensor has something to do with the improvements, as well. But the impacts of both computational photography and the XS’ new A12 Bionic chip are very significant, demonstrating that smartphones are likely to continue making gains on standalone cameras, even if they can’t compete on raw lens or sensor size.