Pro Photographer Explains iPhone XS Camera’s Inner Workings, Debunks Skin Smoothing Issue

There has been a lot of fuss about the apparent ‘beauty filter’ that is being added to photos taking with the iPhone XS and iPhone XS Max front-facing cameras, with images appearing to be artificially smoothed, particularly when skin is involved.

It’s an effect that most people appear to not like, and while there is a subset of the population that goes out of its way to apply such filters in apps like Snapchat, it may not be something you want when taking photos of your kids. The good news is that it appears that this effect is a software thing, and while there’s no ‘beauty filter’ being applied, we now know exactly what is going on.

This is all thanks to Halide developer Sebastien de With, with the photography expert writing a blog post after taking a deeper look into exactly what is going on.

According to de With, the iPhone XS is not applying a filter as such, but the smoother images are the byproduct of the way the cameras now take images – it’s the same whether those images are taken with the front or rear-facing cameras, but is more pronounced on the front camera due to its smaller size.

AS de With points out, the way the new iPhones take multiple photos at once and then combines them using computational geography to merge those shots into a single image. This results in images with “a whole new look” and that are a “drastic departure” from photos taken on older iPhones.

With this process removing certain levels of contrast in images in order to increase its ability to pull detail out of light and dark areas, the resulting image can appear almost flat, with the lack of contrast making images appear smooth. It’s this that we are seeing in the smoothing of skin, with the darker areas being removed, creating a lack of depth.

The tradeoff is that selfies, which traditionally are worse in mixed or harsh lighting (the majority of lighting!) are now no longer blown out, and in most cases it just looks better, if just a little on the smooth side.

That said, de With also explains why the new iPhones use aggressive noise reduction:

Remember that line-up of frames showing how the iPhone camera works?  Unless you have bionic arms, it’s impossible to hold your phone perfectly still for this long. To get a sharp, perfectly aligned burst of images, the iPhone needs to take photos really fast. That requires a shorter shutter speed — and that, in turn, means that there will be more noise in the image.

That noise has to be removed, somehow, and that comes at a cost: noise reduction removes a bit of detail and local contrast.

The good news is that this is all software-based, so if Apple does want to sort all this out, it can presumably do so. That may mean we also lose some of those amazing HDR images we’ve been seeing in reviews of the new iPhones, so the only real fix may involve an option being added. And we all know how Apple feels about those.

Be sure to check out de With’s full blog post for all the details, especially if you have an interest in photography.

You may also like to check out:

You can follow us on Twitter, add us to your circle on Google+ or like our Facebook page to keep yourself updated on all the latest from Microsoft, Google, Apple, and the Web.