Wednesday, April 24, 2024

Google sees fault in camera algorithms, vows to improve skin tone representation

Share

Google’s strength in smartphone cameras has always been its A.I. and computational photography systems. The photo your phone gives you after you press the shutter button isn’t a copy of the exact thing you see — it’s a combination of dozens of frames melded together, processed by advanced algorithms. That creates a problem: Google found that its photo algorithms don’t do a great job of representing all skin tones and physical features properly.

In an announcement at Google I/O 2021, the company said i’s working to change that. Its goal is to “make smartphone photography truly for everyone […] to build a more accurate and inclusive camera.”

Google has spun up a new initiative in its photography division to ensure that its photo-processing algorithms accurately and appropriately represent people of color — from skin tones, to body and facial features, to hair types, and beyond. It started by assembling a group of photography experts to gather information on how computational photography has led to gaps in properly representing some groups, and started work on ensuring that the software in its cameras can counteract that.

“The work is for us to do. It’s not for people to change the way they look; it’s for us to change the way the tools work,” Google said.

Some of the changes are a bit simpler than others. By tweaking the way that the auto white balance and auto exposure algorithms process images, the Google camera app can more accurately represent the tone of someone’s skin regardless of the surrounding scene.

And instead of simply assuming “a brighter image is better,” as is so often the case, it can take into account the fact that there’s a reason why certain parts of a scene are that color or tone. Google’s been at the forefront of combining multiple frames into composite photos, and this is another area where that type of expertise can be applied to a new philosophy.

This goes beyond color and brightness, of course. Google is also focusing on its algorithms for processing different hair types — for photos, selfies, and portrait mode shots. We’ve all dealt with the faux portrait shots that cut off glasses, ears, or chunks of hair, but Google has found that this disproportionately affected curlier hair types. New algorithm changes help to properly represent hair naturally, rather than artificially smoothing it out and removing definition from curls, for example.

Ultimately this all comes down to accuracy. Where various phones land on the spectrum of taking accurate photos or pleasing photos is a constant source of debate; but unfortunately, not enough of that focus has been given to representation of subjects no matter their skin tone or physical features. While we can and should discuss the subjective nature of whether the photo of a park or night scene has proper white balance or unnatural grain in the lowlights, there should be zero debate about the fact that camera algorithms should represent people as accurately as possible, without prejudice.

Google says all of these changes, and more, will be coming to Pixel phones in the fall — presumably coinciding with the Pixel 6 launch. But it doesn’t stop there; Google will be sharing “everything” it learns with the entire Android partner ecosystem, in hopes that other companies will join this initiative.

Read more

More News