New Tool Detects Photoshop Shenanigans in Fashion Photos

By Veronique Greenwood | November 29, 2011 2:28 pm


An image analyzed by the researchers, before retouching, after retouching, with an overlay that shows the strongest retouching in red, and with two facial overlays showing other measures of retouching.

What’s the News: It’s not news that in the age of Photoshop, celebrities and models in magazines have started to look like perfect aliens crash-landed among we ugly Earthlings. But though sometimes it’s obvious when a photo editor has gone too far (witness the Ralph Lauren her-head’s-bigger-than-her-pelvis debacle), the gap between what real people look like and what magazines and other media regularly show has grown distressingly wide without most people consciously noticing it, creating a sea of misinformation that may contribute to body-image disorders.

An analytical tool developed by Dartmouth scientists, though, picks up and quantifies those alterations, potentially providing a useful metric for policymakers looking to set boundaries on how much limb-stretching, torso-trimming, face-smoothing alteration is appropriate.

How the Heck:

  • The tool rates altered images on the basis of geometric change, like stretching and shrinking, and photometric changes, like airbrushing and heightened colors.
  • To test their system, the researchers had 390 volunteers from Amazon’s Mechanical Turk service analyze 468 photos before and after retouching, most of them from the sites of retouchers advertising their services. Each photo got a rating (1 to 5) that described how much reworking the image had undergone.
  • The researchers then had their system rate the images on the basis of statistical measures like facial distortion and local image similarity, which measures the photometric change between different areas of a picture.
  • In most cases, it corroborated the ratings of the human volunteers. Interestingly, the few examples where the results didn’t jibe reveal that humans do some complex processing that machines do not when looking at faces: the volunteers picked up on minute changes to facial structure more strongly than the tool did.

What’s the Context:

Reference: Kee and Farid, A perceptual metric for photo retouching, PNAS. Published online before print November 28, 2011, doi: 10.1073/pnas.1110747108

Image courtesy of PNAS

ADVERTISEMENT
NEW ON DISCOVER
OPEN
CITIZEN SCIENCE
ADVERTISEMENT

Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

80beats

80beats is DISCOVER's news aggregator, weaving together the choicest tidbits from the best articles covering the day's most compelling topics.
ADVERTISEMENT

See More

ADVERTISEMENT
Collapse bottom bar
+