Spitting Image

clara-livingston2I don’t normally have much sympathy for the likes of Facebook, but I did think they were in a ‘damned if they do, damned if they don’t’ position last week when there was that fuss about them banning The Terror of War.

Their algorithm for picking up on and prohibiting the publishing of photos containing child nudity couldn’t tell the difference between the perverse and this iconic image of children running from a napalm attack in Vietnam because one of them happened to be naked.

The problem with algorithms, of course, is that they have no discernment – the computer clockwork clicks away and a photo either ticks the parameter boxes or it doesn’t. And if some unsavoury image slipped through the net, there would be hell to pay.

Facebook is big enough and ugly enough to defend itself, but I was reminded of the algorithm problem by the photo above. It’s of the subject of a future ABC Wednesday post and I was struggling to find appropriate images, so I turned to ‘search Google with this images’. It didn’t turn up anything, but its offering of ‘visually similar images’ was an eye-opener of how random algorithms can be.

Among the many, many people that Google thought it might be were Marilyn Monroe, Helen Keller and Marie Curie which I suppose might be considered flattering. But there was more.

Also on the page were images of Harvey Milk, Liam Neeson, Che Guevara, David Bowie, Adolf Hitler, Harold Wilson and a young Barack Obama. None of which I would consider ‘visually similar’, but perhaps that’s just me.

Nobody’s prefect. If you find any spelling mistakes or other errors in this post, please let me know by highlighting the text and pressing Ctrl+Enter.

3 comments… Add yours
  • Roger O Green 12th September 2016

    We all need the short cuts, but they don’t always work!

  • Steve 14th September 2016

    What made the Facebook controversy appalling wasn’t any fault in the algorithm, but that the Facebook people defended the deletion of that photo. The story in the Guardian quoted a spokeswoman saying “While we recognise that this photo is iconic, it’s difficult to create a distinction between allowing a photograph of a nude child in one instance and not others.” Which is just a RIDICULOUS statement. The story said the standards team would have made the decision to delete, not an algorithm.

    In other words, as I understand it, people who should have known better — and who ought to be able to tell the difference between a Pulitzer-prize-winning news photo that changed the course of a war and child porn — made the decision to delete that picture.

    (Sorry, as a former journalist, this issue fired me up!)

    • Mr Parrot 14th September 2016

      I heard on the news reports that at least part of their decision-making was based on the fact that FB is a global platform and that an iconic photo of a naked child might be iconic in the liberal west, but not so in less open-minded countries. While I don’t necessarily agree with that, it illustrates how social media can be emasculated if its prime concern is not to offend someone, somewhere.


Your email will not be published on this site, but note that this and any other personal data you choose to share is stored here. Please see the Privacy Policy for more information.

Spelling error report

The following text will be sent to our editors: