Artificial intelligence is developing fast and has many possible applications. However, it makes mistakes, and this has proven to be a problem for London’s Metropolitan Police. They use AI to detect incriminating images on seized electronic devices. But, it’s unreliable when it comes to nudity, as it still can’t tell the difference between a nude photo and a photo of a desert.
Mark Stokes, the head of digital and electronic forensics, told the Telegraph that their current software detects the photos of guns, drugs, and money on seized computers and phones. However, it keeps mistaking the photos of deserts with pornography or indecent images:
For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin colour.
As the Telegraph writes, The Metropolitan Police scanned 53,000 different devices for incriminating evidence last year. As you can imagine, this can be very disturbing for the specialists who spend their career searching for the sensitive and incriminating content. Using artificial intelligence takes this unsettling task away from humans.
Since the incriminating photos often include child abuse, the police force plans to train AI to detect abusive images. There’s apparently still room for improvement, but with the help of Silicon Valley providers, they believe this will be possible “within two to three years.”
Additionally, the police have an ambitious plan to store the flagged images to cloud providers like Amazon, Google, or Microsoft, instead of using their local data center. This puts a lot of pressure on these services, as none of them are completely immune to security breaches, as Gizmodo writes.
I don’t find it unusual that the system mistakes deserts from the human body. Artificial intelligence is prone to mistakes, as the machines can’t understand human nuances. After all, sometimes even humans need to look twice. It made me think of “bodyscape” kind of photos [nsfw link]. Still, with such fast development, I believe AI will become way more accurate in the years to come.
[via Gizmodo, the Telegraph]
FIND THIS INTERESTING? SHARE IT WITH YOUR FRIENDS!