British Cops Want to Use AI to Spot Porn—But It Keeps Mistaking Desert Pics for Nudes
“Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” Mark Stokes, the department’s head of digital and electronics forensics, recently told The Telegraph. “For some reason, lots of people have screen-savers of deserts and it picks it up thinking it is skin colour.”
…
“El algoritmo de la policía londinense no distingue un desierto de un desnudo.” (…) “Cuando el programa debía señalar o “flaggear” a personas desnudas fallaba y por mucho, demostrando poseer una mirada especialmente pecaminosa.” (…) “Confundía imágenes del desierto y sinuosas dunas de arena con piel humana, con cuerpos desnudos.”
https://www.instagram.com/p/Bf8zcw9BLmy/?hl=es&taken-by=rentalmagazine
-Send dunes.