Google is rolling out a safer image search feature over the coming months that will blur any images that it deems explicit. It will apparently do this automatically using AI, even if the Safe Search feature is deactivated. Google has apparently initiated this to celebrate ‘Safe Internet Day’.
The news was announced yesterday in a blog post and is aimed at keeping families and young children safer when online. There will also be advances in privacy features in addition to the image-blurring tool.
The announcement says that the blur setting “will be the new default for people who don’t already have the SafeSearch filter turned on, with the option to adjust settings at any time.”
SafeSearch already filters out any dubious results for any users under 18. However, this filter will give searchers the option to view the image, but with the warning that it may contain explicit content.
Essentially it’s not so different from the warnings given over explicit content filters deployed by Meta on Facebook and Instagram. Giving people the choice of whether or not to view such content can only be positive in general.
However, given that the feature is using AI to flag such images does demand some investigation into which types of images are being blurred, and subsequent arguments about image censorship must surely follow.
Last month Instagram was advised to reconsider its policies on image censorship following an investigation by its advisory board. The board found that there was a direct correlation between image censorship and discrimination against women and the LGBTQ communities.
[Via Gizmodo]
FIND THIS INTERESTING? SHARE IT WITH YOUR FRIENDS!