While it is a controversial topic overall, facial recognition does have some benefits. One of the most prevalent in our lives at the moment is the ability to detect individual people in photos we shoot, whether it’s on our phone or with services like Google Photos. It organises them so we can see all photos of that specific person. Google Photos’ “facial” recognition has just gotten a little bit more creepy.
It doesn’t even need to see faces now to be able to make a reasonable guess at a subject’s identity. Google didn’t publicly announce this new functionality but has confirmed to Rita El Khoury at Android Authority that it now exists in Google Photos.
Google hasn’t fully revealed the secrets of how it works, but they did drop some hints. Google told El Khoury that their machine learning algorithms can now “group people based on clothing and other visual cues across photos taken within a similar timeframe”. It doesn’t fully tag them yet, though. It essentially adds them as a suggestion with the prompt “face available to add”, and you can choose a different person if Google Photos has got it wrong.
Comparing clothes allows Google Photos to make a reasonable guess at who is who, even if they’re not facing the camera. Taking the timespan between images into account helps to reduce the number of misidentifications. If it sees two people wearing blue jeans and a plain red t-shirt wearing a green backpack six months apart, it doesn’t assume they’re the same person. If it sees those people 5 minutes apart, then it likely would.
El Khoury says it seems to have about an 80-85% success rate. So it’s not foolproof, but that’s a pretty impressive hit rate. It also seems to be able to handle partially obscured faces, too. She says that many photographs of masked subjects during the pandemic have suddenly become recognised without any intervention.
A handy feature that will no doubt improve over time. Hopefully, the Android native photos app will eventually gain this functionality, too.
[via Android Authority]