I believe you’re familiar with Google Lens, Google’s powerful AI search tool. Well, it’s now become an integral part of every image search you make. From now on, you can access all of the cool Google Lens features straight from your browser, including images, translations, finding text, and more.
After filing a lawsuit against Meta, Texas Attorney General Ken Paxton is now after Google. The AG has sued Google over the alleged unlawful use of biometric data scraped from photos and voice recordings. He claims that it’s hit millions of Texans, yet that Google failed to obtain their informed consent to collect their data.
Google has revealed its own text-to-video AI-generated program which is called Imagen video. Similar to Meta’s Make-a-video, the program allows users to generate a short video clip purely by entering descriptive text. It’s very similar to text-to-image apps such as Dall-E and Midjourney, however this time the end product is moving pictures.
Of course, this isn’t the first iteration of text-to-video, and neither was Meta’s for that matter. A
few couple of months ago DIYP reported that it would be the next big AI visual progression, and in typical AI nature, that progress has reached us at an insanely rapid rate. But back to Google.
It appears that Google Photos is having a bit of a moment. It seems that it’s corrupting some older images stored on the service dating from 2013 to 2015, according to reports from users. The phenomenon shows up as what looks like digital tears (as in rips through the image, not liquid that comes from your eyes) across the image, as you can see in examples posted to the Google support forums.
Quite a few people have reported the issue. As well as post on Google support, there are also a number of Reddit threads with more examples. There is some good news, though. It looks like Google is already on the case and some users have had their images fixed without them having to do anything. But, if you’re facing this issue and it doesn’t get automatically fixed, there is a way to do it yourself.
Earlier this year, the state of Illinois filed a class action lawsuit against Google over Google Photos’ privacy concerns. The company settled, and it was to pay a total of $100 million to anyone who used the app. And if you lived in Illinois between 2015 and 2022, hurry up: you only have a few days left to apply for your cut of the payout.
Computational photography is a big thing these days. Not only when it comes to smartphones but also with software for the desktop using photos shot with a “real camera”. One particular area where it’s beneficial is noise reduction. It’s built into many smartphone camera apps automatically and it’s available for the desktop in applications like Noise Ninja and Topaz DeNoise AI, not to mention denoising features built into apps like Lightroom.
Google’s new Neural Radiance Fields (NeRF) AI, titled RawNeRF (AKA NeRF in the Dark), however, blows them all away, with images that are not only so good they look almost flawless but it can then go on to generate completely artificial 3D scenes that you can move around in after the fact, as well as adjust exposure, gamma & tone mapping and even focus – including bokeh balls!
Last month, Illinois filed a class action lawsuit against Google over privacy concerns. As the Google Photos app uses facial recognition, plaintiffs alleged that the company broke Illinois’ Biometric Information Privacy Act (BIPA). However, the company has settled, and it will now have to pay a total of $100 million to the app users.
Yet another lawsuit has arisen against Snap, and this time, Google and Apple have been sued, too. A 16-year-old girl and her mother sued the three companies after a grown man, an active-duty Marine, had manipulated the girl into sending him nude photos since she was 12. The lawsuit claims that the companies failed to protect teen users from “egregious harm” and the spread of Child Sexual Abuse Materials (CSAM).
Teaming up with a Harvard professor, Google is using and open-sourcing the Monk Skin Tone (MST) Scale. It’s more inclusive than the current tech-industry standard, making various skin tones included in search results.
Scouting locations for photoshoots used to be a time consuming and expensive affair. You often had to physically visit the place, or if you couldn’t, then you had to hire someone to act as a guide or location scout. While there’s still no substitute for actually visiting a place in advance, these days we have a lot more options at our fingertips.
Google has just announced the release of Immersive view. This could be incredibly useful for photographers who want to scout locations from the comfort of their home, or well, from anywhere to be honest.