A few days ago, Getty and Google announced the upcoming changes as a result of a licensing deal. The announced changes have arrived, and now you can’t see the “View Image” button on Google any longer. Instead, if you want to see the photo, you’ll have to go directly to the website where it’s hosted.
Not long ago, Google introduced Clips, an AI-powered camera trained to capture the best moments of your life. It has no LCD screen and there’s only a shutter button, which is completely optional. Google Clips uses artificial intelligence to recognize and save your “perfect moments” itself. But how is it possible? According to Google, it’s because they hired “a documentary filmmaker, a photojournalist, and a fine arts photographer” to help train the camera’s neural network.
Google’s Art & Culture app has an amusing new feature. If you take a selfie within the app, it finds your look-alike in a work of art. Google compares your face to over 70,000 artworks in their Art Project database and then tries to find your doppelgänger. Sometimes the results are stunningly accurate. But at other times they’re just hilarious.
Although artificial intelligence can be impressive, sometimes we get to witness that it’s not always the case. You may remember that time when the Google Photos app tagged a couple of African Americans as “gorillas.” After an apology and a promise it would fix it, Google indeed “fixed it.” It simply removed the label “gorilla” from its lexicon, along with some other words.
In a recent blog post, Google has introduced their new AI that can judge your photos based on both technical and aesthetic quality. According to Google researchers, the new network “sees” the photos almost like the humans would. With time, it could get even more accurate, and its application could affect image editing processes, judging images in competitions and more.
Google have now announced the availability of the final Developer Preview of Android 8.1. While the finalised version won’t roll out until December, the new preview features “near-final” system images. The new preview actives the Pixel Visual Core chipset in both the Pixel 2 and Pixel 2XL.
Essentially, this is an 8 core system on chip (SOC) which can run three trillion operations per second. Android Central reports that Google’s HDR+ routines will be five times faster using less than a tenth of the energy over using the standard image processor. This means better dynamic range and reduced noise through computational imaging.
Google’s Pixel 2 smartphone quickly dethroned the new iPhone 8 Plus once DxO Mark got their hands on it. And the reviewers so far seem to be giving it great praise, both as a camera and a phone. But how is the camera inside the Pixel 2 actually put together?
That’s what Nat of Nat and Friends wanted to find out. Being a Google employee, she has a little more access than most of us. So, in this video Nat takes us inside Google’s HQ to speak to engineers and find out more about how the camera’s development and working process.
Optical image stabilisation is in high demand on new smartphones today. It beats the heck out of electronic image stabilisation. Google’s new Pixel 2 smartphone, however, features both.
As well as receiving DxOMark’s highest score ever for a smartphone, it appears the Pixel 2’s image quality won’t be wasted by jerky footage. When working in tandem, they produce ridiculously smooth footage, if this sample posted by Google is anything to go by.