Have you updated to version 2017.1 of Adobe Premiere Pro? According to some users on the Adobe forum, their files are disappearing, and in some cases, they even get completely deleted. One of the users reports that after clicking on the “clean unused” Media Cache Database, the files from one of the previous projects were completely missing when he tried to reopen the project. Can this be an accident and an isolated case, or is it the global problem?
Adobe has been experimenting with new features and algorithms lately. They have recently tested a solution that applies the style of one photo to the other. But this new feature could be groundbreaking for all the selfie lovers out there.
In their latest video, they offered a preview of the future of selfie photography using artificial intelligence and deep learning. It suggests that in future we may be able to create pretty decent portraits from not-so-good selfie snapshots.
Editor’s Note : Digital storyteller and friend of DIYP, Ted Chin has been guest posting on the official Photoshop Instagram account. This particular post is a fantastic double exposure tutorial, which Ted and Adobe have allowed us to share with you here on DIYP. A simple technique with very effective results.
Hey guys! It’s Ted (@eye.c) here. Today I’m going to show you how to create a double exposure portrait in just few simple steps.
There are many ways and attempts to make the post-processing time shorter and more efficient. Researchers at Cornell and Adobe have teamed up and came up with a new solution. They have created a method that transfers the style of one image onto the other. It’s like a crossover between the Prisma app and copying and pasting settings in Lightroom.
With this solution, you are not supposed to edit the image and then copy and paste settings. You can transfer the style from a finished reference photo onto the one you want to enhance. This includes copying the time of day, lighting and weather from a reference image onto the one you’re editing.
An Adobe Research paper titled Deep Image Matting, might just put an end to green and blue screen techniques. Adobe collaborated with the Beckman Institute for Advanced Science and Technology, to develop a new system based on deep convolution neural networks. This system extracts foreground content from its background accurately and intelligently without any kind of blue or green screen background.
Eliminating the green screen isn’t a completely new idea. Lyryo’s cinema cameras are able to do this based on depth perception. But this solution is 100% software based. The paper outlines the process to evaluate images. It then determines what needs to be cut from the background, and how.
Adobe has announced that their Creative Cloud suite of apps is now available for Chromebooks. What they did was modify their existing Android apps, so they are now compatible with selected Chromebooks. Considering these devices are mainly used in schools, this could be a step forward in education, and the best thing is – these apps will be available free of charge for students and teachers.
Have you ever thought what it would be like to use an app like Siri, but for photo editing? Judging from Adobe’s latest video, this might become reality. They are exploring what an intelligent digital assistant for photo editing might look like, and they presented their idea in this short video.
Many things have troubled me this past year. Global warming, war, consumerism, my beard that seems to grow ginger past a certain length…..but right above those, at the top of the list is Adobes new Select and Mask feature. Why? Because it just doesn’t work! No matter how many times I try, how many sliders I change, it just doesn’t create the great selections I was used to with Refine Edge.
Now to be honest I never really used the Refine Edge for the body, I use the pen tool for those selections. But where Refine Edge earned its pay was when I got to the hair. And like it or not, Select and Mask just doesn’t seem up to the task. Frustrated and tired, I did what any angry Photoshopper would do in their moment of rage!!!……….I created a meme! But with a great meme, comes great responsibility, and other Photoshop users began to share their thoughts on Select and Mask too.
I will show you later in the post how to revert back to good old Refine Edge, whilst using selections in CC 2017, but first lets see what other Photoshop users thought. [Read more…]
Recreating a 17th century painting in the 21st century by using only stock photos would be an interesting project under any circumstances. But doing this with “The Concert” is a more than just interesting. First, it is one of the iconic paintings of Johannes Vermeer, a Dutch painter most people know by “Girl with a Pearl Earring”. And second, the story behind this painting is quite mysterious, since it went missing and it has never been found. All this makes Erik’s recreation of the painting even more valuable.
One of the most off putting things for viewers of video is shaky footage. The best way to keep the camera steady is to use a tripod, but sometimes we want to add a little motion. Quality sliders can still cost a fair amount of money, and not everybody has a gimbal or other stabiliser. We just have to go regular handheld. But this often leads to bumpy footage. So, what can we do?
Adobe Premiere Pro has a built in Warp Stabiliser, but it doesn’t always do the best job. When it works, it works extremely well, but it often falls over and gives results we really didn’t expect. In this video from Miesner Media, Theo takes us on a round trip from Premiere to After Effects, and back to Premiere again, resulting in perfectly stabilised footage.