These People Don’t Exist, An AI Tool Made These Super-Realistic ‘Fake’ Photos

Love it or hate it. Embrace it or mistrust it. Use it or shun it.
Whatever you do, you can’t ignore it. I’m talking about Artificial Intelligence. And the generative adversarial network (GAN) in particular. What’s that, you ask? Allow me to explain.

A company called Nvidia has been testing and improving its AI tool (GAN) for quite some time. In a paper released earlier this month, we got to see just what magic it’s capable of. The AI generated images that looked like real people.

This what AI was capable of back in 2014. Pretty meh, right?

Image Source

And this is 2018. Can you tell these are fake?

Image Source

If this tool goes mainstream, think of all the cool images we could generate. With a click of a button. And not just humans; cats, cars and bedrooms as well. As the video below explains, we could also mix styles after providing source images. According to researchers from Nvidia this AI is highly advanced and

“capable of separating inconsequential variation from high-level attributes.”

This algorithm will blow your mind. See it in action

BUT, a lot of people think this is scary. Here’s why:

  • Image authentication – Will there be a distinction between real and fake images in the future? If yes, how would we be able to tell the difference?
  • Photographic evidence – Real faces can now be modified too, can they really serve as proof?
  • Loss of employment – What will happen to models (stock photo services and photographers) if AI can easily generate realistic faces?
  • Misuse to create pornography 

Phew, that’s a lot to think about. So, what do you make of this AI tool – boon or bane?

📣 Storypick is now on Telegram! Click here to join our channel (@storypick) and never miss another great story.