Fake Overload On The Horizon


Let me make this one short and to the point. If you think that we have a huge fake crisis today, think again because the fake crisis of the future will make the problems of misinformation we have today pretty much kindergarten. Today, our “fake extent” relates to journalistic articles, targeted ads on facebook, and website traffic. People fake how they look with photoshop and when buying fake friends on instagram. Tomorrow, fake will be on steroids, as content manipulation techniques will be so elaborated that the content people produce will be replaced by content the system wants you to see.

Let me say this loud and clear. Around 2025, bots will manipulate your media without you being aware of it. The videos you produce will be manipulated by a system replacing your original content with theirs. These videos will have your face speaking about topics you didn’t speak about when you recorded. Manipulating your perception of reality will be common place.

Isn’t the former a dangerous line of thinking? I think it is.

Let me give you an example: If buying social media views persuade people to spend more money running these bogus campaigns, then so be it even if these campaigns don’t give people a good return on their investment. Remember: The whole fake agenda isn’t about sharing what is real or the truth but to advance deception. The former isn’t a new concept, of course. Third party companies have been selling fake twitter likes and followers since its inception. Is it ethical? I don’t think it is yet many people subscribe to such methods of growing their social media accounts.

I have tried some of these tactics before as part of my research in this area. It is an ugly world out there dude. There is fake everywhere. That’s how these systems operate. Pay me 10 dollars a day and I will make sure that accounts follow you. This is the reality today. How fake! Tomorrow, we are going to see artificial intelligence learning ways to manipulate a part of who we are on-line and potentially distort the views that others have about us by what they see in a video. I know, this is literally crazy but we aren’t that far away from having a machine being able to replace your video’s audio track with other content that resembles your voice with a high degree of accuracy with something you haven’t said. Wait! What? Dr. A, are you saying that bots will eventually have the ability to manipulate recorded videos about us and include content we haven’t produced in order to trick others to believe what they want others to believe in, using people as scapegoats? Yes.

I am convinced that our lives will be much more complicated in the future because of technology. In a period of 5 to 10 years, we won’t be able to distinguish between facts and fiction online very easily or accurately check the credibility of our conversations in cyberspace because of technology advancements in AI. The machine will learn a way to trick others into believing what the machine wants you to believe. People will see videos being manipulated by an algorithm saying things that people haven’t said yet many will believe because they won’t be able to differentiate between what is real and what is fiction.

Advertising and propaganda won’t cease to exist. I wonder how companies will maximize the use of such tools for profit.

The “good” news today is that we can still control what we put out there and can track what is being shared about us in social media. In the near future, things will change drastically in this regard.  Get ready to having to deal with intelligent technologies that will manipulate reality in ways you can’t control.  These intelligent systems will evolve so quickly that your ability to control its intents will be severely diminished. At least this is what Mr. Ovadya, Knight News innovation fellow at the Tow Center for Digital Journalism at Columbia has predicted.

Listen carefully: We are seeing the beginnings of fake online. Can you imagine what will happen to sharing truthful content on-line in 10 years? I don’t even want to know.