What if all the videos you ever watched were fakes?

The advancement in technology has made camera apps to be more sophisticated where you can get rid of pimples, elongate some of your body parts and even fix animal eras on your face. This same technology has facilitated the production of fake videos that look real using a technology known as deep fakes.

Deep fakes refer to the digital manipulation of images, sound and video to make someone appear like they did something, which they did not, in the most realistic way possible. The term deep fake is a combination of deep learning and fake. Deep learning is a subgroup of artificial intelligence that studies, learns and makes intelligent decisions on their own. It is the intelligence used to produce the falsified videos

So how do deep fakes work?

Now that you understand what deep fakes are, here is how it works. First, the deep learning system studies photos and videos of the target person at different angles. The system then mimics the person's behaviors and speech patterns on its own.

Once the fake video has been produced, it is subjected through a process known as GANS (Generative Adversarial Networks). The process of GANS is used to detect flaws. Therefore, in this case, GANS process detects errors in the forgery and the necessary improvements are made. The improvements can be done as many times as possible and voila! Your deep fake video is complete.

What is the Impending Danger of Using Deep fakes

Primarily deep fakes were made with intentions of pornographic fantasies, that is, to disguise pornography featuring celebrities and people's exes. As such, making this technology was left to the Tech gurus. However, with the constantly evolving, innovative technology, deep fakes are now easily accessible to masses. All you need is access to a computer and internet to create deep fake content. The easier accessibility of deep fakes has led to the application being utilized for personal attacks on individuals, businesses, the public and infrastructure to mention but a few.

To start, deep fakes can be used to compromise the political reputation of a candidate by making them appear to say or do something that never occurred. The first time the potential danger of deep fakes to politics was brought to the limelight was when comedian Jordan Peele released a public service announcement that completely mimicked the then President, Barack Obama. This was and still is an indication that this technology may be used to influence the public during elections.

For instance, take the current state of affairs between Iran and the United States of America. If a deep fake video would suffice of one president declaring war on the other, chances are, we would be tucking in for World War 3.

The increase in child pornography is another impending danger following the popularity and use of deep fakes. Currently, people who have been accused of child pornography have claimed that the video was computer generated and not real. This means that there are chances of such sex offenders walking free and placing more children in danger. It also means that since you can fabricate false images, children are at risk of being used for pornography behind the false images and videos.

Deep fakes have allowed cyber-bullies to use the technology to target, harass or blackmailing their victims to threatening to 'leak' their personal videos and photos. Corporate and businesses are not left behind; targeted blackmailing of corporate heads with damaging videos with the intent of hurting their corporate image.

What's next?

It is clear that we are nurturing and fattening an impending danger. The good news is several gurus have engineered the use of AI to detect fake content. Corporations such as Facebook have implemented measures to curb "fake news" that has been a pain on its backside.

Additionally, co-operation between concerned monitoring parties should therefore invest in more research to come up with strategies to control the production and use of deep fakes.

Bottom line, deep fakes are here to stay and it possible for you, your business or brand to be a victim of this technology. The best you can do is find the right technology to detect them deep fakes, educate your workforce on them and put together a team that can quickly respond to deep fakes when they are released.

Deepfake.. a menace in waiting?