A deepfake video of actress Rashmika Mandana went viral on Social Media .What is Deepfake?

A recent video of actress Rashmika Mandanna went viral on social media and netizens reacted with a mixture of shock, surprise and horror when they realized that it wasn’t the popular actress but a British Indian influencer named Zara Patel. Deepfake video  got a much-needed discussion on the internet about the misuse of technology. The fake viral video showed Rashmika entering an elevator dressed in a black body-hugging yoga suit and smiling for the camera.

What Is Deepfake ?

Deepfakes are fake videos created using digital software, machine learning and face swapping. Deepfakes are computer-created artificial videos in which images are combined to create new footage that depicts events, statements or action that never actually happened. The results can be quite convincing. Deep fakes differ from other forms of false information by being very difficult to identify as false.

How it Works?

The basic concept behind the technology is facial recognition, users of Snapchat will be familiar with the face swap or filters functions which apply transformations or augment your facial features. Deep Fakes are similar but much more realistic. Fake videos can be created using a machine learning technique called a “generative adversarial network” or GAN. For example a GAN can look at thousands of photos of Beyonce and produce a new image that approximates those photos without being an exact copy of any one of the photos. GAN can be used to generate new audio from existing audio, or new text from existing text – it is a multi-use technology. The technology used to create Deep Fakes is programmed to map faces according to “landmark” points. These are features like the corners of your eyes and mouth, your nostrils, and the contour of your jawline.

While the technology used to create deep fakes is relatively new technology, it is advancing quickly and it is becoming more and more difficult to check if a video is real or not.  Developments in these kinds of technologies have obvious social, moral and political implications. There are already issues around news sources and credibility of stories online, deep fakes have the potential to exacerbate the problem of false information online or disrupt and undermine the credibility of and trust in news, and information in general

Potential Threat

The real potential danger of false information and deepfake technology is creating mistrust or apathy in people about what we see or hear online. If everything could be fake does that mean that nothing is real anymore? For as long as we have had photographs and video and audio footage they have helped learn about our past and shaped our how we see and know things. Some people already question the facts around events that unquestionably happened, like the Holocaust, the moon landing and 9/11, despite video proof. If deepfakes make people believe they can’t trust video, the problems of false information and conspiracy theories could get worse.

False news propaganda

One of the most common concerns and potential dangers of deep fakes and false information in general is the impact it can have on democratic processes and elections.How to identify it is deep fake

With improvements in technology related to artificial intelligence (AI), deepfakes are becoming common on the internet. These include pictures, audio or videos. Here’s how such deepfakes can be spotted.

Deepfake videos often exhibit unnatural eye movements or gaze patterns. In genuine videos, eye movements are typically smooth and coordinated with the person’s speech and actions.

Mismatches in Color and Lighting .Deepfake creators may have difficulty replicating accurate colour tones and lighting conditions. Pay attention to any inconsistencies in the liting on the subject’s face and surroundings.

Deepfake videos often use AI-generated audio that may have subtle imperfections. Compare the audio quality with the visual content. Strange Body Shape or Movement

Deepfakes can sometimes result in unnatural body shapes or movements. For example, limbs may appear too long or short, or the body may move in an unusual or distorted manner. Pay attention to these inconsistencies, especially during physical activities. Artificial Facial Movements

Deepfake software may not always accurately replicate genuine facial expressions. Look for facial movements that seem exaggerated, out of sync with speech, or unrelated to the context of the video.

Hope this information is helpful .

Previous Post

Next Post

 

Leave a Comment

Discover more from Dhara News

Subscribe now to keep reading and get access to the full archive.

Continue reading

Indian musicians who won Grammy awards 96th Academy Awards: Oscar Nomination 2024 Crystal Clear Water Lakes In The World