ColumnsToday Columns

Reverse Progress

Current technology is raising concerns. While the world is happy about the significant progress in technology, it is alarming how this technology, instead of being used positively, is increasingly being employed negatively, making people’s lives harsh and complicated in the present era.One such form is “Deepfake,” which can ruin anyone’s life. Nowadays, the term “Deepfake” is quite famous, but what is this technology and how does it work? Perhaps few people know about it.

The term “Deepfake” comes from the words “Artificial Intelligence” and “Deep Learning Program,” which implies technology that can create such realistic fake content that it cannot be easily identified.Deepfake technology is creating a stir in various fields, from politics to entertainment and beyond. It’s imperative that we understand the potential consequences of this technology and take measures to address the challenges it poses to our society and individuals.

This is a technology where someone’s face is superimposed on another person’s body, and the effects of this face can be controlled through machine learning, altering it as desired. This technology is currently being used to create many fake and explicit videos. On social media, many videos are being uploaded in which the face and voice do not match, and the body belongs to someone else. Even demon faces can be superimposed, but for an ordinary person, this is impossible.

The technology behind it is called “Generative Adversarial Network” (GAN). Using this, one can create fake content that is so deep that it cannot be easily recognized. Deepfake images and videos are created using this technology. It is being used extensively for various purposes, from reading news to providing training to employees in different languages. Therefore, some argue that the term “fake” should not be used for this technology. The reality is that this technology is being severely misused. For instance, famous personalities are being made actors in adult films through deepfake, or politicians are being manipulated to give incendiary and misleading statements. Until someone points out their fakeness, damage has already been done. The worrying aspect is that so far, no solution has been found to break the shackles of artificial intelligence.

Facebook and Microsoft are collaborating to create an AI system that can identify whether a video is authentic or fake or can be classified as objectionable content before it is uploaded. The intention behind this initiative is to address the current state of political affairs, which has been significantly disrupted due to the spread of fake and explicit news. Hostile elements in the country are using fake and explicit news to manipulate political leaders by blackmailing them. As responsible citizens, it is our duty to delete such fake and explicit videos instead of sharing them. We may not be aware that by participating in the dissemination of such content, we are complicit in a grave sin.

Now let’s talk about its usage. An ordinary person cannot perform this task at all. Instead, only someone who knows about the use of artificial intelligence can provide a human-like face to an individual in videos. Some time ago, an app called ‘FaceApp’ was just a simple gimmick, but this technology is much more advanced because machines have understood how humans behave. However, every technology has negative uses as well. Through it, famous personalities are presented as actors in explicit movies, or videos are created with the speeches of politicians to incite and mislead. When someone exposes their fakeness, substantial damage has already been done. Such fake videos, audios, and images are created using specific software that interfaces with deep learning and artificial intelligence technology. To create deepfake videos, this software needs a lot of information about the target, such as how they look, how they speak, their facial expressions, how they act in different situations, and more.

“After providing all the information about the target, the software of the app begins the process of creating videos, audios, and images as desired. The end result closely resembles reality, making it extremely difficult to detect fakes. Experts in this technology are warning the world. They say that we have reached a point where distinguishing between deepfakes and real videos is becoming increasingly challenging. It’s only a matter of time before the subtle differences between them disappear. They believe that this phenomenon will lead to a loss of trust and credibility among individuals, and society will face such anarchy that crime rates will skyrocket. Marital conflicts may be instigated, ultimately jeopardizing the family structure, and it’s unclear how far-reaching the consequences of this devastation on human society will be.”

According to a recent study, this technology is now being used against women. A cybersecurity company called DeepTrace, based in the Netherlands, revealed in their research that during one year, 96% of the videos created using online deepfake technology were pornographic in nature, and these videos often featured women’s faces swapped onto explicit content. Professor Danielle Citron of the University of Houston Law Centre told researchers that deepfake technology is being weaponized against women. She said that “deepfake technology is the most pernicious, embarrassing, harmful, and speech-inhibiting tool there is when it comes to online existence, employment, or maintaining one’s safety, particularly for women.” From December 2018 to the present, there are approximately over 16,000 deepfake videos online, which is more than a 120% increase from the previous year, with 96% of them being pornographic. According to the study, many such sites started operating in February last year that are pushing deepfake porn, and among them, four sites have recorded over 204 million users.

The rise of “GPT hub” has played a significant role in advancing deepfake technology. Two years ago, in June, an application called “DeepNude” surfaced, which was based on AI technology for artificial facial intelligence (AI technology). This app could generate a nude image of any woman within minutes. When this app became public, it faced severe criticism across various platforms, and warnings about its misuse were issued. Consequently, it was promptly shut down, and now, the team behind the app has acknowledged that they misunderstood people’s interest in the project and that there are significant risks associated with its misuse. Hence, they decided to stop selling the application and will not release any further versions. However, online marketplaces and forums still exist that assist in creating deepfake videos, and even someone with minimal expertise in technology can obtain a fake video by paying a few dollars.

The free version of this app used to have a large watermark on the images, indicating that it was a fake image. However, in the paid version, this watermark was much smaller and could be easily removed. This app has been available for several months, and its team claimed that it wasn’t that powerful, but experts had expressed serious concerns about its use. Even now, people can digitally alter images, but with this app, anyone could do it with just a few clicks, without much effort. These altered images could be used to harm women or make individuals vulnerable, and dealing with such situations is not straightforward.

The individual responsible for creating this app is named Canem Alberto, who admitted on the technology website “The Verge” that if they didn’t do this work, someone else would soon do it because the technology is ready (and accessible to everyone). The team behind the app stated in a tweet, “The world is not yet ready for this kind of technology, and more time is needed.” It appears that the creator of this malevolent app is deflecting blame onto the world instead of taking responsibility. They seem to be waiting for a time when the world acknowledges the profound ethical pitfalls of this technology. It’s a moment when this app will be genuinely recognized as a technological menace.

In Pakistan, the political elite and influential circles have been using unethical language every day, which is now becoming common. It is not far-fetched that such indecent videos could be employed for character assassination of political opponents to gain political advantage. While now, the political arena is not as heated, the lust for power does not consider the ethical boundaries. The politics of today jeopardizes the future of the country and the younger generation.

Due to the current political turmoil, the economic situation in the country is extremely perilous. National reserves are depleting rapidly, and inflation is on the rise. The general public is struggling to afford basic necessities despite soaring prices. Luxury products, like high-end beauty salon makeup products worth millions of rupees, are still in use for a few hours and then discarded. It is astonishing that despite skyrocketing petrol and gas prices, the income generated from imported items is still being squandered.What’s more surprising is that global financial institutions manipulate electricity, gas, and other essential life commodities to increase taxes but have never imposed restrictions on such luxury items. Instead, they claim that these items contribute to expanding businesses.

Now, it is imperative for all political parties in Pakistan to set aside their differences and come to a unanimous agreement to address the current economic emergency. We must take crucial steps for the safety and security of this country, putting aside all disagreements.

In a country where the political elite are busy day and night accusing each other of lies, deception, immorality, and corruption for the sake of power, the safety of the nation is of utmost importance. Otherwise, international financial institutions are waiting eagerly, with sharpened teeth, to pounce on us and squeeze us for their own gain.

With the additional restrictions imposed by the IMF in the 9-month Stand-By Arrangement, the lives of ordinary citizens are further at risk of becoming unbearable. We must realize these circumstances and change our course immediately.

I read about this incident a long time ago, but I am not aware of its authenticity. However, the purpose of mentioning this incident will become clear in the conclusion of my writing.They say that after gaining independence from France, in Congo, France appointed its ambassador. One day, a French ambassador went into the jungles of Congo for hunting. As he was walking through the forest, he saw some people from a distance and thought, “Perhaps they are here to welcome me.” When he got closer, he realized that they were a cannibal tribe. So, the French ambassador was captured, and the tribe prepared a cauldron, cooked soup, and celebrated in the jungle. France was deeply shocked by this incident and demanded that the Congo government pay a hefty sum of 100 million dollars as compensation for the ambassador. The Congo government was in dire straits, with an empty treasury, widespread poverty in the country, but nevertheless, they wrote a letter to France. In the letter, they expressed their deep remorse for the incident, acknowledging that their country couldn’t afford to carry out such acts of violence. After careful consideration, they proposed to France that instead of pay the money, We propose that since our ambassador, who is now with you, holds doubleweight compared to your ambassador, you should treat him in the same manner as your ambassador was treated by our cannibal tribe.

In the past 76 years, our aristocracy, including individuals like bureaucrats, corrupt politicians, and military dictators have taken loans, which are now sitting in your country’s banks. We propose that, in exchange, our aristocracy’s bank balances and assets be seized to offset all Pakistani loans. If the loans are still outstanding, we are willing to hand over these individuals to you and detain them until they return the embezzled wealth.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button