Misinformation - A Hydra Headed challenge in an information driven society
by Namah Bose
7 min read • May 28, 2025

The recent escalation of the conflict between India-Pakistan in the age of social media has brought out the realities of an information driven society. As both the nations dealt with security issues, social media was used to wage a war on truth. Several accounts posted false and inaccurate content to achieve the means of creating panic and fear amongst citizens. The building of narratives and conspiracy theories highlight the need to study and understand the perils of spread of misinformation.
What is Misinformation?
The words misinformation, disinformation and fake news are thrown around in society often today without any distinction. Misinformation refers to spread of false and inaccurate information while disinformation refers to spread of false information deliberately for the purpose of misleading society. The difference between the two is that of intention. Fake news is the term for false news available online. Several news channels in order to garner views and TRP attempt to push news that are not accurate.
While misinformation is not deliberate its effects on society are debilitating - health, law and order, environment. Instances of misinformation during COVID-19 led to refusal to vaccinate, mask avoidance and unnecessary usage of medications that had a negative effect on public health. Microchips and Nano-particals and rumours about sterility created fear in public towards vaccination.
Misinformation during conflicts like India and Pakistan’s recent escalations tend to spread on platforms like Instagram and X (previously Twitter). It had the potential to create fear in the minds of citizens of both the countries about alleged attacks in various cities. Old videos of fires, and videos of ruins of gaza used to create the perception of attacks with severe damages. It is thus essential to analyse social media platforms and the role played by such platforms and its users in creating a frenzy.
Times of Social Media - New mechanisms for propaganda
Historically, traditional media has often served as a tool for the powerful, shaping public perception and controlling narratives to suit particular interests. A stark example of this is seen in Nazi Germany during World War II, where propaganda and censorship were systematically employed to manipulate public opinion and suppress dissent.
In the present day, social media platforms like Instagram, Facebook, WhatsApp, and Telegram have transformed the media landscape. While traditional media has always been controlled by a select few, social media confers this influence to every member of society to create, share, and amplify information. The democratization of media empowers the public to participate directly in shaping narratives.
However, it has given rise to serious challenges. These platforms have become a breeding ground for misinformation, where false content spreads rapidly and often without accountability. Unlike traditional media with editorial checks, digital platforms allow unverified information to go viral before fact-checks can catch up. While some users unknowingly share misinformation out of urgency or ignorance, others deliberately spread disinformation to manipulate public sentiment. The speed and scale of this circulation make it difficult to correct narratives once misinformation takes hold.
Algorithmic Amplification of Emotionally charged Content
Algorithms are tools that are responsible for the content audience views on opening any social media platform. Algorithms are designed to enhance user engagement while taking into account user’s choices, dislikes and likes based on previous interactions.
Algorithmic amplification refers to the phenomenon of Algorithms boosting emotionally charged content that invoke user’s attention to the content, thus maximising engagement. Algorithms are capable of analysing past user behaviour to understand the user's existing beliefs. Content that generates high user engagement is promoted and appears on user’s feeds. It creates echo chambers wherein a user will see content that aligns with their existing beliefs. An imminent fear associated with algorithmic amplification is also that it may drive audiences towards increasingly polarising narratives. Research on three platforms like Reddit, Gab and YouTube has shown that Youtube amplifies extreme and fringe content.
The solution to this particular challenge is making algorithms more transparent. It will help the user understand the basis on which content is prioritised. Media and Digital literacy amongst users itself along with fact checking initiatives can aid in the situation. Press Information Bureau (India) Fact check and its attempts to tackle fake news is a noteworthy example that proves useful during conflicts.
Suggestions and Recommendations - Regulation v. Free Speech Conundrum
Free speech has long been the notable attribute of the digital space—its hallmark. The ability to read, interact with, and create content reflecting diverse or alternative views has been a defining characteristic of the online community. Democratic societies often view any such restriction on freedoms as censorship, raising concerns over curtailment of individual liberties. However, international and national crises like pandemics and armed conflicts expose the dangers of unmoderated digital spaces. Unchecked content during such times do not merely mislead, rather have the potential to heighten tensions, intensify divides, and spread harmful or even violent narratives.
This underscores the need for a refined approach to digital governance, one that does not advocate for blanket censorship but rather focuses on a participative model. Citizens, digital platforms, and law enforcement agencies need to play a significant role towards creating a knowledge driven society. Promotion of digital literacy can empower users to scrutinise the information they consume, verify before sharing, and avoid becoming unintentional vectors of misinformation. Advocating responsible consumption and sharing of content, rather than spontaneous takedowns would be a more sustainable approach.
Efforts by X (previously Twitter) during the COVID-19 pandemic to counter misinformation can be appreciated in some respects, regardless with a critical eye. X (previously Twitter) has over 330 million monthly active users, globally, making it one of the world’s most popular social media platforms. In an acknowledgment of its influence, the platform rolled out several policies to address the spread of misleading information during the pandemic.
Its attempts included the expansion of its search prompt feature, ensuring that users seeking COVID-19 information were primarily shown content from authoritative sources. The initiative was implemented in partnership with national health agencies and WHO in around 70 countries. In Addition, X (previously Twitter) altered the auto-suggest function to avoid steering users toward misleading content and expanded its “Know the Facts” prompt to promote verified health information. Post Twitter's shift to X - it is faced with several allegations regarding its shift in policy - including the disbanding trust teams and revocation of bans on extremist and dangerous accounts.
Although steps taken by Online platforms still reflect a proactive stance, they also reveal the platform’s struggle to balance effective moderation with due process and transparency. The increased automation, while necessary due to the sheer volume of content, have also raised concerns about overreach, false positives, and limited recourse for wrongly flagged users. Moreover, despite the intentions, every platform’s inability to vet every piece of disputed information leaves critical gaps in content regulation. Thus, while platforms like X (previously Twitter) deserve credit for attempting to mitigate the crisis as was seen in the policy during COVID-19 pandemic, the broader challenge lies in creating a resilient digital ecosystem. One where the rights to free expression coexist with the responsibility to prevent harm—and where each actor, from user to platform, takes ownership of that balance.
Until regulations, law making bodies and platforms perform their role of effectively containing the misinformation crisis, the task lies in the hands of the users. The promotion of media literacy, along with efforts to educate individuals about the challenges posed by unverified information, constitutes a fundamental step toward the development of an informed and responsible society grounded in the ethical consumption and dissemination of information.
About the Author
The article is written by Namah Bose, a fifth (final) year student completing B.A.LL.B (Hons.) at Rajiv Gandhi National University of Law, Patiala, Punjab.