top of page

Episode 5: The WhatsApp Prophet

Updated: Jun 16

When Misinformation Comes in Familiar Fonts

ree

He didn’t have a title.He didn’t wear a uniform.He didn’t even show his face.

Yet, during the early days of the pandemic, his voice message went viral across Tamil Nadu:

“Don’t take paracetamol — it’ll worsen COVID. My brother works in AIIMS. I’m just passing this urgently to help people.”

It had urgency. It had relatability. And more importantly — it had a familiar tone.


ree

🔁 The Forward That Wouldn’t Stop


No one knew who sent it. No one checked. Because someone they knew and trusted forwarded it.


And that made all the difference.


In Salem, one family stopped giving paracetamol to their feverish grandmother — fearing it would "push the virus deeper. "In Coimbatore, a school teacher circulated it to 300 students' parents via the PTA WhatsApp group. In Tiruvannamalai, a temple priest read it aloud during morning announcements.


There were no credentials. No evidence. Just a voice that seemed to care.


🧠 When Familiarity Feels Like Proof


This is a classic example of familiarity bias — a subvariant of truth bias.

We believe things not because they are true, but because they come from familiar channels.The more something is repeated, shared, and aligned with existing fears, the more real it feels.


The WhatsApp Prophet — like many of his kind — became a vector of digital infection. Not a virus of the body, but a virus of thought.


🕰️ Did familiarity bias exist in olden days?


Yes — absolutely. In fact, familiarity bias is as old as civilization itself, because it's rooted in the evolutionary need to conserve energy and avoid social conflict.


In the past:


  • People trusted their village elders, family priests, or landlords without questioning.

  • Travelers from faraway places were met with suspicion, while those who looked or spoke like “us” were accepted.

  • Oral traditions, myths, and remedies survived not because of proof, but because “someone we knew swore by it.”


Familiarity = "This is how it’s always been.”


🔍 So why does it feel more dangerous now?


Because tech has broken the natural boundaries that used to limit the damage.


Then:

  • Familiarity was local and slow-moving

  • Misinformation stayed within small social clusters

  • Trust was relational (face-to-face) and errors could often be corrected in person


Now:

  • Digital platforms simulate familiarity (WhatsApp, Instagram, LinkedIn)

  • Falsehood travels faster than truth — amplified by shares, likes, and algorithms

  • A single message from an unknown source can feel trustworthy because it comes through someone you know


In other words, familiarity is no longer earned — it's forwarded.


📉 The Consequence:


What used to be village-level hearsay is now nationwide belief within minutes.

“Earlier, gossip stayed in courtyards. Now, it lives on cloud servers.”

✅ So what changed?

  • Speed

  • Scale

  • Simulation of trust

  • Collapse of verification effort


The bias itself hasn’t changed. What’s changed is our infrastructure for spreading it — making it harder to tell the real from the familiar.


🧩 Framework Breakdown:


  • Type of Truth Bias: Familiarity + Digital Peer Trust

  • Trigger: Forwarded voice message with authority cue ("my brother is in AIIMS")

  • Cultural Amplifier: Collective urgency + reverence for doctors/scientists

  • Platform Impact: Closed-group networks (WhatsApp, Telegram, PTA circles)


📉 The Damage


  • Hundreds skipped medication

  • Some refused medical treatment

  • Local doctors were forced to “counter WhatsApp” more than COVID


Even Tamil Nadu’s Health Department issued advisories urging citizens not to believe unauthenticated forwards.


But by then, the Prophet’s voice had travelled.


📌 Why It Worked

It didn't wear a badge.It didn’t pretend to be the government.It just said: "I’m trying to help you."


And that’s what made it dangerous.


🧠 Final Note – When Familiar Becomes Dangerous, and Search Becomes a Crutch


We used to believe the voice that forwarded the message. Now, we believe the voice in our heads after a Google search.


The problem isn’t just WhatsApp anymore — it’s the illusion of personal expertise built from snippets, search results, and WhatsApp university degrees. More and more people now question doctors, skip prescriptions, or self-medicate — not out of rebellion, but out of misplaced confidence.


Truth bias is evolving.From “I believe this because my friend sent it,”to “I believe this because I read it somewhere... maybe on page 3 of a Reddit thread.”


And that opens the door to the next wave of bias —

Overconfidence from partial information.

As we scroll, search, and self-diagnose, the question isn’t just “What do I know?”It’s “How do I know it’s true?”


ree

That’s the trap.That’s truth bias in its most viral form.


🗞️ Sources Cited in This Episode


  • BBC News (May 2021) – Fake WhatsApp messages worsen India's Covid crisis

  • The Hindu (May 12, 2021) – Doctors battle WhatsApp misinformation during second wave

  • BoomLive (April 2021) – No, Paracetamol Does Not Make COVID-19 Worse – Viral Audio Message Is Fake

  • Indian Express (May 3, 2021) – Health Ministry, doctors counter social media myths as fear spreads faster than facts

  • Alt News (April 28, 2021) – WhatsApp voice note warning against paracetamol use in COVID is fake



Coming up: Episode 6 – The Ponzi Schoolteacher


When trust wears a saree and teaches third standard maths.



Comments


© 2025 Vivek Krishnan. All rights reserved.  
Unauthorized use or duplication of this content without express written permission is strictly prohibited.  
Excerpts and links may be used, provided that clear credit is given to Vivek Krishnan with appropriate and specific direction to the original content.

bottom of page