Astroturfing, content farming, and the sleeper effect

The Anatomy of Online Disinformation: Astroturfing, Content Farming, and the Sleeper Effect


Astroturfing, content farming, and the sleeper effect are some of the techniques used to spread false narratives, manipulate public opinion, and sow discord. By understanding the strategies of those who seek to deceive and misinform, we can better equip ourselves to navigate the complex and often treacherous landscape of the internet.



Astroturfing is a term used to describe the practice of creating a false perception of user engagement or beliefs in online spaces to sway public opinion. This disinformation tactic is often employed by organized actors who may use bots or exploit algorithms to give extremist views more traction. The goal is to artificially inflate the likes, interactions, or presence of divisive and extreme fringe opinions to facilitate greater traffic on social media sites.



A Facebook post highlighting an extreme belief that few agree with in actuality may have 10,000 likes and hundreds of comments. In these cases, it is possible that the user engagement on this post may not have been the product of human users but bots or organized propagandists who are incentivized to push these divisive beliefs to the forefront of information spaces.



Astroturfing In Politics:

Astroturfing is a practice used in politics to create an impression of widespread grassroots support for a candidate, policy, or cause when little such support exists. It is a deliberate attempt to mislead the public into believing that their opinion or position is shared by most people. Astroturfing campaigns can become an obstacle to independent thinking because people tend to adopt the opinions they believe are held by the majority, known as the “herd instinct.” Astroturfing campaigns may be orchestrated by corporations, lobbyists, labor unions, nonprofits, or activist organizations. They may also be undertaken by individuals with personal agendas or by highly organized groups.

Astroturfing can take different forms. One form is through the use of front groups, which are organizations that claim to represent grassroots movements but are actually funded by political groups, corporations, labor associations, or public relations firms. Another form is sockpuppeting, where false online identities are created to manipulate public opinion to support or criticize particular candidates, causes, or organizations. Sockpuppeteers pose as independent third parties but are actually funded by another entity.


The consequences of astroturfing are significant. It may alienate real-life human users who disagree with the post and may sway them into believing that they are in the minority or losing position on a particular issue. Astroturfing may also include a high volume of these divisive or incendiary posts that are intended to drown out honest debate from actual human users.


Astroturfing In Marketing

In 2018, the New York Attorney General found that many online companies were using astroturfing tactics to manipulate customer reviews on their websites. They were paying for fake reviews to make their products appear more popular than they actually were, thereby deceiving customers and potentially increasing their profits.



Content farming is a deceptive practice often paired with astroturfing to manipulate online information. It involves creating low-quality articles, posts, or blogs with little to no credible content, solely to exploit social media or search engine algorithms and gain internet traffic. The sheer volume of low-quality content flooding the online space can make it difficult for users to locate accurate and reliable information. Moreover, the use of certain keywords or phrases can lead to the high ranking of these articles in search engine results although the author lacks expertise or knowledge on the topic which promotes the dissemination of misinformation.



The sleeper effect is another psychological phenomenon that aids in propelling disinformation into internet spaces. It occurs when we remember a narrative or story, but not the origin of the information and therefore we are unable to assess the reliability of our source. This may contribute to misinformation in internet spaces as we characterize flashy stories and narratives stuck in our memory as facts and promote them on the internet.

  • One example of the sleeper effect in action is the widespread belief in the myth that humans only use 10% of their brains. This false claim has been repeated so often in popular culture that many people accept it as fact, despite the lack of evidence to support it.
  • The sleeper effect can also be seen in the way that false information can spread rapidly on social media platforms, even when it has been debunked by reputable sources. Once a false claim has been shared widely enough, it can become entrenched in people’s minds and be difficult to dislodge, even when presented with accurate information.


Leave a Reply

Sign Up for Our Newsletters

Stay tuned to our Know-how posts: Follow @ ReclaimThefacts on Social media or subscribe to our newsletter below

You May Also Like

Learn the “If It Didn’t Happen, Just Fake It”, the Kremlin Way!

In the relentless pursuit of deflecting blame for its aggression against Ukraine, the Kremlin’s propagandists have employed a manipulative tactic: if it didn’t happen, just fake it. This deceptive strategy was evident in their recent attempts to shift responsibility onto the US, NATO, and other entities.
View Post