Russia: Human Bots Fight Opposition

A famous cartoon by Peter Steiner depicts a dog sitting at the computer and saying that “On the Internet, nobody knows you're a dog.” In Russia, this principle is often used to spin political agenda and replicate public opinion.

Creating the “bots” to do this, and the opening hundreds of fake blog accounts has even became a full-time job for people, who do not hesitate to use the blank spaces of online persona to their advantage.

Graffiti robot. Image by Flickr user jlmaral (CC BY-SA 2.0).

Graffiti robot. Image by Flickr user jlmaral (CC BY-SA 2.0).

High-profile oppositional blogger navalny (read GV's interview with him here) recently noted [ru] that some comments in his LiveJournal, the most popular blogging platform in Russia, were created by users registered on the same day. The comments usually range from simple offenses (“the author of this post is an idiot”) to argument-like entries (“this navalny doesn't have any information and is too lazy to check facts”).

The issue of someone with numerous fake identities leaving a comment or two does not pose any problem per se. But it does become more than an annoyance when there is a whole army of “bots”  fighting a war with emerging opposition bloggers in the country. The numerous – however silly and out-of-place – comments distract the attention of readers from the discussion of important issues and spam blogs with pages of unrelated comments.

Other bloggers  (dolboeb, man_with_dogs, and aiden-ko, to name just few) got to the core of the issue and came up with a research-like posts [ru] on their LiveJournals. They talked about a peculiar posting on Free-lance.ru, a Russian website with job adverts for people working in the information technology field. The posting has been deleted but man_with_dogs has a saved screenshot [ru] of the original:

“I need 5 people,” the ad says. “Each of them will leave 70 comments a day from 50 different accounts (the accounts need to be live). Urgently. The job is 5 days a week. The duration of this project is 3 months. The payment is every 10 days (Webmoney, Yandex Money [methods of payment – GV]). Total: 12,000 rubles [around $400-G.V.] a month.”

The author of this ad, someone named Vladimir Alekseev (probably a fake name since it sounds too conventional) also provided the details of the “job.” The human bots need to target the blog of navalny:

The task is to create the maximum believable wave of comments to degrade the rating of the journal's author and to form a negative attitude toward him. You need to comment each new post correctly and persuasively. It is also important to create a positive image of “United Russia” party [the ruling party in Russia]. Can you do it?

Interestingly enough, navalny has a long history of accusing United Russia of all kinds of misbehavior. He famously called it “the party of crooks and thieves” and tried to document financial speculations and cases of embezzlement allegedly conducted by the party's members. So it should not surprise anyone that, if the job advert is real, that United Russia has attempted to discredit navalny online.

Of course, those notes and links are far from hard facts. In theory, everyone could fabricate and replicate this issue online and the advert could well be the product of someone who would like to present navalny as a real fighter of corruption (and there is no real fighter without an opponent). But navalny‘s “correct and persuasive” spam problem is an illustration that human bots have become a relatively new trend on the Russian Internet (previously, comment bots were mostly programmed).

More often than not, this turns out to be an effective way to spin or create a “hot” topic. The recent sad, disappointing and embarrassing online “campaign” against Japan is just one example of how an issue can be created out of nothing. Right after the tsunami and earthquake-hit nuclear power plant in Fukushima became internationally known, several bloggers posted a scanned copy of an old  Japanese newspaper [ru] that allegedly talked about the Chernobyl disaster in 1986.

According to those bloggers, the newspaper called the Soviets “savages that cannot be let near nuclear energy” and bragged that “the Chernobyl scenario is impossible in Japan.” Hundreds of bloggers commented on this. Many believed the scanned copy was real. It seemed that the issue had been actively promoted online and the topic became one of the top themes of the Russian Internet.

It took only someone who knows Japanese language to translate the real headlines of the newspaper: “New Russian Constitution Adopted at the Congress of the Members of Russian Parliament,” “Uniting with the Project of President,” “Interview with the Chairman of High Committee of Russia Sokolov.” And, of course, no mention of Chernobyl or anything related to it.

Blogger drugoi posted the real translation [ru] of the headlines and soon enough encountered the infamous human bots through comments.

This illustrated the sophistication with which certain forces approach the issue of controlling the web. Human bots in Russia are more effective than good old automatic spam bots. They have a soul and a brain. They logically react to blog posts and they strength in their number.

Evgeny Morozov's idea of  “spinternet” can be well applied here with the whole practice of promoting certain points of view online  becoming more and more prevalent in Russia. Nobody knows if you are a dog on the Internet but it certainly seems that dogs behave more ethically than some online forces that try to deceive people and discredit the Web as a tool for building a better society.

This post was re-published in Russian by the Ezhednevniy Zhurnal (Daily Journal) as part of a content partnership with Global Voices’ RuNet Echo.

37 comments

Join the conversation

Authors, please log in »

Guidelines

  • All comments are reviewed by a moderator. Do not submit your comment more than once or it may be identified as spam.
  • Please treat others with respect. Comments containing hate speech, obscenity, and personal attacks will not be approved.