While many people don’t bother with visiting or posting to the comments sections of websites, these spaces—much like a town meeting or kitchen table—can be home to the kinds of thoughtful and constructive conversations that transcend cultural or political differences, conversations that build community.
But these days, almost every popular blog or website has to deal with trolls, those digital denizens who post disruptive, inflammatory, and sometimes hurtful messages to discussion boards or comment threads.
While the common Internet wisdom is to never feed a troll (meaning, don’t engage them in discussion), a venerable science and technology magazine recently decided that the best way to deal with their troll problem is to starve them out entirely.
In a post that began with the memorable line, “Comments can be bad for science,” Popular Science’s online content director Suzanne LaBarre linked the decision to shut off PopSci.com’s commenting function to an overwhelming amount of troll and spambot traffic. LaBarre expressed particular umbrage at commenters who were doing “the cynical work of undermining bedrock scientific doctrine.”
To be fair, some online commenters—perhaps because of rigid belief systems that don’t allow for any deviance from their own “truth”—can be unknowingly disruptive or hurtful. But there are as many if not more garden-variety trolls—cyberbullies, cyberstalkers, and downright creeps—out there who feel that the anonymity of the Internet provides them license to sow discord and say awful things.
Preventing these trolls from dominating and thus destroying online discussion largely depends on effective site moderation and a requirement that commenters own their comments.
Trolls who post ad hominem attacks on article authors and fellow commenters generally have an anonymous account, and their expletive-laden comments are easily spotted by site moderators or editors. Keeping them from hijacking comment threads is possible, but it takes time and resources.
Though, LaBarre is talking about something more insidious than your garden-variety troll. This particular troll wears a disguise and poses as a voice of reason (no expletives here) expressly for the purpose of muddying, casting doubt, or misstating established scientific findings.
Visit the comments section under any online newspaper article or blog citing this or that scientific study—from global issues like climate change and renewable energy to human health topics like childhood vaccinations and e-cigarettes—and you’ll find this type of troll doing the “cynical work” LaBarre references.
Is the commenter named “Scienceguy22” who cites studies (with links) about the expanding polar ice caps really right about global warming? Should you believe “NIMBY,” who says she suffers from wind turbine syndrome? Does “ConcernedMom” have firsthand experience with childhood autism that stems from MMR vaccinations? Did “Ex-smokerBob” really beat throat cancer after he began smoking e-cigarettes?
These commenters may be dissemblers, perhaps working for a special interest to promote this or that dogma, or they may hold these beliefs in earnest. The bottom line is that we just don’t know who they are because most sites still allow people to post comments anonymously.
But we do know that people post these types of comments because they sway readers’ opinions about the contents of an article. And, in the case of complex, interconnected scientific topics, even a little misinformation can go a long way in casting doubt on the validity of a scientific endeavor.
Even garden-variety trolling that is outright venomous and totally unrelated to an article can sway readers’ opinions, according to a recent study by UW–Madison social scientists Dominique Brossard and Dietram Scheufele. “Uncivil comments not only polarized readers, but they often changed a participant’s interpretation of the news story itself,” said Brossard and Scheufele, calling this phenomemon “the nasty effect.”
So, in some ways LaBarre is right: “Comments are bad for science.”
But, the Internet is a relatively new technology. And there is much we can do to cultivate thoughtful and constructive online discussion and weed out trolls in disguise.
More and more websites are using software that can weed out garden-variety trolls and flag for review by site moderators those comments that are specious in nature. Too, many are eliminating anonymous posting entirely by asking people to sign in through some form of verified profile or profile-based social media application like Facebook or Disqus.
Of course there are ways to subvert a verified profile, and trolls will still creep into online conversations. But this doesn’t mean that we should simply shut off the discussion and call it a day.
Nor should we have a free for all in the comments section. Accountability is the key to fostering trust and building the digital community. This is doubly true if we’re going to have honest discussions online around the issues and ideas that are and should be important to everyone: scholarly research and how it can help humanity, the health of our local and global environment, and the role of government and the individual within a democracy.
Free speech is a fundamental component of a healthy community, yes, but thoughtful conversations necessitate a little pruning in order to flourish, grow, and welcome others.