Australasian Science: Australia's authority on science since 1938
Twitter tries to tackle abuse as research shows that most of us can be trolls online
By David Glance, Director of UWA Centre for Software Practice, University of Western Australia
It turns out that the majority of people will post “trolling comments” given the right circumstances and that trolling is not just the output of a minority of antisocial individuals.
In the first part of the research, 667 subjects were split into two groups with one group being given an easy test to complete and the other, a test that was very difficult. After the tests, the groups were evaluated for their mood, which included how angry, tired, depressed and tense they were.
All of the test subjects were then asked to read an article and post at least one comment and up-vote or down-vote and reply to other comments. All subjects saw the same article, but some subjects were shown comments with three troll posts at the top of the comment section. Everyone else was shown neutral posts.
What the researchers found was that people who did the more difficult test were more likely to be in a worse mood than those who did the easy one. About 35% of the people who did the easy test and saw neutral comments then posted troll posts of their own. This number increased to 50% if the subjects had done the hard test or if they were shown other troll comments. A massive 68% of people posted troll posts if they had done the hard test and were shown the toll comments.
What is perhaps not surprising about this is that 35% of people who had no external factors for trolling went ahead and trolled anyway. It didn’t take the researchers much to increase this bad behaviour to the majority of people.
In a separate aspect of the study, the researchers looked at the discussions and posts of 1.2 million users on CNN from 2012. They found that trolling increased late at night and early in the week, times when mood is thought to be generally worse. People were more likely to produce a post flagged for trolling if they had already been flagged or if they had simply taken part in a discussion where others had been flagged.
Previous research has shown that simply being online leads to a disinhibition in behaviour and that this is exacerbated with anonymity. But anonymity is not necessary for people to be trolls and increasingly sites are deciding that allowing comments is simply not worth it.
The latest site to shut down its discussion forum is the movie site IMDB which stated:
“we have concluded that IMDb’s message boards are no longer providing a positive, useful experience for the vast majority of our more than 250 million monthly users worldwide”
IMDB has joined a growing number of organisations that don’t allow comments on their site but rely on engagement through social media platforms. This has the benefit that the article itself is presented without being biased by comments that appear below it, but allows the broader community to engage in a discussion about the articles and content if they wish.
Given that media sites are focusing on promoting their content through social media, there is less incentive to expend time and effort in moderating comments that essentially can end up distracting readers from content on the site itself. As more content is viewed through social media, the comments on the site itself becomes largely invisible in any case.
The trouble is however, as the Stanford and Cornell researchers have shown, commentary on social media is always likely to be plagued with trolls and/or disappointing engagement.
Twitter is planning to bring in changes that will make “potentially abusive and low-quality replies” appear much lower in Twitter conversations. Whilst this, and the idea of “shadow banning” or masking abusive tweets, may target the most extreme examples of abuse, it isn’t going to deal with the general background level of abuse that has become the norm, especially in times of increased stress.
Originally published in The Conversation.