News

Amnesty International brands Twitter as ‘a toxic place for women’

Human rights group Amnesty International has slammed Twitter, calling the service “a toxic place for women” in a report outlining women’s experience of abuse on social media platforms.

The report, which followed a 16-month study carried out by Amnesty, accused the company of failing to meet its responsibilities to protect users, stating “as a company, Twitter is failing to respect women’s rights online” while calling on governments to tighten legislation and policing of online abuse.

During the research, Amnesty directly interviewed 86 women in the US and UK about their experience with abuse on the platform.

The group, which included politicians, journalists, activists, games developer and comedians, reported that many of them felt their experiences were not being taken seriously by the platform.

Amnesty’s report comes the week after the Australian federal government launched its latest program to address online bullying which urged online platforms to make better use of technologies like artificial intelligence to take down content and ban users in a bid to stamp out online bullying.

Previously, Twitter founder Jack Dorsey has lamented the service’s inability to police content with the report quoting him as saying: “We see voices being silenced on Twitter every day. We’ve been working to counteract this for the past 2 years…We prioritized this in 2016. We updated our policies and increased the size of our teams. It wasn’t enough. ”

In the Amnesty report, the authors wrote: “The violence and abuse many women experience on Twitter has a detrimental effect on their right to express themselves equally, freely and without fear

“Instead of strengthening women’s voices, the violence and abuse many women experience on the platform leads women to self-censor what they post, limit their interactions, and even drives women off Twitter completely.”

The report called on Twitter and other social media platforms to do more in dealing with online violence and abuse along with recognising the particular issues and vitriol faced by women from marginalised communities.

Amnesty also recommended Twitter improve its transparency around online abuse reports and the action the service has taken against users alone with improving reporting tools and being more proactive in educating those on the platform about unacceptable behaviour.

The human rights group also called on governments to tighten legislation against online abuse against women and train law enforcement to better tackle the problem.

Spokespeople from Twitter were unable to comment on the report and instead provided the company’s response to Amnesty:

“Twitter has publicly committed to improving the collective health, openness, and civility of public conversation on our service. Twitter’s health is measured by how we help encourage more healthy debate, conversations, and critical thinking. Conversely, abuse, malicious automation, and manipulation detract from the health of Twitter. We are committed to holding ourselves publicly accountable towards progress in this regard.

“Twitter uses a combination of machine learning and human review to adjudicate abuse reports and whether they violate our rules. Context matters when evaluating abusive behavior and determining appropriate enforcement actions. Factors we may take into consideration include, but are not limited to whether: the behavior is targeted at an individual or group of people; the report has been filed by the target of the abuse or a bystander; and the behavior is newsworthy and in the legitimate public interest. Twitter subsequently provides follow-up notifications to the individual that reports the abuse. We also provide recommendations for additional actions that the individual can take to improve his or her Twitter experience, for example using the block or mute feature.

“With regard to your forthcoming report, I would note that the concept of “problematic” content for the purposes of classifying content is one that warrants further discussion. It is unclear how you have defined or categorised such content, or if you are suggesting it should be removed from Twitter. We work hard to build globally enforceable rules and have begun consulting the public as part of the process – a new approach within the industry.

“As numerous civil society groups have highlighted, it is important for companies to carefully define the scope of their policies for purposes of users being clear what content is and is not permitted. We would welcome further discussion about how you have defined “problematic” as part of this research in accordance with the need to protect free expression and ensure policies are clearly and narrowly drafted.”

ADVERTISEMENT

Get the latest media and marketing industry news (and views) direct to your inbox.

Sign up to the free Mumbrella newsletter now.

 

SUBSCRIBE

Sign up to our free daily update to get the latest in media and marketing.