Shadow Home Secretary Diane Abbot has urged Twitter to take action over “highly offensive racist and misogynist” abuse on the platform after a major study found thousands of tweets disproportionately targeting black female politicians and journalists.

The Amnesty International study found that black women were 84% more likely than white women to be mentioned in abusive tweets, with one in ten posts mentioning black women containing “abusive or problematic” language.

A separate Amnesty study published in September 2017 showed that Ms Abbott received almost half (45.14%) of all the abusive tweets sent to female MPs in the run-up to that year’s general election.

In response to the latest findings, Ms Abbott said: “My staff still spend a considerable amount of time removing and blocking abusive or threatening posts from social media.

“Overwhelmingly the abuse is of a highly offensive racist and misogynist character.

“I have always felt that this type of hate speech can lead to violence, and Twitter has a responsibility to shut these accounts down a lot quicker then it currently does.”

She added: “Twitter also does not have the option to delete offensive comments once an account has been blocked. This would make a difference to conversations that are taking place.”

For its latest study, volunteers for Amnesty’s “Troll Patrol” crowd-sourcing project analysed 228,000 Tweets sent to 778 female politicians and journalists across the political spectrum in the UK and US.

  • Black women were 84% more likely than white women to be mentioned in abusive or problematic tweets
  • Of all tweets sent to women in the study, 7.1% were abusive or problematic
  • Female politicians and journalists across the political spectrum were subjected to abuse

The report found that 7.1% of tweets to women in the study contained abusive or problematic language.

According to the report: “Abusive content violates Twitter’s own rules and includes tweets that promote violence against or threaten people based on their race, ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, age, disability, or serious disease.”

Meanwhile, “problematic content” was defined as that which is “hurtful or hostile, especially if repeated to an individual on multiple or cumulative occasions”, but do not necessarily meet the threshold of abuse.

Previously, Labour MP Jess Phillips has spoken out about harassment on Twitter after a wave of violent abuse was launched against her in May 2016.

At the time, she tweeted: “To see the attack of a pack on here check out my mentions 600 odd notifications talking about my rape in one night. I think Twitter is dead.”

Only a month later, parliamentary colleague Jo Cox was murdered by a far-right fanatic Thomas Mair while carrying out work in her constituency.

This week fellow Labour MP Luciana Berger tweeted screengrabs of numerous anti-Semitic messages after voicing her position on a vote of no confidence in Theresa May.

Abuse has not been only been directed toward politicians on the left: a University of Sheffield study of tweets between 2015 and 2017 published last year found that male Conservative MPs had seen the most abuse during the period, while female Tory MPs saw the largest increase in abuse.

Conservative MP Nadine Dorries said earlier this year that colleagues had been advised by Parliament’s Health and Wellbeing Service  to close down their Twitter accounts due to the angry messages they were receiving from members of the public.

Amnesty said: “Politicians and journalists faced similar levels of online abuse and we observed both liberals and conservatives alike, as well as left and right leaning media organisations, were affected.”

Kate Allen, Amnesty UK’s director, said: “It’s clear that a staggering level of violence and abuse against women exists on Twitter. These results back up what women have long been saying – that Twitter is endemic with racism, misogyny and homophobia.”

She continued: “Twitter is failing to be transparent about the extent of the problem, but if our volunteers can gather meaningful data about online violence and abuse, so can Twitter.

“The company must take concrete steps to properly protect women’s rights on the platform.”

Vijaya Gadde, legal, policy, and trust and safety global lead at Twitter, said: “Twitter has publicly committed to improving the collective health, openness, and civility of public conversation on our service.

“Twitter’s health is measured by how we help encourage more healthy debate, conversations, and critical thinking.

“Conversely, abuse, malicious automation, and manipulation detract from the health of Twitter.

“Twitter uses a combination of machine learning and human review to adjudicate abuse reports and whether they violate our rules.

“Our abusive behaviour policy strictly prohibits behaviour that harasses, intimidates or silences another users voice.

“We are also transparently investing in better technology and tools to enable us to more proactively identify abusive, violative material, to limit its spread and reach on the platform and to encourage healthier conversations.”