NewsSocial Media

New poll finds 7 in 10 adults want social media firms to do more to tackle harmful content

Ipsos study finds over 4 in 5 adults are concerned about harmful content online.

A clear majority of the public want social media companies to do more to protect their users from harmful content, according to new research published today.

Polling by Ipsos shows over four in five (84 per cent) adults in the UK are concerned about seeing harmful content – such as racism, misogyny, homophobia and content that encourages self-harm – with two in five (38 per cent) reporting having seen it in the last month. This comes as the Online Safety Bill moves to Report Stage in Parliament this week.

The government commissioned study found strong public support for the measures contained in the Bill. For instance, seven in ten adults (68 per cent) believe social media companies should do more to protect people online.

Four in five adults (78 per cent) want social media companies to be clear about what sort of content is and isn’t allowed on their platform. In a stark warning to social media companies, 45 per cent of respondents also said they will leave or reduce the amount of time they spend on their platforms if they see no action.

Digital Secretary Nadine Dorries said:

Online abuse has a devastating impact on people’s lives, and these findings definitively show the public back our plans which will force social media companies to step up in keeping their users safe.

It is clear people across the UK are worried about this issue, and as our landmark Online Safety Bill reaches the next crucial stage in Parliament we’re a big step closer to holding tech giants to account and making the internet safer for everyone in our country.

The survey also found that women have high levels of concern about legal but harmful content, with 45 per cent feeling unsafe when talking to people on dating or messaging apps.

Most women (65 per cent) agree there should be limits to the types of content people can post online. Nearly half (47 per cent) of those living in households with at least one child report having seen abusive content in the last month.

The safety of women and girls across the country is a top priority. The measures we’re introducing through the Online Safety Bill will mean tech companies have to tackle illegal content and activity on their services, women will have more control over who can communicate with them and what kind of content they see on major platforms, and they will be better able to report abuse. In addition, we are continuing to implement our Tackling Violence Against Women and Girls (VAWG) strategy to bring about real and lasting change offline as well as online.

The Online Safety Bill was introduced to Parliament in March and is a major milestone in the government’s mission to make the UK the safest place in the world to be online. The new laws will protect children, tackle illegal content and protect free speech, as well as requiring social media platforms to uphold their stated terms and conditions.

If they don’t, the regulator Ofcom will work with platforms to ensure they comply and will have the power to fine companies up to ten per cent of their annual global turnover – which could reach billions of pounds – to force them to fulfil their responsibilities or even block non-compliant sites.

When the Bill comes into force, firms will be required to identify and implement solutions to protect their users. Firms hosting content that is harmful to children such as pornography, will have to prevent them from accessing it, for example by using age verification.

Social media platforms will also be required to safeguard people’s free speech, and their access to journalism and content that is democratically important. The poll follows the announcement of a series of amendments to the Bill last week to strengthen protections for freedom of speech, including tougher protections to guard against the arbitrary removal of articles from recognised news outlets shared on social media.

Last week the government published the list of legal but harmful content social media companies will need to address under the Online Safety Bill.

The categories consist of types of online abuse and harassment which can fall below the threshold of a criminal offence, but which still cause significant harm to adults online such as misogyny, homophobia and content that encourages self-harm. This threshold is important to ensure that the online safety framework focuses on content and activity which poses the most significant risk of harm to UK users online.  Free speech within the law can involve the expression of views that some may find offensive, but a line is crossed when disagreement mutates into abuse or harassment, which refuses to tolerate other opinions and seeks to deprive others from exercising their free speech and freedom of association.

Jason Davies

I am one of the editors here at www.systemtek.co.uk I am a UK based technology professional, with an interest in computer security and telecoms.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.