Women ‘no longer believe social media companies will eliminate abuse’
Social media companies must make it easier to report misogyny, says Ofcom chief, after his study found 60% of women have experienced harmful behavior online, including harassment and trolling , in just one month.
In an interview with The Telegraph, Dame Melanie Dawes, chief executive of the watchdog, said the ‘shocking’ abuse women faced online was getting worse while at the same time they had lost faith in the ability social media companies to remove them when they complained.
“People don’t think that when they report something there will be action,” said Dame Melanie, who promised Ofcom would be “straight into businesses and asking for information” once it is officially licensed as a regulator by the Government’s new online security laws.
She said a priority when taking over would be to ensure there were effective ways for people to report abuse. It is understood that this will include allowing “bystanders” to report misogyny, harassment and trolling rather than simply leaving it up to the victim to have it taken down.
The Ofcom study, based on 6,000 people, found that while men had experienced harmful behavior online (64%) in the past four weeks, women were more likely to experience it, at 43% compared to 33% of men.
Women found hateful, offensive or discriminatory content online to be of particular concern compared to men (85% vs. 70% of men), as well as trolling (60% vs. 25% of men).
Dame Melanie said social media must prevent and crack down on “illegal” content like revenge porn, bullying and bullying. “We’re going to go straight there and ask for information on what they’re doing about what’s already illegal,” she said.
“We would say then, ‘talk to women about your services, understand who is in your services and who really has a problem. Find out what they think of reporting tools [abuse] and show them that you take action when something is wrong because right now there is no trust there”.
Another key target will be the algorithms of social media companies, which she blames for causing the worst harm online, as they were designed to boost revenue, profits and publicity rather than protect people. users.
“Some of the worst damage is done when things are shared with hundreds of thousands of other people. That’s when the trolling and pile-ups really happen,” she said .
“Algorithms are too often designed around the business and not around the user. They are often built around commitment. It’s the business model. This is what generates ad revenue for them.
“They’re designed to amplify engagement, but we know that also means they often amplify damage, so that’s the third thing we think the platform needs to address.
She said she wanted social media companies to make sure they were safe in advance. “Too often today we see new products, whether it’s the metaverse or new services, being tested on the public, often on quite young people and sometimes on children,” he said. she stated.
“And then, it is much more difficult to modernize the security devices later. So we want to see that thinking much earlier in the decision making.
The Online Safety Bill, which will impose new regulations and requirements on tech companies and social media platforms to protect their users, is currently going through parliament.
As official regulator, Ofcom will have the power to fine companies up to 10% of their global turnover, block services that fail to comply with the law and bring criminal charges against executives. who fail to comply with its inquiries or requests for information.