Ofcom could soon force social media companies to take action against harmful content

Anne Freier | February 13, 2020

Mobile Advertising

Ofcom, the company that regulates broadcasters and telecoms providers, could soon force social media companies to take action when detecting harmful content.

According to a BBC report, new government plans could extend Ofcom’s scope to online safety.

This would make for a marked shift in the company’s regulatory powers over social media firms. So far, these have been largely self-regulated in the UK.

The new rules would apply to companies hosting content such as forums, comments and video-sharing platforms.

Any online threats that emerge could be handled by Ofcom under the new plans.

“Ofcom’s appointment as the ‘Internet Regulator’ throws up some interesting challenges for Ofcom,” said Scott Morrison, Director at Berkeley Research Group. “Ofcom has an analogous role in regulating broadcast TV and Video-on-Demand services, and so should have transferable skills and people to take on the new role. However, the role of policing the internet is ultimately more challenging than regulating broadcasting services.”

“Online firms operate internationally, and Ofcom may face jurisdictional issues in attempting to regulate these online firms. Furthermore, the sheer volume of online content is vast. For example, it is estimated that over 500 hours of content are uploaded to YouTube every minute. For online firms or Ofcom to police this content cannot be undertaken by humans alone, and will require some form of AI.”