Britain is preparing to raise the stakes on Facebook and other internet and social media giants as they struggle to contain the fallout from election meddling, the spread of hateful, violent messaging, and the proliferation of online networks for child sexual exploitation.
Prime Minister Theresa May’s government announced a plan Monday to create the world’s first independent regulator for “online harms.” In a more than 100-page white paper outlining the plan, which must first be approved by Parliament, the British government said the new agency will impose a “duty of care” on social media, search, messaging and even file-sharing platforms, requiring them to ensure a range of illegal or abusive content can’t be shared.
In other words, the big tech companies would be legally required to protect their users, and the consequences for not doing so would be severe, according to the white paper. The regulator would funded by taxes imposed on the internet companies.
The internet giants could face enormous fines if they fail to prevent the spread of harmful material, and senior management could face legal consequences if found to be negligent. The government said in its announcement that it expects this will be the most advanced internet regulatory regime ever developed.
“Many of our international partners are also developing new regulatory approaches to tackle online harms, but none has yet established a regulatory framework that tackles this range of online harms,” wrote the Secretary of State for Digital, Culture, Media & Sport Sajid Javid and the Secretary of State for the Home Department Jeremy Wright. “The UK will be the first to do this, leading international efforts by setting a coherent, proportionate and effective approach that reflects our commitment to a free, open and secure internet.”
The new regulator is expected to meet resistance from some of the world’s most well-known and powerful internet firms. The Internet Association, a DC-based lobbying group founded in 2012 by Google, Amazon, eBay, and Facebook, said it’s committed to working with governments on internet safety, but criticized the announcement as vague in a statement to CBS News.
“The scope of the recommendations is extremely wide, and decisions about how we regulate what is and is not allowed online should be made by parliament.” said Daniel Dyball, the Internet Association’s U.K. executive director.
CBS News contributor Nicholas Thompson, the editor-in-chief of WIRED, said in an interview on CBSN that the regulations may be going too far, too fast.
“The question I actually think is, ‘Are they too much to help the problem?’ Because this is really complicated stuff. Think about false news. Do you really want the government fining social media platforms for publishing misinformation, because the government would define misinformation.”
For years, much of the spotlight surrounding the spread of harmful content online has focused on Facebook, the world’s largest social media company. The site has been home to massive and widespread election-related disinformation campaigns, as well as live-streamed violence ranging fromto to the recent .
Damian Collins, who leads Parliament’s investigation into disinformation and election meddling, pointed to the mass shooting in arguing the regulator should be empowered to proactively pursue investigations.
“The regulator cannot rely on self-reporting by the companies. In a case like that of the Christchurch terrorist attack for example, a regulator should have the power to investigate how content of that atrocity was shared and why more was not done to stop it sooner,” Collins said.
Though Facebook is part of the Internet Association, its separate response to the U.K.’s announcement cast the company as more open to regulation.
“We have responsibilities to keep people safe on our services and we share the government’s commitment to tackling harmful content online,” said Rebecca Stimson, Facebook’s Head of U.K. Public Policy. “As, new regulations are needed so that we have a standardized approach across platforms and private companies aren’t making so many important decisions alone.”