In its early days, the internet was described as an information superhighway. The metaphor has fallen out of use, but it captures an important truth. Highways have rules – speed limits; no drunk driving – because a free-for-all would be deadly.
The damage caused by reckless or malicious use of an information network is not as visible to passing traffic as a road accident, but there is no doubting the problem. A US Senate committee this week heard the testimony of Frances Haugen, a former employee of Facebook now turned whistleblower, on what she sees as the company’s negligent practices. Users of Facebook’s family of products number around 2.8 billion, so it matters to the whole world if they are hazardous.
Ms Haugen has leaked internal research to support her claim that the company is aware of detrimental effects caused by its services – exacerbating mental health problems in young teenagers, for example – but chooses profit over safety. She believes Facebook products “harm children, stoke division and weaken democracy”.
That effect is spread across a suite of applications. Instagram and WhatsApp are also part of the empire controlled by Mark Zuckerberg, Facebook’s chief executive. The breadth of his realm was illustrated earlier this week when a technical fault shut down large parts of it. For most users that was a minor inconvenience, but there are places where Facebook is synonymous with the internet. Such power cannot be trusted to a negligent organisation. In Myanmar, Facebook has been a primary conduit for material inciting hatred against the Rohingya minority. The company does not dispute that it was used to instigate violence. But it focuses its moderating efforts on content in the US, since that is where it fears regulation.
The era of utopian romance around tech businesses, when it was claimed that they were connecting people for the betterment of humanity, is long past. The giants of the sector are now viewed as oligopolists, paying too little tax, hoarding personal data and neglecting the social cost of their business model.
Chemical plants are not free to discharge toxic waste into water supplies. Tobacco advertising is restricted. When, in the past, an industry’s product has been shown to cause pollution or ill health, government has stepped in. That process is catching up with Facebook. The task is much harder because the raw material it uses is information and designing regulation for that ends up in a debate over censorship. The boundary between tolerable unpleasantness and aggression against civil society is not easy to discern, as the UK government has discovered. Its online safety bill, as drafted, would hand Ofcom an awkward statutory duty to police the way digital platforms protect “democratically important” content without political bias; restricting harmful material without curtailing free expression.
But British regulation of Facebook is downstream of action taken in the US, where there is a federal move to break the company up to bolster competition. In theory, a challenger in the market could satisfy demand for a social network that takes its social responsibilities seriously. That is some way off, but Facebook’s dominance will not last forever. Its power was amassed in an era when the toxicity of its products was hidden from view. Thankfully, that era is drawing to a close.