At its inception, Facebook – and other social media platforms – were envisioned as free-for-alls on which essentially any person, anywhere could share whatever they pleased about any subject. Over the years, however, we’ve seen that utopian concept evaporate in the face of inappropriate or offensive comments and “fake news”. It has become clear that Facebook must perform a certain degree of police work on the platform – and that the public needs to be prepared for that.
The issue of Facebook’s responsibility to monitor content was highlighted last week, when the platform – along with Apple and YouTube – took the decision to ban the output of Alex Jones, an American radio host well known for his far-right views and fringe conspiracy theories. The move followed months of public backlash against Jones’ repeated belief that events like the 2012 Sandy Hook massacre of schoolchildren didn’t take place, and that it was all staged to force politicians to impose restrictions on gun ownership in the US.
The case has brought into focus a challenge that for years has vexed Facebook: how to balance its defence of free speech and the need to control hateful and potentially harmful content.
To be clear, Facebook and other social media platforms are not under any obligation to allow people to post whatever they wish. The fact is they remain private companies that are perfectly within their rights to control the activities taking place on their “territory”. A newspaper, for example, is under no obligation to print offensive opinion pieces from the wider public, and a restaurant or mall is fully within its rights to expel customers for behaviour it deems unacceptable.
The simple fact is that Facebook, Instagram, Twitter and YouTube only have power because we, the consumer, give it to them”
That said, Facebook is seen as a different proposition because of the power the platform has. Its ubiquity means that its ability to control material its users publish could be seen as an unfair curtailment of free speech – and that’s an accusation that has started to be aired with greater force following the decision to ban Jones. Access to information and ideas, the argument runs, has been placed in the hands of a small band of unelected, San Francisco-based technology moguls.
Yet, the inescapable reality is that Facebook’s power makes content policing more pertinent, not less. Material published on the platform – it could be Jones, it could be Daesh – has the obvious potential to have real-world consequences. In Jones’ case, his views have already resulted in many parents of Sandy Hook victims being harassed and, in some cases, having to flee to escape the wrath of Jones’ audience.
Given the potential damage that can be caused by Facebook, the public should expect – indeed, perhaps even welcome – a certain degree of control over what can be viewed. To give a local example, Arabian Business is often forced to take down offensive comments left beneath our Inside AB videos on YouTube. The Guardian is also very proactive when it comes to cleaning up its comments section.
There’s also consumer power at play here. The bid to control Jones’ content came after people made enough noise to force Facebook to act to curtail his access.
That, perhaps, is the key element. The simple fact is that Facebook, Instagram, Twitter and YouTube only have this degree of power and influence because we, the consumer, continually give it to them. We can withdraw it any time. That means Facebook’s primary responsibility is to provide a safe environment for its billions of users. It should not be regarded as a custodian of free speech. That’s really not its role.
Subscribe to Arabian Business' newsletter to receive the latest breaking news and business stories in Dubai,the UAE and the GCC straight to your inbox.