Following Facebook owner, Mark Zuckerberg’s recent call for government regulation of the internet, the UK’s Department for Digital, Culture, Media and Sport (DCMS) has responded by proposing new legislation that would fine and block websites which fail to tackle harmful content, such as child abuse or radicalisation.
The DCMS is considering setting up an independent regulator that will hold website owners to account as well as fining the company for breaching any regulations or code of conduct it implements.
In collusion with the Home Office, the DCMS has just issued a white paper which puts forward a number of suggestions, including giving the watchdog the power to enforce regulations, forcing internet service providers to block non-compliant sites, preventing search engines from listing or linking to them, publicly naming non-compliant companies and imposing harsh fines on companies and their executives if rules are not followed.
Some within the government are arguing for these fines to be similar to those for GDPR breaches, which are up to 4% of a company’s annual global turnover.
A wide agenda Â
There is a wide range of harmful issues which the government want to see website owners take accountability for preventing. These include radicalisation and terrorism, child sex abuse and exploitation, harassment and hate crime, the selling of illegal items and revenge pornography. It is also looking at other harmful behaviours, such as trolling, cyber-bullying and the dissemination of fake news, as well as seeking to remove content that promotes or encourages suicide, anorexia, self-harm and other harmful behaviours.
Not just for social media sites
While the prime focus of any new watchdog will be to hold social media sites to account, it is not just on these sites where harmful issues arise. Any website which has a forum, a blog with a comments section or which runs a small-scale social media or messaging system will need to look at how it polices publicly uploaded content. Not only can people make harmful comments, they can also post links to other harmful sites.
Similarly, if a business has an account on a social media platform to which anyone can post to or add comments, then there is a need to ensure that all these are checked for harmful content before being published.
Other sites that need to be more aware, are those which are built to publish third-party content. These can include blogs, as well as photo, video, music and games sharing sites. If a third-party uses your website to publish illicit content, such as a fake news article, an abusive meme or a song with hate crime lyrics, then you may be held accountable for it.
Those that run websites which enable third-parties to buy, sell or swap goods also need to ensure that what is for sale is not illegal. This doesn’t just include items like drugs or weapons; some printing companies, for example, enable users to create designs for t-shirts and mugs which are then advertised and sold online – if these contain harmful slogans, the printing business will be responsible for making sure they are taken offline.
Finally, if you are a hosting reseller or a web developer that hosts websites for your clients, the new proposals may require you to take down any sites which publish harmful content, whether it is done intentionally or because your clients are not policing their own site in compliance with new regulations.
What happens next?
At the moment, the government is only at the consultation stage of drawing up its plans for internet regulation and there is a long way to go before it is likely to come into force. However, after years of getting criticised for failing to tackle the matter in-house, the major social media sites are welcoming the proposals for regulation, stating that there is a need for standardisation – and this means there is a likelihood that proceedings will begin to speed up.
There are several issues that need to be ironed out, however. Firstly, some of the harms the UK government want to see removed from the internet, such as fake news, are not currently illegal and this is leading to a heated debate between those that want to see a safer internet and those who see freedom of speech being eroded. It’s this debate which will affect where the line will be drawn between what does and doesn’t get included in the regulations.
A second issue is one which Zuckerberg himself raised: that if regulation is going to work, it really needs to be globally implemented and enforced. In other words, the world has to agree on what is and isn’t harmful behaviour. Without agreements like this, each country may end up with its own regulations which will make it difficult for international websites to comply with them all.
From the perspective of a website owner, you will have to see what the outcome of the consultation is in the UK. In the meantime, you should rethink your website policies and look at ways to make sure that you do not let harmful material get published on your site.