Members of the UK parliament refer to Facebook as the “digital gangster”

In a report published in 2019, the British parliamentarians did not hesitate to attack the Californian company head-on, judging it incapable of fighting “fake news” and abusing the exploitation of data. The MPs consider that the time for self-regulation for the network majors is over and that it is now necessary to put in place a code of ethics controlled by an independent authority.

For the digital giant Facebook, which now possesses Instagram and WhatsApp and whose owners used to think they controlled the social media landscape, the time for self-control seems to be over. After the year 2018 marked by the resounding Cambridge Analytica scandal, the social media with 1.5 billion users will now have to face up to its responsibilities. Especially towards the legislators of many countries, who have been calling for legislation to punish abuses for several months.

Amongst the most active in this area, the British MPs have just published a report in 2019[1] on the practices of Facebook and other digital giants. The findings were overwhelming. Facebook and its competitors have behaved like “digital gangsters”. On one hand, by allowing inappropriate content and fake news, sources of disinformation and potential manipulation to spread on their platforms, this report of about a hundred pages, the fruit of eighteen months of parliamentary work, is the subject of this report. Facebook is also accused of having “intentionally” violated antitrust laws and texts governing the protection of personal data. Overall, the British MPs consider that Facebook is above the law : as proof, the authors of this report have tried to hear from Mark Zuckerberg, who has refused three times.

Data that Facebook is using against the EU regulation.
Credits : Markus Spiske, 2018, pexels.com

The UK Digital, Culture, Media and Sport Commission is calling for social networking, like other digital giants, to be subject to a code of ethics.

To make it effective, the commission recommends setting up an independent authority with the power to take legal action. It recommends funding the authority through a new tax on digital companies.

While misinformation is the main focus of the British MPs’ comments, competitive practices, and in particular the data economy on which the Facebook model is based, are also in the commission’s sights. “Companies like Facebook have considerable market power, which allows them to make money by intimidating small technology companies and developers who rely on this platform to reach their customers” says Collins. For example, the report cites the case of Six4Three, an application developer in conflict with Facebook, which had to close down.

One of the main criticisms that can also be levelled at the digital giants concerns the collection of personal data that is increasingly being left behind. For example the evolution of data collected from users by Google “confidentiality rules” between 1999 and 2019[3] by citizens when using the Internet, but not only because we should also note the use that is made of these data after they have been processed by the powerful algorithms of these companies. 


There is also a problem with fake news, certainly since the world is the world. But it seems to have become an incredible nuisance capacity brought about by the rise of social media, their almost instantaneous diffusion and the global dimension of their audiences. Examples are so numerous in their most diverse fields as well as in their almost infinite particularities that it might seem pointless to make a catalogue of them.  

Depiction of a Facebook user struggling between fake and real information
Credits : Kaboompics.com, 2014, pexels.com

However, we can never say enough about the extent to which false news is and remains for the moment a real “cancer” for our democracies, by the dramas that it sometimes causes, and by the intellectual and moral corruptions that it gives rise to in all the points that it contaminates and even those that should, normally, provoke universal ethical adherence. 

What would seem constructive would be to question the possibility of quarantine or at least significantly reduce the spread and nuisance of incorrect information in the very important part circulating on Internet.

It is probably time for Facebook to consider implementing simple systemic solutions that could effectively limit the scope and impact of misinformation on social media platforms.  For example, by systematically correcting fake news, by providing all users who have seen incorrect information, independently verified corrections, which would significantly reduce the belief in misinformation. And in parallel, modify the algorithm by systematically degrading disinformation and those actors in user news feeds, thereby mechanically reducing their ability to spread in a very significant way. In a way, redefining the Facebook algorithm so that it does almost the opposite of what they do today. 

Of course, one can dream, but there are, a priori, no intellectual or mathematical barriers to such change.

Leave a comment