How Facebook’s algorithm escapes the control of its creators

By Alexandre Piquard and Damien Leloup

Posted today at 6:10 am

This is perhaps the main feeling that emerges from reading the “ Facebook Files “. Among these thousands of pages of internal Facebook documents, recovered by Frances Haugen, a former employee, and transmitted by an American parliamentary source to several media, including The world, many passages seem to indicate that Facebook no longer understands, or does not understand, what its own algorithms do. And that his social network has become a machine that is difficult to control.

The “Facebook Files”, a dive into the workings of the “likes” machine

The “Facebook Files” are several hundred internal Facebook documents copied by Frances Haugen, a specialist in algorithms, when she was an employee of the social network. They were provided to the US regulator and Congress, then transmitted by a US parliamentary source to several media, redacted from the personal information of Facebook employees. In Europe, these media are, besides The world, the German daily Süddeutsche Zeitung, the WDR and NDR television channels, the Tamedia Group, Knack, Berlingske and OCCRP.

They show that Facebook is devoting more resources to limiting its damaging effects in the West, to the detriment of the rest of the world. They attest that these effects are known internally but the warning signals are not always taken into account. Finally, they prove that Facebook’s algorithms have become so complex that they sometimes seem to escape their own authors.

This is particularly the case for a crucial algorithm, responsible for “classifying” the messages that appear in the news feed of users: Facebook uses a multitude of signals, the simplest – the number of people subscribed to a page – the most complex – the interest that a user’s “friends” have shown in a subject – in order to assign a “score” to a message. The higher this score, the more likely it is to appear in the news feed.

Extract from one of the anonymized documents sent to the US Congress by the former Facebook employee Frances Haugen, in which an employee is surprised that the ranking scores of certain content are very high.

However, with time and the accumulation of new signals added by the engineers of the social network, the average “score” of a message has exploded. In an undated document, a Facebook analyst made some calculations, and found that for some content, the score “May exceed 1 billion”. This has the very direct consequence of rendering many moderation tools inoperative. These reduce the score of certain problematic messages by 20%, 30% or 50%, in order to prevent them from being broadcast too much. But for the top rated content, the score is so high that dividing it by two doesn’t prevent them from continuing to show. “Some of this content would remain in the lead even if we applied a drop in their score of 90%”, regrets the author of the document.

Complex system without “unified vision”

This problem is absolutely not the result of an intentionally implemented policy, which considers that certain messages should be immune to automatic moderation tools. This is simply one of the many side effects caused by the hundreds of changes to Facebook’s algorithms over the years, the consequences of which the social network’s own engineers seem unable to anticipate.

Read also Article reserved for our subscribers Facebook Files: outside the United States, the weaknesses of moderation in dozens of languages

“The different parts of Facebook applications interact with each other in complex ways” and each team develops modifications without there being a “Unified systemic vision”, regrets the employee Mary Beth Hunzaker, in a long note written on the occasion of her departure from Facebook, in August 2020. The consequence? “An increased risk of problems facilitated or amplified by unforeseen interactions between functions or services of the platform. “

You have 79.72% of this article left to read. The rest is for subscribers only.