InvestigationFacebook Files | In internal company documents, its engineers admit their incomprehension in the face of computer code with unforeseen effects, which makes the social network a complex machine, difficult to control.
This is perhaps the main feeling that emerges from reading the “ Facebook Files “. Among these thousands of pages of internal Facebook documents, recovered by Frances Haugen, a former employee, and transmitted by an American parliamentary source to several media, including The world, many passages seem to indicate that Facebook no longer understands, or does not understand, what its own algorithms do. And that his social network has become a machine that is difficult to control.
The “Facebook Files”, a dive into the workings of the “likes” machine
The “Facebook Files” are several hundred internal Facebook documents copied by Frances Haugen, a specialist in algorithms, when she was an employee of the social network. They were provided to the US regulator and Congress, then transmitted by a US parliamentary source to several media, redacted from the personal information of Facebook employees. In Europe, these media are, besides The world, the German daily Süddeutsche Zeitung, the WDR and NDR television channels, the Tamedia Group, Knack, Berlingske and OCCRP.
They show that Facebook is devoting more resources to limiting its damaging effects in the West, to the detriment of the rest of the world. They attest that these effects are known internally but the warning signals are not always taken into account. Finally, they prove that Facebook’s algorithms have become so complex that they sometimes seem to escape their own authors.
This is particularly the case for a crucial algorithm, responsible for “classifying” the messages that appear in the news feed of users: Facebook uses a multitude of signals, the simplest – the number of people subscribed to a page – the most complex – the interest that a user’s “friends” have shown in a subject – in order to assign a “score” to a message. The higher this score, the more likely it is to appear in the news feed.
However, with time and the accumulation of new signals added by the engineers of the social network, the average “score” of a message has exploded. In an undated document, a Facebook analyst made some calculations, and found that for some content, the score “May exceed 1 billion”. This has the very direct consequence of rendering many moderation tools inoperative. These reduce the score of certain problematic messages by 20%, 30% or 50%, in order to prevent them from being broadcast too much. But for the top rated content, the score is so high that dividing it by two doesn’t prevent them from continuing to show. “Some of this content would remain in the lead even if we applied a drop in their score of 90%”, regrets the author of the document.
Complex system without “unified vision”
This problem is absolutely not the result of an intentionally implemented policy, which considers that certain messages should be immune to automatic moderation tools. This is simply one of the many side effects caused by the hundreds of changes to Facebook’s algorithms over the years, the consequences of which the social network’s own engineers seem unable to anticipate.
“The different parts of Facebook applications interact with each other in complex ways” and each team develops modifications without there being a “Unified systemic vision”, regrets the employee Mary Beth Hunzaker, in a long note written on the occasion of her departure from Facebook, in August 2020. The consequence? “An increased risk of problems facilitated or amplified by unforeseen interactions between functions or services of the platform. “
You have 79.72% of this article left to read. The rest is for subscribers only.
https://wakelet.com/wake/iqc5ev6zE5VP8FFDQqPro
https://wakelet.com/wake/e7tog-4poYRlwKpnKNgvy
https://wakelet.com/wake/GENzByTkxW5iu5bXbsgaL
https://wakelet.com/wake/YddqNq5-g93oBRhJfcgDi
https://wakelet.com/wake/J-RPX9dzpgRikOtktTZky
https://wakelet.com/wake/kouZMA8VA2kVrY6vm_TgU
https://wakelet.com/wake/9VAq3JiOLi0q7JbMKmNcv
https://wakelet.com/wake/A8aqXJ6xUPwJBHid1tB1E
https://wakelet.com/wake/PMXLSF6rq_YUzt27NRy4S
https://wakelet.com/wake/UPTcQrRX-53NSe7PTRNx2
https://wakelet.com/wake/Ti7bMgNk9WJYw50JXuvye
https://wakelet.com/wake/JIfy2Y7AKD3sBd2_4KMad
https://wakelet.com/wake/em8d_8IbL3hIt_fh-Ev8Q
https://wakelet.com/wake/jbfhelEeiuEQwihbI7c_y
https://wakelet.com/wake/tEZMb9HBc_CDwMxKa_Ptk
https://wakelet.com/wake/gtrr_7efj482FR-qGt3Um
https://wakelet.com/wake/lmm4TDD0PEWQHZj44G0_h
https://wakelet.com/wake/w7X5myMkkZaJB7pkM5ahi
https://wakelet.com/wake/bgyZHGoT2S9NKTmWscdc6
https://wakelet.com/wake/5eJbFd876fZVNG87z9YJN
https://wakelet.com/wake/-h_U2RG4a-FrBmqdPRJ55
https://wakelet.com/wake/gK7t_xNDWrnHAqjZr1gx1
https://wakelet.com/wake/-xJS3T8SyT1jYqWYzHXOb
https://wakelet.com/wake/11XfJ-vi5oHWNWycIuRw-
https://wakelet.com/wake/G6KuZ_eJhGMRKTu298vzp