Facebook’s parent company Meta: A spokesperson said the “council’s decisions and recommendations have not only impacted the content of each case specifically, but they have also ensured that we update our policies and enforcement systems, managing content and detecting to correct mistakes”. The company emphasizes that it has invested billions in the security on the platform in recent years and that 40,000 people are engaged in this.
Oversight Board: the council states that the aim has never been to judge all matters, but to gradually bring about improvements in Meta’s policy. “We do this by addressing issues of precedent potential for how Meta handles content moderation. We provide advice that calls for greater clarity, consistency and accountability in how the company publishes, develops and implements its policies.” The supervisory board also says that its work has led to more clarity about how moderation decisions are made.