The European Commission’s refusal to give public access to the risk assessment report of a large social media company on its compliance with the provisions of the Digital Services Act | Case | European Ombudsman

The complainant asked the European Commission for public access to the risk assessment report of a large social media company on its compliance with the provisions of the Digital Services Act (DSA) – annual reporting is part of the obligations of ‘very large online platforms’ under the Act. The Commission refused access to the document, arguing that it could be generally presumed that disclosure could undermine the commercial interests of the company in question as well as an ongoing investigation into the company’s compliance with the DSA. The Commission did not individually assess the report for possible disclosure.

The Ombudsman concluded that the Commission’s application of a general presumption of non-disclosure to the risk assessment report constituted maladministration. She recommended that the Commission conduct an individual assessment of the document with a view to granting the widest access possible.

Source: The European Commission’s refusal to give public access to the risk assessment report of a large social media company on its compliance with the provisions of the Digital Services Act | Case | European Ombudsman

EU court adviser rejects Meta’s fight against EU antitrust demands for Facebook data

“In his opinion, Advocate General Athanasios Rantos proposes that the Court of Justice dismiss both appeals and uphold the judgments of the General Court”, the court said in a statement, adding that Rantos said in his non-binding opinion that the General Court “did not err in law in assessing the necessity of the information requested or in examining the safeguards for its provision.”

Source: EU court adviser rejects Meta’s fight against EU antitrust demands for Facebook data

The Practical Limits of Algorithmic Transparency: Lessons from France · Félix Tréguer

In the face of concerns raised by AI systems in recent years, transparency has emerged as a central governance principle, enshrined in regulations from France’s Digital Republic Law to the EU AI Act. This talk examines the practical enforcement of algorithmic transparency provisions through action-research conducted by La Quadrature du Net, a French digital rights organization. Drawing on five years of Freedom of Information Act (FOIA) requests targeting surveillance algorithms deployed by French public authorities, we document systematic enforcement failures that render transparency requirements largely symbolic. Our findings suggest that transparency operates primarily as a legitimizing device for algorithmic governance rather than as an effective accountability mechanism.

Source: The Practical Limits of Algorithmic Transparency: Lessons from France · Félix Tréguer