
Today, Renee DiResta and I submitted commentary on the safety and empowerment of users to the Committee of Ministers for the Council of Europe.
Thank you for the opportunity to comment on the Draft Recommendation of the Committee of Ministers to member States on online safety and empowerment of content creators and users. We write as academic and technology policy specialists with a focus on user empowerment tools.
Our joint policy work builds on complementary vantage points: DiResta’s empirical research on how large platforms’ design choices mediate fundamental rights, and Hogg’s policy work on competition and interoperability. Together we have examined the operational realities of content curation and moderation, the blend of state and private power that shapes today’s discourse, and the practical trade-offs introduced by regulations such as the Digital Services Act. Our co-edited whitepaper “Shaping the Future of Social Media with Middleware” crystallises this collaboration, combining field research into broader expert perspectives, legal analysis, and case studies to propose user-delegated curation and moderation as a rights-preserving alternative to centralized one-size-fits-all platform control.
1. Statement of Support and Recommendations
We welcome and strongly support the draft Recommendation’s ambition to embed empowerment by design into Europe’s online-safety framework.
To give that principle force and specificity, we urge the Council of Europe (CoE) to place middleware—third-party software intermediaries that act as delegated agents between users and platforms—at the heart of the instrument. Middleware operationalizes the draft’s user-autonomy goals, is a tangible mechanism for executing on envisioned design-related duties, and aligns with the goals of parallel EU initiatives such as Digital Markets Act (DMA) interoperability rules. Concretely, we recommend:
1. Adding a definition of middleware to Definitions and giving users a right of delegation to certified middleware providers. For example, the definition might read: “Users have a right to delegate editorial decisions of content moderation and recommender systems for their platforms feeds to third-party software intermediaries.”
2. Inserting a new systemic duty (“empowerment duty by design”) obliging “platforms of significant influence” to expose interoperable APIs and granular feed-selection tools that enable middleware.
3. Clarifying specifically that empowerment duties are compulsory only where proportionate—adding into § 57 more specifically that they should be chiefly for “platforms of significant influence”—while smaller or niche services may be exempt.
4. Providing safe-harbour incentives and explicit immunity—modeled on existing intermediary-liability doctrines—for platforms that make middleware-friendly APIs available and third-party providers that supply user-empowerment tools, such that offering such controls does not increase their legal exposure. This would be coupled with baseline obligations to combat illegal content.
These changes will empower creators and users, and rebalance informational power without imposing prescriptive editorial mandates.
2. General Feedback on Draft
2.1 User controls address the concentration problem highlighted in § 1
Paragraph 1 of the draft notes that “a small number of influential platforms” can effectively determine the means for exercising freedom of expression. Middleware mitigates this risk.
2.2 Middleware as the practical path to user empowerment
The draft rightly centers user autonomy. Middleware operationalizes that autonomy by letting individuals (or communities) choose the algorithms that curate their feeds and the policies that moderate their spaces, rather than having those choices imposed by a centralized platform or state authority.
2.3 Alignment with existing European instruments
The DMA already recognises the need for interoperability and portability to spur competition and user choice. Adding middleware to this Recommendation would conceptually align Council of Europe standards with existing EU regulations.
2.4 Including liability protection for platforms
For interoperability and portability requirements to be effective, platforms must be secure in understanding the extent and limits of their legal liability for the actions and practices of third-parties. We offer three recommendations. First, safe-harbors and liability protections should be established where a platform exposes certified middleware APIs and undergoes independent safety audits. Second, escalating regulatory scrutiny should be applied where a platform blocks or degrades API access. Third, safe-harbors should be extended to third-party tool providers themselves, ensuring they are not penalized for enabling users to filter, rank, or block content—an approach compatible with European human-rights principles.
2.5 Addressing potential objections
We acknowledge concerns that personal-choice algorithms could deepen echo-chambers. However, middleware’s modular design allows for pro-social defaults and bridging metrics, while still preserving user choice. Baseline illegality rules under the DSA would remain untouched; middleware supplements, not replaces, top-level content standards.
By explicitly weaving middleware into the Recommendation’s text, the Council of Europe can convert its high-level commitment to user empowerment into concrete, future-proof standards that expand choice, enhance safety, and reduce systemic dependencies on a handful of dominant and centralized providers of content moderation and recommender systems. We would welcome the opportunity to provide further evidence or participate in forthcoming multi-stakeholder dialogues.
We appreciate the work of the Committee and look forward to continued opportunities to engage on issues of online safety and empowerment of content creators and users.