A needed revolution for social media platforms
Should Canada be generating a public infrastructure for social media platforms in order to counter leviathans like Facebook and X? Must rigorous state regulation of the leviathans be rolled out soon?
Image is from the April 2025 announcement by the Forum on Information and Democracy (FID) of a paper by a dozen scholars on a pathway for Europe to counter the US-based platforms of the tech oligarchs.
On Monday, we cross-posted part 2 (« Dark Patterns and the New Propaganda ») in a series from @ALetterFromAMaritimer on Substack. On Wednesday, we explored the need for civic education and a renewed journalism of both fact-checking and deeper-dive explanatory reporting.
Today, we share some developments around alternatives to the toxic, profit-driven outrage machine of deception and conspiracy theories that undermines democracy, pits people against each other, and increases risks to individual and social health.
What’s needed if we are to revive the early hopes for the democratic potential of social media is a publicly funded network of digital platforms designed deliberately to promote democratic engagement. That will require that such platforms are designed and governed on the principle of “digital sovereignty”, ensuring that people have control over privacy and how the data they produce are used. And it also emphasizes a governance autonomy — at national and local levels — in relation to foreign-actor agendas, whether those actors are corporate leviathans or states bent on interference of various sorts.
Here’s a recent (April 2025) piece written by a dozen scholars that is quite wonky but makes the critical points that democracy depends on citizens making informed decisions: “Europe has a unique chance to establish local social media platforms, to counter the new US technopolitics.” A brief overview from the Forum on Information and Democracy (FID) provides a sense of the paper’s thrust:
[T]he paper urges European leaders to play the card they hold in technology governance. It explores the potential impact and feasibility of creating a European social media infrastructure that would comply natively with local laws at the European level. This new infrastructure would have economical, social and political benefits, and place the user experience and rights at the center of the industry:
Local social media platforms respecting the regulations would be better suited to address issues such as disinformation thanks to better traceability of content, and benefit European citizens through more transparency on the use of their personal data.
The new sovereign digital infrastructure would provide interoperability between platforms, allowing users to easily communicate and migrate from one platform to another and avoiding a ‘locked-in’ effect on the users.
The creation of a new infrastructure and platforms would guarantee their compliance with the local laws, notably the DSA and the DMA, and overcome the challenge of retroactively modifying the existing platforms to effect compliance.
The question of government leadership — and funding — of a counterweight strategy such as this does not seem to have generated much discussion yet in Canada. There is no similar call from scholars for the federal government to seize the moment in the same way is being called for in Europe by this FID paper.
However, what we do have are several initiatives that have emerged from civil society, each focusing heavily on the digital sovereignty of Canadians in relation to US-dominated, corporate social media platforms.
Two of them are inviting people to sign up now in order to receive information when they go live, but it seems the user interfaces and the back-end functionalities have yet to be set out in detail:
The third project — the Great Canadian Shield — is much more forthcoming in terms of functionalities, but, unlike Gander and Eh Now not at near-launch stage. Our understanding from its champion, Sean Luo, is that the plan is to hold off on rolling out such a platform until it can be integrated into a wider “digital democracy” environment — about which discussions are under way — and until sustainable funding can be secured. We have already profiled a 50-minuteYouTube explainer on this project; that video can be accessed by going to the project’s website:
Great Canadian Shield (greatcanadianshield.ca)
Most of us would not expect that counterweights to Facebook and X will emerge any time soon or, once they emerge, become major players quickly. It will take time for alternatives to take hold and grow, and meanwhile the harms of the dominant players will continue. Government regulation must be addressed, and soon — notwithstanding the newly elected government is not signaling this issue as a priority either in the Prime Minister’s mandate letter to ministers nor in the Throne Speech.
Here are three pieces to spark reflection on government regulation.
The first piece was published by TechPolicy.press the day the election started, April 28, and reviews the state of play of Bill C-68, the Online Harms Bill — which was targeted at some of the problems of social media (apart from child predation, it focused also on hate propagation and on breeding of extremism leading to terrorism).: Many Lau, “Canada’s Online Harms Bill is Dead (Again): Three Questions to Consider for the Next Round.”
Getting regulation right is tricky — in particular, protecting freedom of expression and diversity of views and at the same time being attentive to the real harms of propaganda, spewing of hate, ideological manipulation by owners of what people see and what they do not, and so on. But doing nothing is not an option given the stakes. Here from the Canadian Centre for Policy Alternatives is the second piece; by Sondos Kataite, it is called “Canada has a disinformation problem—and the tools to fix it.” Here is part of the argument:
The European Union’s regulatory frameworks demonstrate that serious regulation of disinformation is both feasible and enforceable. The Digital Services Act (DSA) and the Digital Markets Act shift responsibility onto platforms, requiring algorithmic transparency, systemic risk assessments, and clear moderation procedures. These aren’t vague guidelines—they come with teeth. Failure to comply can result in fines of up to six per cent of the company’s global revenue.
Canada can follow suit by shifting away from voluntary moderation and moving toward structural reform: mandating transparency about how platforms curate content, requiring that users be allowed to opt out of algorithmic recommendation systems, and ensuring that independent researchers have access to platform data.
The EU’s Code of Practice on Disinformation also shows that voluntary commitments only work when paired with regulatory pressure. U.S big tech companies, like Meta and X, have resisted transparency even under EU law, proving that self-regulation fails without legal consequences.
Canada has some legal tools, but without holding platforms accountable for how they function—not just what they host—these tools fall short. The EU’s approach offers a model: treat disinformation as a systemic risk and regulate the infrastructure that enables it.
At this point, we imagine many of you have been wondering as you read this post, ‘Doesn’t AI change everything?’ Indeed, AI is a major development that makes effective regulation look like an arm race. Here is a piece by Alexander Martin published a few days ago by Open Canada, to help us ponder: “The AI Threat to Canadian Democracy: Fighting for Digital Sovereignty: AI-driven interference is outpacing regulation efforts and endangering our democracy.” Martin’s piece concludes:
Canada must urgently develop a robust and adaptive digital governance framework to safeguard its sovereignty and democracy. The Trudeau government made initial efforts through several bills… [T]hese initiatives …now need to be resumed and updated to address challenges like AI threats and mis/disinformation.
Indeed, to effectively combat AI threats and mis/disinformation, Canada must prioritize stronger, comprehensive regulations. Building on past initiatives and taking cues from frameworks like the EU’s Digital Services Act (DSA), the government needs to enforce stricter rules on tech platforms and invest in domestic digital infrastructure to protect Canada’s digital sovereignty.
As Minister of Artificial Intelligence and Digital Innovation, Evan Solomon should carefully consider these issues and ensure they are addressed through effective policies and enforcement. His leadership is essential to making sure Canada’s digital governance keeps pace with evolving risks.
What do you think?
I think it's a great idea. I quit FB and Twitter on principle and realize I'm missing a lot of content that keeps me informed, but I just can't support to the oligarch "tech bros" who are actively trying to destroy a democracy that they feel gets in their way.
To retain a strong Canadian identity on our social media, we must prioritize platforms that reflect our values, laws, and culture. This means supporting Canadian-made digital networks or adopting open-source alternatives managed within Canada. We should encourage local content creation, promote bilingual communication (English and French), and ensure our data is stored and governed on Canadian soil. By fostering a digital space rooted in Canadian principles—privacy, inclusivity, and diversity—we can resist foreign influence and strengthen our national voice online.
A. Paradis, Laval, Qc