Federal Bern bows to Brussels officials
New law to allow platforms to be blocked
by Michael Straumann*
(12 December 2025) Around half of the Swiss population has turned its back on traditional media. This is shown by the new Quality of Media Yearbook published by the University of Zurich: 46 per cent are now considered “news deprived” — people who hardly ever consume news and, if they do, only via social media.1 This is a historic high.
(Picture ma)
This development is not new. Trust in the established media has been declining for years. This opens a window of opportunity for newly emerging media that see a gap in the market. For the old media themselves — and even more so for the “classe politique” that uses them as its preferred stage — the trend is alarming. The interpretive authority of the political-media complex is eroding in Switzerland, albeit slowly.
Instead of asking themselves self-critically why trust has been crumbling for years, the old media and politicians in this country like to shift the blame away from themselves. Sometimes it’s the Russians or the Chinese, sometimes it’s the unregulated social media with their opaque algorithms. The buzzwords are then: disinformation, misinformation.
Fake news is always spread by “the others”
In June 2024, the federal government published a report entitled “influence activities and disinformation”,2 warning of the dangers of alleged fake news. And recently, Albert Rösti – SVP Federal Councillor and Head of the Federal Department of the Environment, Transport, Energy and Communications (DETEC) – declared during an appearance at the Swiss Museum of Transport in Lucerne that “disinformation is a crime”.3
If this is really the case, then Alain Berset, former head of the Department of Health, should logically be prosecuted for his false statements during the coronavirus pandemic – for example, for his appearance on the Arena TV programme on 5 November 2021, when he falsely claimed that the Covid certificate shows “that you are not contagious”.4 But that will not happen. On the contrary: Berset has become Secretary General of the Council of Europe5 and was recently awarded an honorary doctorate by the University of Fribourg in Switzerland.6
Fake news is always spread by “the others”. Telegram founder Pavel Durov once put it clearly:7 Terms such as “misinformation” and “disinformation” serve as “code words for censorship” to silence unwelcome voices. The Federal Council’s latest push to introduce a new law regulating social media and search engines also appears in this light.
Chronicle of a law foretold
According to the Federal Council, the planned “Federal Act on Communication Platforms and Search Engines”8 is intended to “strengthen the rights of users in the digital space and oblige very large platforms to be fairer and more transparent”. The EUs Digital Services Act9 served as a model for this. Since August 2023, this act has required internet platforms to act not only against illegal content but also against “disinformation” and “hate speech” – terms that are broadly defined and leave considerable scope for the deletion of politically unwelcome opinions.
Switzerland is now moving in a similar direction, albeit in a milder form. The pattern is well known: the country is following the example of other countries – with a time lag and in a watered-down form but essentially based on the same model. The debate was initiated by Jon Pult, an SP National Councillor from the canton of Graubünden. He submitted a parliamentary initiative in November 2021,10 at the height of the coronavirus pandemic. Even then, the demand was that “hate speech” and “disinformation” must be consistently combated.
This was followed in December 2022 by a “Joint Statement on Platform Regulation”,11 written by AlgorithmWatch, Digitale Gesellschaft and Stiftung Mercator Schweiz. The paper advocated the adoption of key elements of the Digital Services Act and focused on the fight against hate speech and disinformation. However, this met with resistance from several of the organisations involved. The Pirate Party and the Chaos Computer Club warned in their own statement12 that the state should not become the arbiter of truth – otherwise, the door would be opened to censorship.
In February 2023, Pult’s parliamentary motion was rejected by the relevant committee13 – not least due to public pressure from organisations such as the Chaos Computer Club, the Pirate Party and the Internet Society.
In January 2025, the Federal Media Commission (EMEK) weighed in on the matter.14 This extra-parliamentary federal commission, several of whose members have close ties to the influential Mercator Foundation (including Angela Müller from AlgorithmWatch15), directly referenced the 2022 joint statement and called on the federal government to press ahead with the regulation.
Despite this pressure, the Federal Council took an unusually long time to produce its preliminary draft.16 The international situation probably also played a role in this: US President Donald Trump described measures taken against platforms such as X or Meta as discrimination against US companies. The Federal Council may have wanted to avoid stirring up trouble. The move then followed at the end of October: the Swiss government presented its draft and opened the consultation process.
Authorities could block platforms in future even without a court order
The law would only apply to platforms used by at least ten per cent of the population at least once a month. This would cover YouTube, WhatsApp, LinkedIn, Instagram, Facebook, Snapchat, Pinterest, TikTok and various messenger services. Among search engines, Google is likely to be the most affected.
Article 4 of the draft provides for a reporting procedure. Most large platforms already have corresponding functions, so the instrument is not fundamentally new. Content can already be reported on X, with different reporting categories depending on the location setting.
In terms of content, the reporting procedure targets violations of Swiss criminal law: depictions of violence, defamation, slander, insults, threats, incitement to murder, sexual harassment, public incitement to crime or violence, and discriminatory or hateful statements under Article 261 of the Swiss Criminal Code. A recent case shows how broadly hate speech violations are now interpreted: a craftsman from Bern has been sentenced to ten days in prison17 for daring to say that there are biological differences between men and women.
The draft becomes controversial when it introduces very broad criteria. Article 20(2)(c) refers to “negative consequences for the formation of public opinion”. What does that mean in concrete terms? Political slogans? Polemical posts? Opinions that are unwelcome to the state? Formulations such as “negative consequences for election and voting processes”, “for public safety” or “for public health” remain equally vague. The scope for interpretation is considerable – and so is the potential for abuse.
Even more sensitive is the planned out-of-court dispute resolution. Whereas today the police, public prosecutors and courts are responsible, in future a single click on the report button could be enough to make a post disappear without judicial review. Although an appeal procedure is provided for, this would effectively undermine the ordinary legal process. Offences would no longer be clarified legally – they would simply be deleted.
The draft also closely follows the EU’s Digital Services Act in terms of sanctions. The Federal Office of Communications (OFCOM) would be able to impose heavy fines without a court ruling. Companies would only be able to appeal to the Federal Administrative Court retrospectively. The penalties are considerable: up to six per cent of global annual turnover, additional fines of up to one per cent – which may exceed the total annual profit – and up to ten per cent for violations of the obligation to provide information. Even refusing to grant certain civil society organisations access to data can be punished.
BAKOM’s powers go furthest when it comes to network blocking. The authority can impose administrative measures without a court order. Article 32(2) is particularly controversial: BAKOM can instruct telecommunications service providers to restrict access to a platform if measures are ineffective or there is “reason to believe” that they could be.
In concrete terms, this means that OFCOM could have platforms such as X, Telegram, Facebook or YouTube blocked for Swiss users. Technically, such blocks could be circumvented via VPN, but they would remain network blocks – an instrument otherwise known mainly from authoritarian states. In addition, it is not a court that decides in this case, but a federal authority. BAKOM could arbitrarily determine that a platform does not meet the requirements – and order the block. Swiss internet providers such as Swisscom would have to implement it. Article 33 limits such blocks to 30 days, but they can be extended, which can effectively lead to longer-term blockages.
Tighter restrictions loom
The years of work by organisations closely linked to the Mercator Foundation are bearing fruit: the Federal Council has adopted key elements of their demands. The draft does not go as far as some of them would have liked, but one point stands out: there are no regulations to combat “disinformation”.
Whether this will remain the case is questionable. In the consultation process, the SP, Greens and FDP are likely to push for tighter restrictions. In a press release, the Greens have already criticised18 the draft for not containing any measures against “disinformation campaigns”. It is therefore quite possible that the final version will resemble less of a “Digital Services Act Light” and move significantly closer to the original.
The very fact that the law empowers a federal authority to block entire platforms in serious cases shows how far-reaching the powers would be – and how authoritarian certain elements of the bill are. It is therefore no exaggeration to refer to it as a censorship law. One thing is clear: this law would further restrict the public debate space, which has been shrinking in Switzerland for years. The consultation process will run until 16 February 2026. It remains to be seen how far politicians will ultimately go.
| * Michael Straumann, born in 1998, he studies political science and philosophy at the University of Zurich and works as an editorial intern for the magazine “Schweizer Monat”. He is the editor of “StrauMedia”. |
Source: https://www.straumedia.ch/p/sowjetisierung-der-debatte, 18 November 2025.
This article also appeared as a column on the portal of the ”Freie Akademie für Medien & Journalismus” (Free Academy for Media & Journalism), published by media scientist Prof. Michael Meyen and diploma journalist Antje Meyen.
(Translation “Swiss Standpoint”)
2 https://www.news.admin.ch/de/nsb?id=101494
4 https://schweizermonat.ch/verbale-entgleisungen-und-falschaussagen
5 https://www.coe.int/de/web/portal/-/alain-berset-new-secretary-general
7 https://x.com/durov/status/1976577486692753837
8 https://www.news.admin.ch/de/newnsb/6TmEAde4htulaWG9CWYtK
10 https://www.parlament.ch/de/ratsbetrieb/suche-curia-vista/geschaeft?AffairId=20210532
13 https://www.parlament.ch/press-releases/Pages/mm-rk-n-2023-02-03.aspx
14 https://www.emek.admin.ch/de/markt-und-meinungsmacht-von-plattformen
15 https://algorithmwatch.ch/de/foerderpartnerschaft-stiftung-mercator-schweiz
17 https://insideparadeplatz.ch/2025/09/22/berner-kommentiert-auf-facebook-schon-hagelts-klagen