The Story

Last week Microsoft confirmed, in response to reporting by TechCrunch and others, that it had handed BitLocker recovery keys for three laptops to the FBI following a valid court order. The underlying case was a fraud investigation in Guam. The laptops were encrypted with BitLocker — the full-disk encryption built into Windows, which many institutions and individuals rely on as their primary protection against unauthorised data access.

The mechanism is simple and was not widely known. When you set up a modern Windows device and sign in with a Microsoft account, BitLocker automatically uploads your recovery key to Microsoft’s cloud. No prominent notification. No opt-in. The key sits there, associated with your account, accessible to Microsoft. When a US court issues a lawful order, Microsoft complies. Redmond confirmed this is policy, not an exception.

Bruce Schneier’s response was characteristically direct: “The lesson here is that if you have access to keys, eventually law enforcement is going to come.” Jennifer Granick at the ACLU called remote key storage in this configuration “quite dangerous,” particularly given that the same mechanism is available to any government that can issue a Microsoft-compatible legal order — not only the US Department of Justice.

That last point is the one European institutions should be reading carefully.


Why This Is a European Problem

The CLOUD Act — the US Clarifying Lawful Overseas Use of Data Act, passed in 2018 — allows US law enforcement to compel US-based companies to produce data held on servers anywhere in the world. If your university stores its BitLocker recovery keys in a Microsoft account, and Microsoft is a US company, the geographic location of the servers those keys sit on does not limit a US court’s reach. The keys are in Virginia, legally, wherever the hardware is.

This is not speculation. It is the explicit structure of US digital law. The European Court of Justice has repeatedly ruled that certain US surveillance frameworks are incompatible with GDPR — the invalidation of Privacy Shield in Schrems II (2020) being the most prominent example. But court rulings about data transfer frameworks do not automatically change the operational reality for an institution whose laptops are running Windows with default settings.

European universities hold exactly the kinds of data that make this a real rather than a theoretical concern:

  • Research data: medical studies, clinical trials, interviews with human subjects, social science datasets — all subject to strict ethical and legal protections
  • Student records: academic performance, personal circumstances, disciplinary proceedings
  • HR data: employment contracts, salary records, health information, union activity — particularly sensitive under German and EU labour law
  • Correspondence and draft documents: research in progress, grant applications, peer review material

If the disk holding any of this is encrypted with BitLocker, and the recovery key has been uploaded to a Microsoft account by default, the encryption provides less protection than it appears to. The key is accessible to a foreign state with a court order. That state is not party to GDPR.


The Structural Problem

The BitLocker story is one instance of a larger pattern. It is not that Microsoft behaved unusually or maliciously — it complied with a lawful order in its home jurisdiction, as it is legally required to do. The problem is structural: when an institution depends on a closed-source, US-headquartered platform for its critical infrastructure, the institution has delegated control over its own data to an entity whose legal obligations lie elsewhere.

This applies beyond encryption. It applies to email (Exchange Online, Outlook), document storage (SharePoint, OneDrive), communication (Teams), identity management (Azure Active Directory), and any service that runs through a Microsoft account or Azure tenant. For each of these: the data is subject to Microsoft’s terms, and Microsoft is subject to US law.

The same argument applies, with different specifics, to Google Workspace and any other US-headquartered platform. The issue is not that these companies are bad actors. It is that their legal accountability and the legal accountability of European public institutions point in incompatible directions, and the institutions mostly have not noticed.


What Sovereign Software Looks Like

The alternative is not paranoia and air-gapped servers. It is a coherent strategy for institutional digital infrastructure that is based on software the institution controls.

In Germany, this conversation has a name and a project. OpenDesk — developed under the aegis of the federal and state governments — is a stack of open-source tools (Nextcloud, Collabora Online, Matrix/ Element, Jitsi, Keycloak, Open-Xchange) assembled into an integrated workspace alternative to Microsoft 365. The Souveräner Arbeitsplatz (sovereign workspace) concept behind it is exactly what the BitLocker story illustrates: if the software is open, the keys stay in your institution, and no foreign court can reach them via a warrant served on a US company.

Several German states and federal agencies have been piloting OpenDesk. The city of Munich’s earlier experiment with Linux (LiMux) and its eventual rollback to Windows is the cautionary tale here — not because open source failed, but because the transition was not supported seriously enough over time, and the incumbent vendor’s lobbying was. The BitLocker story is a reminder of what is at stake in that political negotiation.

The FSFE’s “Public Money? Public Code!” campaign has articulated the principle cleanly: software developed with public funding should be released as open-source software. The argument is not only about freedom as an abstract value. It is about the practical consequence of being locked into a proprietary platform: your institution loses the ability to audit what the software does, to modify it to meet your requirements, to host it where your data protection law applies, and to switch providers without losing access to your own data.


What I Do, and Why

I work at a publicly funded institution. The software I build for institutional contexts — campus infrastructure, workforce management, archival systems, alert systems — is public.

Not because I am ideologically committed to open source as a movement, but because the alternative is incoherent. If I build tooling for a university with public funds and keep it closed, I have produced a private asset with public money, duplicated by every institution that builds the same thing independently, inspectable by nobody, and ultimately dependent on my continued willingness to maintain it or hand it over. None of those outcomes serve the institutions I am building for.

Here is what that looks like in practice:

zammad-ticket-archiver — automated archival of Zammad support tickets as cryptographically signed PDFs, with RFC 3161 timestamps for non-repudiation. Built for institutions that need legally defensible audit trails of their helpdesk operations. The signing infrastructure is self-hosted; no external party holds the keys.

alarm-broker — a silent panic alarm broker for campus facilities. Receives emergency triggers from hardware devices (Yealink keys), distributes notifications via Zammad, SMS, and Signal, with acknowledgment tracking and escalation scheduling. Runs locally, logs to self-hosted PostgreSQL; no external dependency for the alarm path.

campus-app-kit — a React Native / Expo starter for university mobile applications, with a pluggable Node.js backend designed for institutional data sources (room booking, events, schedules). The architecture separates institution-specific connectors (which institutions keep private) from the shared foundation (which is public). Any university can take it and build on it without starting from scratch.

cueq — an integrated workforce management system for German universities under TV-L (the collective agreement for public sector employees in the German states). Handles time recording, shift planning, absence management, payroll export, and GDPR-compliant audit trails. Built around NestJS and Next.js, with a PostgreSQL backend and Honeywell terminal integration. The HR data stays on the institution’s own infrastructure.

These are all boring. They are not research contributions; they are plumbing. But plumbing is what holds institutions together, and the question of who controls the plumbing — and under whose legal jurisdiction — is exactly the question the BitLocker story makes visible.


The Principle

Public money, public code. If an institution funded by public money develops software for its own operations, that software should be released under an open licence, inspectable, forkable, and deployable by any institution with the same needs.

The corollary: institutions funded by public money should prefer software that is itself openly licensed, auditable, and deployable on infrastructure the institution controls. Not as a blanket ban on proprietary tools where they are genuinely the best option, but as a starting presumption that shifts the burden of justification.

The BitLocker story is not a story about Microsoft doing something wrong. It is a story about the logical consequence of a procurement decision that was made without asking “and what happens when a US court sends a subpoena?” That question was available in 2018 when the CLOUD Act passed, in 2020 when Schrems II was decided, and before both. It is still available now, for every institution that has not yet asked it.


The FSFE “Public Money? Public Code!” campaign is at publiccode.eu. The OpenDesk project is at opendesk.de. The original TechCrunch reporting on the BitLocker handover is at techcrunch.com.