The problem of digital determinism
Every click, every search query, every like, every location ping is recorded. This collective digital memory is not preserved for your benefit — it is harvested to predict and steer your future. The aggregation is silent, the inference is invisible, and the consequences are material. Your credit score, your insurance premium, the job adverts you see, the news you read, the political messages you receive: all are shaped by a digital past you neither curated nor consented to.
In 2019, Christopher Wylie published Mindfuck, his account of working inside Cambridge Analytica. The book documented how psychographic profiles — constructed from the Facebook data of 87 million users — were deployed to influence democratic elections. This was not a speculative risk or a theoretical concern. It was industrial-scale behavioural manipulation, executed through data that individuals had generated casually and unknowingly surrendered. Your digital past, in the hands of those who knew how to weaponise it, became a deterministic engine aimed at your political future.
Cambridge Analytica collapsed, but the infrastructure it exploited did not. The advertising technology ecosystem, the data brokerage industry, the recommendation engines that shape what billions of people see every day — all operate on the same foundational logic: that a sufficiently detailed record of your past behaviour enables sufficiently accurate prediction of your future behaviour. And prediction, in this context, is not passive observation. It is active shaping. The prediction becomes the intervention.
Marx: material conditions as determinism
The idea that external conditions determine individual consciousness is not new. Karl Marx, writing in The German Ideology (1846) and later in Das Kapital (1867), argued that the material conditions of production shape the ideas, beliefs, and possibilities available to individuals. Consciousness does not determine life — life determines consciousness. The worker who owns nothing but their labour power experiences the world differently from the capitalist who owns the means of production, and these different experiences produce different worldviews, not through choice but through structural necessity.
In the digital economy, data has become the means of production. The parallel is not metaphorical — it is structural. Whoever owns the data infrastructure owns the capacity to predict, influence, and monetise human behaviour. The user who generates data but does not control it occupies a position analogous to the worker who generates value but does not own the product of their labour. The asymmetry is not incidental; it is the business model.
Recommendation engines do not merely suggest; they constrain. Credit scoring algorithms do not merely assess; they determine who receives capital and who does not. Predictive policing systems do not merely forecast; they direct the physical deployment of state power. In each case, the owner of the digital memory — the accumulated behavioural data — determines the material possibilities available to the individual whose behaviour was recorded. Marx would recognise the structure instantly, even if the factory floor has been replaced by a server farm.
Marcuse: the eradication of alternatives
Herbert Marcuse, in One-Dimensional Man (1964), diagnosed a subtler form of domination. Industrial society, he argued, does not merely exploit — it absorbs opposition by creating false needs that bind individuals to the existing order. The consumer who believes they are exercising free choice is, in Marcuse’s analysis, choosing from options that have been pre-selected to reproduce the system. Critical thought — the capacity to imagine that things could be fundamentally different — is not suppressed by force. It is rendered unnecessary by comfort and irrelevant by the apparent absence of alternatives.
The digital version of Marcuse’s one-dimensionality is more radical than anything he could have anticipated. The filter bubble does not merely present a curated selection of options. It eliminates the awareness that other options exist. When your news feed, your search results, your social connections, and your entertainment are all algorithmically tailored to reinforce your existing preferences and beliefs, the very concept of an alternative worldview becomes invisible. You are not forbidden from thinking differently. You are simply never shown that there is anything different to think.
This is determinism of a peculiarly insidious kind. It operates not through coercion but through the quiet elimination of choice. It does not destroy your capacity for autonomy — it renders autonomy meaningless by ensuring you never encounter the conditions that would provoke its exercise. Marcuse warned that the greatest threat to freedom is not the tyrant who forbids, but the system that makes forbidding unnecessary.
Lefebvre: the production of space — biological versus digital
Henri Lefebvre, in The Production of Space (1974), argued that space is never a neutral container. It is produced by social relations, power structures, and economic forces. The layout of a city, the design of a building, the organisation of a workplace — all encode assumptions about who belongs, who has access, and who is excluded. Space is political, and it is produced.
The digital realm is a produced space. Its architecture — the protocols, the platforms, the algorithms, the terms of service — is not natural or inevitable. It was designed by specific actors with specific interests. The question Lefebvre would ask is not “how do we use digital space?” but “who produced this space, and in whose interest?”
Lefebvre argued that space is never neutral. The overlap between biological and digital space is where your autonomy is contested — every day, every click, every algorithmically curated feed.
Biological space has a natural limit: the body forgets, ages, and dies. Organic memory is selective, lossy, and merciful. Digital space has no such limit. Every data point is preserved indefinitely, searchable instantly, and combinable with every other data point. The biological self fades; the digital self accumulates. When these two spaces overlap — in the lived experience of a person navigating both — the question of who controls the digital record becomes a question of who controls the person.
Prisma’s answer: ODRL as anti-determinism
Digital determinism is not a law of nature. It is a design choice — and design choices can be made differently.
The deterministic loop works because data subjects have no technical mechanism to control how their data is used after collection. Terms of service are legal fictions: unread, unreadable, and unilaterally changeable. The only effective counter is to embed control into the data itself — to make access conditional on machine-readable, machine-enforceable rules that travel with the data wherever it goes.
This is precisely what ODRL 2.2 (the Open Digital Rights Language) provides. An ODRL policy attached to a dataset specifies who may access it, for what purpose, under what temporal and geographical constraints, and with what obligations. These are not guidelines or suggestions. They are executable rules, evaluated at every access point, enforced automatically. A research institution requesting health data receives it with policies that permit epidemiological analysis but prohibit commercial reuse, require anonymisation, and mandate deletion after a specified period. The policy is not a promise — it is a constraint built into the infrastructure.
PROV-O (the W3C Provenance Ontology) completes the picture by making every data action traceable. Every query, every transformation, every access decision is recorded with cryptographic integrity. The audit trail is not a log buried in a server room. It is a structured, queryable graph that enables both the data subject and the regulator to verify that policies were respected. Determinism requires opacity; provenance destroys opacity.
W3C Decentralised Identifiers (DID) break the final link in the deterministic chain. When your identity is issued and controlled by a platform, the platform can correlate your activity across contexts, building the very psychographic profiles that Cambridge Analytica exploited. A DID is owned by its holder, not by any platform. It enables authentication and authorisation without creating a centralised record of your movements. Your identity becomes something you carry, not something that is filed about you.
Determinism requires three things: a complete record of the past, opacity about how that record is used, and an identity system that enables cross-context correlation. ODRL breaks the first, PROV-O breaks the second, and DID breaks the third.
This is not utopian aspiration. Prisma implements all three standards in production code, running on EU-sovereign infrastructure, released under MIT and Apache-2.0 licences. The code is auditable on Codeberg. The infrastructure runs on European cloud providers. No data leaves EU jurisdiction. No vendor holds a master key.
Writing a different programme
Marx analysed how material conditions determine consciousness. Marcuse showed how industrial society destroys critical thought — not through repression but through the manufacture of contentment. Lefebvre revealed how space is never neutral but always produced in the interest of power. Cambridge Analytica proved all three right — in code. Psychographic profiling, algorithmic curation, and platform-controlled digital space combined to create a deterministic engine that could steer the political behaviour of millions.
The philosophical tradition tells us that determinism is structural, not inevitable. If the structure changes, the outcome changes. ODRL, PROV-O, and DID are structural interventions: they alter the conditions under which data flows, identities are managed, and accountability is maintained. They do not require individuals to become more vigilant or more technically literate. They change the infrastructure so that vigilance is no longer the only defence.
Prisma is an attempt to write a different programme. Not a programme that predicts your future from your past, but one that gives you the technical means to decide which parts of your past are visible, to whom, for what purpose, and for how long. The question is not whether digital memory will continue to accumulate — it will. The question is whether that memory will remain an instrument of determinism, or whether it can be governed by the person it describes.
The answer is a design choice. We have made ours.
Further reading
- Identity & Sovereignty — detailed wiki page on Prisma’s identity architecture and DID implementation
- Academic References — full bibliography and theoretical foundations
References
- Marx, K. (1846). The German Ideology.
- Marx, K. (1867). Das Kapital: Kritik der politischen Oekonomie. Verlag von Otto Meissner.
- Marcuse, H. (1964). One-Dimensional Man: Studies in the Ideology of Advanced Industrial Society. Beacon Press.
- Lefebvre, H. (1974). The Production of Space. Trans. D. Nicholson-Smith. Blackwell, 1991.
- Nissenbaum, H. (2010). Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford University Press.
- Wylie, C. (2019). Mindf*ck: Inside Cambridge Analytica’s Plot to Break the World. Profile Books.
- Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Profile Books.
Questions or feedback? Reach us at info@prisma-platform.eu or visit the project on Codeberg.