This book reconstructs magic for a probabilistic world.
Drawing on decision theory, cognitive science, mental model frameworks, and the operational tradition of chaos magic, it argues that magical practice is best understood as probability engineering — the deliberate shaping of outcome distributions through attention, belief, and behavioural intervention.
By bringing into dialogue the work of thinkers such as Daniel Kahneman, Annie Duke, and Peter J. Carroll, it develops a rigorous account of influence without recourse to superstition. It shows how perception, decision processes, and symbolic action interact within complex systems to alter the trajectory of events.
Magic, in this framework, is not supernatural power.
It is structured agency exercised within conditions of uncertainty.
The first chapter is being made available to all subscribers so the depth, method, and intellectual foundation of the work can be examined directly. From Chapter 2 onward, the material becomes increasingly technical and operational in character, and will therefore be published behind the paywall.
If you find value in this work, please consider becoming a paid subscriber. Your support makes it possible to continue developing and publishing this research at the level of depth and rigour it demands.
The contents are here
Part 1 - Onthology of Uncertainty
Chapter 1 - The Death of Mechanistic Reality
1. The Machine That Once Explained Everything
For approximately three centuries, the dominant scientific and philosophical conception of the universe was mechanistic in the strongest possible sense.
This characterisation was not merely metaphorical. It expressed a literal ontological commitment: reality itself was understood as a vast, law-governed system composed of interacting parts whose behaviour was fully determined by precise and invariant principles. Every event was presumed to arise as the necessary consequence of antecedent conditions. Causes generated effects with exact regularity, and, in principle, complete knowledge of the present state of the system would permit equally complete knowledge of its future evolution.
Under this framework, uncertainty possessed no fundamental status. It was interpreted exclusively as epistemic limitation, a consequence of incomplete measurement, insufficient data, or imperfect observational capacity. Nothing was inherently indeterminate; there existed only phenomena not yet fully known.
Importantly, this mechanistic worldview extended far beyond the domain of physics. It shaped broader conceptions of knowledge, agency, and control. If reality operated analogously to a clockwork mechanism, then mastery of nature consisted in discovering the laws governing motion and interaction. Once those laws were known with sufficient precision, prediction would become exact and intervention reliably effective. Control was understood as the natural extension of knowledge.
Determinism, therefore, functioned not merely as a scientific hypothesis but as a comprehensive ontological commitment, a claim concerning the fundamental structure of reality itself.
Within such a framework, uncertainty was provisional rather than structural. It represented a temporary absence of information rather than an intrinsic feature of the world. In the limiting case of complete knowledge, prediction would be perfect, and where prediction is perfect, control follows directly.
This assumption underwrote a remarkably coherent intellectual architecture.
Classical physics formulated exact mathematical descriptions of motion and interaction.
Engineering applied those laws to the design and construction of machines.
Economic theory frequently modelled social systems through mechanical analogies of equilibrium and force.
Psychology sought lawful regularities of behaviour analogous to physical causation.
Metaphysical reflection likewise absorbed the same logic, treating the intelligibility of the world as a consequence of its lawful, stable, and calculable character.
Within such an ontological framework, the conceptual space available for magic was severely constrained. Only two interpretations were logically available.
Magic could be regarded as error, a misapprehension of causal relations that would eventually be corrected through improved knowledge of natural law.
Alternatively, it could be regarded as supernatural intervention, an interruption or suspension of those laws by agencies external to the system itself.
No third category was available. If the universe is fundamentally mechanical, then influence must occur either through lawful mechanism or through violation of mechanism altogether.
Yet this mechanistic conception of reality, intellectually powerful and historically productive as it was, rested upon a foundational assumption that would not survive the developments of twentieth-century science, namely that causal processes are, in principle, perfectly predictable when fully known.
2. The Dream of Total Prediction
The mechanistic conception of the universe reached its most complete and rigorous formulation in the idea of total prediction.
Consider a hypothetical intelligence capable of knowing, at a single instant, the precise position and momentum of every particle in the universe. Given this total specification of the present, together with perfect knowledge of the laws governing motion and interaction, the entire future and the entire past would be calculable with absolute accuracy.
Nothing would remain uncertain. Nothing would be contingent. Nothing would remain genuinely open.
The universe would unfold in strict accordance with its initial conditions, its entire history contained implicitly within its present state.
This formulation represents the logical culmination of classical determinism. It expresses a conception of reality in which unpredictability is purely epistemic, arising only from the limitations of observers rather than from any indeterminacy in the world itself.
Within such a framework, the problem of control becomes conceptually straightforward. To influence the future is simply to modify present conditions. If causal chains are exact, stable, and fully specified, then sufficiently precise intervention yields correspondingly precise and predictable outcomes.
This is the ideal of total control grounded in total knowledge.
It also describes the intellectual environment within which modern science developed and technological power expanded. Deterministic modelling proved extraordinarily successful across a wide range of domains. Planetary motion could be predicted with great precision. The behaviour of rigid bodies could be calculated reliably. Electrical circuits could be designed to perform with consistent regularity. Within many bounded systems, causal relations appeared stable, quantifiable, and reproducible.
The practical success of these models encouraged a far-reaching assumption: if prediction works with remarkable accuracy in many domains, then in principle it must be universally applicable.
Reality, on this view, was fundamentally computable.
Yet an important distinction must be made between systems that can be predicted under specific conditions and a universe that must be predictable in principle. The former is an empirical claim, supported by successful modelling within limited domains. The latter is a philosophical assertion about the ultimate structure of reality.
It is precisely this philosophical claim that would come under sustained pressure from multiple directions during the twentieth century. Developments in physics, mathematics, and the study of complex systems each revealed distinct forms of limitation. Each demonstrated that predictability, even in principle, encounters structural constraints.
Taken together, these developments did not merely complicate the mechanical worldview. They rendered its universal form untenable.
3. The First Fracture: Indeterminacy at the Foundations of Matter
The first decisive rupture in the mechanistic conception of reality did not occur at the scale of planets, machines, or observable macroscopic systems, but at the smallest scales accessible to precise measurement.
Classical physics had assumed that physical entities possessed definite properties such as position, momentum, and energy independently of observation. Measurement, within this framework, functioned merely as revelation. It disclosed values that already existed in determinate form prior to their observation.
This assumption began to break down as experimental inquiry penetrated the domain of atomic and subatomic processes.
At microscopic scales, measurement no longer appeared to function as passive disclosure. Instead, it seemed to participate in the specification of the very properties it recorded. Observed quantities did not behave as fixed, determinate values but as distributions of possible values, each associated with a definable probability. Events that classical physics would have regarded as strictly determined instead appeared intrinsically unpredictable in their individual occurrence.
This development cannot be attributed simply to technical limitation or inadequate instrumentation. The transformation occurred at the level of theoretical description itself. Physical theory no longer sought to specify exact trajectories of individual entities but to characterise the statistical structure of their possible behaviours. Prediction shifted accordingly. Rather than forecasting specific outcomes, theory provided probability distributions over potential outcomes under defined conditions.
The conceptual shift was profound.
The most precise physical theories no longer described what will occur, but what is most likely to occur given a particular configuration of circumstances.
Determinism at the foundational level of matter did not merely weaken; it was replaced by irreducibly probabilistic description.
Calculation remained possible and extraordinarily precise, but its results took a different form. What could be computed were distributions, expectation values, and likelihoods rather than singular, determinate futures.
At its most fundamental level, the physical world could no longer be understood as a mechanism executing a fixed sequence of events. It was more accurately described as a structured field of weighted possibilities within which specific outcomes were realised.
Importantly, this transformation did not entail the disappearance of physical law. Regularity persisted. Mathematical structure persisted. The behaviour of systems remained subject to rigorous description. What changed was the character of predictability. For certain classes of phenomena, the most complete and accurate description available was inherently statistical.
Uncertainty, therefore, could no longer be interpreted solely as a consequence of incomplete knowledge. It was not merely epistemic. It was embedded in the formal structure through which physical processes were described.
Indeterminacy entered not as ignorance, but as a constitutive feature of theoretical representation itself.
4. The Second Fracture: Sensitivity to Initial Conditions
Even if microscopic processes are irreducibly probabilistic, it might still be assumed that large-scale systems would average out such variability and therefore exhibit stable, predictable behaviour. At the scale of everyday experience, objects do not appear to behave randomly. A projectile follows a trajectory that can be calculated with considerable accuracy. A bridge either stands or fails in accordance with measurable structural loads. Many macroscopic processes display regularity sufficient for reliable prediction within practical limits.
However, further developments in the study of dynamical systems revealed that even systems governed entirely by deterministic equations can exhibit behaviour that is effectively unpredictable.
Certain classes of dynamic systems display extreme sensitivity to initial conditions. Infinitesimal differences in starting values, differences far smaller than any measurement could capture in practice, expand over time until system trajectories diverge dramatically. What begins as negligible variation becomes, through repeated amplification, substantial divergence in outcome.
In such systems, the governing equations remain strictly deterministic. The rules themselves contain no randomness. Yet prediction becomes severely limited because initial conditions cannot be specified with infinite precision. Any small uncertainty present at the outset grows exponentially as the system evolves.
The result is a fundamental tension.
The system is deterministic in principle.
It is unpredictable in practice.
Even where governing laws are fully known and causally precise, long-term behaviour cannot be forecast reliably because the unavoidable imprecision of measurement is dynamically amplified. Predictive error is not merely accumulated; it is magnified.
This discovery significantly altered the conceptual status of prediction. Determinism alone does not guarantee predictability. The existence of exact causal laws does not ensure that future states can be operationally specified.
Deterministic systems may generate behaviour that is effectively indeterminate for any observer constrained by finite measurement capacity.
Some processes resist prediction not because they are random in their governing structure, but because they are exquisitely sensitive to initial variation. Their trajectories are shaped by conditions that cannot be fully captured, and thus cannot be projected forward with certainty.
In such systems, the future is not fixed in any operationally meaningful sense. It unfolds along pathways that cannot be specified in advance, even when the underlying rules are perfectly understood.
The ideal of total prediction therefore collapses for two distinct reasons. It fails not only because certain physical processes are fundamentally probabilistic, but also because many deterministic systems possess structural instability that renders long-term forecasting impossible in practice.
5. The Third Fracture: Complexity and Emergence
A further challenge to mechanistic certainty emerged from the study of complex systems, that is, systems composed of large numbers of interacting components whose collective behaviour cannot be adequately described through simple linear relations among their parts.
In such systems, interaction itself becomes structurally significant. Components do not operate independently but influence one another through continuous exchange. Feedback loops arise in which outputs re-enter the system as inputs, effects modify causes, and local interactions generate global patterns. Behaviour, under these conditions, is not reducible to the properties of individual elements considered in isolation. It emerges from the organisation of the network as a whole.
A wide range of natural and social phenomena exhibit these characteristics, including atmospheric dynamics, ecological systems, economic markets, neural processes, and large-scale human societies. Each consists of many interdependent units whose interactions produce system-level behaviour that cannot be straightforwardly inferred from component-level analysis alone.
Several structural features distinguish such systems.
Small perturbations may generate disproportionately large effects, particularly when amplified through feedback mechanisms.
Stable patterns can arise spontaneously without central coordination or external direction.
System organisation can shift or reorganise internally as interaction patterns change.
Accurate modelling requires simultaneous consideration of processes operating across multiple temporal and spatial scales.
Even where underlying rules governing local interactions are simple and well-defined, the resulting collective dynamics may be extraordinarily difficult to anticipate. The difficulty does not arise solely from insufficient information, but from the combinatorial and recursive nature of interaction itself.
More significantly, complex systems generate novelty. They produce structures, patterns, and modes of organisation that are not explicitly specified in initial conditions but arise through sustained interaction over time. This phenomenon, commonly described as emergence, marks a decisive departure from strictly mechanical conceptions of causality. System-level properties appear that cannot be fully predicted by analysing component parts alone.
Under such conditions, the future is not merely the unfolding of pre-existing structure. It includes the formation of new organisational configurations that cannot be completely specified in advance.
The world, therefore, does not simply change state within fixed parameters. It undergoes transformation in form and organisation.
It evolves.
Evolution, in this sense, is intrinsically path-dependent, contingent upon prior states, and layered through historical development. What becomes possible at any moment is conditioned not only by governing rules, but by the sequence of interactions through which the present configuration has emerged.
6. The Collapse of the Mechanical Universe
Taken together, these three lines of development, probabilistic physical foundations, sensitivity to initial conditions, and emergent complexity, converge upon a single, far-reaching conclusion.
The universe cannot be understood as predictably mechanical.
It remains structured, governed by lawful regularities, and open to systematic investigation. Yet it is no longer fully calculable, not perfectly forecastable, and not adequately describable in terms of linear chains of cause and effect. The behaviour of many systems cannot be projected with complete precision, even in principle, from knowledge of their governing rules and present states.
For observers embedded within such systems, the future is not determined in any operationally accessible sense.
Reality continues to exhibit order, but that order manifests through probability distributions, statistical tendencies, feedback processes, and dynamic interactions rather than through fixed and uniquely specified trajectories.
This does not entail the wholesale abandonment of the mechanical worldview. Within restricted domains, it retains substantial validity. Many systems remain sufficiently simple, sufficiently isolated, or sufficiently constrained that deterministic modelling provides highly accurate predictions. Engineering practice continues to rely on such models, and modern technological power depends upon their reliability.
What collapses is not the utility of determinism in particular contexts, but its claim to universal applicability.
Determinism no longer functions as the master description of reality.
In its place stands probability as the primary framework through which the behaviour of complex and interacting systems is most adequately understood.
7. What Replaces Determinism?
When deterministic certainty is abandoned, two characteristic interpretive errors frequently emerge.
The first is a form of nihilistic randomness, the assumption that if the world cannot be predicted with precision, it must therefore be arbitrary or without structure.
The second is a form of mystical voluntarism, the assumption that if outcomes are not fixed in advance, they may be shaped directly and without constraint by intention alone.
Both positions arise from a failure to understand the nature of probabilistic order.
Probability does not imply chaos in the sense of absence of structure. It describes structured uncertainty. Possible outcomes do not occur with equal likelihood. They exhibit patterned distributions. Certain configurations arise more frequently than others. Tendencies persist across time. Regularities stabilise behaviour even within systems that remain open to variation and change.
The collapse of certainty does not entail the disappearance of law. Rather, it marks a transformation in the form that law takes.
Law no longer governs singular outcomes with absolute necessity. Instead, it governs the distribution of possible outcomes, constraining their relative likelihood and shaping the patterns through which events occur.
Within such a framework, influence operates not by dictating results, but by shifting likelihoods. Intervention modifies probability distributions rather than determining specific futures.
Control, accordingly, is no longer an all-or-nothing condition. It becomes graded, partial, and continuous. It operates along gradients of influence rather than through absolute determination.
8. The New Ontological Condition
The contemporary ontological condition may be stated with precision.
Reality consists of interacting systems whose behaviour can be described in statistical terms but cannot be predicted with complete certainty; whose trajectories depend upon initial conditions that cannot be specified with perfect accuracy; and whose interactions generate emergent structures through processes unfolding over time.
This constitutes the environment within which all action takes place.
No outcome is guaranteed.
No capacity for influence is entirely absent.
Events arise within probability landscapes shaped by physical processes, informational exchange, behavioural dynamics, and networks of feedback.
The future is neither fully determined nor wholly unconstrained.
It is structured through differential weighting.
9. Why This Matters for Agency
Within a strictly mechanistic framework, agency is conceptually unambiguous. One identifies the governing causal laws and applies them to produce desired effects. Influence operates directly, and where knowledge is sufficient, outcomes are predictably determined.
Under conditions of radical randomness, by contrast, agency effectively collapses. If events occur without structured regularity, intervention cannot produce systematic effects. Action may coincide with outcomes, but it cannot reliably influence them.
In probabilistic systems, however, agency assumes a different form.
Interventions do not guarantee specific outcomes. Instead, they modify the distribution of possible outcomes. Actions alter conditions; altered conditions reshape processes; and those processes, in turn, modify the likelihoods associated with future events.
Control thus becomes indirect yet consequential. It operates not through determination, but through structured influence.
Accordingly, the central practical question changes. The relevant issue is no longer whether a particular event can be caused in a determinate sense, but whether its probability can be increased relative to available alternatives.
This shift redefines intentional action.
Power becomes the capacity to reshape probability landscapes across time through sustained and structured intervention.
10. The Opening for a New Kind of Magic
Within a strictly deterministic universe, the concept of magic can be sustained only by positing a violation of natural law. If all events unfold through fixed and fully determined causal relations, then any purported magical effect must either be reducible to ordinary mechanism or attributed to forces operating outside the lawful structure of the system.
Within a universe characterised by pure randomness, by contrast, the concept of magic loses coherence entirely. If outcomes occur without structured regularity, intervention cannot systematically alter them. Under such conditions, no method of action can produce reliable influence, and the notion of magical efficacy becomes indistinguishable from coincidence.
A probabilistic universe, however, introduces a third conceptual possibility.
If outcomes are not fixed but distributed across weighted possibilities, and if human cognition and behaviour participate in the causal networks that shape those distributions, then structured intervention within those networks may alter the likelihoods associated with future events. Under such conditions, influence need not violate natural law in order to be effective. It need only operate within the lawful dynamics of uncertainty.
No appeal to supernatural agency is required. What is required is the capacity to exert leverage within probabilistic structure.
It is within this conceptual space that modern operational magic becomes intelligible. It does not persist as a residual form of pre-scientific belief, but emerges as a practical orientation toward a reality understood to be structured by probability rather than fixed determination.
Within the mechanical universe, magic appeared either unnecessary or impossible. Within the probabilistic universe, influence is not only possible but pervasive.
The central question therefore shifts. The issue is no longer whether reality can be influenced, but whether such influence can be rendered systematic, disciplined, and subject to evaluation and refinement.
11. The End of Certainty, the Beginning of Navigation
Human beings do not occupy a position external to probabilistic systems. They exist within them as embedded components, cognitive, behavioural, and social nodes participating in networks of interaction that continuously shape and are shaped by unfolding processes.
Uncertainty, therefore, is not an external condition from which they may detach themselves. It is the medium within which they operate.
While uncertainty cannot be eliminated, it can be navigated.
Within a probabilistic ontology, navigation replaces prediction as the central metaphor of agency. Action no longer consists in calculating a uniquely determined future, but in adjusting one’s position within a dynamic and evolving field of possibilities.
Sailing does not entail control over the ocean; it involves the strategic use of currents and prevailing conditions.
Strategy does not determine the course of history; it operates by identifying and exploiting persistent tendencies within unfolding processes.
In analogous fashion, magic, understood in its modern operational sense, does not command reality through direct imposition of will. It functions through the strategic exploitation of probability gradients within complex systems.
12. The Foundation of Everything That Follows
This chapter establishes the ontological foundation upon which the entire framework of this work depends.
Reality is not a mechanism executing predetermined outcomes.
It is more accurately understood as a dynamic structure of probability shaped by interacting processes operating across multiple scales.
Several consequences follow from this characterisation.
Prediction is necessarily limited.
Control is inherently partial.
Influence is continuous and distributed.
Human beings therefore do not act within a world of certainty, nor within a world devoid of structure. They act within a world of organised uncertainty, one in which patterns exist but outcomes remain open within constrained ranges.
All subsequent analysis, including the study of cognition, decision-making, mental models, chaos magic, and probability engineering, rests upon this shift in ontological perspective.
Magic does not persist in opposition to scientific understanding. Rather, it becomes intelligible precisely because scientific inquiry has revealed that certainty does not constitute the fundamental structure of reality.
Next Chapter:
Probability as the Structure of Reality
Where we examine what probability actually is, how distributions shape experience, and why understanding them is the first step toward deliberate influence.


«Within a strictly mechanistic framework, agency is conceptually unambiguous. One identifies the governing causal laws and applies them to produce desired effects. Influence operates directly, and where knowledge is sufficient, outcomes are predictably determined.»
But as we are observably not the Laplacean demon, knowledge about world state and rules is never sufficient for perfect predictions and agency. But without believing oneself to be contradiction to a fundamental mechanistic framework, we can still reason with probability and randomness, knowing them to be cognitive tools and necessary, useful simplifications for us non-divine beings. If I assume however that the universe itself to be inherently probabilistic and non-deterministic rather than just my necessarily constrained model thereof, my reasoning would be full of causality paradoxes, instead of just known uncertainty about how things work in detail. And that would be an operational problem.
«Determinism no longer functions as the master description of reality.»
Well, I still assume it would. But we are parts of reality, not masters of it. So it's not a description we have access to (nor would we have the computational resources, if we had). We can still evaluate, which of our models match the master prediction more closely. Every computational model, including the master one, can be compressed to run on lower-specced hardware with lower accuracy output. If I reject determinism, I cannot evaluate my models, on how closely they match an optimally compressed master model.
Whilst I cannot ask God to verify my evaluation either, I find the framing helpful nevertheless.
Anyway, this may all just be an operationally irrelevant disagreement.