While still in high school, I learned that the tides act as a brake on the Earth’s rotation, gradually slowing it down, and that the angular momentum lost by the rotating Earth is transferred to the Moon, causing it to slowly spiral outwards, away from Earth. I still vividly remember my puzzlement. How, by what mechanism or process, did angular momentum get transferred from Earth to the Moon? Just so Newton’s contemporaries must have wondered at his theory of gravity. Newton’s response is well known:

I have not been able to discover the cause of those properties of gravity from phaenomena, and I frame no hypotheses…. to us it is enough, that gravity does really exist, and act according to the laws which we have explained, and abundantly serves to account for all the motions of the celestial bodies, and of our sea.

In Newton’s theory, gravitational effects were simultaneous with their causes. The time-delay between causes and effects in classical electrodynamics and in Einstein’s theory of gravity made it seem possible for a while to explain “how Nature does it.” One only had to transmogrify the algorithms that served to calculate the effects of given causes into physical processes by which causes produce their effects. This is how the electromagnetic field—a calculational tool—came to be thought of as a physical entity in its own right, which is locally acted upon by charges, which locally acts on charges, and which mediates the action of charges on charges by locally acting on itself.

Today this sleight of hand no longer works. While classical states are algorithms that assign trivial probabilities—either 0 or 1—to measurement outcomes (which is why they can be reinterpreted as collections of possessed properties and described without reference to “measurement”), quantum states are algorithms that assign probabilities *between* 0 and 1 (which is why they cannot be so described). And while the classical laws correlate measurement outcomes *deterministically* (which is why they can be interpreted in causal terms and thus as descriptive of physical processes), the quantum-mechanical laws correlate measurement outcomes *probabilistically* (which is why they cannot be so interpreted). In at least one respect, therefore, physics is back to where it was in Newton’s time—and this with a vengeance. According to Dennis Dieks, Professor of the Foundations and Philosophy of the Natural Sciences at Utrecht University and Editor of *Studies in History and Philosophy of Modern Physics*,

the outcome of foundational work in the last couple of decades has been that interpretations which try to accommodate classical intuitions are impossible, on the grounds that theories that incorporate such intuitions necessarily lead to empirical predictions which are at variance with the quantum mechanical predictions.

But, seriously, how could anyone have hoped to get away for good with passing off computational tools—mathematical symbols or equations—as physical entities or processes? Was it the hubristic desire to feel “potentially omniscient”—capable in principle of knowing the furniture of the universe and the laws by which this is governed?

The question that will be centrally pursued in this book is: what does it take to have stable objects that “occupy space” while being composed of objects that do not “occupy space”? (The latter are commonly referred to as “pointlike.”) And part of the answer at which we shall arrive is: quantum mechanics.

As said, quantum states are algorithms that assign probabilities between 0 and 1. Think of them as computing machines: you enter (i) the actual outcome(s) and time(s) of one or several measurements, as well as (ii) the possible outcomes and the time of a subsequent measurement—and out pop the probabilities of these outcomes. Even though the time dependence of a quantum state is thus clearly a dependence on the times of measurements, it is generally interpreted—even in textbooks that strive to remain metaphysically uncommitted—as a dependence on “time itself,” and thus as the time dependence of something that exists at every moment of time and evolves from earlier to later times. Hence the mother of all quantum-theoretical pseudo-questions: why does a quantum state have (or appear to have) two modes of evolution—continuous and predictable between measurements, discontinuous and unpredictable whenever a measurement is made?

An approach that rejects the very notion of quantum state evolution runs the risk of being dismissed as an ontologically sterile instrumentalism. Yet it is this notion, more than any other, that blocks our view of the ontological implications of quantum mechanics. One of these implications is that the spatiotemporal differentiation of the physical world is incomplete; it does not “go all the way down.” The notion that quantum states evolve, on the other hand, implies that it does “go all the way down.” This is not simply a case of one word against another, for the incomplete spatiotemporal differentiation of the physical world follows from the manner in which quantum mechanics assigns probabilities, which is *testable*, whereas the complete spatiotemporal differentiation of the physical world follows from an assumption about what is the case *between measurements*, and such an assumption is “not even wrong” in Wolfgang Pauli’s famous phrase, inasmuch as it is neither verifiable nor falsifiable.

For at least twenty-five centuries, theorists—from metaphysicians to natural philosophers to physicists and philosophers of science—have tried to model reality from the bottom up, starting with an ultimate multiplicity and using concepts of composition and interaction as their basic explanatory tools. If the spatiotemporal differentiation of the physical world is incomplete, then the attempt to understand the world from the bottom up—whether on the basis of an intrinsically and completely differentiated space or spacetime, out of locally instantiated physical properties, or by aggregation, out of a multitude of individual substances—is doomed to failure. What quantum mechanics is trying to tell us is that reality is structured from the top down.

This textbook is based on a philosophically oriented course of contemporary physics I have been teaching for the last ten years at the Sri Aurobindo International Centre of Education (SAICE) in Puducherry (formerly Pondicherry), India. This non-compulsory course is open to higher secondary (standards 10–12) and undergraduate students, including students with negligible prior exposure to classical physics.

*Footnote:* I consider this a plus. In the first section of his brilliant Caltech lectures, Richard Feynman raised a question of concern to every physics teacher: “Should we teach the *correct* but unfamiliar law with its strange and difficult conceptual ideas…? Or should we first teach the simple … law, which is only approximate, but does not involve such difficult ideas? The first is more exciting, more wonderful, and more fun, but the second is easier to get at first, and is a first step to a real understanding of the second idea.” With all due respect to one of the greatest physicists of the 20th Century, I cannot bring myself to agree. How can the second approach be a step to a real understanding of the correct law if “*philosophically we are completely wrong* with the approximate law,” as Feynman himself emphasized in the immediately preceding paragraph? To first teach laws that are completely wrong philosophically cannot but impart a conceptual framework that eventually stands in the way of understanding the correct laws. The damage done by imparting philosophically wrong ideas to young students is not easily repaired.

I wish to thank the SAICE for the opportunity to teach this experimental course in “quantum philosophy” and my students—the “guinea pigs”—for their valuable feedback.

*Note added in 2018:* Today I would call it a mix of enthusiasm and galvanizing perplexity. While discussing the bomb testing experiment, one student exclaimed after a minute of puzzled contemplation: “I like this feeling in the head!”

August 15, 2010