(Tomorrow I post my first post on Unit Operations. I’ll probably be reading it at a slower pace, though.)
This is Part 4, on “Virtuality and the Laws of Physics”.
Part 1 (on “The Mathematics of the Virtual”)
Part 2 (on “The Actualization of the Virtual in Space”)
Part 3 (on “The Actualization of the Virtual in Time”)
—
Summary of the Sentiment
In a flat ontology of individuals, like the one I have tried to develop here, there is no room for reified totalities. In particular, there is no room for entities like ‘society’ or ‘culture’ in general. Institutional organizations, urban centres or nation states are, in this ontology, not abstract totalities but concrete social individuals, with the same ontological status as individual human beings but operating at larger spatio-temporal scales. Like organisms or species these larger social individuals are products of concrete historical processes, having a date of birth and, at least potentially, a date of death or extinction. And like organisms and species, the relations between individuals at each spatio-temporal scale is one of parts to whole, with each individual emerging from the causal interactions among the members of populations of smaller scale individuals. Although the details of each individuation process need to be described in detail, we can roughly say that from the interactions among individual decision-makers, institutions emerge; from interactions among institutions, cities emerge; and from urban interactions, nation states emerge. The population serving as substratum for the emergence of a larger whole may be very heterogeneous or, on the contrary, highly homogeneous. But even in those cases where the degree of homogeneity at different scales is high enough to suggest the existence of a single ‘culture’ or ‘society’, the temptation to postulate such totalities must be resisted, and the degree of homogeneity which motivated such postulation must be given a concrete historical explanation.
The Deductive-Nomological Approach: An epistemological theory wherein “scientific explanations are treated as logical arguments consisting of several propositions, one of which must be an exceptionless law”. The scientific field becomes axiomatic. If we can successfully confirm our own predictions, under this paradigm, we may not have determined a causal mechanism, but we can subsume that phenomenon as “a particular case under a general category.” This approach ought to be a uncommon but it is still prevalent in some corners, DeLanda claims.
When one accepts this model of explanation the structure of the theoretical component of a scientific field takes the form of an axiomatic: from a few true statements of general regularities (the axioms) we deduce a large number of consequences (theorems) which are then compared to the results of observations in the laboratory to check for their truth or falsity. Given that deduction is a purely mechanical way of transmitting truth or falsity, it follows that whatever truth one may find in a theorem must have already been contained in the axioms. It is in this sense that axioms are like essences.
An attempt to counter this conception and “reintroduce productive causal relations as an integral part of explanations”.
“In the view of these philosophers, explanations, rather than being simply logical arguments, involve a complex use of mathematical models of different types: models of general relations, models of particular experimental situations, as well as statistical models of the raw data gathered in laboratories. One of the defenders of this new view, Ronald Giere, puts it this way:
Even just a brief examination of classical mechanics as presented in modern textbooks provides a basis for some substantial conclusions about the overall structure of this scientific theory as it is actually understood by the bulk of the scientific community. What one finds in standard textbooks may be described as a cluster (or cluster of clusters) of models, or, perhaps better, as a population of models consisting of related families of models. The various families are constructed by combining Newton’s laws of motion, particularly the second law, with various force functions – linear functions, inverse square functions, and so on. The models thus defined are then multiplied by adding other force functions to the definition. These define still further families of models. And so on
“I would like to add that the basic idea of thinking of a physical theory as a population of models fits well with the ontological stance I am defending. Such a population is easily conceived as the product of a historical accumulation, subject to all the contingencies of such historical processes, and hence with no pretence that it represents a complete or final set of models. At any rate, the completeness or closure of the set becomes an empirical matter, not something to be assumed at the outset as in axiomatic treatments.”
[A] fundamental law achieves its generality at the expense of its accuracy. A fundamental law, such as Newton’s law of gravity, is strictly speaking true only in the most artificial of circumstances, when all other forces (like electromagnetic forces) are absent, for instance, or when there is no friction or other nonlinearities. In other words, the law is true but only if a very large ‘all other things being equal’ clause is attached to it. We can compensate for the shortcomings of fundamental laws by adding to the basic equation other equations representing the action of other forces or the complex causal interactions between forces. But then we lose the generality that made the original law so appealing to essentialists. The model becomes more true, describing with increased accuracy the structure of a given experimental phenomenon, but for the same reason it becomes less general. In short, for Cartwright the objective content of physics does not lie in a few fundamental laws, but in a large number of causal models tailored to specific situations. (Giere does not speak of ‘causal models’ but of ‘hypotheses’ linking the abstract models and the world, but the overall thrust of his argument is very close to that of Cartwright)
[…]
To summarize the argument of this section, far from being mere mathematical expressions of linguistic truths, laws must be viewed as models from which the mathematical form cannot be eliminated. The unification brought about by the calculus of variations, for example, cannot be understood otherwise since its techniques do not apply to linguistically stated laws. These irreducibly mathematical models form a growing and heterogeneous population, some members of which carry causal information about productive relations between events, others embody quasi-causal relations between singularities. In other words, the population of models making up the theoretical component of classical mechanics contains a large number of specific causal models which are the vehicles for truth (the part of the population that interfaces with the actual world), and fewer models which do not refer to the actual world (hence are neither true nor false) but which nevertheless do interface with the virtual world by virtue of being wellposed problems. For Deleuze a problem is defined precisely by a distribution of the singular and the ordinary, the important and the unimportant, the relevant and the irrelevant. A well-posed problem gets these distributions right, and a solution always has the truth it deserves according to how well specified the corresponding problem is.19 In these terms Newton’s achievement would consist not in having discovered general truths about the universe, but in having correctly posed an objective problem defined by the simplest distribution of singularities (unique minima or maxima). This interpretation preserves the objectivity of Newton’s laws but it deflates his achievement somewhat, in the sense that, if the insights of nonlinear dynamics about multiple attractors are correct, the single minimum problem is not the most general one.
[…]
[This] alternative approach, a problematic approach, rejects the idea that fundamental laws express general truths and views them instead as posing correct problems. Problems are defined by their presuppositions (what is not being explained) as well as by their contrast spaces (defining what the relevant options for explanation are). In the particular case of explanations in classical physics, where the laws are expressed by differential equations, the presuppositions are the physical quantities chosen as relevant degrees of freedom (which make up the different dimensions of a state space) while the contrast space is defined by a distribution of singularities in state space, that is, by a particular partition of possibilities into distinct basins of attraction. As the example of hydrodynamic regimes of flow shows, however, a contrast space may have a more complex structure: a cascade of symmetry-breaking bifurcations may link several such spaces in such a way that a problem may gradually specify itself as the different contrast spaces it contains reveal themselves, one bifurcation at a time.
These conclusions are directly connected with the ontological ideas I explored before, but to see this connection we must expand the conception of problems beyond those involving scientific explanations. In Deleuze’s approach the relation between well-posed explanatory problems and their true or false solutions is the epistemological counterpart of the ontological relation between the virtual and the actual. Explanatory problems would be the counterpart of virtual multiplicities since, as he says, ‘the virtual possesses the reality of a task to be performed or a problem to be solved’. Individual solutions, on the other hand, would be the counterpart of actual individual beings: ‘An organism is nothing if not the solution to a problem, as are each of its differenciated organs, such as the eye which solves a light problem.’ Let me illustrate this idea with a simple example I used before: soap bubbles and salt crystals, viewed as the emergent result of interactions between their constituent molecules. Here the problem for the population of molecules is to find (or compute its way to) a minimal point of energy, a problem solved differently by the molecules in soap films (which collectively solve a minimization problem stated in surface-tension terms) and by the molecules in crystalline structures (which collectively solve a bonding energy problem). It is as if an ontological problem, whose conditions are defined by a unique singularity, ‘explicated’ itself as it gave rise to a variety of geometric solutions (spherical bubbles, cubic crystals).
This intimate relation between epistemology and ontology, between problems posed by humans and self-posed virtual problems, is characteristic of Deleuze. A true problem, such as the one which Newton posed in relatively obscure geometric terms and which Euler, Lagrange and Hamilton progressively clarified, would be isomorphic with a real
virtual problem. Similarly, the practices of experimental physicists, which include among other things the skilful [sic] use of machines and instruments to individuate phenomena in the laboratory, would be isomorphic with the intensive processes of individuation which solve or explicate a virtual problem in reality. This conception of the task of theoretical and experimental physicists runs counter to the traditional realist picture which views it as that of producing a corpus of linguistic propositions expressing true facts which mirror reality. In this old and tired view, the relation between the plane of reality and that of physics would be one of similarity. Yet, as Deleuze says, there is ‘no analytic resemblance, correspondence or conformity between the two planes. But their independence does not preclude isomorphism . . .’ Indeed, as I said in the conclusion of the previous chapter, there is a further isomorphism which must be included here: the philosopher must become isomorphic with the quasi-causal operator, extracting problems from law-expressing propositions and meshing the problems together to endow them with that minimum of autonomy which ensures their irreducibility to their solutions.
[…]
I would like to discuss the details of these isomorphisms, one involving the experimental, the other the theoretical component of classical physics. This will imply dealing with both sides of the relation, that is, not only the laboratory and modelling practices of physicists, but also the behaviour of the material phenomena and machinery which inhabit laboratories as well as the behaviour of the mathematical models with which the theorist makes contact with the virtual. I will begin with a discussion of how the capacity of material and energetic systems to self-organize and self-assemble, a capacity which reveals a properly problematic aspect of matter and energy, is concealed when physicists or philosophers focus on linear causality at the expense of more complex forms. Yet, I will also argue that even if a material system under study has been fully linearized and domesticated, the causal relations between experimentalist, machines, material phenomena and causal models are still nonlinear and problematic. Indeed, the physics laboratory may be viewed as a site where heterogeneous assemblages form, assemblages which are isomorphic with real intensive individuation processes.
I will then move on to questions of quasi-causality and compare Deleuze’s epistemological approach to state space, an approach that emphasizes the singularities that define the conditions of a theoretical problem, to those of analytical philosophers who stress the solutions to the problem, that is, who see not the singularities but the
trajectories in state space as the conveyors of theoretical knowledge. While trajectories bear a relationship of geometric similarity to quantities measured in the laboratory, the singularities defining a problem in physics are isomorphic with those defining the conditions of a virtual multiplicity. Here too, I will argue that it is the behaviour of linear equations that conceals the problematic aspect of mathematical models. In short, whether we are dealing with causes or quasi-causes, with experimental or theoretical physics, the crucial task is to avoid the subordination of problems to solutions brought about by the search for simple linear behaviour.
And so as we might expect, Deleuze’s epistemology echoes his ontology. The laboratory is a heterogenous assemblage of machines, material, and people, connecting operations to materiality- the “epistemological counterpart to ontological intensities” attempting the extract the virtual. We don’t live in a closed, clean, unproblematic world base on beautiful, simple universal truths. There are nonlinear causes and complex environments in our world, created by (and creating) nonlinear models with multiple attractors, often hidden- unactualized- in an objective delusion.