Occasionalism is a variation upon Cartesian metaphysics. The latter is the most notorious case of dualism (mind and body, for instance). The mind is a “mental substance”. The body – a “material substance”. What permits the complex interactions which happen between these two disparate “substances”? The “unextended mind” and the “extended body” surely cannot interact without a mediating agency, God. The appearance is that of direct interaction but this is an illusion maintained by Him. He moves the body when the mind is willing and places ideas in the mind when the body comes across other bodies.
Descartes postulated that the mind is an active, unextended, thought while the body is a passive, unthinking extension. The First Substance and the Second Substance combine to form the Third Substance, Man. God – the Fourth, uncreated Substance – facilitates the direct interaction among the two within the third. Foucher raised the question: how can God – a mental substance – interact with a material substance, the body. The answer offered was that God created the body (probably so that He will be able to interact with it). Leibnitz carried this further: his Monads, the units of reality, do not really react and interact.
They just seem to be doing so because God created them with a pre-established harmony. The constant divine mediation was, thus, reduced to a one-time act of creation. This was considered to be both a logical result of occasionalism and its refutation by a reductio ad absurdum argument. But, was the fourth substance necessary at all? Could not an explanation to all the known facts be provided without it? The ratio between the number of known facts (the outcomes of observations) and the number of theory elements and entities employed in order to explain them – is the parsimony ratio.
Every newly discovered fact either reinforces the existing worldview – or forces the introduction of a new one, through a “crisis” or a “revolution” (a “paradigm shift” in Kuhn’s abandoned phrase). The new worldview need not necessarily be more parsimonious. It could be that a single new fact precipitates the introduction of a dozen new theoretical entities, axioms and functions (curves between data points). The very delineation of the field of study serves to limit the number of facts, which could exercise such an influence upon the existing worldview and still be considered pertinent.
Parsimony is achieved, therefore, also by affixing the boundaries of the intellectual arena and / or by declaring quantitative or qualitative limits of relevance and negligibility. The world is thus simplified through idealization. Yet, if this is carried too far, the whole edifice collapses. It is a fine balance that should be maintained between the relevant and the irrelevant, what matters and what could be neglected, the comprehensiveness of the explanation and the partiality of the pre-defined limitations on the field of research.
This does not address the more basic issue of why do we prefer simplicity to complexity. This preference runs through history: Aristotle, William of Ockham, Newton, Pascal – all praised parsimony and embraced it as a guiding principle of work scientific. Biologically and spiritually, we are inclined to prefer things needed to things not needed. Moreover, we prefer things needed to admixtures of things needed and not needed. This is so, because things needed are needed, encourage survival and enhance its chances. Survival is also assisted by the construction of economic theories.
We all engage in theory building as a mundane routine. A tiger beheld means danger – is one such theory. Theories which incorporated less assumptions were quicker to process and enhanced the chances of survival. In the aforementioned feline example, the virtue of the theory and its efficacy lie in its simplicity (one observation, one prediction). Had the theory been less parsimonious, it would have entailed a longer time to process and this would have rendered the prediction wholly unnecessary. The tiger would have prevailed.
Thus, humans are Parsimony Machines (an Ockham Machine): they select the shortest (and, thereby, most efficient) path to the production of true theorems, given a set of facts (observations) and a set of theories. Another way to describe the activity of Ockham Machines: they produce the maximal number of true theorems in any given period of time, given a set of facts and a set of theories. Poincare, the French mathematician and philosopher, thought that Nature itself, this metaphysical entity which encompasses all, is parsimonious. He believed that mathematical simplicity must be a sign of truth.
A simple Nature would, indeed, appear this way (mathematically simple) despite the filters of theory and language. The “sufficient reason” (why the world exists rather than not exist) should then be transformed to read: “because it is the simplest of all possible worlds”. That is to say: the world exists and THIS world exists (rather than another) because it is the most parsimonious – not the best, as Leibnitz put it – of all possible worlds. Parsimony is a necessary (though not sufficient) condition for a theory to be labelled “scientific”. But a scientific theory is neither a necessary nor a sufficient condition to parsimony.
In other words: parsimony is possible within and can be applied to a non-scientific framework and parsimony cannot be guaranteed by the fact that a theory is scientific (it could be scientific and not parsimonious). Parsimony is an extra-theoretical tool. Theories are under-determined by data. An infinite number of theories fits any finite number of data. This happens because of the gap between the infinite number of cases dealt with by the theory (the application set) and the finiteness of the data set, which is a subset of the application set.
Parsimony is a rule of thumb. It allows us to concentrate our efforts on those theories most likely to succeed. Ultimately, it allows us to select THE theory that will constitute the prevailing worldview, until it is upset by new data. Another question arises which was not hitherto addressed : how do we know that we are implementing some mode of parsimony? In other words, which are the FORMAL requirements of parsimony? The following conditions must be satisfied by any law or method of selection before it can be labelled “parsimonious”:
Exploration of a higher level of causality – the law must lead to a level of causality, which will include the previous one and other, hitherto apparently unrelated phenomena. It must lead to a cause, a reason which will account for the set of data previously accounted for by another cause or reason AND for additional data. William of Ockham was, after all a Franciscan monk and constantly in search for a Prima Causa. The law should either lead to, or be part of, an integrative process. This means that as previous theories or models are rigorously and correctly combined, certain entities or theory elements should be made redundant.
Only those, which we cannot dispense with, should be left incorporated in the new worldview. The outcomes of any law of parsimony should be successfully subjected to scientific tests. These results should correspond with observations and with predictions yielded by the worldviews fostered by the law of parsimony under scrutiny. Laws of parsimony should be semantically correct. Their continuous application should bring about an evolution (or a punctuated evolution) of the very language used to convey the worldview, or at least of important language elements.
The phrasing of the questions to be answered by the worldview should be influenced, as well. In extreme cases, a whole new language has to emerge, elaborated and formulated in accordance with the law of parsimony. But, in most cases, there is just a replacement of a weaker language with a more powerful meta-language. Einstein’s Special Theory of Relativity and Newtonian dynamics are a prime example of such an orderly lingual transition, which was the direct result of the courageous application of a law of parsimony. Laws of parsimony should be totally subjected (actually, subsumed) by the laws of Logic and by the laws of Nature.
They must not lead to, or entail, a contradiction, for instance, or a tautology. In physics, they must adhere to laws of causality or correlation and refrain from teleology. Laws of parsimony must accommodate paradoxes. Paradox Accommodation means that theories, theory elements, the language, a whole worldview will have to be adapted to avoid paradoxes. The goals of a theory or its domain, for instance, could be minimized to avoid paradoxes. But the mechanism of adaptation is complemented by a mechanism of adoption. A law of parsimony could lead to the inevitable adoption of a paradox.
Both the horns of a dilemma are, then, adopted. This, inevitably, leads to a crisis whose resolution is obtained through the introduction of a new worldview. New assumptions are parsimoniously adopted and the paradox disappears. Paradox accommodation is an important hallmark of a true law of parsimony in operation. Paradox Intolerance is another. Laws of parsimony give theories and worldviews a “licence” to ignore paradoxes, which lie outside the domain covered by the parsimonious set of data and rules. It is normal to have a conflict between the non-parsimonious sets and the parsimonious one.
Paradoxes are the results of these conflicts and the most potent weapons of the non-parsimonious sets. But the law of parsimony, to deserve it name, should tell us clearly and unequivocally, when to adopt a paradox and when to exclude it. To be able to achieve this formidable task, every law of parsimony comes equipped with a metaphysical interpretation whose aim it is to plausibly keep nagging paradoxes and questions at a distance. The interpretation puts the results of the formalism in the context of a meaningful universe and provides a sense of direction, causality, order and even “intent”.
The Copenhagen interpretation of Quantum Mechanics is an important member of this species. The law of parsimony must apply both to the theory entities AND to observable results, both part of a coherent, internally and externally consistent, logical (in short : scientific) theory. It is divergent-convergent : it diverges from strict correspondence to reality while theorizing, only to converge with it when testing the predictions yielded by the theory. Quarks may or may not exist – but their effects do, and these effects are observable. A law of parsimony has to be invariant under all transformations and permutations of the theory entities.
It is almost tempting to say that it should demand symmetry – had this not been merely an aesthetic requirement and often violated. The law of parsimony should aspire to a minimization of the number of postulates, axioms, curves between data points, theory entities, etc. This is the principle of the maximization of uncertainty. The more uncertainty introduced by NOT postulating explicitly – the more powerful and rigorous the theory / worldview. A theory with one assumption and one theoretical entity – renders a lot of the world an uncertain place.
The uncertainty is expelled by using the theory and its rules and applying them to observational data or to other theoretical constructs and entities. The Grand Unified Theories of physics want to get rid of four disparate powers and to gain one instead. A sense of beauty, of aesthetic superiority, of acceptability and of simplicity should be the by-products of the application of a law of parsimony. These sensations have been often been cited, by practitioners of science, as influential factors in weighing in favour of a particular theory.
Laws of parsimony entail the arbitrary selection of facts, observations and experimental results to be related to and included in the parsimonious set. This is the parsimonious selection process and it is closely tied with the concepts of negligibility and with the methodology of idealization and reduction. The process of parsimonious selection is very much like a strategy in a game in which both the number of players and the rules of the game are finite. The entry of a new player (an observation, the result of an experiment) sometimes transforms the game and, at other times, creates a whole new game.
All the players are then moved into the new game, positioned there and subjected to its new rules. This, of course, can lead to an infinite regression. To effect a parsimonious selection, a theory must be available whose rules will dictate the selection. But such a theory must also be subordinated to a law of parsimony (which means that it has to parsimoniously select its own facts, etc. ). a meta-theory must, therefore, exist, which will inform the lower-level theory how to implement its own parsimonious selection and so on and so forth, ad infinitum.