
## Metadata
- Author: [[Robert W. Ulanowicz]]
- Full Title: A Third Window
- Category: #books
## Highlights
- Phenomenology, as used in science, means the encapsulation of regularities into a quantitative formula, achieved in abstraction of any eliciting causes. Hence, phenomenology does not imply understanding, although it often leads in that direction. ([Location 382](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=382))
- I had the good fortune to read two papers in close succession (Atlan 1974; Rutledge, Basorre, and Mulholland 1976) that together provided me with a method to quantify the degree of organization inherent in any collection (network) of interacting processes. This discovery itself proved to be highly useful for assessing the status of an ecosystem, but there was more. The mathematics used to quantify organization was borrowed from the discipline of information theory. The measurement of information is accomplished in a strange, converse fashion whereby, in order to assess how much is known about a situation, it is first necessary to quantify its opposite, i.e., how much is unknown. ([Location 384](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=384))
- using information theory, it becomes possible to decompose the complexity of any scenario into two separate terms, one that appraises all that is ordered and coherent about the system and a separate one that encompasses all that is disordered, inefficient, and incoherent within it. ([Location 391](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=391))
- Our inclination under the monist approach is to drive the aleatoric to extinction, but to do so beyond a certain point is to guarantee disaster. ([Location 408](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=408))
- The sixteenth century, the “forgotten century” by some historians, was an especially tumultuous and violent time, marked by all manner of bloody strife between parties that mostly were divided by sectarian beliefs. It should not be surprising, then, that maintaining the homogeneity of belief within any particular society became a matter of significant common concern, and such angst afforded special powers to the guardians of the belief structures in the form of what we now would call overweening clericalism. ([Location 533](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=533))
- Darwin’s basic scheme was simple enough—so extremely simple that some say the resultant scenario became “colonial” in the sense that it could be applied to almost any situation, even many where it does not apply ([Location 688](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=688))
- A process is the interaction of random events upon a configuration of constraints that results in a nonrandom but indeterminate outcome. ([Location 714](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=714))
- it is taken for granted in the Darwinian narrative (and I have never seen it emphasized) that over the course of affairs between inception and reproduction, organisms strive to compete for resources (Haught 2003). But what is the cause of this striving? To say that the drive is encoded in the organism’s genes explains nothing. The reductio ad absurdum of such a suggestion is revealed in a cartoon from the series Kudzu by Doug Marlette (Jan. 23, 1997). In it the main character, Kudzu, sits next to his parson and asks, “When will we discover the gene that makes us believe that everything is determined by genes?” ([Location 844](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=844))
- in favor of flow variables. It needs to be acknowledged at the outset that full reconciliation between stasis and change is impossible, notwithstanding the fact that both are readily observable aspects of nature. (The physicist or engineer might remark that the two concepts are fundamentally different entities, like oranges and apples, because the dimensions of change include time, which is absent from the dimensions of stasis). ([Location 877](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=877))
- In the last chapter, we discussed two instances (statistical mechanics and the grand synthesis) of how science has attempted to mitigate the challenges posed by stochastic interference. Both reconciliations rested upon the same mathematical tool—probability theory—to retrieve some degree of regularity and predictability over the long run. Probability theory, however, comes with its own vulnerabilities. To apply it forces one to accept a set of assumptions regarding how chance is distributed, e.g., normally, exponentially via power-law, ([Location 891](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=891))
- More importantly still, probability theory can be used only after a more fundamental set of assumptions has been accepted. These essential preconditions are rarely mentioned in introductions to probability—namely, probability applies only to chance events that are simple, generic, and repeatable. Simple events are atomic in that they occur at the smallest scale and over the shortest time interval under consideration. Generic means that there is no particular characteristic of the events worthy of mention. They are all homogeneous and indistinguishable and lack directionality, save perhaps occasionally exhibiting a binary difference in sign. Finally, it is necessary for one to observe many repetitions of the same chance event; otherwise, one cannot gauge how frequently it occurs (i.e., estimate its probability). As the cliché goes, if one possesses only a hammer, everything begins to look like a nail! Because probability theory works only on simple, generic, and repeatable chance, most tacitly assume that all instances of chance share these characteristics. But, if the burgeoning field of “complexity theory” has taught us anything, it is that matters cannot always be considered simple. Complex systems exist, so why shouldn’t complex chance? ([Location 895](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=895))
- All that Elsasser has been telling us about both chance and heterogeneity can neatly be summarized in a few words: “Combinatorics and heterogeneity overwhelm law.” ([Location 1012](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=1012))
- Kauffman, Deacon, I, and others are driving at the ostensible paradox that out of a mélange of processes can emerge certain patterns of transformations that endure over time. For how else could the hard material of this world have arisen? We could believe, as did the early Platonists, that matter arose to conform to preexisting “essences,” but contemporary physics paints a considerably different picture of what we regard as hard material. For one, what to our senses seems solid, we are told, is, in large measure, empty space. And so we take recourse in assuring ourselves that the elementary particles found within that space have been around since the beginning of time—well, almost the beginning of time, that is. According to the theory of the big bang, the universe began as a chaotic, incredibly dense mass of extremely high-energy photons—pure flux (Chaisson 2001). As this continuum began to expand, some of the photons came together (collided) to form pairs of closed-looped circulations of energy called hadrons, the initial matter and antimatter. For a while, these hadrons were destroyed by collisions with photons about as fast as they appeared. Continued expansion put space between the elementary particles so that matter and antimatter pairs annihilated each other with decreasing frequency, and the diminishing energy of the photons made their collisions with extant material less destructive. Matter was beginning to appear but was also disappearing at much the same rate. Meanwhile, a very subtle (one in a billion) asymmetry (chance event) produced slightly more matter than antimatter, so that, after most antimatter had been annihilated by matter, a remainder of matter slowly accrued. Further expansion gave rise to yet larger configurations of emerging materials and the appearance of weaker forces. Eventually matter coalesced under gravity (in stars) to a density that ignited chain (feedback) fusion reactions, producing larger, more complex aggregations—the heavier elements. From these, it became possible to construct solid matter. The take-home message here is that the enduring materials we perceive today are actually the endpoints of dynamical configurations of processes, asymmetries, and feedbacks of bygone eons. ([Location 1140](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=1140))
- “In principle, then, a causal circuit will generate a non-random response to a random event.”1 Bateson, the cyberneticist, was dealing with “causal circuits,” concatenations of events or processes wherein the last element in the chain affects the first—what commonly is known as feedback. Causal circuits, he implied, have the capability to endure because they can react nonrandomly to random stimuli. This inchoate fragment of an idea, I believe, is the keystone in our efforts to understand the life process and its origins. ([Location 1160](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=1160))
- The renowned philosopher Bertrand Russell (1993, 22) was among the first to appreciate the central importance of centripetality to evolution, although he referred to it under the guise of “chemical imperialism”: Every living thing is a sort of imperialist, seeking to transform as much as possible of its environment into itself and its seed…. We may regard the whole of evolution as flowing from this “chemical imperialism” of living matter. (emphasis mine) ([Location 1304](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=1304))
- One should never lose sight of the fact that the autocatalytic scheme is predicated upon mutual beneficence or, more simply put, upon mutuality. Although facilitation in autocatalysis proceeds in only one direction (sense), its outcome is, nevertheless, mutual in the sense that an advantage anywhere in the autocatalytic circuit propagates so as to share that advantage with all other participants. That competition derives from mutuality and not vice versa represents an important inversion in the ontology of actions. The new ordination helps to clear up some matters. For example, competition has been absolutely central to Darwinian evolution, and that heavy emphasis has rendered the origins of cooperation and altruism obscure, at best. Of course, scenarios have been scripted that attempt to situate cooperative actions within the framework of competitions (e.g., Smith 1982). But these efforts at reconciliation invariably misplace mutuality in the scheme of things. Properly seen, it is the platform from which competition can launch: without mutuality at some lower level, competition at higher levels simply cannot occur. The reason one rodent is able to strive against its competitors is that any individual animal is a walking “orgy of mutual benefaction” (May 1981, 95) within itself. ([Location 1338](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=1338))
- That the conversation between scientists and theologians is so topical owes in some measure to recent monetary support by the John Templeton Foundation, but, in substance, it has been prompted by the postmodern critique of the privileged role that science plays vis-à-vis other ways of knowing. Because discussions at this interface often relate to the very core of a participant’s self-image, they can become animated and, unfortunately at times, vehement. Such emotion derives from the distrust individuals often hold for those expressing opposing opinions. Not infrequently, there exists the fear, too often justified, that one’s opponent is seeking to “seize one’s philosophical position” (Haught 2000). Thus, theists, at least since the time of Thomas Huxley, have felt that some in science wish to use that body of facts to extinguish any possibility of belief in agency beyond the natural. Susskind’s comment in the last chapter revealed that such worry persists and is not outright paranoia (e.g., Dennett 2006, Dawkins 2006). More recently, metaphysical naturalists have felt themselves besieged by many in the public who seek to make part of public education the belief that an Intelligent Being beyond human capacity is required to explain the order in nature (Behe 1996, Dembski 1998). Either of these conflicting intentions is an example of what theologian John Haught (2000) calls “metaphysical impatience”—a desire to establish the certainty of one’s beliefs and to extirpate opposing convictions, an unwillingness on either side to admit that we inhabit, as Ilya Prigogine (and Stengers 1984) put it, a world of radical uncertainty. ([Location 2463](https://readwise.io/to_kindle?action=open&asin=B0058UAJRQ&location=2463))