Deep Simplicity: Chaos, Complexity, and the Emergence of Life
Appearance
Introduction: The Simplicity of Complexity
- Chaos and complexity are based on two simple ideas:
- The sensitivity of a system to its starting conditions
- Feedback
Order Out of Chaos
- The Greeks were superb geometers, who had a very good understanding of the relationships between stationary things, but they had no understanding of how things move or the laws of motion.
- The three-body problem - Equations describing systems of three or more objects can be written down, but they cannot be solved - they are not integrable and are said to have no analytical solutions (the techniques of mathematical analysis do not work on them).
- Liebnitz and Newton invented calculus at about the same time
- Equations that can be solved analytically are sometimes said to be deterministic: the equation describing the orbit of a single planet around the sun is deterministic, and has analytical solutions in the form of ellipses. But the lack of solutions to the 3BP is built into the laws of maths.
- The absence of analytical solutions means that nature itself does not "know" exactly how the orbits will change (evolve) as time passes. For the planets of the Solar System, there is the possibility that the orbits may not stay the same for ever as they are today, but may change in literally unpredictable ways.
- Newton and Maxwell provided the mathematical toolkit needed to deal with everything known to physical science in the middle of the 19th c, and Maxwell's equations also produced a description of light as a magnetic wave traveling through space at a certain speed.
- There is something odd about the arrow of time - Newton's and Maxwell's equations work whichever way time is running. Time seems to be closely tied in with the laws of thermodynamics and statistical mechanics.
- Count Rumford in the 1790s came up with the insight that heat is a form of work. A steam engine converts heat into work. The cannon-boring process turned work into heat.
- Energy cannot be created or destroyed, but only transformed from one form into another.
- But while the total amount of energy in the universe stays the same, the amount of useful energy is always decreasing.
- Entropy measures the amount of order in a system, with increasing disorder corresponding to increasing entropy. Disorder increases in any closed system (things wear out) as time passes, the inevitable increase in entropy defines a direction of time, an arrow pointing from the ordered past into the disordered future. As this process seemed inevitable and universal, the Victorian thermodynamicists envisaged an ultimate fate of the universe in which all useful energy had been converted into heat, and everything was a bland mixture of stuff at one uniform temperature, a bleak scenario that they called the 'heat death' of the universe.
- Anywhere in the universe that a pocket of order appears, it always does so at the cost of more disorder being produced somewhere else.
- Attractors - The final equilibrium state (which also corresponds to the state of minimum energy) is said to be an attractor, because the system acts as if it is attracted towards that state. One that state is reached, there is no way to tell how the system got into it - there is no history written into the final, equilibrium state. By the time a system reaches equilibrium it has forgotten its initial conditions. All that matters is where it is now.
- There is no such thing as an isolated system (except for the entire universe) and no system is ever in perfect equilibrium.
- Away from equilibrium, a flow of energy can, under the right circumstances, create order spontaneously.
- In general, a system that is close to equilibrium will be attracted to a state where the rate at which entropy is being produced is a minimum.
- Why do we live in an orderly world? Boltzmann suggested that the heat death of the universe had already occurred, and that it must be in equilibrium in this way
- All the allowed states of gas in a box recur repeatedly, with a characteristic periodicity called the Poincaré recurrence time, or Poincaré cycle time. If the entropy goes up for a while, it must inevitably come down again later on, to bring the gas back to its original state. And this recurrent, cyclic behavior is a direct result of the strict application of Newton's laws, in which past and future have equal status.
The Return of Chaos
- Any number series that does not converges on a specific finite number is said to diverge
- In many cases, there is no way to prove whether the series will diverge or converge.
- Instability is normal, and permanently stable orbits are the exception.
- With a linear system, if I make a small mistake in measuring, or estimating, some initial property of the system, this will carry through my calculations and lead to a small error at the end. But with a non-linear system, a small error at the beginning of the calculation can lead to a very large error at the end of the calculation. A linear system is more or less equal to the sum of its parts; a non-linear system may be either much more, or much less, than the sum of its parts.
- The Lorenz attractor (aka the Butterfly attractor)
- Simple laws, non-linearity, sensitivity to initial conditions and feedback are what make the world tick.
- Resonance is a way of getting a large return for a relatively small effort, by making the effort at just the right time and pushing a system the way it "wants" to go (like swinging on a swing).
Chaos Out of Order
- In practice, it is impossible to predict in detail what is going to happen more quickly than events unfold in real-time.
- Turbulence in the flow of water in a river where one large rock divides the flow builds through at least three distinct changes:
- First, as the flow builds up, little whirlpools appear behind the rock. These vortices stay in the same place, and a chip of wood floating downstream may get trapped in one of them and go round and round for a long time. In phase space, this kind of attractor is known as a limit cycle, because wherever the system starts out, in the limit it will be attracted to the specific repetitive pattern of behavior.
- At the next stage, as the speed of the water flowing downstream increases, vortices form behind the rock, but don't stay there. They detach themselves (or are detached by the flow), and move away downstream, retaining their separate existence for a while before dissolving into the flow. As they do so, new vortices form behind the rock and detach in their turn. Now, a chip of wood might get caught in one of these eddies and be carried downstream still circling around in the vortex as long as the vortex lasts.
- As the speed of water increases still further, the region behind the rock where the vortices survive gets smaller and smaller, with vortices forming and breaking up almost immediately to produce a choppy surface in which there are seemingly only irregular fluctuations - turbulence.
- Eventually, when the flow is fast enough, all trace of order in the region behind the rock disappears. No vortices form and the entire surface of the water breaks up behind the rock into unpredictable chaotic motion.
- There are two key features of the route from order to chaos:
- Something is changing - in this case, just one parameter, the speed with which water is flowing.
- A whirlpool that detaches from the rock and moves downstream doesn't just disappear. It breaks up inside into smaller vortices, which in turn break up into smaller vortices, eddies within eddies within eddies in what seems to be an endless process of bifurcation. The road to chaos involves what seems to be an infinite number of choices operating on an infinitely small scale.
- Period doubling - when a dripping tap goes from drip, to drip-drip, to drip-drip-drip-drip
- The logistic equation describes how the population of a species changes from one generation to the next: x(next) = Bx(1-x)
- This process is non-linear as well as involving feedback (through the iteration)
- B < 1 - the population fails to reproduce itself with each adult leaving on average less than one offspring, and will eventually die out
- 1 < B < 3 - there is an attractor, where whatever value of the population you start with (any value of x between 0 and 1) after a number of generations, it settles down to a steady level, a constant population. For values of B close to 3 it settles down at 0.66, ie 2/3 of the max population
- 3 < B - the pattern changes and the population switches between two different, constant levels in alternate generations. The single attractor has split in two (bifurcated) and the period has doubled from 1 to 2
- B = 3.4495 - each of the two prongs bifurcates, producing a system that oscillates between four different populations (period 4)
- B = 3.56 - period 8
- B = 3.596 - period 16
- B = 3.56999 - the number of attractors becomes infinite and anyone studying the population changes from year to year will see genuine deterministic chaos.
- But there are small ranges of values where order is restored. For example 3.8 < B < 3.9 - the system seems to settle down to a stable state resembling the behavior when B is less than 3.
- The patterns within patterns are said to be self-similar. In the midst of order, there is chaos; but in the midst of chaos, there is order.
- In a paper called "Period Three Implies Chaos", Yorke and Li show that for certain families of differential equations, if there is just one solution which has period three then there must also be an infinite number of periodic solutions, with every possible period, and also an infinite number of non-periodic solutions.
- Feigenbaum showed the the period-doubling route to chaos is not some special feature of the logistic equation, but is a product of the iterative process by which the system feeds back on itself - whether an animal population, an oscillator in an electrical circuit, an oscillating chemical reaction, or, in principle, the business cycle of the economy.
- What mattered was that the systems had to be self-referential. If so, then they followed exactly the same route to chaos. In the interval between period doublings, there is a constant ration in the size of one step to the next (4.669:1). This is a geometric convergence, taking you closer and closer to some critical point without ever quite getting there.
- We can think of turbulence as resulting from an increasing number of regular periodic cycles added together. In a simple vortex, the motion corresponds to a loop around a simple attractor, the limit cycle. The next step would be to imagine a point in phase space tracing out a circle, while the centre of the circle traces out a larger circle. The resulting attractor would be a torus, like an inner tube. Going regularly round the large circle while the small circle goes regularly around the large circle, the point representing the state of the system follows a path like that of a curved coiled spring, a slinky, bent round in a circle, in a regular and predictable way. Typically, two periodic motions in phase space interact with one another and get locked into a repeating rhythm.
- Mathematically, it's possible to describe further stages of increasingly complex behavior on the road to turbulence in terms of "tori" in higher numbers of dimensions.
- A limit cycle is a 1-D attractor existing in 2-D. The surface of a torus is a 2-D attractor embedded in a 3-D space. And the logical next step would be a pattern of behavior described by a 3-D attractor embedded in 4-D phase space (but the real world doesn't go any further down this route.
- Turbulence occurs with the point in phase space representing the state of the system staying on the 2-D surface of the torus but moving in a much more complicated way around the surface in a path that can never pass through the same point in phase space twice (for, if it did, the system would be periodic and its behavior would repeat) and therefore never crossing itself.
- Typically, in an echo of the original 3BP, three of more periods in phase space do not get locked into a repeating rhythm, and their combined effect is no more predictable than the orbit of the small particle in the restricted 3BP.
- The trajectory through phase space of the point describing the system would correspond to an infinitely long line wrapped in a complex way, without ever crossing itself, around a finite surface). Ruelle and Takens in 1971 dubbed this a "strange attractor".
- The Peano curve is infinitely long, yet contained within a finite area. there is a clear analogy here with the "space filling" curve that is the attractor, wrapped around the torus in the phase space that describes a turbulent system, although non of that was known in the 1890s.
- Mandelbrot realized that an object like the Peano curve could be described as having an intermediate dimension, in this case somewhere between 1 and 2.A regular line is a 1-D object, and a plane is a 2-D object, but just as there is an infinity of numbers between every rational number, so there are entities with intermediate, non-integer dimensions. If the dimension of such an entity is not an integer, then it must be fractional. Mandelbrot coined the word "fractal" in 1975.
- There are three other fractals known decades before the 1970s:
- The Cantor set - Is a line that is less than a line, "trying" to be a point, a kind of ghost of the original line, like the fading grin on the face of the Cheshire cat in Alice in Wonderland. It is produced by iteration (rub out the middle third, and then repeat and repeat with all the lines left) and is self-similar (made up of two copies of itself, scaled down by a factor of 1/3
- The Serpinski gasket - Take a triangle and then take the mid-points of the sides, join them to make an upside-down triangle and erase them, and then repeat and repeat with all the triangles left. It is self-similar, a fractal, with dimension somewhere between 1 and 2.
- The Koch curve - Divide a line in three and create a triangle between the two middle points. Then repeat on each remaining line and repeat again. Alternatively, start on an equilateral triangle and perform the operation on all three sides. This structure becomes the Koch snowflake and, eventually, the Koch island, which has an infinitely long coastline. The dimension of the Koch curve is 1.2619, more like a line than a plane. The coastline of Britain resembles a fractal, with a dimension of 1.3, slightly less of a line than the Koch curve.
- Seemingly complicated systems can be produced, or described, by the repeated application of a very simple rule.
- It is hared to see how even the wealth of DNA contained in the single cell that develops to become a human being, or a pine tree, or whatever, can contain a literal blueprint for all the complex structures involved in the final adult form. But it is much easier to see how that DNA might contain a few simple instructions along the lines of "double in size for n steps, then divide by two, and repeat in each branch." In the chaos games, instructions only slightly more complicated than this make complex structures like ferns by iteration. If there is an attractor for the fern shape, based on that kind of simple rule, it is no surprise that some plants grow into ferns, even though they need not be programmed to have that shape any more than a single die is programmed to make the Sierpinski gasket. It is randomness plus a simple iterative rule (or rules) that makes the complexity of the world.
- The Mandelbrot set is based on feeding two complex numbers into a simple expression, yet is possibly the most complicated entity ever investigated by humankind.
- The way to make complex structures out of simple rules is repetition. The process is a bit like the way a baker kneads dough, stretching it, folding it over on itself, pummeling it flat, stretching and folding again, repeatedly.
- Volume, or size, obeys a power law with exponent 3. The mass of the body of an animal is proportional to its volume, and as you would expect, the more mass there is the higher the metabolic rate, because there is more body burning up its food and releasing energy. But although the mass increases as a power law with exponent 3, the metabolic rate scales in line with a power law with exponent 2.25.
- So animals are behaving as if their size scales like something intermediate between a volume and a 2-D surface. This implies that the objects involved are fractal surfaces crumpled up within finite volumes. There are many features of living systems that are like fractals.
- The way that arteries and veins branch, for example, turns out to be essentially fractal, enabling blood to reach and return from every part of the body without the veins and arteries taking up so much room that there is no space for anything else. The kidney is very much a finite 3-D object, yet the veins and arteries inside tend towards the infinite length of a true fractal.
- The most complex and interesting things in the Universe are happening right at the end of chaos, just before order is destroyed. Here we find taps or leaky gutters dripping with strange and exotic rhythms, whirlpools within whirlpools spinning about in wonderful patterns, and the extraordinary complexity of the kidney, or the multiply folded and refolded near-fractal surface of the cerebral cortex of the human brain.
- Turing's paper "The Chemical Basis of Morphogenesis":
- Argued that patterns could arise in a mixture of chemical if the catalyst A not only encouraged the production of more A, but also encouraged the formation of another compound B, which was an inhibitor that acted to slow down the rate at which more A is produced.
- Showed how the competition between A and B was the key to pattern formation, and that it was essential that B must diffuse through the mixture more quickly than A, so that while the runaway production of A by the autocatalytic feedback process is always a local phenomenon, the inhibition of A by B is a widespread phenomenon. Also the rapid diffusion of B away from where it is being made means that it doesn't entirely prevent the manufacture of a at its source.
- Self-organization and the spontaneous appearance of patterns out of uniform systems, occur on the edge of chaos and can be described in the language of phase space, limit cycles and attractors. Murray, in Mathematical Biology, found that not only the spots of the leopard but also the stripes on a zebra, the blotches on a giraffe, and even the absence of patterning on the coat of a mouse or an elephant, can all be explained by the same simple process, involving diffusion of actuator and inhibitor chemicals across the surface of the developing embryo at a key stage during its growth. "It is an appealing idea that a single mechanism could generate all of the observed mammalian coat patterns."
The Edge of Chaos
- Only in closed systems do we encounter time reversibility and Poincaré recurrences; in open systems we encounter irreversibility and an arrow of time.
- The nearest a living things ever gets to equilibrium is when it dies.
- Things exist in a steady state in the linear regime; a human being, for example, can maintain his or her integrity for many years using the flow of energy (and food) through their body, although in this case the steady state eventually breaks down, for reasons which are still not understood.
- In a uniformly heated fluid, there is no special place where boiling can start. There is a uniform symmetry across the bottom of the fluid, and it all "wants" to rise at once. At first nothing happens. Then, as the temperature is gradually increased underneath while kept cool on the top (so that the temperature gradient gets steeper), at a critical point the symmetry is broken and the uniform surface of the fluid is suddenly broken up into a pattern of tiny hexagonal convection cells.
- The most interesting patterns appear right at the edge of chaos, and for a thin layer of oil heated in an open pan, thanks to surface tension the specific interesting pattern that happens is a honeycomb arrangement of hexagons. This is happening far from equilibrium, thanks to energy flowing through an open system and being dissipated. This is the secret of the existence of order in the Universe, and specifically the secret of life. Whenever you are confronted by the existence of order, and especially by the existence of life, and you are considering how the world got to be like that, you can always say to yourself "Remember the hexagonal cells of the Bénard convection? It's the same thing!"
- A system can only be held in an interesting state away from equilibrium if it is dissipative and open to the environment, so there has to be an outside source of energy. On Earth, that energy comes, ultimately, from the Sun. And it is understanding why the Sun shines that makes it clear why order has arisen out of chaos and how the Universe came to have an arrow of time.
- It's all thanks to gravity. Gravity has a very curious property - the gravitational energy of any object that has mass is negative. it is literally less than zero, and the more compact the object is the more negative its gravitational energy is.
- The gravitational field started with zero energy. So now it has less than zero - negative energy. By the time the particles come together to make a star, the gravitational field associated with that star has a lot of negative energy. How much?
- The furthest time back we can see is when the Universe was about 300k years old, and consisted of a very nearly uniform sea of hot gas (strictly speaking, a plasma), at about the same temperature as the surface of the Sun today - roughly 6k degrees C
- When the gravitational forces between the individual forces between the individual particles that make up the gas cannot be ignored, as is the case for large clouds of gas and dust in space, gravity can pull things together in clumps, making more order and at the same time reducing the entropy.
- As Paul Davies has put it, "gravitationally induced instability is a source of information". More information implies less entropy, so you can also regard this as implying that as the information "comes out" of the gravitational field in a collapsing cloud of gas, the gravitational field is swallowing entropy, to go with its negative energy. It is the negative energy of the gravitational field that makes it possible for it to swallow up entropy in this way, explaining why the Universe is not in thermodynamic equilibrium today.
- Ultimately, it is gravity that tells the arrow of time which way to point.
- A planet like the Earth is bathed in the glow of energy from a star, which makes the whole surface of the planet an open, dissipative system. All life on the surface of the Earth makes use of this energy to maintain itself far from equilibrium, on the edge of chaos. Plants get their energy directly from sunlight through photosynthesis; grazers get their energy from plants; carnivores get their energy from other animals. But it is all originally from the Sun, and it is all, originally, thanks to gravity. But the way in which systems organize themselves, using this flow of energy, into what seem to be complex forms, is really quite simple.
- Energy also comes from within the Earth, chiefly as a result of the decay of radioactive elements in the Earth's core. This radioactive material was produced in previous generations of stars, and spread through space when those stars exploded, becoming part of the interstellar cloud from which the Solar System formed. So this energy source, too, ultimately owes its origin to gravity. Life forms that feed off this energy, which scapes through hot vents in the ocean floor, may do so entirely independently of the energy from sunlight, but they are as much a product of gravity as we are.
- Turing showed that there are systems which cannot be "compressed" algorithmically, and whose most compact representations are themselves. In particular, the shortest description of the Universe is the Universe itself.
Earthquakes, Extinctions, and Emergence
- A complex system is really just a system that is made up of several simpler components interacting with each other.
- The simple pieces have to be connected together in the right way, so that they interact with one another to produce something that is greater than the sum of its parts. And that's complexity, founded upon deep simplicity.
- When scientists are confronted by complexity, their instinctive reaction is to try to understand it by looking at the appropriate simpler components and the way they interact with one another. Then they hope to find a simple law (or laws) which applies to the system they are studying. If all goes well, it will turn out that this law also applies to a wider sample of complex systems (as with the atomic model of chemistry, or the way the laws of cogwheels apply both to bicycles and chronometers), and that they have discovered a deep truth about how the world works.
- There are very many small earthquakes, very few large earthquakes, and the number in between lies, for any magnitude you choose, on the straight line joining those two extreme points. This is a power law, now known as the Gutenberg-Richter law - a simple law underlying what looks at first sight to be a complex system.
- An 8 magnitude earthquake, like the SF earthquake of 1906, is 20bn times more energetic than a magnitude 1, which is like the tremor you feel indoors when a heavy lorry passes by in the street outside. Yet the same simple law applies across this vast range of energies.
- There seems to be a connection between the fractal geometry of Norway (and other coastlines) and the frequency with which earthquakes of different sizes occur worldwide (and in individual earthquake zones). Part of that connection lies in the fact that they and fractals are both scale-free.
- Power laws always mean that the thing being described by the law is scale invariant, so that earthquakes of any size are governed by exactly the same rules.
- Large events are more rare and you can express this by saying that:
- The frequency of an event is equal to 1 divided by some power of its size, or
- The size of an event is proportional to 1 over some power of its frequency, f.
- As the exact power is not particularly important, in general this is called "1 over f noise" (or power law behavior, or pink noise), and written as "1/f noise".
- At one extreme, the direct contrast with 1/f noise would be white noise, which is completely random, and at the other extrem the contrast would be with a pure signal containing just one frequency, like a single musical note. The sound of music or of the spoken word is also 1/f noise.
- 1/f noise contains information.
- There are few cities with very large populations, and many more cities with smaller populations, and (for the world as a whole and for individual regions), the way people congregate in cities behaves a power law, consistently over time. Even though we choose where to live individually, we are subject to the same laws as earthquakes.
- The prices of commodities also obey power laws, and large events like the crash of October 1987 can happen out of the blue as a consequence of small triggers.
- Sometimes human intervention on a large enough scale will have an effect - WWII provided a boost to the US economy that pulled it our of depression
- Diminishing returns is a form of negative feedback, while increasing returns is positive feedback.
- Modern economists are dealing with dynamic changing systems, in which positive feedbacks are involved and through which a form of energy (money) flows. Thanks to feedback, economies are self-organizing systems on the edge of chaos, though we are on the inside looking out, and human beings are an integral part of the system they are trying to analyze.
- On the geological scale, we need to look at the patterns of extinctions throughout history to get a proper feel for whether the death of the dinosaurs was a special event or just one of those things. The K-T event is one of the "Big Five" extinctions in the last 600m years, and by no means the biggest:
- 440m ya - Ordovician/Silurian boundary
- 360m ya - Devonian/Carboniferous boundary
- 250m ya - Permian/Triassic boundary
- 215m ya - Triassic/Jurassic boundary
- 65m ya - K-T boundary
- It is estimated that just over a third of all species that have ever lived on Earth have died out in mass extinctions, but as 99% of species have gone extinct, two thirds have disappeared in lesser events. Extinctions happen on all scales, all the time, and an extinction of any size can happen at any time. Life on Earth, which is self-organizing, feeding off a flo of energy, and exists at the edge of chaos.
- Power laws and 1/f noise are always associated with large systems that are made up of many component parts (complex systems), which are open, with energy supplied from outside, if they are to arrive at the edge of chaos, a state that came to be known as self-organized criticality.
- The sandpile model - When adding more sand to a sandpile, the average amount of sand in the pile stays the same, with the same amount falling off the edge of the table as is being added from above. The system is in a state of self-organized criticality, feeding off a flow of energy carried by the new sand grains being dropped onto the pile. And, just as in the way real earthquakes of any size can be triggered by stimuli of the same size, adding a single grain of sand may cause one large avalanche, or a series of small avalanches, or just leave the new grain of sand delicately balanced on the pile. But the pile always remains close to critical.
- All complexity is built on networks, interconnections between the simple parts that make up a complex system.
- When looking at a collection of buttons on the floor, and starting to connect pairs of buttons:
- The size of the largest cluster grows slowly at first, in a linear fashion, as the number of threads connecting pairs of buttons is increased. Because most buttons don't have many connections, already there is only a small chance that each new connection will add another button or two to the existing largest cluster.
- But when the number of threads approaches and then exceeds half the number of buttons, the size of the largest cluster increases extremely rapidly (essentially exponentially) as each new thread is added, because with most of the buttons now in clusters there is a good chance that each new connection will link one smaller cluster to the existing largest cluster. Very quickly, a single supercluster forms, a network in which the great majority of the buttons are linked in one component. Then the growth rate tails off.
- As the number of connections in the network exceeds half the number of nodes, it switches very quickly from one rather boring state to another stable state with a lot more structure, but in which there is little scope for further change. This is a phase transition.
- In the primordial chemical broth, it is quite easy to see how a network of connections between chemicals can arise, an autocatalytic network which sustains itself. This argues Kauffman, is the way life arose - as a phase transition in a chemical system involving a sufficient number of connections between the nodes (individual chemical compounds) of the network.
- If the network is insufficiently connected, there is no life; but add one or two connections and life becomes not only possible but inevitable.
- In the fully-developed human being, there are some 256 different kinds of specialized call. In each case, only the appropriate bits of DNA (the appropriate genes) ever get "switched on" during the normal course of life, so that a liver call is only ever a liver cell. But all the rest of the genetic information is still there.
- The only systems that behave in a way that is both complicated enough to be interesting and stable enough to be understood are those in which each node is connected to exactly two other nodes. Interesting things happen at the edge of chaos, and feedback is an essential ingredient in what makes them interesting. In those systems alone, each state cycle has a length equal to the square root of the number of nodes.
- In these kinds of systems there should be a number of different attractors approx equal to the square root of the number of nodes. With 100k nodes, there will be about 317 different attractors, with 30k nodes, 173 attractors. There are between 30k-100k genes in the human genome, and there are 256 different kinds of cell in the human body. Could it be that each type of cell represents a particular state cycle for the human genome, in which specific genes are turned on and others off?
- The number of cell types increases in proportion to the square root of the amount of DNA present in the organism, and the number of genes is proportional to the amount of DNA. The number of cell types does increase as the square root of the number of genes! There are a couple of hundred state cycles which are attractors in a network of tens of thousands of genes interacting with one another in accordance with the rules of Boolean logic .
The Facts of Life
- The essence of Darwinian evolution is very good at explaining relatively stable (at most slowly evolving) situations involving only a few species, as in the classic example of the Galapagos Islands:
- Offspring resemble their parents.
- There are slight (sometimes more than slight) variations between individuals in each generation.
- There are more individuals born in each generation than survive to become adults and reproduce in their turn. The ones who survive are those than fit best with their environment (which may be changing and in which live other species). It is the ones who "fit" this environment best, not those who are, in some abstract way fittest, who will survive.
- There is a competition for food between members of the same species.
- Better means "leave more descendants - this is the only criterion of evolutionary success.
- John Maynard Smith of University of Sussex applies games theory to his split of a single species into:
- Hawks - Who behave aggressively.
- Doves - Who are more submissive.
- If everyone is a dove, nobody gets hurt and everyone has enough to eat.
- But if one hawk arises through mutation, it will thrive and produce lots of offspring who share its hawkish tendencies
- If everybody becomes a hawk, then every confrontation ends in a fight and the population could go extinct unless there is plenty of food around.
- Neither extreme is stable, and from either end there is an evolution towards the middle. The balance may be around five doves for every seven hawks, but this stable situation is not necessarily the best scenario (which remains everyone as a dove)
- But even an ecosystem that seems to be in a stable state is actually constantly changing, being kept in balance only because both (or all) the components are evolving as fast as they can in order to keep up with the others. This is the Red Queen effect, from the queen who has to run as fast as she can in order to stay in the same place.
- In the case of frogs and flies, frogs evolve to get better at catching flies and flies evolve to get better at escaping frogs. Each has to run as hard as possible to stay in the same place.
- If what happens to one species affects every other species directly, it is impossible for even a pseudo-stable Red Queen situation to develop. There is utter (but deterministic) chaos in the system, because even small changes have large and in practice unpredictable effects. But in real life, what happens to one species affects only a few other species directly, although what then happens to those species may affect others, in a widening ripple of interactions.
- Foxes and grass are not directly connected, but rabbits eat grass, so what happens to foxes, by affecting rabbits, in turn has an effect on grass. More foxes means fewer rabbits which means the grass can grow longer and this will in turn affect other species that feed on the grass.
- A process of co-evolution, in which all the species involved change together when one of them changes, will naturally push complex ecosystem from either extreme towards the interesting region of self-organized criticality in the phase transition on the edge of chaos. If a group of organisms is locked into a stable strategy, a mutation involving one of the species is likely to open up the network, allowing it to evolve. Evolution by natural selection will ensure that a change which is detrimental to the species involved will be washed away over the generations; but a change that is beneficial will spread, and by spreading it will open up more networks, pushing the system towards the edge of chaos.
- On the other side of the phase transition, in the chaotic regime, the same thing happens in reverse. With the rules of the game of life changing in every generation, any group of individuals that manages to insulate itself from the chaos to some extent, by reducing the number of its connections to the outside world, will get a chance to evolve, by natural selection, into a state that takes advantage of the opportunities provided.
- Simply by each individual species acting to maximize its own evolutionary fitness, the whole ecosystem evolves towards the edge of chaos.
- Fitness landscapes - Where hills represent successful evolutionary strategies (good packages of genes working together), and valleys represent unsuccessful evolutionary strategies (bad genes). Any particular individual can be represented by a single point on the landscape, which is really a kind of phase space, with the hills as attractors. The species as a whole will be represented by a cluster of points.
- Suppose the peak the species has evolved to is just a foothill, and there are much higher peaks that it could be on nearby. There is no way for the species to cross the intervening valley floor and climb those peaks, because to do so it would first have to evolve towards a words (less well fitted- state than the one it is in today - it has evolved up a blind all, from which there is no escape. Fisher's landscape is not only a static one, but one in which species can only achieve a local equilibrium, and have to stay there.
- Because of co-evolution the position and size of those peaks change all the time. In the static regime, nothing change; in the chaotic regime, the landscape changes so rapidly that nothing interesting can ever get time to happen. On the edge of chaos, the landscape is always changing, but usually slowly, and this keeps opening un new evolutionary possibilities for individual species, and groups of species, within the ecosystem.
- This is no longer solely about individual species doing the moving towards a better state:
- Genes may work as a unit to control one aspect of the workings of a cell
- Cells work together as an organ such as the liver,
- Organs work together as an individual
- Individuals work together as a species
- Several species may be linked together in an interacting network, somewhere between as ESS and total independence.
- The cut and try of evolution isn't just to build a good animal, but to find good building blocks that can be put together to make many good animals.
- Co-evolution, not just evolution
- Any genus within a group has a constant probability of going extinct and disappearing from the fossil record. It doesn't matter how long, or how short, a time the genus has existed already, the chances of it going extinct in any chosen interval of geological time are identical. The same applies at other levels in the web of life. Species get neither better at surviving nor worse as time passes; they die at random.
- Successful species actually get better at evolving as time passes - a mutation which enables a species to adapt more quickly to its changing environment is itself likely to be successful and spread through a population. Eg the evolution of sex, which is a way of reproduction that speeds up evolution, giving large, slow-breeding, multi-celled creatures like us a change to keep up in the arms race in competition with small, swiftly reproducing organisms such as bacteria and parasites.
- Darwin would have been thoroughly at home with the idea of punctuated equilibrium, and would have found his ideas totally unthreatened by it.
- Changes in the boundary or interface are most likely to occur at places where the value of some appropriate parameter is at its largest or smallest - "extremal dynamics".
- the extreme cases in ecological networks are the species that are best fitted to their environment or the least fitted.
- The model naturally produces a situation in which intervals of stability are separated by mass extinctions, even though the same rules apply during the quiet intervals and the extinctions. The result is described as a power law.
- For all but a few extreme versions the system settles down into a critical state in which the patter of extinctions obeys a power law similar to the one in the fossil record.
- As tension builds up in an earthquake zone until something has to give, it is as if gradual Darwinian evolution builds up more and more tension in the ecological network, through the Red Queen effect, until something snaps, and the whole network or part of it collapses.
- Over an extremely wide range of possibilities, whatever conditions you start out with and whatever shocks you apply to the living systems (external or internal or both), you arrive at the self-organized critical state on the edge of chaos, where even a small trigger can, on occasion, produce a very large change in the system as a whole. Life really is like that.
- 250m years ago, at the end of the Permian, a small trigger could have caused the extinction of most species around at that time, whereas a comparable small trigger affecting one continent today could not produce such a large global wave of extinctions. The physical environment and the biological environment are interconnected, much more subtly than it seems at first sight.
Life Beyond
- Living systems characteristically being local order to systems, making entropy "run backwards" as long as they have an external source of energy to feed on.
- For the atmosphere of the Earth to stay in this seemingly stable state for hundreds of millions of years, "something must be regulating the atmosphere and so keeping it at its constant composition. Moreover, if most of the gases came from living organisms, then life at the surface must be doing the regulation.
- Gaia - the Earth as a self-regulating system.
- Daisyworld! - Shows how a little feedback can allow (or require) life to regulate the temperature of a planet.
- The Earth hasn't just failed to freeze when it was young; it has failed to fry as it has aged, and has actually maintained a remarkably uniform temperature for billions of years as the Sun has grown hotter.
- What you need to do, of course, is to steadily reduce the amount of greenhouse gases in the atmosphere as the Earth ages, to compensate for the increasing heat from the Sun.
- Microscopic lifeforms in the oceans play a key role in controlling the climate of the Earth.
- Gaia posits that the biological and physical components of our planet are part of a single network which operates in a self-organized way to maintain conditions that are broadly suitable for the existence of life, but which undergoes fluctuations on all scales (including Ice Age-Interglacial rhythms, and mass extinctions) analogous to the fluctuations that occur in the sandpile model. The Earth is home to s single living network, and the existence of that network (Gaia) would be easily apparent to any intelligent life on Mars capable of applying the Lovelock test and looking for signs of entropy reduction.
- The four most common reactive elements in the Universe are carbon, hydrogen, oxygen and nitrogen, collectively known as CHON.
- Apart from the overwhelming amount of hydrogen in our Solar System, for every 100 atoms of oxygen there are 57 atoms of carbon and 13 atoms of nitrogen, with silicon, the next most common element, trailing in with half the abundance of nitrogen. But everything except hydrogen and helium together makes up only 0.9% of the mass of the Solar System.
- The proteins of all living things on Earth are composed of various combinations of just twenty amino acids. This kind of material would have rained down upon the young planets in the early stages of the formation of the planetary system, deposited by comets swept up by the gravitational influence of the growing planets. A broth of amino acids has the capacity to organize itself into a network with all the properties of life
- It is natural for simple systems to organize themselves into networks at the edge of chaos, and once they do so it is natural for life to emerge wherever there is a suitable "warm little pond". It is part of a more or less continuous process, with no sudden leap where life begins.
- The most complex things in the known Universe are living creatures, such as ourselves. These complex systems are made from the most common raw materials know to exist in galaxies like the Milky Way. In the form of amino acids, those raw materials naturally assemble themselves into self-organizing systems, where simple underlying causes can produce surface complexity.
- Chaos and complexity combine to make the Universe a very orderly place, just right for life-forms like us. The Universe hasn't been designed for our benefit. Rather, we are made in the image of the Universe itself.