Evolution: From Blasphemy to Consensus (Complexity: a Guided Tour)

The second law of thermodynamics states that entropy in an isolated system always increases. This principle resonates with our perception of history and our own experiences. Yet, life stands as a defiant counterexample, exhibiting increasing complexity. The crucial question is: who or what drives this complexity? Charles Darwin’s theory of evolution by natural selection provides a compelling answer, challenging traditional beliefs and marking a significant scientific breakthrough.

Pre-Darwinian Notions of Evolution: The concept of evolution, meaning gradual change, was long overshadowed by the belief in divine creation. Early thinkers like George Louis Leclerc de Buffon and Erasmus Darwin proposed that life forms evolved from common ancestors, but lacked a clear mechanism. Jean-Baptiste Lamarck suggested that organisms could pass acquired characteristics to their offspring, an idea that was later refuted by empirical evidence.

Origins of Darwin’s Theory: Charles Darwin’s journey on the H.M.S. Beagle and his subsequent observations led him to propose evolution by natural selection. Influenced by geologist Charles Lyell, economist Thomas Malthus, and economist Adam Smith, Darwin developed a theory where organisms with advantageous traits survive and reproduce, passing these traits to their offspring. Despite initial reluctance due to potential religious backlash, Darwin published “On the Origin of Species” in 1859, establishing a comprehensive and well-supported theory of evolution.

Mendel and the Mechanism of Heredity: Gregor Mendel’s experiments with pea plants uncovered the principles of genetic inheritance, contradicting Lamarck’s ideas. Mendel discovered that traits are inherited through discrete factors (genes) in pairs, with dominant and recessive alleles. His work, initially overlooked, later became fundamental to genetics and supported Darwin’s theory by providing a mechanism for inheritance.

The Modern Synthesis: The integration of Darwin’s theory with Mendelian genetics, facilitated by mathematical biologists like Ronald Fisher, J.B.S. Haldane, and Sewall Wright, formed the Modern Synthesis. This framework unified the concepts of natural selection and genetic inheritance, emphasizing gradual evolution through small variations. It provided a comprehensive explanation of evolutionary processes and established population genetics as a key discipline.

Challenges to the Modern Synthesis: In the 1960s and 1970s, paleontologists like Stephen Jay Gould and Niles Eldredge challenged the Modern Synthesis, proposing punctuated equilibria—long periods of stasis interrupted by rapid evolutionary changes. They argued that natural selection alone could not explain all evolutionary phenomena, emphasizing the roles of historical contingency and biological constraints. Molecular evolution studies, such as Motoo Kimura’s neutral theory, also questioned the exclusive role of natural selection.

Despite ongoing debates and challenges, the fundamental principles of Darwinism remain widely accepted. Evolution through natural selection is recognized as a key force shaping life on Earth, though modern discoveries continue to refine and expand our understanding of evolutionary processes. The interplay of natural selection, genetic drift, and historical contingencies forms a complex and dynamic picture of life’s evolution.

Molecular Biology and the Evolution of Genetics

Challenges to the Modern Synthesis have emerged from molecular biology discoveries, reshaping biologists’ views on evolution. This section provides foundational knowledge in genetics, essential for understanding these shifts.

Cellular Basics: Since the early 1800s, it’s known that all living organisms are composed of cells. By the late 1800s, chromosomes within cell nuclei were identified, yet their function remained unclear. During cell division (mitosis), chromosomes duplicate, ensuring each new cell receives a complete set. Meiosis, a different process, creates germ cells (sperm and eggs) with half the chromosome number, leading to genetic variation through recombination during fertilization.

Early Genetic Discoveries: In 1902, Walter Sutton proposed chromosomes as hereditary carriers, linking them to Mendelian factors. Thomas Hunt Morgan’s experiments on fruit flies validated Sutton’s hypothesis. By the 1920s, RNA and DNA were discovered, but their genetic roles were confirmed only in the mid-1940s, establishing DNA as the hereditary substrate.

DNA and Heredity: Key questions about DNA’s role in trait determination, replication, and variation were answered in the 1950s and 1960s. The discovery of DNA’s double-helix structure by James Watson and Francis Crick in 1953 was pivotal. By the 1960s, the genetic code was deciphered, linking DNA sequences to amino acids and proteins, and elucidating mechanisms of DNA replication and mutation.

Mechanics of DNA: The phenotype, an organism’s traits, is determined by proteins, which are chains of amino acids coded by DNA sequences (genes). DNA is composed of nucleotide bases (A, C, G, T), forming double strands where A pairs with T, and C with G. Gene expression involves transcription (DNA to mRNA) and translation (mRNA to proteins).

Transcription and Translation: During transcription in the cell nucleus, RNA polymerase unwinds DNA, creating mRNA, which then travels to the cytoplasm. Here, ribosomes read mRNA codons, matched by tRNA anticodons carrying corresponding amino acids, forming proteins.

DNA Replication: Before mitosis, DNA strands unwind and replicate, ensuring each new cell has a complete DNA set. Errors in replication lead to mutations, contributing to genetic variation.

Self-Referential Nature of DNA: DNA encodes the machinery for its own replication and protein synthesis, exemplifying a self-referential system that mirrors Turing’s concepts in computation.

Genetic Discoveries and Nobel Prizes: By the mid-1960s, geneticists had unraveled the fundamental processes of DNA transcription, translation, and replication. Notable achievements include Watson, Crick, and Wilkins’ Nobel prize in 1962 for DNA structure, and Khorana, Holley, and Nirenberg’s Nobel in 1968 for decoding the genetic code.

While the major mechanisms of genetics and evolution were thought to be understood by the 1960s, ongoing discoveries continue to reveal the complexity of these processes, challenging and expanding our understanding of evolution.

This book explores complexity, but defining the term rigorously remains challenging. Key questions like “Is a human brain more complex than an ant brain?” and “Has complexity in biological organisms increased over time?” intuitively seem answerable with “yes.” However, achieving a universally accepted definition of complexity has proven elusive.

Panel Discussion on Complexity: In 2004, Mitchell organized a panel at the Santa Fe Institute’s Complex Systems Summer School. Panelists, including renowned scientists like Doyne Farmer, Jim Crutchfield, and Stephanie Forrest, offered diverse definitions of complexity, leading to disagreements and illustrating the term’s ambiguity. The students’ frustration highlighted the difficulty in forming a unified science of complexity, which remains a task for future generations.

Historical Context: Science often progresses without universally accepted definitions of central terms. For instance, Newton lacked a clear definition of force, and geneticists still debate the precise meaning of a gene. Astronomers don’t fully understand dark matter and dark energy, and psychologists struggle with defining concepts like “idea” or “concept.” Scientific terms are refined over time as understanding deepens.

Measuring Complexity: Physicist Seth Lloyd proposed three dimensions to measure complexity:

  1. How hard is it to describe?
  2. How hard is it to create?
  3. What is its degree of organization?

Lloyd identified about forty measures of complexity, derived from various fields, to address these dimensions. Let’s explore some proposed definitions using the example of comparing the human genome with the yeast genome.

Human vs. Yeast Genome:

  • Human genome: ~3 billion base pairs, ~25,000 genes, with only 2% coding for proteins.
  • Yeast genome: ~12 million base pairs, ~6,000 genes.

Complexity as Size: Size can measure complexity. By base pairs, humans are 250 times more complex than yeast. By genes, humans are only four times more complex. However, amoebas have 225 times as many base pairs as humans, and mustard plants have a similar number of genes to humans, indicating size alone isn’t a reliable complexity measure.

Complexity as Entropy: Shannon entropy measures average information content. A highly ordered sequence (e.g., “A A A…”) has zero entropy, while a random sequence has maximum entropy. However, entropy doesn’t capture complexity intuitively, as random sequences would appear more complex than the structured human genome.

Complexity as Algorithmic Information Content: Proposed by Kolmogorov, Chaitin, and Solomonoff, this measure defines complexity by the shortest computer program that can generate an object’s description. A repeating sequence (e.g., “A C A C…”) has low algorithmic information content, while a random sequence has high content. Like entropy, this measure can misclassify random objects as complex.

Effective Complexity: Murray Gell-Mann’s measure combines regularity and randomness. Effective complexity is the algorithmic information content of an entity’s regularities. Both highly ordered and random objects have low effective complexity, while the human genome, with many regularities, has high effective complexity. However, identifying regularities can be subjective.

Logical Depth: Charles Bennett’s measure considers how difficult an object is to construct. Ordered or random sequences are easy to construct, but a viable DNA sequence for an organism is not. Logical depth measures the steps a Turing machine needs to generate an object, aligning with the idea that complex objects result from extensive computation or processes. Practical measurement challenges remain.

Thermodynamic Depth: Seth Lloyd and Heinz Pagels proposed measuring the total thermodynamic and informational resources required for an object’s construction. For example, the human genome’s depth includes all evolutionary events leading to modern humans. Like logical depth, this measure is theoretically appealing but practically challenging due to the complexity of tracing all relevant events.

Computational Capacity: Stephen Wolfram suggested that complex systems can perform universal computation. However, Charles Bennett argued that complexity should measure the system’s behavior coupled with its inputs, not just its computational potential.

Statistical Complexity: Jim Crutchfield and Karl Young’s measure quantifies the minimum information needed to predict a system’s future behavior based on its past. This measure is related to Shannon entropy but focuses on creating a predictive model, distinguishing between ordered, random, and complex systems.

Fractal Dimension: Fractals, like coastlines, exhibit self-similarity at different scales. Fractal dimension quantifies this self-similarity, measuring how detail changes with magnification. Real-world objects, though not perfect fractals, can be approximated by fractal dimensions, providing a way to measure ruggedness or “cascade of detail.”

Degree of Hierarchy: Herbert Simon proposed measuring complexity by the degree of hierarchy, where systems are composed of nested subsystems. Evolutionary biologist Daniel McShea developed a hierarchy scale for biological organisms, showing that maximum hierarchy increases over evolutionary time. This measure, while insightful, involves subjective judgments about parts and levels.

No single measure can capture all aspects of complexity. Each proposed measure highlights different dimensions, reflecting the multifaceted nature of complexity. The diversity of measures indicates that complexity encompasses various interacting elements, requiring a multidimensional approach for a comprehensive understanding.

References:

Complexity: A Guided Tour – Melanie Mitchel

"A gilded No is more satisfactory than a dry yes" - Gracian