A new direction of physics, where "information" dominates all processes and overrides the rules of physics
Updated on: 40-0-0 0:0:0

The universe is not static, nor is it "simple". It's not just sliding towards heat death, it's spontaneously forming more and more complex structures – not just for life, but also for non-living systems.

In traditional physics, the second law of thermodynamics defines the "arrow of time": entropy always increases. Systems tend to be disordered, which is the underlying belief of physics. Now, a group of interdisciplinary scientists has come up with a different kind of arrow: complexity increases over time. Not against entropy increase, but in parallel. It is an evolutionary view based on "functional information" that describes why the universe self-organizes itself from simple particles into stars, minerals, cells, languages, technologies, and even consciousness.

This is not the cliché of Darwinian natural selection. This theory asserts that as long as there is some "selection mechanism" in a complex system, whether living or non-living, its functional information will rise over time. Evolution is not just the preserve of living things, but a universal process in the physical universe itself.

In 2003, biologist Jack Szostak first introduced the concept of "functional information" to measure the uniqueness of a molecule in fulfilling a specific function, such as binding to a target molecule. The smaller the fungibility, the higher the "functional" information. The concept was originally used for RNA aptamers, and has since been extended to algorithmic simulations, mineral evolution, elemental synthesis, and even language and technology evolution.

Today, Robert Hazen, a mineralogist at the Carnegie Institution in Washington, and Michael Wong, an astrophysiologist, have taken that thread to the extreme. Rather than finding the ultimate recipe for the origin of life, they propose a universal framework for the evolution of systems – functional selection that drives irreversible growth in complexity. The emergence of life is only one step in the ranks. In their view, any system in the universe that can perform a function and can be "chosen" from the possibilities is involved in this complex evolution.

This sounds like a generalization of Darwinism, but it's not the same as the blind trial and error of "survival of the fittest." Their focus is not on adaptability, but on the "feature implementation" itself. An ore is also "chosen" if it is more common in the earth's crust due to its stable crystal structure. A chemical combination that occurs frequently because of the simplicity of the reaction path is also "chosen". This is a universal selection mechanism that transcends the biosphere.

In this framework, biological systems are just higher-order manifestations of complex self-organization. For example, mineralogical studies have shown that the number of mineral species has increased significantly throughout the Earth's history, and that their distribution has become more sophisticated. This is not because "the physical mechanism has changed", but because there is a choice of paths that allow certain structures to be preserved and reproduced. Just like DNA replication, it does not require "consciousness" and can accumulate information.

More critically, functional information is not a static, closed "quantity" in a closed system, but a context-dependent, goal-oriented, dynamic quantity. An RNA fragment that binds to a specific molecule may have a high level of functional information in the current environment, but in another environment, its ability to do so may be completely invalid, and the value of the information will disappear. But the evolutionary process is precisely the constant "creation" of new contexts. The key is not the information itself, but whether it is activated and whether it participates in the execution of functions.

This is precisely the core characteristic of life: systems don't just adapt to the rules, they rewrite them.Language, culture, and technology are all new dimensions that jump out of the original rules.Szostak and Hazen used artificial life simulations to find that as evolution, the functional information of the algorithm does not increase linearly, but abruptly jumps—highly consistent with the "major jumps" in biological evolution: eukaryotic cells, multicellular, nervous system, and human language.

These transitions correspond to the expansion of the "phase space". In physics, a phase space represents a collection of all possible states of a system. For living systems, every jump in functional information opens up an unprecedented level of phase space. Kauffman calls this "the next floor": you can't predict the pattern of the second floor on the first floor until you actually get there.

Ricard Solé and Paul Davies further argue that the unpredictability of this transition to complexity is inherently Gödelian incompleteness: no system of closed rules can predict its entire future. Living systems are self-referencing and self-customized, so evolution cannot be predicted in a closed manner.

This is why the evolution of life cannot be modeled as a closed computational system. Davies et al. point out that life is different from stars or galaxies, which, even if complex, are not self-referential. Once a living system has cognition, experimentation, and language, it begins to "internally simulate" its own evolution, bringing about a higher-order leap. This is a cognitively driven evolution of complexity, no longer just about external environment selection, but about internal target design.

Thus, functional information is not just a measure of complexity, but opens up a new level of cause and effect. Just as Galileo's laws of physics no longer apply to flying birds,Once the complexity is sufficient, the behavior of the system will no longer be determined by the underlying physics, but by the higher-level functionality。 This is a biological "causal detachment": new laws of cause and effect emerge and override the rules of physics.

Hazen proposed that information may be one of the fundamental physical quantities of the universe, alongside quality, energy, and charge. But this is not Shannon information, nor entropy, but functional information: context-dependent, goal-oriented, selectable.

Sara Walker and Lee Cronin took a different approach, proposing "assembly theory", which measures complexity in terms of assembly index. Both have the same goal: the structural complexity is used to reveal the selection trajectory. These theories all reflect the fact that there is an evolutionary trajectory in the natural system that goes beyond the laws of physics and can give rise to a new level of causal law and a new level of system behavior.

This is the convergence of systems science, information theory, and evolutionary biology. It is trying to explain stellar synthesis, mineral evolution, language formation, technological expansion, and cognitive development in a unified framework. This interdisciplinary convergence is reminiscent of the early days of thermodynamics: starting with the problem of the efficiency of the steam engine, culminating in a deep understanding of entropy, the arrow of time, and the fate of the universe.

Of course, there is no small controversy. Many physicists question: what is science if you can't calculate accurately? Hazens retorts: We can't accurately calculate the gravitational system of the asteroid belt, but we can still navigate the probe through. Functional information does not necessarily need to be accurately quantified, as long as it can make a valid approximation of trends, structures, and transition paths.

In astrobiology, this theory has shown potential for application. For example, if organic molecules are found on a planet that are distributed far from thermodynamic equilibrium, it is likely to reflect functional selection at work. This may be a key feature in identifying extraterrestrial life: not how many molecules there are, but whether there are "chosen molecules".

The broader meaning is that complexity is not accidental, and life is not an isolated event. Once the selection mechanism exists, complexity grows irreversibly like entropy. And when complexity crosses a certain threshold, new rules, new goals, and new jumps emerge. Human civilization may only be a halfway point in the evolution of the complexity of the universe.

The next question to ask is not "Are there any extraterrestrial intelligence?" Rather, where is the "next level" in the universe for this functional information transition? Are there any other systems that are also penetrating upwards?