AI Models Can Evolve According to New Natural Law
To improve your local-language experience, sometimes we employ an auto-translation plugin. Please note auto-translation may not be accurate, so read original article for precise information.
The law of universal evolution encompasses both the living and the non-living, surpassing Darwinian evolution as a fundamental principle. This revelation arrives after centuries since Newton’s discovery of the law of universal gravitation and over a century since Darwin’s revolutionary insights into evolutionary biology. In the 21st century, interdisciplinary science has taken centre stage, culminating in a groundbreaking discovery by a diverse team of nine scientists, including astrobiologists, mineralogists, philosophers of science, and a theoretical physicist. Their findings, presented as the “law of increasing functional information“, relegate Darwin’s biological evolution theory to a specific case, akin to how Einstein’s theory of relativity surpassed Newton’s law of universal gravitation.
This novel natural law asserts that evolution extends beyond terrestrial life to encompass planets, stars, minerals, atoms, and other intricate systems, f.e., AI and AGI. The core tenet of this new law is that any complex natural system evolves towards states characterised by heightened structure, diversity, and complexity.
The “Law of Increasing Functional Information” postulates that a system evolves when “numerous configurations of the system are subjected to selection for one or more functions.” Functional information quantifies a system’s state, which may manifest in various configurations, each requiring specific information to achieve a given level of “functionality.” Functionality can range from general stability concerning other states to specific functions, such as the efficiency of a particular enzymatic reaction.
This law applies to systems comprising diverse components (atoms, molecules, cells, etc.) that natural processes repeatedly order and rearrange, generating countless structures. However, only a fraction of these configurations survives through “selection for function.” Whether a system is living or inanimate, evolution occurs when a new configuration performs effectively and enhances its functions.
In biology, Darwin primarily associated function with survival—enabling organisms to live long enough to produce viable offspring. Recent research expands on this perspective, identifying at least three types of functions in nature:
- Stability: The most stable arrangements of components (atoms, molecules, etc.) are selected for continuity.
- Energy Supply: Dynamic systems with a more consistent energy supply are favoured for preservation.
- Novelty: Evolving systems tend to explore new configurations that sometimes lead to remarkably new behaviours or characteristics (e.g., photosynthesis).
This newfound law has profound implications for the search for extraterrestrial life. If the enhancement of functionality in developing physical and chemical systems is a universal natural law, life should be a widespread outcome of planetary evolution throughout the cosmos.
Consequently, previous research by the lead authors of this study, Michael Wong and Stuart Bartlett, exploring “Asymptotic burnout and homeostatic awakening as possible solutions to the Fermi paradox“, takes on heightened significance.
Although this law primarily applies to natural systems, it provides valuable insights into improving AI. This law can be followed by AI algorithms by aiming for configurations that maximise functionality and efficiency. Machine learning, which is essential to AI, may advance as AI systems evolve and adapt. This could result in AI systems that can handle a broader range of tasks. Furthermore, the law is consistent with evolutionary algorithms used in AI, promising improved problem-solving and optimisation. AI-powered technology can adapt and thrive in new environments. Furthermore, interdisciplinary collaboration, as seen in the discovery of the law, emphasises the importance of cross-disciplinary approaches in AI research. In essence, the “law of increasing functional information” inspires AI to be more adaptive, efficient, and proficient, despite its roots in natural systems.
In line with the Trust Project guidelines, please note that the information provided on this page is not intended to be and should not be interpreted as legal, tax, investment, financial, or any other form of advice. It is important to only invest what you can afford to lose and to seek independent financial advice if you have any doubts. For further information, we suggest referring to the terms and conditions as well as the help and support pages provided by the issuer or advertiser. MetaversePost is committed to accurate, unbiased reporting, but market conditions are subject to change without notice.