Monday, February 18, 2013

Information as Reality

I read an interesting book last year by an Oxford physicist, Vlatko Vedral, entitled Decoding Reality: The Universe as Quantum Information.  In this book, Professor Vedral postulates that the ultimate building block of reality is not atoms or subatomic particles, but actually information, or rather, bits of information.  Now this is not "information" as we use the term in daily conversation, but rather more like information understood in terms of computer bits ("1s and 0s") and in the way that it was defined by engineer Claude Shannon when he developed methods for maximizing the quality of communications across telephone wires.  Shannon's "information theory" dealt with probabilities and the concept of entropy: the tendency for all things in the universe to tend toward a state of randomness and disorder.  Such entropic forces impede upon common everyday communications, whether by traditional telephone wires and radio waves, or across modern fiber optic systems, in the form of interference, signal degradation, and static, which threaten to corrupt or even destroy the information that is being transferred.  Shannon's insight was that the information content of messages was roughly proportional to their complexity, where "complexity" can be understood as the unlikelihood that a set of signs or symbols which are used to convey a particular message could have been produced through a chance combination of those particular signs or symbols.  Hence, messages with more information require longer strings of such signs and symbols to be effectively conveyed.  Shannon devised methods for encoding messages (before they were sent) and decoding them (after they were received) in a manner that efficiently and effectively preserved their information content, protecting them from line "noise" and degradation that could undermine communications, cause messages to get misinterpreted or misread, or even make messages unintelligible altogether.

Vedral draws a parallel between Shannon's processes (which are now a standard feature of all communication systems) and the transmission of information content among living organisms, through genes and chromosomes.  Noting the redundancy in the building blocks of DNA "messages" transmitted through the genes, Vedral contends that this is a form of "error correction" comparable to the methods that Shannon developed for preserving messages.  It ensures that the characteristics of living organisms are preserved for future generations.  It allows for a method of preserving information far more effective and enduring than, for example, the methods that human civilizations have used in attempting to preserve their own "messages" across the spans of time: through written records, stored artifacts, or even stone monuments.  (One must question, however, just what kind of universal "message" then is being conveyed through this process of information preservation through natural selection.  It seems to be: "I have lived, and I have outdone my competitors in avoiding predators, finding and consuming prey or other food, and surviving long enough to produce offspring.")

But Vedral goes even further.  He observes that the random mutations that occur in genes and chromosomes across generations do not all constitute a disruptive "noise" which merely degrades the quality of the original messages.  Rather, those mutations which are favored by natural selection are preserved in the descendants that bear them, and in fact lead to an increase in the general complexity of life on earth (i.e., through greater variety within and among different species, and in the enhanced physical attributes and behavior patterns developed by members of these species).  There is a law in physics, the Second Law of Thermodynamics, which states that all processes in the universe tend to move things to a greater state of randomness and disorder, a state of higher entropy.  But it has long been observed that the evolution of life on earth - and of human civilization - has constituted a movement in the opposite direction, a sort of "swimming against the tide", or "negentropy", as it has sometimes been termed.  Vedral equates this increase in biological complexity to the creation of "biological information from no prior biological information", and sees in it an example of "creation ex nihilo".  This last observation is an important one, for it serves as one of the springboards for Vedral's boldest claim, that if one accepts that it is information that is the fundamental building block of existence, then one can understand how the universe itself came into existence: how something came out of nothing.

I don't pretend to understand all of the aspects of Vedral's theories, particularly when he proceeds to tie them in with the labryinthine concepts of modern physics, such as quantum theory.  The linking of information to the beginning of the universe reminds me of a conjecture that my best friend shared with me a long time ago about why he thought the universe was created.  Imagine, he said, that there was a God, but no created universe.  What would God think about?  There would be no perceptions or experiences of an external reality.  There would be no memories to reflect upon, since nothing had literally happened.  There would be no emotions - at least not in any substantial sense - since there was nothing tangible to love, to hate, or to react to in any way.  And there would not even be a meaningful sense of identity or self-concept, because there would be nothing else in existence that could call God by a name, perceive God, love God, hate God,  fear God, or worship God.  There would just be this Entity existing in a void.  My friend concluded that God would be compelled (perhaps too strong a word to link to a perfect being not bound by causality - "inspired" might be more appropriate, if even that is allowed) to create something else.

While there is something tantalizing about Vedral's theories, there is also something somehow hollow in his concept of information as a basis of reality.  For the information described by Vedral and Shannon can be characterized in terms of content and complexity, but not really in terms of quality.  There is nothing in their theories that really get at the heart of why information matters.  I actually think that my friend was closer to the mark - to borrow a phrase from the philosopher Jean Paul Sartre: "Existence precedes essence."  There is no information in a meaningful sense without an entity, or entities, that use the information.  There has been much talk in science in recent decades about complexity and chaos, but what may appear to be complex to one organism - an elaborate architectural design to its human admirers, for example - may appear to be chaotic to other organisms - such as the birds that sit upon it.  For a sufficiently intelligent organism, with a suitable breadth and depth of perception and memory, it could very well be that nothing would appear chaotic, while for others, most of their surrounding environment may be the sheer embodiment of chaos.  Chaos, then, is complexity uncomprehended.  And information - no matter how complex - is only as real as the living beings that use it.