QR2.1.2 Information Theory

The proposed quantum network can be understood in terms of information theory. Modern information theory began with Shannon and Weaver, who defined information as the number of choice options as a power of two (Shannon & Weaver, 1949). By this logic, a choice between two physical options is one bit, a choice between 256 options is 8 bits (one byte) and a choice of one option, which is no choice at all, is zero bits. Processing was then defined as changing information, i.e. the event of making a new choice.

A bit as the choice between two physical states means that while a physical state can be seen as a bit, by information theory it holds no information at all in itself as it is just one way. A book can “contain” information but in itself has zero information because it is fixed in one way. This seems wrong but it isn’t, as hieroglyphics no-one can decipher indeed contain no information. Writing only gives information when a reader decodes it, and that depends entirely on the decoding context, e.g. reading every 10th letter of a book, as in a secret code, gives a different message with different information.

It follows that the amount of information “in” a physical state depends on the assumed number of physical states it was chosen from. Hence one electronic pulse sent down a wire can represent one bit, or as ASCII “1” it can be one byte, or as the first word in a dictionary, say Aardvark, can be many bytes. That the information in a physical message depends on the decoding context lets data compression store the same data in a smaller signal using more efficient encoding. If information was purely physical, data compression couldn’t pack the same data into a physically smaller signal! In general, the information in a physical signal is undefined until a reader decodes it. Only when sender and receiver agree on the encoding-decoding context can they agree on the information in a signal, and thus communicate.

Information only emerges from a physical state when an observer is added. When we store information as a set of physical states in a book or database the same applies – those states in themselves contain no information at all. But when one writes a book in English say, the writer provides the encoding context and a reader can use the same context to extract the same information.

Since processing is defined as creating information, writing a book is processing, as one can write it in many ways, and reading a book is also processing, as one can read in many ways, but the book itself, being just one way and no other, is empty of information. So information doesn’t exist without processing which consists of dynamic events not static physical states. By Shannon and Weaver’s definition, processing is the dynamic means by which static information is encoded and decoded.

Next