The relation between matter and information is specified by information theory, which began when Shannon and Weaver defined information as the binary choices between physical states [1] (Shannon & Weaver, 1949). One choice from two states is a bit of information, two choices is two bits, and so on. Eight bits, or a byte, is eight choices, so eight electronic on/off switches can store one byte. One byte has 256 possible states so it can store one text character. All our kilobytes and megabytes are based on choices between physical states.
If only physical states exist, all information depends on them, so there is no software without hardware. Information stored as choices between matter states needs matter to exist, so that information also creates matter is like a daughter giving birth to her mother – impossible.
A choice of one state, which is no choice at all, is zero bits, so anything fixed one way contains no information. Hence, a physical book that only exists one way contains no information in itself. This seems wrong but it isn’t, as hieroglyphics no-one can read do indeed contain no information. Information only emerges from a book when it is decoded, based on a decoding context. For example, the decoding context for this text is the English language. A different decoding context, like reading every 10th letter, gives different information from the same text.
The decoding context of a physical signal is how many physical states it was chosen from. It defines the amount of information sent, so one electronic pulse sent down a wire can represent one bit, or it can be one byte as ASCII “1”, or as the first word in a dictionary can be many bytes. Information isn’t just the physical signal, but also its decoding context. If it weren’t so, data compression couldn’t store the same data in a smaller signal, but it can, by better decoding. In general, the information in a physical signal is undefined until its decoding context is known. The transfer of information between a sender and receiver requires an agreed decoding context, so a receiver can only extract the information a sender put in a signal if they know how to decode it.
Given the above definition of information, processing can be defined as the act of changing it by making new choices. Writing a book is then processing, as it can be written in many ways, and reading a book is also processing, as it can be read in many ways. Processing lets us save data in a physical state and reload it later, given a decoding context. Information stored in a matter state is then static, while processing as an activity is dynamic.
[1] Mathematically, Information I = Log2(N), for N choice options.