top of page
Search

The Hard Problem of Consciousness Part 2: Exploring Information Theory in Context of Consciousness




To prove the existence of consciousness, let's start with the theses given by Chalmers. He proposed two axioms: any consciousness is structurally consistent with awareness, but is organizationally invariant (it does not matter which carrier the logical structure is located). If we make a silicon-based human brain, its awareness will be the same as ours.


This assumption simplifies the task of understanding consciousness. We can state that if a phenomenon occupies the same place in a conscious logical system, its awareness will be the same. For example, the color red is perceived to be equally red for everyone if there are no external disagreements about colors; if two people call one object red, then they perceive it the same way. Such a conclusion is unfounded since the subjective component of the experience is unverifiable, regardless, it will help us build further conclusions.


Chalmers proposed to perceive consciousness as one of the aspects of information processing. It is a very interesting approach, even somewhat similar to attempts explaining consciousness using quantum mechanics. Information is a useful idea, and information theory is a well-developed field that allows us to tackle many practical problems; however, we still do not have a definition of information. Although many have attempted to define it, the term "information" remains intuitively accepted. We all have a rough idea of what it is; we can even measure the amount of information, and that's enough for many things but not for forming a strong connection with the theory of consciousness.


Let's sort out the concept of information. From a human point of view, any object can be represented as a set of information. Scientists are engaged in obtaining information about everything. You can imagine any interaction as an exchange of information. For example, the interaction between wood and fire. Information is exchanged between the piece of wood and the fire, during which the system strives for information equilibrium, which in this case will be expressed as thermodynamic. The temperature of the fire and the piece of wood is then equalized, if this temperature is greater than the temperature of the piece of wood, the wood lights up and burns.


The information in the example of wood and fire is the temperature difference. However, information is not an absolute value but consists of many parameters that depend on the effects of the external environment and internal structure. To calculate a single value—the combustion—we will have to find out, for example, the conditions of it, such as the amount of oxygen in the environment, the calorific value of the wood (and hence the organic composition of the wood, its humidity, etc.). I dare even assume that the exact value will not be calculated since many factors must be taken into account. In practice, it is enough to know the temperature with an accuracy of a degree and we have to admit that it is impossible to take into account the entire volume of data to solve a physical problem, we have to round it up. However, when describing an informational physical process, we cannot assume that rounding occurs. Physical logic seems to be extremely accurate and it has no errors.


In a situation where rounding of physical information occurs, we would be living in a terribly chaotic world. The exchange of information occurs involuntarily. This can be represented in the form of an empty glass into which water is poured. We may not know the volume of the glass, but still, when we pour enough water into it, it will begin to overflow. There is no natural information here - there are only certain characteristics of matter.


Consider an indivisible unit of matter—a quantum. When isolated from the rest of the world, it remains latent, showing no signs of manifestation. However, introduce an influence—temperature, sunlight, or an object falling with the matter—and a consequential event becomes inevitable. Something will either change or remain unchanged, and this outcome signifies a characteristic of the matter. This reaction is the sole manner in which matter responds to a specific type of influence. We can assign a term to this characteristic—radioresistance or melting point—and measure it in other objects. By identifying underlying patterns, we can group the quanta of matter. Then we can say that these are not quanta; they have components because this is the only way to explain why some pieces of matter behave this way while others behave differently. And so, this process continues over an extended period of time. It's crucial to emphasize that we must approach the studied object as either a quantum or a collection of quanta. Otherwise, logical reasoning is not effective. In this study, we define and utilize information as a universal characteristic.


Going back to our experiment with a piece of wood, we set it on fire, and it burns. This phenomenon is only a manifestation of a characteristic that leads to nothing. To form an idea of a process, we need to compare it with another. Let's try to set fire to a stone; it does not burn. Water: the fire goes out. So now we can form a scale of combustion and introduce the concept itself. At this stage, we have received the information. We can describe the combustion process and compare different materials. We have a universal code for the designation of a group of phenomena, and we can compose a message for the Shannon communication scheme. Before, we could not even get any information about what was happening because we did not have a base for it. When a base appears, the characteristics become universal and turn into information.


We have now introduced a variant of using the terms characteristic and information. Interesting conclusions can be drawn from these theses. In physics, if we consider matter not as a collection of quanta but as a continuous field with relative (and not absolute!) fractional interaction characteristics, it is possible to take a fresh look at the logical conclusions of theories. Instead of clustering quanta, we can turn to the types of interaction results, and then the set of characteristics of space becomes much smaller. Such an approach would allow us to describe physical phenomena more effectively.


Bibliography:

  1. Shannon, Claude E. "A Mathematical Theory of Communication." 1948.

  2. Turing, Alan M. "On Computable Numbers, with an Application to the Entscheidungsproblem." 1936.

  3. Boltzmann, Ludwig E. "Kinetic Theory of Gases." 1872.

  4. Rényi, Alfréd. "On Measures of Information and Entropy." 1961.

  5. Clausius, Rudolf J. E. "Über das Wesen der Wärme verglichen mit Licht und Schall." 1857.

8 views

Subscribe to get updates!

Thanks for subscribing!

bottom of page