Role of entropy as a measure of chaos

 

Kazhikenova S., Sagyt E.

 

The scientist introduces a concept of entropy as a measure of uncertainty of knowing something, and a message as a means of the knowledge increase. For the convenience of calculating entropy of messages transferred by a binary code Shannon substituted the natural algorithm used by thermodynamics lnfor a logarithm with a binary base :

 

,

. 

 

He suggested a formula using which it is possible to measure the information amount of the events taking place with various probability.

Thanks to this formula scientists obtained a possibility to measure information contained in the code characters of quite different content. Besides, selecting logarithms as a measure of information, it is possible to sum up the information contained in each code character making a message, and thus to measure the information amount contained in the complete message. Shannon’s conception permitted to construct a fundamental theory which was widely recognized, practically used and continues developing at present.

Thus, information and entropy characterize a complicated system from the point of view of order and chaos; besides, if information is a measure of order, then entropy is a measure of disorder. This measure stretches from maximum entropy, i.e. chaos, complete uncertainty, to the highest level of order. So, the level of order is defined by the level of information at which a system is found.

   Today entropy is a concept that is widely used in different fields of science: in mathematical theory of metric spaces, in theory of control, in biological ecology, in statistical physics, in theory of information, etc.

In the present work we base on the terminological instrument of theory of information, in accordance with which we will understand (within the frames of the tradition founded by C. Shannon):

- under “entropy” – a measure of uncertainty (unpredictability) of a system characterized by the possibility of selecting as the following element some number of any variant out of the finite number of them;

- under “information” – eliminating the uncertainty in the system by the way of a realized selection of a variant though unpredictable in relation to the previous states of the selection system;

- under “redundancy” – a measure of the original predictability of selecting the following element of the number, due to which this selection itself does not lead to occurring new information (that is predefined in advance); besides, the selection predictability turns out to be conditioned by the system previous state;

- in the connective “entropy – information – redundancy” the first and the third elements are opposed to each other because of their different relation to information, besides, the former is understood not as a substantive, but exclusively as a combinatory category.

Information processes taking place in the material world, wild life and human society are studied by almost all scientific disciplines. The growing complexity of scientific research problems led to the need of attracting to their solution of different specialties scientists. It’s worth noting that originally information was studied by two complex fields of science: cybernetics and information. At present theory of information came out of the limits of cybernetic processes and widened its sphere of application. Today information theories play the key role in all the spheres of human life, therefore the main type of human activity becomes more often methods of the obtaining, storing and widening of knowledge and information.

N. Wiener wrote: “Information is a designation of the content obtained from the external world in the course of our and out feelings adaptation to it. The process of obtaining and using information is the process of our adaptation to the environment chances and out vital functions in this environment. The present day life needs and complexities claim much larger requirements than ever before to this information process, and our press, our museums, scientific laboratories, universities, libraries and textbooks are to satisfy this process demands, else they won’t implement their predestination. To live actively means to live possessing correct information. Thus, communication and control are connected both with the essence of human existence and with the life of a man in the society”.

Very significant in this definition is the indication of the content changing in the course of its forming under the influence of various extra-linguistic factors. In this connection is expedient to turn to the etymology of the word “information” divided into two parts: “in-formation”, that means acquisition of form, forming a certain content using an external action.

In philosophy the concept of information is considered as a general scientific cognitive category: “Information is a result of material systems interaction. It exists objectively in any material system. The sources of information are real material systems, processes, phenomena, events. The highest, most developed form of information is representation in the human consciousness of organization of material systems or the surrounding world real processes parameters in the form of conceptions, representations of various theories, models, descriptions and schemes of these systems and processes”. Information is the reflection of the external world with the help of signs or signals. The information value of a message consists in new data that are contained in it (in reducing ignorance). Information is so wide concept, that it cannot be explained by a single phrase. This word has various meanings in engineering, science and everyday life. But it is indubitable that information is a fundamental scientific concept alongside with matter and energy.

General scientific category of information contains three main components:

­       data, knowledge  - a quantitative component;

­       order (negentropy)  - a qualitative component;

­       diversity (reflection of the real world multiplicity by means of a certain character system)  - a multiple component. The quantitative component makes the base for information existence, the qualitative component indicates the final aim of the information process, and the multiple component projects the process of informing to various situational levels.

In the trend of theory of reflection information is considered as an ordered reflection, and noise – as a disordered reflection. Studying the opposition “information-noise”, well-known scientist À.À. Soshalskiy comes to the conclusion that information is a phenomenon of personality character, noise is non-personality phenomenon; information subordinates in a larger or a smaller extent to the laws, noise is a spontaneous phenomenon, not subordinating to any laws; information is a structured phenomenon, noise is amorphous; information is a negentropic phenomenon, noise is entropic.

Information entropy is a measure of information randomness, uncertainty of some symbol of the primary alphabet occurring. With the absence of information losses it is numerically equal to the information amount for a symbol of  the message transferred.

Some scientists believe that information is one of the kinds of negentropy. However, we will agree with À.À. Soshalskiy who considers that negentropy is only a measure of information, entropy is a measure of information lack. Within the frames of our study it is very important to establish mutual dependences between the concepts of information, entropy and negentropy, as it permits to consider the information capacity as a conception reducing entropy and assisting to order the elements of a certain system.

Modern studies devoted to methodology of entropy and information, are, in our opinion, based on the works by doctor of philology M.Yu. Oleshkov. He considers that “entropy is understood as a measure of uncertainty (unpredictability) of the text development in the discourse process, characterizing by a possibility of selecting as the following stage some stage out of the number of variants. The entropy indicator characterized quantitatively the level of the information order of the text as a system: the more it is, the less ordered is the system (= text), the more is its discrepancy from the “ideal” development. Thus, entropy is a function of the condition; any condition of the system can be given a certain entropic value”.

This concept of entropy is also productive in a wider philosophical aspect because it poses a very important problem; the problem of scientific knowledge value that can change quantitatively (the accumulated sum of data) and qualitatively–knowledge that decreases constantly the growing entropy.

Thus, summing up the above-said and taking into account three variants of entropy in modern science, we’ll try to determine the meaning of the concept “entropy”:

­       in thermodynamics (according to Clausius) it is a function of condition: entropy is proportional to the amount of unavailable energy that is in the system and cannot be converted  into work;

­       a measure of chaos disorder, uniformity of molecular systems (according to Boltzmann);

­       in theory of information (according to Shannon) it is a measure of certainty of information transmitted through the communication channel: entropy characterizes quantitatively the certainty of the transmitted signal and is used for calculating the information amount.   

Well-known physicist S.M. Korotayev said: “It is difficult to find concepts more common for all sciences (not only natural) and at the same time sometimes bearing a color of mystique than entropy and information”.  Naming the concepts universal, natural-scientific, fruitfully used in a lot of fields, the scientist expressed a hope that readers will not only see the possibilities of using the entropic approach in their field but also develop it further more.

Indeed, when considering the concept of information, it is impossible not to touch the other, opposite concept, i.e. entropy. Both concepts, being tied in the organic whole in the middle of the last century thanks to C. Shannon, never part now.