General theoretical conception of “entropy” and “information”
in modern science
Kazhikenova S., Sagyt E.
At present science
needs a new paradigm because all the aspects of the information structure
depend on mechanisms of real vital activity. In the present day world
information presents one of the most important resources, one of the motive
forces of the humanity development. In the middle of the XX century there took
place two events that, in our opinion, in a significant degree define the
further ways of science development, the world scientific understanding, any
objects’ theoretical and practical perfection. Here we speak of developing the
theory of information, establishing the characteristics of information measures
and of beginning the analyzing of entropic-information criteria mechanisms, for
which studying synergy attracts all the latest achievements in unbalanced
thermodynamics, theory of information and general theory of systems. The
aspiration for the determinate description of real processes leads inevitably
to the subject or object idealism and by this introduces into the process and
the result of cognition stochasticity connected with diversity of points of
view, interpretations and versions of different authors.
The occurrence of
theory of information is closely related to the name of C. Shannon who
suggested a solution of the main problem of finding the information
transferring speed that can be achieved at the optimal method of coding and
decoding so that the probability of error was as small as possible. Theory of
coding is characterized by that alongside with statistical methods it uses for
building concrete codes deep algebraic and combinatory ideas.
The work “entropy”
was fir first time used in 1865 by German physicist Rudolf Clausius for naming
a value characterizing the processes of thermal energy transition into
mechanical one. In his main scientific work, a three-volume monograph
“Mechanical theory of heat” R. Clausius explains in details the expediency of
introducing this quite specific, new concept. Speaking of that “heat cannot
itself come from one cooler body to a warmer one”, R. Clausius proved that
there does not exist a method of heat transition from a cooler body to a warmer
one without some changes in the nature that could compensate such a transition.
His idea to name the new function of state the German scientist explained in
such a way: “trope” means in Greek “transformation”; two letters “en” were
taken from the work “energy”, as both values are close to each other in their
physical relevance. Using the conception of entropy and Clausius’s
inequality, there can be formulated the second origin of thermodynamics as a
law of increasing the closed system entropy in irreversible processes: any
irreversible process in a closed system takes place in such a way that the
system entropy increases.
However, the
formula of entropy suggested by the scientist did not revealed the inner
mechanisms of the processes leading to the entropy increase. This problem was
solved by Ludwig Boltzmann (1872), who suggested a formula connecting entropy
with the logarithm of the system state probability. “General thermodynamics,
− L. Boltzmann wrote, − adheres to
unconditional irreversibility of all natural processes. It takes the function
(entropy) whose value in any event can change only one-sidedly, for example, it
can increase. Thus, any later state of the Universe differs from any earlier
state by a larger entropy value. The difference between entropy and its maximum
value, that is a motive force of all the natural processes, becomes smaller and
smaller. In spite of the total energy invariability, its ability to
transformations becomes smaller, the natural events become more lifeless, and
any return to the former amount of entropy is excluded”. The system
thermodynamic state entropy is determined through thermodynamic probability as: S = k·lnW, where k is Boltzmann’s constant.
This entropy expression through thermodynamic probability is called “Boltzmann’s principle”. Thus, Boltzmann’s constant k occurrence
can be considered as a consequence of connection between thermodynamic and
statistic definitions of entropy.
Not less
interesting discoveries were made by scientists I. Prigozhin and I. Stengers in
the book “Order out of chaos” where they give convincing arguments of that
irreversible processes are a source of order, generating high levels of
organization. In the authors’ opinion, entropy is not simply a system unceasing
sliding to a state without any organization, but under certain conditions it is
a progenitress of order and finally, of life. The authors of the book underline
a possibility of the spontaneous occurrence of order and organization out of
disorder and chaos as a result of the process of self-organization.
Besides, on the
basis of analyzing complicated physical, chemical phenomena scientists prove
the theorem of entropy minimum production (of reducing the temps of the
disorder measure increase) in stationary unbalanced phenomena , proving the
possibility of order occurrence out of chaos if there external impacts that
always take place in the real world. The formed order in the form of structures
of spatial-time, functional characteristics is described not only by external
factors but in a greater extent by the characteristics of the complicated
object itself. That’s why this law is called self-organization. Later on a lot
of studies of different authors showed the universality of this phenomenon.
Evolution of open systems exchanging with the environment the matter, energy
and information is always followed by the process of self-organization.
In works by I.
Prigozhin as the main postulate there is taken the formulated at the
microscopic level the second law of thermodynamics, i.e. the law of entropy
increasing and time asymmetry. Besides, there is introduced a new concept, the
internal time characterizing the processes in unstable dynamic systems. On a
lot of examples from physics, chemistry and biology there is demonstrated a
constructive role of irreversible processes. For outstanding services in the
field of thermodynamics of irreversible processes I. Prigozhin was conferred a
Noble Prize.
The interaction
between information and self-organization based on the principle of information
entropy maximum in relation to a wide circle of unbalanced processes is spoken
of by Herman Haken who is by right considered the flounder of synergy. The
author considers the synergetic approach to the problem of images recognition
using methods of studying various systems: from biological to quantum. He makes
a conclusion that “information is the decisive element of existing the life
itself’ and is connected with the transmission capacity of a communication
channel or with the commands acquiring the role of the environment from which
there is obtained the concrete information.
Entropy
and information, being the expression of two opposite tendencies in the
processes of development, are reflected in the formula H + J = 1 (const). If a
system evolves in the direction of order, its entropy decreases, however, it
requires purposeful efforts and managing. One of the founders of cybernetics,
American scientist N. Wiener, writes: “While entropy is a measure of
disorganization, information transited by a certain flow of messages determines
the measure of organization… We are drifting along the stream fighting against
a great flow of disorganization which, in accordance with the II law of
thermodynamics, is striving to bring everything to a thermal death – the global
balance and similarity, that is to entropy”.
There
can be cited a lot of works devoted to this subject and presenting scientific
interest not only for a separate branch but for science on the whole. For
example, American specialist in the field of physical-mathematical sciences,
one of the pioneers of theory of chaos Ì. Feigenbaum, found universal numbers
describing a structure of dynamic systems chaos and proved that any disorder
has its internal order. Feigenbaum’s theory permits to conclude that between
chaos and order there is a deep internal connection. An unperiodical, random
process occurs as a limit of more complicated structures, chaos appears as a
super-complicated organization. This conclusion is general, it can be related
to models of ecology, hydrodynamics, chemistry, etc., i.e. to any systems where
there is a sequence of the period doubling bifurcation.
An outstanding
physicist-theoretician who made a great contribution into developing modern
statistical physics and physics of open systems Yu.L. Klimontovich proved that
for the processes of self-organization there acts another law – the law of
entropy reducing. In other words, the analogue of Boltzmann’s Í-theorem for
open systems is Klimontovich’s S-theorem whose essence is in the following: If in the capacity of the origin of
randomness is taken “the balanced state” satisfying zero values of the
governing parameters, then while moving away from the balanced state, due to
the governing parameter changing, the entropy values relating to the given
value of average energy, reduce. Thus, Klimontovich law of entropy
decreasing gives a key to solving a fundamental collision of continuity and
discreteness which has not yet found its solution.
A pioneer in the
field of information theory is to be unconditionally considered R. Hartley. An
indubitable service of this scientist is that for the first time he introduced
the concept of “information” (entropy) as a random variable and defined the
measure of information. Introducing a quantitative measure of information was
the most important step on the way of perceiving the nature and anti-entropic
processes, as originally this measure was aimed for solving only applied
problems of communication engineering. The scientist suggested to estimate the
amount of information by a logarithm of the number of possible events. However,
Hartley considered non-essential the possibility of non-equally probable
outcomes, understanding that outcome probabilities affect the information
amount that is contained in a message. He considered that “difference between
these outcomes cannot be expressed in numbers, and they are determined by
psychological, metrological or some other factors not subject to the authority
of mathematics”.
The following studies in the
field of physics and biology permitted to reveal universal methods using which
it is possible to establish an interconnection between the information amount
and physical entropy and, in the end, to define the essence of a new scientific
interpretation of the concept “information” as a measure of structural order of
variable in their nature systems. American scientists C. Shannon proved that
Hartley’s point of view is erroneous. Any factors – psychological,
meteorological, and other – can be taken into account using theory of
probability. Shannon began developing ideas that became the basis of his
well-known theory of information. Shannon’s aim was to optimize the information
transition over telephone and telegraph lines. To solve this problem he had to
formulate what information was and how its amount could be measured. In his
works he defined the information amount through entropy, a value known in
thermodynamics and statistic physics as a measure of the system disorder. The
unit of information was taken what was later on called “a bit”, i.e. a
selection of one variant out of two equally probable. On the firm foundation of
his definition of the information amount the scientists proved a surprising
theorem of noisy communication channels bandwidth.