In classical thermodynamics, the first field in which entropy was introduced, S is a state function of a system in thermodynamic equilibrium, which, by quantifying the lack of availability of a system to produce work, is introduced together with the second principle of thermodynamics. On the basis of this definition, we can say, in an explanatory but not strict way, that when a system moves from a state of equilibrium it ordered a disordered one to increase its entropy; this fact provides indications about the direction in which a system evolves spontaneously.
Entropy and disorder
The concept of entropy is quite complex and to fully understand its meaning at least a basic knowledge of thermodynamics and quantum mechanics is necessary; In fact, there are at least two strict definitions of entropy: a macroscopic definition, provided by thermodynamics and a microscopic definition, provided by quantum mechanics.
However, it is possible to give a simplified explanation of entropy, interpreting it as the "degree of disorder" of a system. Therefore, an increase in the "disorder" of a system is associated with an increase in entropy, while a decrease in the "disorder" of a system is associated with a decrease in entropy; However, it is necessary to clarify that the disorder is relative, for this reason the simplified explanation is not equivalent to the exact one, but it serves to represent the concept.
Other systems that can take different degrees of disorder are metallic materials. In fact, they can take the following structures:
- crystal structure (ordered): the atoms are arranged in an orderly manner; a crystalline structure is composed of several "cells" anyway, repeating itself in space; in this case we speak of "long-range order"
- polycrystalline structure (partially ordered): there are more "crystals" (ordered structures) within the material; in this case, we speak of "short-range order";
- amorphous (disordered) structure: the atoms are arranged in a completely disordered way; there is no short-range order or long-range order.
Disruption of metallic material structures also increases in the presence of so-called "crystalline defects" (including the inclusion of other atoms type or the lack of an atom in a lattice position), the presence of which leads to an increase of the entropic content of the material.
A fundamental property, also called (incorrectly) postulated as entropy, states that in an isolated system the entropy S of the system never decreases and, during an ordinary irreversible process, increases. The demonstration is the following: consider an isolated system, both mechanically and thermally, which, due to an internal disturbance, leads from a state 1 to a state 2. Being entropy a function of the state, by definition, the variation of the same does not depends on the path followed, but on the initial and final state, it is possible to conceive a reversible process that takes us from 2 to 1.
Energy and entropy
Assuming that the whole universe is an isolated system, that is, a system for which it is impossible to exchange matter and energy with the outside, the first and second principles of thermodynamics can be summarized as follows:
& ldquo; the total energy of the universe is constant and the total entropy increases continuously until it reaches an equilibrium. & rdquo;
valid statement for any isolated system.
This means that not only can it not create or destroy energy, nor can it completely transform from one form to another without a part dissipating in the form of heat.
If, for example, a piece of coal is burned, its energy is conserved and converted into energy contained in carbon dioxide, sulfur dioxide and other combustion residues, as well as in the form of heat. Although the process has not lost energy, we can not reverse the combustion process and recreate the original piece of coal from its waste.
The second principle of thermodynamics can, therefore, be rewritten as follows:
Whenever a certain amount of energy is converted from one form to another, there is a penalty that consists in the degradation of a part of the energy itself in the form of heat. This part will not be usable to produce work.
The state in which entropy reaches its maximum value and there is no more energy available to perform the work is called a state of equilibrium. For the whole universe, conceived as an isolated system, this means that the progressive conversion of work into heat (for the principle of increase of total entropy), in front of a mass of the finite universe, will finally lead to a state in which everything the universe will be in conditions of uniform temperature; the so-called thermal death of the Universe.
Entropy characterizes the verse of any real transformation as an irreversible transformation: in fact, it also returns from a final state to one identical to the initial state by temperature, volume, pressure or other parameters, as it occurs continuously in the cycles of a thermal machine, at least one variable physics differed from where you started: entropy (which inevitably increased).
Every real transformation is an irreversible transformation because entropy increases; vice versa, the hypothesis of ideality is equivalent to the hypothesis of a change in zero entropy.
History and definition of entropy
The concept of entropy was introduced at the beginning of the nineteenth century, in the context of thermodynamics, to describe a characteristic (which was generally observed for the first time since Sadi Carnot in 1824) of all the systems known below, in the that it was observed that the transformations occurred spontaneously in one direction, one towards the larger disorder.
In particular, the word "entropy" was introduced for the first time by Rudolf Clausius in his Abhandlungen über die mechanische Wärmetheorie (Treaty on the theory of mechanical heat), published in 1864. In German, Entropie derives from the Greek á¼ & nu; in, "inside", and of & tau; & rho; & omicron; & pi; Î® trope, "change", "turning point", "turning" (in the Energie model, "energy") .. "): For Clausius indicates where the energy supplied to a system ends correctly Clausius claims to refer to the relationship between inward movement (with the body or system) and internal energy or heat, state link expressing the great insight of the Enlightenment, which somehow the heat should refer to the mechanical movement of the particles within the body, in fact it was defined as the relation between the sum of small increments (infinitesimal) of heat, divided by the absolute temperature during the change of state.
To clarify the concept of entropy we can present some examples:
- Place a drop of ink in a glass of water: it is observed that, instead of staying a drop more or less separated from the rest of the environment (which would be a completely ordered state), the ink begins to spread and, in a certain time, a uniform mixture is obtained (completely messy state). It is a common experience that, although this process occurs spontaneously, the reverse process, which separates water and ink, requires external energy.
- Imagine a perfume contained in a full flask as a set of point molecules with a certain velocity derived from the temperature of the perfume. While the bottle is covered, that is, isolated from the rest of the universe, the molecules will be forced to remain inside and, having no space (the bottle is full), they will remain fairly ordered (liquid state). When the bottle is uncorked, the molecules on the surface of the liquid will begin to detach from the others and, accidentally colliding with each other and against the walls of the bottle, will come out of this outer dispersion (evaporation).). After a certain time, all the molecules will be released and dispersed. Even if at random some molecule falls into the bottle, the general system is now disordered and the thermal energy that has set the phenomenon in motion is dispersed and, therefore, is no longer recoverable (there is a dynamic equilibrium).
The concept of entropy has seen great popularity in the nineteenth and twentieth century, thanks to the wide range of phenomena that helps to describe, until out of the purely physical and be adopted by the social sciences, the theory of the signal, in 'theoretical informatics and in' economics. However, it is good to keep in mind that there is a class of phenomena, such non-linear phenomena (such chaotic phenomena) for which the laws of thermodynamics (and therefore entropy) must be extensively revised and no longer have general validity .