9 Answers

  1. If applied to a person's everyday life, “entropy“, in fact, characterizes the concreteness of information, or, one might say, is a measure of uncertainty. If we know something for sure, then the entropy of this event is low. If we are not sure about something and we have little information, the entropy is high.

    The term “entropy” was first coined in 1865 by the German physicist Rudolf Clausius to estimate the measure of irreversible energy dissipation. Over the past century and a half, the concept of entropy has been firmly established in all languages and has proved popular and convenient in theories, both natural science and humanities. And now it has already become quite universal, since in general, entropy is closely related to the measure and degree of uncertainty, chaos, and disorder in any system.

    In natural sciences (physics, chemistry, biology, geography…) it is a measure of the disorder of a system consisting of a large number of individual elements;

    In mathematics — it is a measure of the complexity of an object or process;

    In economics, entropy is a quantitative indicator of disorder, a measure of excessive work when achieving a set financial goal, and the proportion of unprofitable side processes or phenomena that accompany any activity (such as inefficiency of economic activity, competition, etc.).

    In the humanities-in the field of psychology, and sociology distinguish the entropy of personality:

    uncertainty of the state of personality in psychology-mental entropy-a disorder of consciousness when our inner attention, focused on some goal, is mixed with various thoughts that distract us from our goal.

    social entropy:

    this is a measure of the deviation of the social system from the reference state, when the deviation manifests itself in a decrease in the level of organization, efficiency of functioning, the pace of development of the system, and therefore the standard of living.

    In information theory — it is a measure of the uncertainty of an experience, process, or test that may have different outcomes (in fact, it is a measure of the amount of missing information).

  2. In simple words, it won't work.

    Since the term entropy is used in the exact sciences, we will focus on its physical meaning. This is the logarithm of the number of available system states: S=Ln N.

    This means that the greater the number of possibilities (possible states) the higher the entropy of the system.

    Example. If you are driving a car, then your options are very limited.Therefore, the entropy is low. You can't walk, drink coffee, sleep,… . But you can turn on the music.

    But if you are not driving and at the same time free from worries, then you have more opportunities. Therefore, the entropy will be high.

    The comparison is certainly rough, but it reflects the essence.

    Actually, it turned out in simple words.

  3. Entropy , in very simple words-the desire for self-destruction (chaos). According to the second law of thermodynamics, any new machine (mechanism) tends to collapse over time and this process is not reversible. Entropy can be positive (in closed systems) and negative (in open systems). Negative entropy is clearly demonstrated in humans. Man, as an open system, creates negative entropy through homeostasis.

  4. In simple terms, you always run the risk of losing accuracy. Well, chf.

    In short, this is the degree of uncertainty. That is, some dude tells you: “In which hand I have a pebble.” You have 2 options: left or right, i.e. the degree of uncertainty is equal to the probability of a minus the 1st degree. If the dude told you that it wasn't in the left hand, then he reduced the uncertainty, and as a result, the entropy. If he said that the pebble can still be not only in his hand, but, for example, still in his underpants, then the entropy is already equal to 3m, which increased the uncertainty.

    Something like this)

  5. Entropy determines the degree of equilibrium of a closed system. If there is no equilibrium in this system, i.e. there are pressure differences, temperature differences, electric potential differences, etc., then certain work can be performed in the system. Thus, the system has potential energy and its entropy is at some level. After performing all types of work, the potential energy passes into the internal one, which is evenly distributed throughout the entire volume of the system. The equilibrium of the system reaches its maximum and no more work can be done in it. The entropy of such a system tends to infinity.

  6. How do we even know that the Creator exists?

    Professor George Zinsmeister, a former lecturer at the University of Massachusetts (United States), as a religious person, answers this question in one word — entropy. He refers to what scientists call the ” second law of thermodynamics.” This law describes a well-known fact: any ordered system, such as a house, cannot appear on its own. Meanwhile, our universe consists of an infinite number of highly ordered physical and biological systems. All this testifies to the existence of a brilliant and powerful Creator.

  7. In principle, entropy always tends to zero. So, for example, water always tends to go down from the tops , i.e. where the entropy of water is zero, approximately the same as in the concept in thermodynamics

  8. I understand it this way!entropy is the force (energy)that returns an unbalanced closed system to equilibrium.This may be the movement of the piston in the cylinder, or it may be the opening of the “shut-off” valve, or an explosion.I may be wrong.This is my personal understanding.

  9. My original explanation, as it were, “from the reverse”: ENTROPY IS THE AMOUNT OF ENERGY NEEDED TO BRING THE SYSTEM IN ORDER. For example, your internal combustion engine broke down on the Prior, so the price list in the car service is a measure of entropy for the client!

Leave a Reply