The third law of thermodynamics states that the entropy of a system approaches a constant value as the temperature approaches absolute zero. The entropy of a system at absolute zero is typically zero, and in all cases is determined only by the number of different ground states it has. Specifically, the entropy of a pure crystalline substance (perfect order) at absolute zero temperature is zero. This statement holds true if the perfect crystal has only one state with minimum energy.
The second law of thermodynamics says that the entropy of any isolated system always increases. Isolated systems spontaneously evolve towards thermal equilibrium—the state of maximum entropy of the system. More simply put: the entropy of the universe (the ultimate isolated system) only increases and never decreases.
A simple way to think of the second law of thermodynamics is that a room, if not cleaned and tidied, will invariably become more messy and disorderly with time – regardless of how careful one is to keep it clean. When the room is cleaned, its entropy decreases, but the effort to clean it has resulted in an increase in entropy outside the room that exceeds the entropy lost.
The first law of thermodynamics, also known as Law of Conservation of Energy, states that energy can neither be created nor destroyed; energy can only be transferred or changed from one form to another. For example, turning on a light would seem to produce energy; however, it is electrical energy that is converted.
A way of expressing the first law of thermodynamics is that any change in the internal energy (∆E) of a system is given by the sum of the heat (q) that flows across its boundaries and the work (w) done on the system by the surroundings:
This law says that there are two kinds of processes, heat and work, that can lead to a change in the internal energy of a system. Since both heat and work can be measured and quantified, this is the same as saying that any change in the energy of a system must result in a corresponding change in the energy of the surroundings outside the system. In other words, energy cannot be created or destroyed. If heat flows into a system or the surroundings do work on it, the internal energy increases and the sign of q and w are positive. Conversely, heat flow out of the system or work done by the system (on the surroundings) will be at the expense of the internal energy, and q and w will therefore be negative.
In order to avoid confusion, scientists discuss thermodynamic values in reference to a system and its surroundings. Everything that is not a part of the system constitutes its surroundings. The system and surroundings are separated by a boundary. For example, if the system is one mole of a gas in a container, then the boundary is simply the inner wall of the container itself. Everything outside of the boundary is considered the surroundings, which would include the container itself.
The boundary must be clearly defined, so one can clearly say whether a given part of the world is in the system or in the surroundings. If matter is not able to pass across the boundary, then the system is said to be closed; otherwise, it is open. A closed system may still exchange energy with the surroundings unless the system is an isolated one, in which case neither matter nor energy can pass across the boundary.
A particularly important concept is thermodynamic equilibrium, in which there is no tendency for the state of a system to change spontaneously. For example, the gas in a cylinder with a movable piston will be at equilibrium if the temperature and pressure inside are uniform and if the restraining force on the piston is just sufficient to keep it from moving. The system can then be made to change to a new state only by an externally imposed change in one of the state functions, such as the temperature by adding heat or the volume by moving the piston.
A sequence of one or more such steps connecting different states of the system is called a process. In general, a system is not in equilibrium as it adjusts to an abrupt change in its environment. For example, when a balloon bursts, the compressed gas inside is suddenly far from equilibrium, and it rapidly expands until it reaches a new equilibrium state. However, the same final state could be achieved by placing the same compressed gas in a cylinder with a movable piston and applying a sequence of many small increments in volume (and temperature), with the system being given time to come to equilibrium after each small increment. Such a process is said to be reversible because the system is at (or near) equilibrium at each step along its path, and the direction of change could be reversed at any point.
This example illustrates how two different paths can connect the same initial and final states. The first is irreversible (the balloon bursts), and the second is reversible. The concept of reversible processes is something like motion without friction in mechanics. It represents an idealized limiting case that is very useful in discussing the properties of real systems. Many of the results of thermodynamics are derived from the properties of reversible processes.
The application of thermodynamic principles begins by defining a system that is in some sense distinct from its surroundings. For example, the system could be a sample of gas inside a cylinder with a movable piston, an entire steam engine, a marathon runner, the planet Earth, a neutron star, a black hole, or even the entire universe. In general, systems are free to exchange heat, work, and other forms of energy with their surroundings.
A system’s condition at any given time is called its thermodynamic state. For a gas in a cylinder with a movable piston, the state of the system is identified by the temperature, pressure, and volume of the gas. These properties are characteristic parameters that have definite values at each state and are independent of the way in which the system arrived at that state. In other words, any change in value of a property depends only on the initial and final states of the system, not on the path followed by the system from one state to another. Such properties are called state functions. In contrast, the work done as the piston moves and the gas expands and the heat the gas absorbs from its surroundings depend on the detailed way in which the expansion occurs.
The behaviour of a complex thermodynamic system, such as Earth’s atmosphere, can be understood by first applying the principles of states and properties to its component parts—in this case, water, water vapour, and the various gases making up the atmosphere. By isolating samples of material whose states and properties can be controlled and manipulated, properties and their interrelations can be studied as the system changes from state to state.
thermodynamics, science of the relationship between heat, work, temperature, and energy. In broad terms, thermodynamics deals with the transfer of energy from one place to another and from one form to another. The key concept is that heat is a form of energy corresponding to a definite amount of mechanical work.
Heat was not formally recognized as a form of energy until about 1798, when Count Rumford (Sir Benjamin Thompson), a British military engineer, noticed that limitless amounts of heat could be generated in the boring of cannon barrels and that the amount of heat generated is proportional to the work done in turning a blunt boring tool. Rumford’s observation of the proportionality between heat generated and work done lies at the foundation of thermodynamics. Another pioneer was the French military engineer Sadi Carnot, who introduced the concept of the heat-engine cycle and the principle of reversibility in 1824. Carnot’s work concerned the limitations on the maximum amount of work that can be obtained from a steam engine operating with a high-temperature heat transfer as its driving force. Later that century, these ideas were developed by Rudolf Clausius, a German mathematician and physicist, into the first and second laws of thermodynamics, respectively.
The most important laws of thermodynamics are:
- The zeroth law of thermodynamics. When two systems are each in thermal equilibrium with a third system, the first two systems are in thermal equilibrium with each other. This property makes it meaningful to use thermometers as the “third system” and to define a temperature scale.
- The first law of thermodynamics, or the law of conservation of energy. The change in a system’s internal energy is equal to the difference between heat added to the system from its surroundings and work done by the system on its surroundings.
- The second law of thermodynamics. Heat does not flow spontaneously from a colder region to a hotter region, or, equivalently, heat at a given temperature cannot be converted entirely into work. Consequently, the entropy of a closed system, or heat energy per unit temperature, increases over time toward some maximum value. Thus, all closed systems tend toward an equilibrium state in which entropy is at a maximum and no energy is available to do useful work.
- The third law of thermodynamics. The entropy of a perfect crystal of an element in its most stable form tends to zero as the temperature approaches absolute zero. This allows an absolute scale for entropy to be established that, from a statistical point of view, determines the degree of randomness or disorder in a system.
Although thermodynamics developed rapidly during the 19th century in response to the need to optimize the performance of steam engines, the sweeping generality of the laws of thermodynamics makes them applicable to all physical and biological systems. In particular, the laws of thermodynamics give a complete description of all changes in the energy state of any system and its ability to perform useful work on its surroundings.
This article covers classical thermodynamics, which does not involve the consideration of individual atoms or molecules. Such concerns are the focus of the branch of thermodynamics known as statistical thermodynamics, or statistical mechanics, which expresses macroscopic thermodynamic properties in terms of the behaviour of individual particles and their interactions. It has its roots in the latter part of the 19th century, when atomic and molecular theories of matter began to be generally accepted.