The universe tends towards disorder. But how come nobody knows why?

The universe tends towards disorder. But how come nobody knows why?

Entropy is the physicist’s magic word, invoked to answer to some of the biggest questions in cosmology. Yet a quantum rethink may be needed to tell us what it actually is

ALL the King’s horses and all the King’s men couldn’t put Humpty together again. Everyone knows the sorry tale of Humpty Dumpty, but have you ever noticed that the rhyme makes no mention of an egg? In fact, the ill-fated protagonist only assumed egg-man form when he met Alice in Lewis Carroll’s Through the Looking Glass, after which broken eggs became indelibly associated with irreversible damage. So perhaps Carroll deserves to shoulder a share of the blame for scrambling our ideas about entropy.

Entropy is typically thought of as a measure of disorder or randomness, and it is bound up with thermodynamics – the branch of physics that deals with heat and mechanical work. Its propensity to increase forever has granted it exalted status as the pithiest answer to some deep questions, from what life is to how the universe evolved and why time moves ever forward like an arrow. And yet just like Humpty, entropy gets messy as soon as you crack its surface.

For a start, there is no single definition. But even if we understand it broadly as a measurement or quantity, our current conception of entropy doesn’t work to describe the things it purports to, not least the universe. “It’s all very confusing,” says Anthony Aguirre at the University of California, Santa Cruz.

Now, Aguirre and others are going back to the drawing board in search of a universally valid version of entropy anchored in our most fundamental theory: quantum mechanics. They hope to put our understanding of the universe’s mystifying directionality on firmer footing – or nudge it off a wall.

We might even be in for something akin to the Copernican revolution, when we realised that Earth orbits the sun, rather than the other way around. “That changed the way we view the universe,” says Wojciech Zurek at the Los Alamos National Laboratory in New Mexico. “From then on, one could make connections between phenomena that previously seemed unconnected. It’s the same with the new way of looking at thermodynamics.”

It all started in Carroll’s day, during the industrial revolution, when Victorian engineers were desperately trying to figure out why their coal-powered steam engines were so inefficient. Entropy was essentially a mathematical way to quantify heat that wasn’t available for doing useful mechanical work, such as driving a piston. In the 1860s, Rudolf Clausius defined it as the amount of heat energy you could put into a system without raising its temperature by a degree.

Ludwig Boltzmann soon made it a bit more precise. He knew that the mechanical work done by a hot gas like steam came from the motion of the molecules, but he also recognised that it was impossible to calculate how every individual atom or molecule in a given system moves. So he suggested working with probabilities. Thus Boltzmann defined entropy in terms of the number of different possible ways in which molecules in a closed system could be arranged. The more possible arrangements, the greater the entropy.

Boltzmann’s entropy worked surprisingly well to describe thermal systems such as steam engines – and it is still hard at work, with physicists and chemists using it on a daily basis. But difficult questions were raised as early as 1867, when James Clerk Maxwell devised a thought experiment in which a crafty demon lurked inside a box of molecules divided in two. The molecules start off evenly mixed, with no difference in temperature between the compartments, and thus no ability to do useful mechanical work. But the demon uses its knowledge of the molecules’ movements to separate hot ones from slower cold ones by opening a door between the two.

That posed a problem. The demon seems to have rendered the system ready to do work: open the door and the energetic molecules will be able to push a piston. In other words, the demon has reduced the system’s entropy, violating the second law of thermodynamics, which holds that the entropy of a closed system will always increase over time.

UNBREAKABLE

First law of thermodynamics

Broadly speaking, energy can’t be created or destroyed. Any energy added to a system goes into raising its internal energy and allowing it to perform work.

Second law of thermodynamics

The total amount of entropy in a closed system can never decrease. This is often expressed as the universe tending towards disorder.

This is considered the most far-reaching and robust law of nature. “The second law cannot be violated, never, in no situation, under no circumstances,” says Sebastian Deffner at the University of Maryland, Baltimore County. “Every time we run into an apparent violation, we find we have overlooked a contribution to the entropy production.”

That was certainly the case with Maxwell’s demon. What had been overlooked turned out to be a vital component of how we now understand physical systems: information. The demon can only perform its trick if it can store information about the molecules and their movements. It can’t have an infinitely large memory, so it will have to discard some information – and in the 1980s, physicist Charles Bennett calculated that this has a physical effect involving an increase in entropy. So the very process that allows the demon to reduce entropy inside the system increases it elsewhere.

The second law emerged unscathed again, but entropy changed. Bennett’s insight revealed that it isn’t just about heat, or the numbers of ways molecules can be arranged, or work. Deep down, entropy seems to be about information. This has some intriguing implications for how information might ultimately find use as a fuel (see “Running on facts”). It has also raised new questions about how information relates to the second law and the big-picture processes of the universe – questions that have forced physicists to revisit their understanding of entropy yet again.

New Scientist Default Image

A revision is long overdue, according to Zurek. He has always been suspicious about Boltzmann’s framing. The consideration of all possible states was, Zurek says, “an inspired ruse”: although it has been useful, there is no real-world justification for it. When dealing with finite systems such as an engine or a chemical reaction, he reckons it makes no sense to frame things in terms of the infinite possible ways you can arrange molecules.

For Zurek, this is nothing short of a “fudge” that has lulled us into a false sense that we understand the behaviour of physical systems. He suspects the reason Boltzmann’s statistical tricks worked was because what we call entropy is secretly something to do with quantum physics. The quantum world is probabilistic, with properties definable only in the statistical terms that Boltzmann stumbled on. Hence the idea that there might be something in this most fundamental theory that gives rise to Boltzmann’s version of entropy.

Quantum roots

And so Zurek has set out to reframe our current, information-based conception in terms of quantum physics. His scheme centres on quantum entanglement, where physically distinct systems have shared properties that mean a measurement on one can affect the outcome of a subsequent measurement on the other.

Last year, he showed that it is possible to derive thermodynamics by considering quantum systems that are entangled with their environment. Essentially, that means a system’s entanglement determines the amount and the nature of the available information about its state, which gives a measure of its entropy. It is a significant step: rooting information and entropy in quantum mechanics not only gives new depth to our understanding of how physical systems behave and interact, but also promises to reinstate entropy as a real measurable quantity.

Zurek is not the only one daring to ask hard questions of the answer to almost everything. Aguirre, together with his UC Santa Cruz colleagues Dominik Safranek and Joshua Deutsch, is also working on a new version, again with information at its core. They call it “observational entropy”, since it is designed to take account of the amount of information that can be gained when you perform a series of measurements on a quantum system.

Intriguingly, the observational entropy of a system will change depending on the way an observer chooses to perform a sequence of measurements. “It’s not something that has a fixed, objective value prior to those measurements,” says Safranek.

This, he explains, is because in quantum mechanics, the properties of any object or system are undefined until they are measured. What’s more, the Heisenberg uncertainty principle says that measuring one property changes other, unmeasured properties – so the order in which you make measurements will affect the observational entropy in a system. This is a serious re-casting of how we think about entropy, but it still connects with the classical concept, where the outcomes of measurements are linked to probability and possible configurations of the system.

It is early days for these ideas, and there is much to work out. Nonetheless, the physicists behind them hope that redefining entropy in quantum terms can put our understanding of it on firmer ground. You might wonder what there is to gain. After all, no one is saying that the tried and trusted second law of thermodynamics no longer applies. But Aguirre is enthusiastic about what this redefinition could mean. “I believe that it will have significant pay off,” he says, and he isn’t alone.

“The main hope is that quantum thermodynamics might shed some new light on the old problems,” says Vlatko Vedral at the University of Oxford. “The arrow of time is one of them, but both the origin of life and the expansion of the universe have also been mentioned in the literature.”

The connection to life might seem odd. But scientists have long puzzled over whether the cellular mechanisms inside living organisms can be seen as exploiting entropy. In recent years, it has even been proposed that life might have its origins in increasing entropy. The idea is that the tendency of atoms to structure themselves in a way that increases entropy inevitably produces complex structures, including living things. It is a speculative idea, but a clearer picture of entropy’s true nature may help put it to the test.

steam engine
Conceived to improve steam engines, entropy is thought to explain why time moves forward

An equally thorny issue is the arrow of time. The fact that time moves forwards, not backwards, is reflected in the fact that certain actions are irreversible – you can’t unscramble an egg or unspill a cup of coffee. We often think of this as the cast-iron rule, enshrined in the second law, that entropy always has to increase. The reasoning seems simple: there are more ways to arrange identical molecules in a scrambled egg than in the neat, ordered situation where the yolk sits within the albumen. But such a conclusion involves questionable assumptions, says Safranek: “In certain situations, it’s not clear which state should be considered more ordered.”

Deffner agrees. People often assume that disordered systems have more states, but that isn’t necessarily true, he says. “You can easily construct examples where the number of possible states increases – increasing the Boltzmann entropy – but the states come in a very ordered and structured manner.”

What’s more, Aguirre says that many attempts to apply entropy on a cosmological scale are questionable because entropy as currently defined applies only at near equilibrium states, where the system has settled into an unchanging configuration. “Fundamentally, nothing is in equilibrium,” he says. “The universe is certainly not. In fact, almost every process in the universe we care about relies on the universe being out of equilibrium.”

Deffner reckons this undermines the argument that the arrow of time comes from entropy increasing. The two are equivalent, he suggests: maybe we only see time flow because things move inexorably towards equilibrium, which is a process that increases entropy. “The increase of entropy is just a mathematically convenient reformulation of the universally observed arrow of time,” Deffner says.

The prevalence of this sort of circular reasoning is one reason that Aguirre is excited about observational entropy, which doesn’t make assumptions about equilibrium. “There hasn’t been a quantum version of Boltzmann entropy until we did this work, but we now have a description of what the entropy of that universe looks like. It goes up, too, so that’s a good step towards thinking about issues such as the arrow of time.”

Zurek points to practical benefits, too – not least that quantum entropy will help us better understand and exploit the properties of quantum machines such as nanoscale sensors and quantum computers. “This is an emerging field that is of great importance to nanotechnology and quantum information processing,” he says. And if information really is a resource to be treated like heat or mechanical work, the insight might even give rise to an array of technologies as revolutionary as those that seeded the first industrial revolution. “Maybe quantum [entropy] can do for us what steam did for the Victorians,” says Deffner.

RUNNING ON FACTS

An engine driven by information is, quite frankly, hard to imagine. And yet consider this: there is no way to process information without physical systems using energy. This includes erasing information: wiping a hard drive has an energy cost. Turn this observation on its head, and information starts to look like a potential way to fuel machines.

Imagine you have a device that holds information in binary, 1s and 0s, and that it is blank (meaning it is all zeroes). This is an ordered state, much like the cold environment of a heat engine – something that converts thermal energy into mechanical energy. “You can in principle build a device that would convert this state to the mixture-of-1s-and-0s state,” says Christopher Jarzynski, a chemist at the University of Maryland. Such a device would use, say, heat energy to change this state, and the conversion represents an acquisition of information. The physical act of the zeros changing into ones and zeros could be harnessed to do something mechanical, like lift a mass against gravity or charge a battery. “You can extract work from a thermal reservoir by the very act of writing information onto a blank memory slate,” says Jarzynski.

That could prove useful in scenarios where there is no other practical way to fuel a process. Experiments have already backed up that principle. Now the challenge is to explore the possibilities, creating machines fuelled by information. That might be something akin to the biological machines that process genetic information, or quantum-scale sensors that use their information intake to power motors or other mechanisms.
Read more: https://www.newscientist.com/article/mg24432590-200-the-universe-tends-towards-disorder-but-how-come-nobody-knows-why/#ixzz67Gvh5q2j