Three Reasons Why Entropy Does Not Conflict With Evolution

Three Reasons Why Entropy Does Not Conflict With Evolution March 20, 2017

I recently came across a few strangers who objected to evolution because, according to them, it violated the second law of thermodynamics. This is the law that focuses on how entropy behaves over time. There are plenty of refutations to this claim, such as this concise but informative one from Talk Origins. Of course, many of the refutations go over the same point, a point that I will address in this piece. There are reasons beyond this, though, that show that evolution not only doesn’t violate the second law of thermodynamics, but the second law actually encourages evolution. I’m an academic who has taken three thermodynamics courses (four, if you count my thermo-heavy chemical separations class). I might as well put that academic expertise to some use while I procrastinate on getting a paper together. Besides, it will be useful for me to drop this link whenever the argument comes up again such that I don’t have to waste time saying the same things over and over.

Reason 0: Abiogenesis Isn’t Evolution

This is Reason 0 because I intend to focus on evolution, and abiogenesis is a separate concept. When we are talking to creationists, we know that the two concepts tend to be confused and conflated. Evolution describes the diversification of biological populations over time, whereas abiogenesis is the event where life arose from non-living chemicals. While it’s true that evolution couldn’t happen without abiogenesis, they are entirely separate processes that run on separate mechanisms. As a very crude analogy, it’s the difference between inventing the automobile and designing a sleeker, safer, or more affordable consumer vehicle. Once the framework for a car is in place, it is open to being developed and refined. In a similar way, once a cell is able to uptake chemical energy from the environment and start replicating itself into similar copies, then its basic form is available to be changed over time.

Interestingly, within arguments against creationists, abiogenesis is likely where entropy is most relevant. A cell has to be formed together, which necessitates the coalescing of certain chemicals into specific forms, which is a low-probability event. However, we know that at least some of the spontaneous processes that are necessary can spontaneously happen all the time. For example, lipid bilayers and micelles form all the time even today, which reduces entropy. This requires the lipids to organize themselves spontaneously into sheets, walls, and spheres if they are going to form structures such as cell walls and organelles. To someone not well-versed in thermodynamics, this process appears to violate the second law of thermodynamics. However, the spontaneous process of vesicle formation reduces the free energy of the system, which basically means that it converts the energy of the lipid molecules into heat by forming these structures. Whenever heat is involved, you know that entropy is involved. I will discuss this in detail in a later section.

(Edit: I’ve portrayed this part somewhat inaccurately. While lipids do produce heat when they form, it’s not very much and not the driving force for vesicle formation. I’ve written another post providing the details about this system.)

I’ve left another part out of abiogenesis, which is the formation of genetic information. The earliest life is widely suspected to have contained this information in the form of RNA, which can both contain genetic information and can also act as a chemical catalyst to encourage certain biochemical reactions*. The argument here is that the genetic sequences and proteins required to create self-replicating life are far too unlikely to have come together spontaneously. This rests on a few faulty assumptions. First of all, many of the calculations that creationists make assume the formation of a modern cell containing a lot of genetic information, while the earliest cell may have been much simpler. Just last year, microbiologists were able to design an organism that only has 473 genes, which is very small compared to the tens of thousands of genes that modern organisms like humans have. A primordial cell may very well have been able to function on even less genetic information. Furthermore, creationists ignore that this spontaneous generations of proto-cells were happening over and over at a rapid rate, and the Earth didn’t have merely one shot to create life. No matter how unlikely a certain genome is able to form, the ancient oceans only had to get it right once to have a cell that could eventually develop into modern life. A creationist is imagining a googol-sided die rolling once, and if it fails, that’s it. Biochemists are imagining trillions of septillion-sided dice being rolled billions of times over millions of years, and if they have one success then that means the planet gets life. For a more in-depth explanation, I recommend this page.

Reason 1: Life Isn’t In An Isolated System

The next time you are in an argument with a creationist and they bring up the second law, ask them if they know what the second law actually says. In all likelihood, they will tell you that the second law says that things cannot get more complex or more organized over time. This is likely a result of education systems and science communication trading a lot of accuracy of the concept for a simple explanation of entropy, which is a bit of a disservice to thermodynamics. However, we can take this opportunity to see what the second law actually says. There are actually quite a few ways to accurately phrase the second law of Thermodynamics, but I will use one explanation from some class notes from MIT.

For an isolated system the total energy is constant. The entropy can only increase or, in the limit of a reversible process, remain constant.

In this case, we must define and discuss what exactly an isolated system is. In an isolated system, no energy can enter or leave. If some matter and energy is left in an isolated system, the total entropy within the system will always increase. In practical terms, this means that any fluids will mix together, the temperature and pressure will become equal everywhere, and the energy will be minimized everywhere.

In real life, there is no truly isolated system, except perhaps the universe as a whole. Insulated containers always allow a little bit of heat to flow through its boundaries. The Earth spreads its energy to the universe via radiation. Energy is always flowing, constantly shifting from one region of space to another. There’s not much we can do to halt this process. In the long term, this means that the universe will no longer be able to sustain life, which is a bit of a bummer. In the short term, the movement of energy from one place to each other keeps us alive. Depending on your perspective, this is more or less of a bummer.

The point is, no species is in an isolated system. There are imbalances of energy everywhere, and therefore energy is constantly flowing from system to system. While everything will tend towards more entropy, a local decrease in entropy is entirely possible within a system, as long as either the total entropy increases or some energy is added to the system (or both). For a good example, look at a refrigerator, which is essentially an entropy-reducing device. What a refrigerator does is organizes energy from one location (the inside of the fridge) to another (the outside). Greater entropy means that energy is spreading everywhere, but a refrigerator moves energy away from the insides, so the refrigerator is working against entropy. A refrigerator isn’t an isolated system. It takes in energy from its power cord, turns that electrical energy into mechanical energy which pumps heat away from the insides of the refrigerator, and that heat escapes from the radiators in the back of the refrigerator. While the entropy inside the refrigerator is reduced through work from the refrigerator, the energy input from outside the refrigerator allows this to happen, and the heat from the radiator makes up for a local loss in entropy. If you were to open a refrigerator running in a truly isolated system like a perfectly insulated room, then the mechanical work and the heat would actually cause the room to increase in temperature because of the heat. Remember, heat means entropy.

Consider even the earliest life. Were they near any sort of continuous energy source, they would be free to keep reproducing, and therefore they would be free to continue mutating and changing. If the cell is the system, it is open to heat from a thermal vent, or from chemical food in the immediate surroundings. As life has continued, each organism has been an open system of its own. It is constantly taking up energy in the form of food, or photosynthesizing solar energy from the sun. There is work and energy being input to life all the time, so the second law doesn’t apply very well in this case.

Reason 2: Life Is An Efficient Entropy Producer

I keep saying that heat means that entropy is everywhere. Why is this the case? The answer is a tad abstract, which probably is one of the reasons why entropy isn’t explained to the general public in a thorough manner, but it has to do with states of energy. Energy is everywhere in various forms, and to think of entropy in terms of energy means that you can think of where the energy is “placed”. When an object is very hot, it means its molecules are containing a lot of energy compared to its surroundings. If you have a hot cup of coffee, all the molecules inside it are jiggling around, moving quickly, bumping into each other, and rotating rapidly. A cup of coffee contains an uncountable number of high-energy molecules organized into one place, the coffee mug. If this collection of lots of energy is going to disperse everywhere, the high-temperature liquid must heat up its surroundings. The high-energy water molecules collide with the molecules contained in the mug, making the walls jiggle and vibrate at an atomic level such that the mug’s energy increases. Following that, the molecular vibrations of the mug impart energy onto air molecules that collide with it, such that the energy transfers from the mug to the air. In this manner, energy is spread out from the liquid in one place to the air that is everywhere.

Spontaneous chemical reactions are similar. In its most reductionist sense, life is a series of a few chemical reactions. The food that we intake contains energy in its chemical bonds, largely in sugars and carbohydrates. These foods are “storing up” energy in a compact molecule, much like how a cup of coffee stores a bunch of molecular kinetic energy in one location. When these bonds are broken, the energy from these bonds is transported to the rest of the body so the body can function. But not 100% of that chemical bond energy is able to be transferred to useful functions, some of it is lost to the environment. This lost energy is heat, and you can feel it when you feel someone’s skin and sense how warm they are. Biology is really good at breaking down these sorts of chemical reactions, so much so, that a few years ago a MIT researcher proposed that once an external heat source is available, life-like chemical processes will emerge to dissipate heat more and more rapidly over time, and evolution will actually favor the organisms that most effectively disperse heat.

It makes sense, then, if higher entropy is favored, that life would spontaneously adapt to new environments such that it could continue to increase entropy. If an environment changes and the species within that environment aren’t able to adapt and go extinct, then the rate of entropy production will plummet, which is not thermodynamically ideal. However, if those species are able to adapt to environmental changes, then life will continue. If life continues, then heat will be dissipated much faster than in an environment without life, which is something that the laws of thermodynamics favor. Life is such an efficient producer of heat, that any marginal changes in the genes and the phenotype of an organism that appear to reduce the entropy in the structure of the organism or its genetic information will be “paid off” by an even greater amount of entropy increase due to heat production.

Reason 3: Entropy Isn’t “Complexity”

Let’s go back to the creationist’s argument against evolution. It’s often the case that they’ll say something to the effect of “things can’t get more organized or complex over time”. I’ve already explained what this means above in words, but let’s look at it another way. Brace yourself, I’m about to use… math!

The boltzmann entropy formula [S=klog(W)]
The Boltzmann entropy formula carved into Ludwig Boltzmann’s gravestone. [via Wikimedia Commons]
  • S – Entropy
  • k – the Boltzmann constant. Its value is not important for the blog post, so you are free to ignore it for now.
  • W – number of states or permutations
  • log – the log function

S is proportional to the logarithm of how many states, arrangements, or permutation something is in. For those not familiar with the log function, this is how entropy increases with the number of states something is in (Drawn by me in the style of every professor ever):

graph of entropy increasing with the log of the number of states
A graph showing how entropy increases with the number of states

The reason why scientists use “disorder” or “chaos” when describing entropy is because for most systems, there are more “disorganized” states than not. To illustrate this, assume that I organize all my shirts by hanging them on clothes hangers in my closet. There is a relatively small number of ways I can organize my shirts in my closet, because they are confined to a certain finite number of hangers. There are comparatively far more ways I can have all my clothes strewn about my room. This is similar to how molecules behave. They are more likely to be found spread throughout a system then confined to one location.

There are lots of ways to think about “number of states”. When I took statistical thermodynamics, we looked at comparing the number of ways molecules could occupy states adsorbed to a surface compared to floating around in a free volume. We also considered the number of conformations a polymer chain can exist in. These states may appear somewhat abstract to someone who is unfamiliar with the concept, but fortunately, biology has an intuitive way to think of states in terms of the genetic code and variations of phenotypes. In the case of evolution, you could think of each species as a state, where each species has more or less the same genome, and more or less the same “code”. This would mean that as more variations exist, the more states exist, and therefore more entropy.

How does this compare to evolutionary theory? It aligns perfectly well. According to modern evolutionary theory, everything currently living today has a common species as a source, our Last Universal Common Ancestor. In this case, its entropy** would be k*log(1), which is 0. Compare this to the number of species that are alive today. This is difficult to estimate, but it’s certainly in the millions, and some estimates have put it at 8.7 million. Let’s give it a conservative estimate and say 1 million. The entropy of living species today would be k*log(1,000,000), which is much larger.

In order to change the number of species, there must be mutations and changes and additions to genetic information. Creationists often say that there is no way to create “new” genetic information, but this is patently absurd. Insertions to gene sequences happen all the time, so over time more and more genetic information is bound to increase. This can be described computationally as information entropy, which would increase over time as more and more sequences are added to the pool of genes that species carry. Describing DNA sequences in terms of entropy is possible, but beyond the scope of this post as well as my knowledge, except to say that with the more DNA sequences that exist on earth, the higher the genetic entropy.

Hopefully this drives the point home that the second law of thermodynamics is far from a barrier to evolution. In fact, entropy guides biology and its fundamental processes just as strongly as every other part of the natural world. Perhaps this is not intuitive to creationists, especially when they don’t have the requisite knowledge to understand what entropy means at a basic level (or evolution, for that matter). In that case, as is usually the case with creationists, they should probably stand aside and defer to the expertise of those who do understand what they are talking about.

*A lot of these discoveries have been the result of work by Nobel laureate Tom Cech, a biochemistry professor from my current institution, University of Colorado, Boulder. If you’re reading this, Dr. Cech, please tell your students to keep the piranha fume hood clean when they use it. Cheers!

**A more accurate global species entropy calculation would be a weighted probability of each species, where each weight is proportional to the number of individuals that exist within a certain species. This would be pretty hard to calculate, considering that we have yet to identify a vast majority of organisms on the planet. Also, this is merely a blog post where I’m trying to introduce difficult concepts simply, and unfortunately I’m not as thorough as Randall Munroe.

"Donald Trump walks into a bar.... And promptly lowers it. I know I know, I've ..."

Left- vs Right-Wing Comedy: Why so ..."
"Clancy's daughter does not sound like an irrational person."

A Church-Going Atheist
"Dammit meant to post the mixed marriage clip. :)"

Left- vs Right-Wing Comedy: Why so ..."
"I don't see tribalism necessarily having any relationship to the quality of thinking. What matters ..."

A Church-Going Atheist

Browse Our Archives

Follow Us!

What Are Your Thoughts?leave a comment