Biological Information, Molecular Structure, and the Origins Debate

by Jonathan K. Watts
Biomolecules contain tremendous amounts of information; this information is “written” and “read” through their chemical structures and functions. A change in the information of a biomolecule is a change in the physical properties of that molecule—a change in the molecule itself. It is impossible to separate the information contained in biomolecules from their structure and function. For molecules such as DNA and RNA, new information can be incorporated into the sequence of the molecules when that new sequence has favorable structural and functional properties. New biological information can arise by natural processes, mediated by the inter- actions between biomolecules and their environment, using the inherent relationship between structure and information. This fact has important implications for the generation of new biological information and thus the question of origins.

PSCF 63, no. 4 (2011): 231–39

4 comments to Biological Information, Molecular Structure, and the Origins Debate

  • I can’t tell if the author understands the second law of thermodynamics. That the origin of life and the evolution of the first forms of life into mammals violate the second law is obscured by both the intelligent design (ID) community and atheistic opponents of ID.
     
    The second law of thermodynamics is observed when heat flows from a hot object to a cold object. In this process, there is a decrease in information because there is less knowledge about the location of the most energetic molecules. Another word for information is complexity. The bell-shaped curve, which describes the distribution of the speeds of molecules in a gas, is calculated from Stirling’s approximation: logN! = NLogN. The inverse of the factorial of 52 (52!) is the probability of finding a shuffled deck of playing cards totally arranged by suit and by number.
     
    The second law is so firmly grounded in experience and logic that we ordinarily assume it is never violated. When a seed grows into a plant, we assume that the complexity of the seed with its DNA is just as great as the complexity of the fully grown plant. More scientifically, the seed is not an isolated system of particles but absorbs energy and material from the environment.
     
    However, it is difficult to imagine that the earth-sun system 5 billion years ago was not a system of particles and, hence, subject to the second law. Thus, evolution violates the second law, as all PhDs in biology know. Advocates of ID know it, but they give the impression that it is a matter of controversy among scientists. It is true that layman don’t understand this, as the following quote from a layman shows:
     
    “They [Pinker and Bloom] particularly emphasized that language is incredibly complex, as Chomsky had been saying for decades. Indeed, it was the enormous complexity of language that made is hard to imagine not merely how it had evolved but that it had evolved at all.
     
    “But, continued Pinker and Bloom, complexity is not a problem for evolution. Consider the eye. The little organ is composed of many specialized parts, each delicately calibrated to perform its role in conjunction with the others. It includes the cornea,…Even Darwin said that it was hard to image how the eye could have evolved.

    “And yet, he explained, it did evolve, and the only possible way is through natural selection—the inestimable back-and-forth of random genetic mutation with small effects…Over the eons, those small changes accreted and eventually resulted in the eye as we know it. (Christine Kenneally, The First Word: The Search for the Origins of Language, pp. 59–60.)”

  • Jonathan Watts

    Hi David – Thanks for your comment.  I’ll respond to a couple of points.

    First of all, entropy is a thermodynamic parameter that doesn’t always correspond to our intuitive ideas of “randomness.”   As an example, proteins fold in complex ways, and this is a process that intuitively increases their degree of “orderliness.”  Yet the driving force for the folding of some proteins is actually an increase in entropy (in the example I’m thinking of, there is a complex structure of water molecules around the polypeptide chain, and when the protein folds, the water molecules are released into a more random arrangement in solution… this causes a net increase in entropy in spite of the reduction in the degrees of freedom of protein conformation upon folding.)

    The second law, as you mentioned, applies to only a closed system.  Thus we see a seed growing into a tree, I can clean my desk (although I do it far too rarely), a SELEX experiment can isolate a functional sequence from a random pool, and so on.  In all of these cases, the entropy of the universe increased overall even though the “orderliness” of a particular place and time has also increased.  All of these processes required energy input.  As you said, the growing plant absorbs energy from the sun allowing it to give order to biomolecules and cells.  Since evolution occurred on earth, which is not a closed system and absorbs energy from the sun, it doesn’t work to disprove evolution with the second law.  Can you help me understand why you find this a convincing rebuttal to evolution?

  • Randy Isaac

    Perhaps it’s appropriate for both of you to review thermodynamics a bit. David, you said “Thus, evolution violates the second law, as all PhDs in biology know.” and Jonathan, you said “The second law, as you mentioned, applies to only a closed system.” Neither statement is correct. The second law governs all processes and, as far as we know, is never violated. The entire second law can be succinctly stated as “The Gibbs Free Energy is always minimized.” No conditions or constraints. Every process will reduce the Gibbs Free Energy, which includes terms like G = U – TS +PV, etc., where G is the Gibbs Free Energy, U is the total internal energy, T is the temperature, S is the entropy, P is the pressure, and V is the volume. Other terms cover stress, strain, magnetic moment, etc.
    What most people remember as being the Second Law is “Entropy tends to increase in a closed system.” That follows from the definition above, with the constraint that U is constant in a closed system. If G decreases, then S must increase if U and the other terms are constant. But this is only a special case. In the general case, all parameters can vary. As energy enters the system from the environment, U and T and other parameters can change and S can decrease dramatically while G also decreases.
    In other words, the second law applies everywhere, not just to closed systems. And, perhaps biologists don’t know it, but physicists know that the second law is not violated by any biological process or by evolution in any sense.

  • Randy Isaac

    P.S. And perhaps I should add that in the thermodynamic sense, information and entropy scale in the same way, so that when the entropy increases, the total information content increases as well. That seems counterintuitive. After all, entropy is a measure of disorder and uncertainty while information is a measure of order and a reduction of uncertainty. So how can the both increase at the same time? One way to look at it is to consider that as entropy increases, uncertainty increases. Then, any given state of information will reduce a greater amount of uncertainty. Hence, information is greater in the system when entropy is higher.
    Simple example: suppose you have one marble and two holes in which you can put the marble. You have one bit of information. Then suppose you increase the entropy by adding two more holes. You still have only one marble in one hole, but you now have two bits of information. Both entropy and information have increased.

 

November 2011
M T W T F S S
« Aug    
 123456
78910111213
14151617181920
21222324252627
282930  

Email Notification for Post