I was fascinated to learn that ribosomes inside all living cells rely on something very similar to blockchain proof of work when they build proteins from RNA code. They expend energy to find the right block of code that will match the next codon in the RNA so they get the right amino-acid in place to extend the growing protein chain. Which is analogous to cryptocurrency mining where energy is expended to find the sequence of bits, the nonce, which matches some criterion so the growing blockchain can be extended.
This is the first of a series of articles examining blockchains from the point of view of thermodynamics. We’ll travel over a wide range of topics, pulling together useful concepts, e.g. Maxwell’s Demon, critical phenomena in collective phase transitions, social field theories, Life the Universe and Everything, and finally end up with practical solutions to real problems. I don’t like just presenting the practical solutions without deep explanation.
It’s a long read and if you don’t know the basic concepts you will have to follow the links to external sources to get a general idea. But I will summarise key concepts along the way.
Thermodynamics & Blockchains
I started thinking about blockchains and thermodynamics after noticing that a mining rig pumps out a lot of heat, and also a series of highly ordered electrical signals. A refrigerator pumps out heat and makes ordered collections of water molecules, ice.
Which implies that …
A Bitcoin is like a lump of ice!
That’s a crazy idea!
But I remembered the advice of the Ross Mathematics Program;-
Think deeply of simple things
And that relationship seemed a simple observation, which is probably very deep. So I thought about it. There is the obvious connection that it takes energy and costs money to make ice, and it takes energy and costs money to mine Bitcoin, so in a sense they’re exchangeable. But something niggled me, I felt there must be a deeper idea lurking in the background. So I thought deeply about it some more. And then some more.
Blockchains are supposed to be almost irrevocable records so the second law of thermodynamics must be involved somewhere because the second law accounts for the irreversibility of time at the macroscopic level(see Wikipedia on the second law) I wondered how the process of creating low entropy records in the blockchain, and at the same time making those records hard to reverse by pumping out lots of high entropy in the form of heat, could be related to making ice in a cooler.
Thermodynamics & Living Things
I remembered from a long time ago reading a book by Erwin Schrödinger (the guy with the cat) about the thermodynamics of living things. Based on a lecture series in Dublin in 1943 and published 1944 before people even worked out that DNA was the basis of hereditary, it’s called What Is Life? The Physical Aspect of the Living Cell. I recalled he had some important things to say about thermodynamics and record keeping in biology, so I checked my university library, (I’m an alumnus, not an academic) and found, much to my surprise that they had two copies, both only available on short loan, and a queue to borrow a copy. Which implied that a very slim book, published 74 years ago, before they even knew that DNA was the molecule that carried the genetic code and where almost everything written in it has since been superseded, was still in high demand. Must be worth reading again!
After waiting a few weeks I got my copy. And it was worth reading. It has key insights into the thermodynamics of living things that turn out to be very relevant to blockchains. I’ll quote the forward by Prof Roger Penrose
When I was a young mathematics student in the early 1950s I did not read a great deal, but what I did read — at least if I completed the book — was usually by Erwin Schrödinger. I always found his writing to be compelling, and there was an excitement of discovery, with the prospect of gaining some genuinely new understanding about this mysterious world in which we live. None of his writings possesses more of this quality than his short classic What is Life? — which, as I now realize, must surely rank among the most influential of scientific writings in this century. It represents a powerful attempt to comprehend some of the genuine mysteries of life, made by a physicist whose own deep insights had done so much to change the way in which we understand what the world is made of. The book’s cross-disciplinary sweep was unusual for its time — yet it is written with an endearing, if perhaps disarming, modesty, at a level that makes it accessible to non-specialists and to the young who might aspire to be scientists. Indeed, many scientists who have made fundamental contributions in biology, such as J. B. S. Haldane and Francis Crick, have admitted to being strongly influenced by (although not always in complete agreement with) the broad-ranging ideas put forward here by this highly original and profoundly thoughtful physicist. Like so many works that have had a great impact on human thinking, it makes points that, once they are grasped, have a ring of almost self-evident truth; yet they are still blindly ignored by a disconcertingly large proportion of people who should know better. How often do we still hear that quantum effects can have little relevance in the study of biology, or even that we eat food in order to gain energy? This serves to emphasize the continuing relevance that Schrödinger’s What is Life? has for us today. It is amply worth rereading!
Roger Penrose, in the forward to Schrodinger, Erwin. What is Life? (Canto Classics) . Cambridge University Press.
The book “What is Life?” asks a question, how do the processes of life and in particular the molecules that hold the genetic code function in the face of onslaughts of random thermal noise? Seen from the perspective of a physicist the mechanism of inheritance has some odd features. First of all, we’re dealing with very small things, molecules and atoms, which are constantly buffeted by random motion and yet the locations of individual atoms relative to other atoms can determine the macroscopic features of an organism in a very precise and ordered way. That’s a big puzzle and it’s very different from other physical processes which are dependent on the law of large numbers (see Law of Large Numbers) to smooth out microscopic random motions.
We in the early 21st Century don’t give such puzzles much thought, we’ve learned that DNA replicates using molecular machines, and DNA using other molecular machines transcribe the code into RNA and then other molecular machines read the RNA and make proteins from it. It’s all been worked out so we don’t puzzle over it. But we should puzzle over it because the processes rely on unusual facets of basic physics, and thinking about the basic physics yields some remarkable insights.
Biology has been creating long-lived records for over three and a half billion years and living things have had to face much the same problems blockchain technologies today have to deal with and for much the same reasons. Most notably how to keep records intact in the face of constant attacks by malicious agents. So it’s worth looking at the similarities to find out how living things have dealt with these problems and perhaps learn from them.
By 1944 scientists knew that the elements of a gene were only a few atoms in size and could be disrupted by a single x-ray or gamma-ray photon. Yet somehow the information stored in these molecules was protected against being degraded by random thermal vibrations for billions of years. Some of the genes in every cell in your body have hardly changed since the last common ancestor of all living things, and we know that because all living things share very those very similar genes. How is that information preserved and protected?
Obviously, Schrödinger knew that quantum theory predicts that molecules are somewhat resistant to disruptions due to thermal noise. In addition, he pointed out that the genetic information encoded in molecules is in a state of extremely low entropy, in a sense equivalent to being at absolute zero.
..the laws of physics, as we know them, are statistical laws… They have a lot to do with the natural tendency of things to go over into disorder.
But, to reconcile the high durability of the hereditary substance with its minute size, we had to evade the tendency to disorder by ‘inventing the molecule’, in fact, an unusually large molecule which has to be a masterpiece of highly differentiated order, safeguarded by the conjuring rod of quantum theory. The laws of chance are not invalidated by this ‘invention’, but their outcome is modified. The physicist is familiar with the fact that the classical laws of physics are modified by quantum theory, especially at low temperature. There are many instances of this. Life seems to be one of them, a particularly striking one. Life seems to be orderly and lawful behaviour of matter, not based exclusively on its tendency to go over from order to disorder, but based partly on existing order that is kept up.
To the physicist — but only to him — I could hope to make my view clearer by saying: The living organism seems to be a macroscopic system which in part of its behaviour approaches to that purely mechanical (as contrasted with thermodynamical) conduct to which all systems tend, as the temperature approaches the absolute zero and the molecular disorder is removed.
Schrodinger, Erwin. What is Life? (Canto Classics) (pp. 68–69). Cambridge University Press. Kindle Edition.
In short, he reasoned that although the molecules carrying genetic information are subject to random thermal motion, there is a sense in which they behave as if they’re very near to the absolute zero of temperature. They behave in a mechanical way as if they’re not buffeted by the molecular disorder in which they must exist.
He argued that the genetic information must be encoded in what he called ‘aperiodic crystals’. Crystals because they’re highly ordered and maintain that order against thermal disruption, and aperiodic because they also have to encode lots of information. In 1943 people thought that genetic information was encoded in proteins. It was only during 1944 that it was discovered that DNA was the molecule that encoded genetic information, but it wasn’t widely accepted for a few years (see Discovery of the function of DNA). Once the idea was accepted then the race was on to work out the structure. The notion of ‘aperiodic crystal’ described by Schrödinger gave Crick and Watson a sense of what they should be looking for when working out the structure of DNA.
But maintaining those ‘aperiodic crystals’ in a state equivalent to absolute zero against the constant disruption of thermal noise requires a source of order or as he phrased it ‘negative entropy’
What then is that precious something contained in our food which keeps us from death? That is easily answered. Every process, event, happening — call it what you will; in a word, everything that is going on in Nature means an increase of the entropy of the part of the world where it is going on. Thus a living organism continually increases its entropy — or, as you may say, produces positive entropy — and thus tends to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy — which is something very positive as we shall immediately see. What an organism feeds upon is negative entropy. Or, to put it less paradoxically, the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help producing while alive.
Schrodinger, Erwin. What is Life? (Canto Classics) (p. 71). Cambridge University Press. Kindle Edition.
He made the point that it’s not solely the energy content of the food that we feed on, but the source of order, or negative entropy, inherent in that food. Even plants take in negative entropy in the form of sunlight, because an energy imbalance, light coming from the Sun, is a lower entropy state than a perfect equilibrium. For animals eating food what’s the point of exchanging one collection of carbon, nitrogen, oxygen, & etc, atoms for another set? It’s more than the energy content, it’s the ‘negative entropy’ content in the food that we need to preserve the molecular processes of life in a state ‘as if’ at absolute zero.
We can extract a key idea here which I’ll come back to over and over again. Orderliness has a value and because it has a value living things seek it out to use it for themselves.
How is does this relate to blockchains?
There are a number of ways in which the processes of life are similar to blockchain technology. We’ll start with two, (1) macroscopic at the whole species level, and (2) right down at the molecular level.
To understand them we’ll have to revise both basic biology and blockchain technology. I’m not going to include a description of the basic processes in blockchain technology. If you don’t already know about it follow these links to get up to speed;-
Bitcoin: A Peer-to-Peer Electronic Cash System, Satoshi Nakamoto
Ever wonder how Bitcoin (and other cryptocurrencies) actually work?
The core idea you need to get is that when adding a new block to the end of a blockchain cryptocurrency miners have to perform computational work to find a sequence of bits that match with bits in the block so that a certain condition is fulfilled. Now there is a difference in other mechanisms, e.g. Proof of Stake, Proof of Authority, but these mechanisms must also follow the laws of thermodynamics, it’s just not as obvious as large mining farms pumping out heat. The thermodynamics of Proof of Stake is in the next article.
If you’ve got the basics of the blockchain, then we can proceed to look at the similarities to living systems. I’ll just discuss two, at the macroscopic level and at the molecular level.
(1) Macroscopic level
Firstly, at the species level, we can envisage the processes of evolution as being analogous a blockchain. Living things reproduce new blocks of code, called progeny, which replicate what has gone before, but with some new modifications and additions, mutations. Each generation has to perform work to prove they are viable organisms to produce the next generation (block) in the chain. There is a consensus process called ‘natural selection’ which decides which blocks of code (genes) go on to be extended into the next generation. It’s a continually branching process with invalid branches or simply unsuccessful branches being cut off by the consensus mechanism of natural selection. Sometimes whole new branches split off and become new species. Branching plus a mechanism for getting rid of ‘invalid’ branches is the driver of evolution as Charles Darwin realised early in developing his ideas.
If you watch the details of a blockchain as it evolves in time being continually extended block by block, then you’ll see that it has a decidedly organic feel about it. New branches are continually being created and chopped off by the consensus mechanism. Sometimes those branches go a few ‘generations’, before being lopped off by group consensus in a network reorganisation. Sometimes the consensus mechanism breaks down and a completely new chain splits off, e.g. Bitcoin Cash splits off from Bitcoin
It’s a nice analogy but rather weak. It does us a sense that there’s some kind of relationship due to the similarity in the underlying processes but when we get down to the molecular level the similarities are much stronger
(2) Down at the molecular level.
Protein synthesis uses Proof Of Work. To understand the similarities we’ll have to revise biology. DNA in the cell nucleus stores genetic information. It’s encoded in a long chain of molecules called nucleotides of which there are four types.(see Wikipedia on nucleotides) The code is written in groups of three nucleotides. Each triplet is called a codon, each codon codes for one and one only amino acid in a protein. For example the codon AAA codes for lysine, and CAA codes for glutamine. With 4 nucleotides and 3 nucleotides per codon with have 4³ = 64 possible codons most code for 1 of only 20 amino acids, but some are punctuation codons that signal either ‘start’ or ‘stop’. A sequence of codons that code for one protein is called a gene. The correct sequence of amino acids in the protein define its function. Depending on the sequence it could be an enzyme to catalyse a specific chemical pathway in the cell, or a structural element of the cell or a molecular switch to regulate other reactions. (see Wikipedia on the Genetic Code).
Even one amino acid in the wrong place in the protein can mean it won’t work properly. For example sickle cell anaemia is due to one nucleotide, adenine, being replaced by thymine, which changes one codon in the DNA gene that codes for haemoglobin in your blood, from GAG to GTG, which results in glutamic acid being replaced by valine one position in haemoglobin , which changes its function (see Wikipedia on Sickle Cell disease). This illustrates two points, the first is Schrödinger’s observation that it’s very odd for a macroscopic physical process to be so dependent on tiny changes at the molecular level, normally the law of large numbers smooths things out, and the second is the importance of making the translations from genes in DNA to proteins error free and also in making copies of DNA error-free.
Most of the time DNA a cell is in a dormant state, only a few genes at any one time are active. It’s long-term storage like the hard drive or SSD in a computer. Then in response to molecular messages, some genes are opened up and transcoded into a different molecule called messenger RNA, or mRNA for short,(see Wikipedia on DNA transcription). mRNA is then used as the template for making a specific protein which may be structural, e.g. muscle or an enzyme to catalyse a specific chemical reaction.
That glosses over a rather remarkable set of processes by which the right amino acids are matched to the right codons. All you need to know so far is that information is read out of long-term storage into temporary storage, mRNA, to be used by the ribosome. Ribosomes are like the CPUs of the cell, it’s where the code is acted on. Think of it like copying code out of persistent storage on an SSD into the temporary cache memory in the CPU.
The ribosome is a little molecular machine (or CPU), it starts reading the code in the mRNA, and matches each triplet of nucleotides, a codon, with another type of RNA called transfer RNA, or tRNA. tRNA has a set of nucleotides on one side, that fit the codons in the mRNA, and an amino acid on the other side that is the right one for that codon (see Wikipedia on ribosomes and Wikipedia article on mRNA, and Wikipedia article on tRNA, and an excellent animation from the DNA learning center)
In the above diagram, the messenger RNA (mRNA) is being read by the ribosome. It pulls in the right transfer RNA (tRNA) to match the next codon in the sequence. The codon has the right amino acid attached and once it’s in place the ribosome links the amino acid at the end of the growing peptide chain. The shapes of the nucleotides mean that they match up in pairs. Adenine (A) matches to thymine (T) in DNA or uracyl (U) in RNA, and guanine (G) matches to cytosine ( C ). Hence in the diagram above the codon UUC in the mRNA is about to be matched with the anti-sense codon AAG in the tRNA.
The process takes code stored in DNA to proteins that do the work in the cell.
The diagram and animation make it appear as if it’s a nice clean process. The correct tRNA matching UUC waiting in a queue for its turn to be to fit neatly into place and add the amino acid phenylalanine to the growing peptide (protein) chain. But it’s not really that clean, tRNAs do not ‘know’ it’s their turn to fit into place. The phrase “…pulls in the right transfer RNA to match the next codon in the sequence. The codon has the right amino acid attached…” glosses over subtle complexity. How does the right amino acid get attached to the right tRNA and how does the right tRNA get matched up to the next codon? Proof Of Work is involved in both these processes.
The codons are matched up by the shapes of the nucleotides, but the difference in energy between the right match and the wrong match is very small. It’s not very different from the energies of the molecules that are constantly bashing into it in random thermal motion. It’s so small a difference in energy that it’s close to the absolute limit imposed by thermodynamics on the ability to differentiate a correct match from an incorrect match. Like listening out for the signal ‘Yup this one fits’ in a noisy building site, it’s hard to hear above the background random noise. Knowing that there exists a theoretical limit imposed by thermodynamics, and knowing the energy difference between good matches and bad matches together with the number of possible tRNAs available we can calculate the theoretically best possible error rate. It’s 1 error in 10
Which is very remarkable because an error of 1 in 10 amino acids in a peptide is nowhere near good enough to make an enzyme with hundreds of amino acids which has to be exactly the right shape to work. Haemoglobin has 574 amino-acids, so on average, each haemoglobin molecule would have 57 errors. As described earlier a single spot mutation which results in a single amino acid being out of place is responsible for sickle-cell anaemia. An error rate of 1 in 10 would mean that no useful peptides could be built. Life must have evolved a way to do better than 1 in 10 very early on.
In fact, the observed error rate when a ribosome makes a peptide from the mRNA code is about 1 in 10,000. How does it get to be so good given that the limit imposed by physical law means the error rate should be 1 in 10? Ribosomes implement two different complementary methods to increase selectivity and reduce errors.
Reducing the error rate
Conformational Proofreading. The shape of the ribosome is such that it’s not a perfect fit for the correct tRNA matching the mRNA codon to be read, but it’s an even worse fit for an incorrect match. This increases the energy difference between the right match and the wrong match favouring the right match (see Wikipedia article on Conformational proofreading)
Kinetic Proofreading. Conformal proofreading increases the probability of getting the right match, but it’s still not enough to improve selectivity from an error rate of 1 in 10 to 1 error in 10,000. Very low error rates occur in many biological processes, the error rate in DNA replication is even lower it’s 1 in 10⁹ so there has to be another mechanism besides conformal proofreading to account for such good selectivity.
However, the calculation of the theoretical limit to the error rate imposed by thermodynamics assumes that the possible configurations are in equilibrium. But a cell has access to free energy so it can pump the reaction to a non-equilibrium state. It uses a molecular version of “Maxwell’s Demon” (I’ll come to Maxwell’s Demon in the next article where I’ll look at the thermodynamics of Proof of Stake). Maxwell’s Demon is an idealised mechanism for reducing entropy by making decisions.
Kinetic proofreading (or kinetic amplification) is a mechanism for error correction in biochemical reactions, proposed independently by John Hopfield (1974) and Jacques Ninio (1975). Kinetic proofreading allows enzymes to discriminate between two possible reaction pathways leading to correct or incorrect products with an accuracy higher than what one would predict based on the difference in the activation energy between these two pathways.
Increased specificity is obtained by introducing an irreversible step exiting the pathway, with reaction intermediates leading to incorrect products more likely to prematurely exit the pathway than reaction intermediates leading to the correct product. If the exit step is fast relative to the next step in the pathway, the specificity can be increased by a factor of up to the ratio between the two exit rate constants. (If the next step is fast relative to the exit step, specificity will not be increased because there will not be enough time for exit to occur.) This can be repeated more than once to increase specificity further.
Free energy inside the cell is stored in nucleoside phosphate, ATP, GTP which is are far out equilibrium, by coupling the selection reactions to other reaction that are far out of equilibrium we can drive the selection out of equilibrium. Using available free a cell runs a sequence of hard to reverse reactions, each spending energy, and each increasing the specificity of the right chemical pathway. So that if the specificity of each reaction is to say 1 error in 10, then 4 such reactions gives 1 error in 10⁴.
For the reader wanting to get into the technical details read;-
- Wikipedia article on Kinetic proofreading
2. The original paper Kinetic Proofreading: A New Mechanism for Reducing Errors in Biosynthetic Processes Requiring High Specificity Hopfield 1974
3. And a slightly less technical explanation here;- Guéron, Maurice. “Enhanced Selectivity of Enzymes by Kinetic Proofreading: The Mystery of the Cell’s Accuracy in Translating Genes to Protein Has Been Unlocked by the Discovery of a Resourceful Method of Heightening Selectivity.” American Scientist 66, no. 2 (1978): 202–08. http://www.jstor.org/stable/27848516.).
The Key Idea
The key concept is that ribosomes expend energy to find the correct block of code, e.g. AAG to match UUC, from a random soup of possible codes, so that the next amino-acid in the growing peptide chain can be added. Which is analogous to a Bitcoin miner expending energy to find the correct nonce that fits the combination of the previous block hash and the set of transactions in the block to fulfill some criterion so they can extend the developing blockchain.
How it works
Using the example from Guéron, catching fish, but changing the context. Suppose there’s an art exhibition with lots of interconnected rooms, and for an entirely contrived and arbitrary reason, the organisers want to fill one of the rooms with people whose names begin with ‘B’ without them realising they’re being selected.
There are many people visiting the exhibition (assume an infinite number) and they explore the rooms at random (a Poisson process). The organisers work out they can make the people whose names begin with ‘B’,(the Bs) hang around a bit longer in one room if they offer them a glass of wine. So when people enter the room they give their names, and if they start with ‘B’ they get a glass of wine, and otherwise a nice smile but no wine. The Bs that get a glass of wine hang around in the room for an average of 10 minutes and then leave, the non-Bs who don’t get any wine hang around for just 1 minute and then leave. That means the room will slowly fill up with people with names beginning with ‘B’ until an equilibrium is reached where Bs leave at the same rate as Bs are entering. Non-Bs also enter and level at an equilibrium rate, but because they don’t get a glass of wine and hang around their equilibrium number (per letter), will be lower than the Bs. The proportion of Bs in the room will be enhanced relative to the proportion in the general population.
Which fulfills one of the requirements for kinetic proofreading
reaction intermediates leading to incorrect products more likely to prematurely exit the pathway than reaction intermediates leading to the correct product.
This is like the first stage of the matching process where ribosomes match mRNA to tRNA. The correct match, AAG to UUC has a lower energy level than incorrect matches, so it binds more tightly than other possible matches, e.g. AGG, which means the correct match spends more time in the transcription location in the ribosome than incorrect matches. But it’s not totally accurate so AGG and other matches will also arrive, but they don’t tend to hang around as long. At any one time, you’re more likely to find the correct match in the transcription ‘room’ in the ribosome that the incorrect match.
The next requirement is
Increased specificity is obtained by introducing an irreversible step exiting the pathway,
In the ribosome when a tRNA fits into the matching slot, it gets tagged via an energy-releasing reaction. Because the reaction releases energy it increases the molecular vibrations around it, it heats them up, which is an increase of entropy, which makes it hard for the reaction to go backward. The next step in the reaction chain only accepts tRNAs that have been tagged, and that increases the specificity.
In our analogy, we need to do something to drive the system out of equilibrium. In Guéron’s example, catching fish, he lifts the fish catching kit out of the water, creating an irreversible exit pathway. In our example, the art gallery organisers put things out of equilibrium by stopping people from entering that room, maybe put up a little red rope. People can still leave, but no one can enter. We now have our irreversible exit pathway. Since the non-Bs, don’t get wine and leave quickly compared to the Bs so after a few minutes we’ll have mostly Bs but with some non-Bs left in the room. This fulfills the requirement
If the exit step is fast relative to the next step in the pathway, the specificity can be increased by a factor of up to the ratio between the two exit rate constants.
After waiting a few minutes we open a door to another room, the people in the first room, now have an extra option other than leaving the way they came in, they can move to the next step, where Bs are offered another glass of wine and non-Bs get a nice smile but no wine. By repeating this down a sequence of rooms we can get a room filled with just Bs.
In our example we have a stop-go mechanism, we let people in to the first room, let an equilibrium be established, then close it off to new entrants, wait a bit, open the door to the next room, let that equilibrium be established then close that other room to new entrants, and reopen the first room to new entrants. That’s the analogy for two exit rate constants.
Molecular ratchet effects like this are used in biological processes which require high specificity (low error rates). They use a source of low entropy, molecules such as ATP and GTP which are far away from their equilibrium concentrations in the cell, to create even lower entropy molecules, mRNA, tRNA, DNA, and proteins. But since total entropy always increases they have to expel entropy in the form of heat and waste products.
This is analogous to a freezer and a Bitcoin mining rig. A source of order, in Schrödinger’s phrase ‘negative entropy’, is used to drive a process which produces greater order at one side, and greater disorder at the other. In a freezer and Bitcoin mining rig, the source of order is electricity, it’s out of equilibrium with its environment, and is thus low entropy. In living things the source of order can be food, or in plants, sunlight. This is used inside molecular machines which distill the order into highly ordered molecules and expel disorder as heat and waste products.
Our modern understanding of the physics of molecular biology confirms Schrödinger’s insight that to maintain highly ordered molecules, “aperiodic crystals”, living things have to drink negative entropy from the environment. This process is very similar in form to a freezer or cryptocurrency mining because it is fundamentally a thermodynamic process which distills greater order from a source of order, and expels disorder.
A key concept to get from this is that order, low entropy, has a value. In fact, in cryptocurrencies, low entropy is the value. But it’s not located purely in the bits on computers, it’s mostly lower entropy in the communities using the technology.
Where to next?
The increased entropy in cryptocurrency mining using Proof of Work is obvious, we see lots of heat coming out, but what happens in Proof Of Stake systems? They’re supposedly more energy efficient. Where’s the source of order if it’s not electricity? They reach ‘finality’ for each block more quickly, which implies they’re harder to reverse, so by the Second Law of Thermodynamics they must be creating more entropy somewhere, where does the extra entropy go? I’ll discuss that in the next article in this series.
Since biology has had at least 3.5 billion years creating molecular records then there’s probably something to learn about the solutions that have evolved to deal with malicious agents. In the next article, I’ll discuss malicious agents and we’ll be able to put the whole centralisation vs decentralisation issue in the context of thermodynamics.
Another obvious line of inquiry is that the increase of entropy is responsible for the one way flow of time at macroscopic levels. There is one other process which occurs at the level of fundamental particles which is not time-symmetric, but we can ignore that. Thinking about the increase in entropy and the creation of records as progress in time leads quite naturally to a set of methods to increase the scaling of blockchains by extending into block lattices. That will be discussed in the third article in this series.
Thermodynamics of Blockchain and the Thermodynamics of Life was originally published in Data Driven Investor on Medium, where people are continuing the conversation by highlighting and responding to this story.