Based on history alone, we can be certain that our civilization will collapse someday: civilizations simply don’t last forever. In fact, they rarely last very long at all. A great many things can bring down a civilization.
One of these is a declining EROEI (sometimes called EROI.)
This stands for Energy Returned On Energy Invested, or Energy Returned on Investment. It refers to how much energy it takes to acquire more usable energy. It’s expressed as a ratio. An EROEI of 1:1 is break-even — it costs one unit of energy to acquire one unit of energy.
For instance, if it costs one bushel of wheat (baked into bread) to feed enough people to farm and produce one bushel of wheat, we’re at 1:1 or break-even: we’ll use up all our food raising more food, with nothing left over. If we can produce two bushels of wheat and eat only one, we have an EROEI of 2:1, and we can sell the other bushel for profit, or pay it as a tax to maintain an army to protect our fields, or use it to invent donuts and waffles. If it costs us more than a bushel to raise another bushel, we’re on our way toward starvation.
Any civilization requires a certain minimum EROEI to survive as a civilization. Below that magic number, there isn’t enough energy left over after bare survival to engage in any of the trappings of civilization. Civilization enters a downward spiral that ends when we become small tribes herding goats in the shadows of the skyscrapers built by the ancients.
Now, if you’ve already guessed that this minimum EROEI is a very fuzzy number, and that precise calculation of any EROEI is impossible, you’d be right. If we figure out a clever way to farm wheat more efficiently, the EROEI of wheat goes up. If the soil starts to degrade from our farming practices, the EROEI of wheat goes down. If locusts find our fields and make them the annual site of their spring-break kegger, the EROEI of wheat goes down. So the exact EROEI of wheat depends on many factors: quality of the soil, rainfall, pests, skill of the farmer.
It also depends on honesty, which is in short supply in energy discussions. We have centuries of tradition within the business world of pocketing profits and passing costs onto everyone else. To be meaningful, EROEI has to account for the costs — all the costs. Like the cost of cleaning up the mess left by farming the wheat: dumping the chaff in your neighbor’s field and saying it costs nothing to dispose of it is dishonest, and artificially boosts the EROEI of wheat. Critics of coal and nuclear energy, for instance, say that the industries claim much higher EROEI than is merited by ignoring or minimizing cleanup costs. Proponents of these industries hotly deny this, and so the EROEI discussion consists of wildly varying numbers and a lot of name-calling.
Fortunately for our purposes in this blog, we need only very rough numbers.
The sweet crude domestic oil pumped from US wells in 1930’s is the “gold standard” that everyone seems to agree on, and this had an EROEI of roughly 100:1 — in simple terms, it cost one gallon of gasoline in 1930 to produce one hundred gallons of gasoline. That left us ninety-nine gallons of gasoline to do anything we liked, such as fly airplanes, drive cars, harvest potatoes, make movies, build computers, fight wars….
Here’s a chart showing one comparison of different energy sources.
Don’t take the numbers too seriously — remember how hard it is to compute an accurate EROEI. Instead, look at the shapes, and positions, and movements.
Blue is from 1930. Purple is from 1970. Red is from 2005.
As technology in extracting and producing a particular form of energy improves, these bubbles move upward toward higher EROEI, and as we build more plants and produce more energy, they move to the right. So wind, way over on the left side, is currently around 20:1 on EROEI, and — being brand new — produces very little energy. As a new technology with growing acceptance, it will certainly move up and to the right. How far it will move on either axis before it reaches its limits is anyone’s guess.
Oil, on the other hand, shows what happens to finite resources as we use them up. Initially, they move up and to the right, just like any new technology. For oil, this phase happened in the late 1800’s. Then the bubbles start to move down and to the right as the resource becomes harder to extract, but production increases. As production peaks, they continue to move down in EROEI, but start to move to the left, and eventually vanish. US oil production peaked in the 1970’s, and is on its way out. Global oil production peaked in the early 2000’s, so — unless we want to fight escalating wars to take the the oil away from other paying customers by force — the bubble for imported oil is also headed down and to the left.
The other major thing to notice is where the US sits in terms of total energy usage, way out there on the right. In 1930, the US was at about the 20 mark, and in 1890, it was well under 10. We’re now using more energy every year than all the plant life on the earth creates in a year through photosynthesis. We are doing this by burning coal, oil, and natural gas on a phenomenal scale, and no one fuel source can come close to meeting this level of usage — we need them all, and then some. As each bubble drops and slips to the left, another bubble needs to move up and to the right to replace it — otherwise, the US usage bubble will be dragged to the left by energy shortages, and down toward the line of uncivilization as we lose the ability to use energy for anything other than producing more energy.
There is plenty of discussion taking place regarding how to move the US toward the left of this chart in an orderly way (e.g. insulating your home, using fluorescent light bulbs, driving less,) as well as what will happen when we don’t do this (we won’t) and get forced to the left by energy shortages. There’s also plenty of discussion of what happens as the bubbles drift down toward the line of uncivilization.
I don’t really have a lot to add to that discussion.
What I’d like to do instead is explore how something like cold fusion might change this entire outlook.
WARNING: Everything about cold fusion is still speculative — including whether it is even real. So what follows is informed speculation, and nothing more.
Second warning: This is going to get technical.
First, let’s do some basic physics.
There’s a thing called the “nuclear binding energy” that comes from Einstein’s famous equation, E = mc2. If we total up the individual masses of the protons and neutrons in the nucleus of an atom, then weigh the nucleus, we come up short — the total mass is less than the sum of the parts. The difference is the mass that serves as binding energy that holds the nucleus together. So if we add back all the missing mass in the form of energy (a good hard kick with a high-velocity particle) it turns out that this is just exactly enough energy to blow the nucleus to smithereens and create individual particles.
If we then put all those smithereens back together in a different arrangement, mass will vanish again, but a different amount. If even more mass vanishes in this rearrangement, meaning that the particles have all found a more stable arrangement, the difference will show up as free energy, typically in the form of a gamma ray photon and some recoil momentum in the nucleus, both of which ultimately show up as heat.
This makes it surprisingly easy to compute the energy released by a nuclear reaction. It’s exactly equal to the mass lost in the reaction.
The Rossi cold fusion reaction might look something like this:
Masses (in atomic mass units) look like this:
The energy released is simply the mass of (Ni + H) minus the mass of (Cu), which leaves 0.006573 atomic mass units. This mass vanishes, and reappears as energy. Since one amu is 1.492417 x 10-10 Joules, this reaction will release 9.810 x 10-13 Joules (or 6.123 Mev.) Not very much energy, but this is for only one atom of copper produced. A nickel — the coin — weighs five grams, and if it were made entirely of pure nickel — the metal — with a molar density of 58.69 g/mole, it would have:
If we were to convert all of the nickel in a five-cent coin into copper, it would release 50.3 x 109 Joules of energy, or 48 MMBtu, or — released slowly as pure electricity — 14 MW-hours. Since it will actually be released slowly as heat, an ideal heat engine running at 1000ºC would be about 78% efficient, so this would be about 37 MMBtu, or 11 MW-hours. A typical US suburban home uses 90 MMBtu in a year, so three nickels would power a house for over a year.
The United States currently uses about 97 quadrillion (million-billion) Btu per year — let’s call it 100 quadrillion — so to switch our energy economy totally to nickel-hydrogen fusion would require about
that is, about two billion nickels, or ten million kg, or 10,000 metric tons of raw nickel per year, all of which would be converted to copper to supply our energy needs. How much nickel is this? As a comparison, the worldwide production of raw nickel in 2011 was 1,800,000 metric tons. So about a half-percent of worldwide annual nickel production — as it exists right now — would theoretically power the US for a year. Even with our profligate and excessive energy use.
Now the piece left out of this calculation is how much energy it takes to make this reaction happen. Because it isn’t free: you have to put energy in to get anything out. If, for example, we have to put 1,999,999,999 of the two billion nickels into the reactor just to get the reaction to work, the net energy released is a nickel’s worth, and that powers one house for five months. And it used up 10,000 metric tons of raw nickel to do it. That’s not very useful.
The story we’re hearing from the people doing this, however, is that output is more than slightly higher than input: some claim as much as six to one, that is, for every six nickels we burn, we get five back, making it over eighty percent efficient. Let’s assume that this is hype and that it’s only ten percent efficient: that is, if we burn ten nickels, we only get to keep one. The US would then have to burn a full five percent of the worldwide annual nickel production every year to maintain its standard of extravagance.
What this means is that total energy production from this reaction could place a bubble completely off the above chart, to the right. Like any non-renewable resource, we’ll eventually run out of nickel, but no one is going to consider that distant event a drawback in the present. People don’t behave that way.
What about EROEI? That’s a little trickier. But we could get a very rough idea by looking at coal. Like nickel, coal is mined, and coal is also “burned” to produce heat energy. To within a factor of two or three, EI should be about the same as coal: in both cases we have to mine it, purify it, ship it, build burners and boilers, attach the boilers to turbines, etc. We can compare the EO by simply estimating overall Btu/ton. Bituminous coal carries about 25 x 106 Btu/metric ton. The Rossi reaction (at 10% efficiency) yields about 10 x 1015 Btu for 10,000 metric tons, or 1 x 1012 Btu/metric ton. So the Rossi reaction has an EROEI somewhere around 40,000 times that of coal. In other words, the EROEI of Ni-H fusion is so far off the chart to the top that you can’t even see anything else.
In short — Ni-H fusion is the magic energy bullet to end all magic energy bullets. And it may not even be the best cold fusion process out there.
So let’s talk about drawbacks — this is “nuclear energy” after all.
Unlike traditional nuclear fission plants (or worse, the experimental hot fusion processes) the Rossi reaction seems to require only low technology. Nickel is plentiful, and is easily mined and refined using pre-industrial technology. Hydrogen is plentiful and easy to produce in the quantities needed using pre-industrial technology. The reactor cells do not appear to be a lot more complicated than a pressure cooker. So entirely unlike traditional “nuclear power,” this technology looks like it can be scaled down to ridiculous extremes — perhaps not quite to the level of a “nuclear battery” that would fit in a cell phone, but they are already talking about hot-water-heater sized Rossi reactor units for powering individual homes.
Unlike traditional nuclear fission plants, there is zero risk of a “runaway reaction” or a nuclear meltdown. Fission plants use a neutron chain-reaction: the fuel rods are made of radioactive material that breaks down and releases neutrons. Those neutrons spread out and break down more of the radioactive material — one neutron can release anywhere from two to twenty new neutrons — and with every new batch of neutrons, the reactor gets hotter. The natural tendency of a fission reactor is to melt down in a fiery cataclysm, and most of the technology in the plant is there to prevent this from happening. If the technology fails, the reactor pursues its nature and bores a hole to China.
With the Rossi reaction, if anything fails — say the housing cracks and the hydrogen leaks out — it simply stops working. The nuclear fire goes out. The hydrogen is flammable, and the reaction cell is a kind of pressure cooker, so fires and explosions are possible: the same kinds of risks that gas furnaces and oil-fired boilers pose. But they aren’t nuclear fires or explosions, and nothing radioactive gets released.
Unlike fission reactors, radiation is not a serious problem. This reaction produces only gamma rays. Gamma rays are quite harmful to biological organisms, but they are easily blocked.
Let’s look at this in a little more detail.
Gamma radiation is electromagnetic radiation, like visible light, but at very high frequency — beyond ultraviolet, and beyond X-rays. It interacts with matter in three basic ways.
One is ionization (which is why gamma rays are called “ionizing radiation,” as are X-rays) or the photoelectric effect. In this case, the gamma photon hits an electron in an atom and kicks it loose, typically ejecting it at very high speed. This usually destroys any chemical bond the atom was involved in, so this can by itself slice up DNA or other chemical compounds. The high speed electron — historically called beta radiation — can slam into other atoms and cause more damage to chemical bonds. Electrons also slow down as they pass through the complex electromagnetic fields between atoms in matter, and as they do, they spew out a cascade of electromagnetic radiation (Bremsstrahlung radiation) that moves progressively down into softer gamma rays, X-rays, ultraviolet radiation, visible light, and lower frequencies. All of this secondary radiation can kick out other electrons from other atoms.
It’s a hundred-car pileup on the interstate, complete with the shrieking of brakes (Bremsstrahlung means “braking.”) Like a pileup, however, each collision gets a little slower and a little gentler, and eventually — after a few nanoseconds — the whole mess comes to rest.
While this is all very dangerous to the delicate chemistry of life, it isn’t of much concern in dense materials like lead. There’s no delicate chemistry to worry about, and all the radiation and the free electrons eventually just get absorbed and shuffled back to where they belong, making the lead a tiny bit warmer. Nothing becomes radioactive, so there is no radioactive waste. If the lead is thick enough, no radiation will make it to the other side of the shielding.
The second interaction, called Compton scattering, is much like the first. The details are slightly different in a way that’s important to physicists, but practical end results are identical to the first case.
The third interaction is called “pair production,” and it’s a little dance between the gamma ray and the nucleus of the atom that forms an electron-positron pair out of the energy of the gamma ray. These take off in opposite directions like a divorced couple at someone else’s wedding reception. The positron ends up running into another electron somewhere, where they annihilate each other completely to produce two (lower-energy) gamma rays [Note: wedding reception hook-ups with antiparticles never end well.] End result is, again, exactly the same: some bouncing electrons, a cascade of decreasing-energy electromagnetic radiation, and then it all settles out.
No radioactive waste left over.
In short, this reaction is about as clean a burn as you could ever hope to find in nature.
So far, this is all elementary college physics — well, okay, third-year college physics — and there are no surprises here. There’s no magic, and the numbers are all well-known. The practical question revolves around whether the reaction takes place fast enough to be in any way useful.
Traditional thought in physics would say that the Coulomb barrier — the repulsive electrical force between the hydrogen nucleus and the nickel nucleus — is strong enough to prevent this reaction from having any significant probability of occurring without either high-energy particle collisions, or extreme heat and pressure such as at the core of a star. Lacking that, you might get a spontaneous fusion event every few million (or billion) years due to quantum tunneling, which isn’t even enough to be a laboratory curiosity. To get the probability up to observable levels would take more energy than you could ever get back out of it — the EROEI of this nuclear process itself would be far below break-even.
However, the growing experimental evidence indicates that there are other mechanisms that can overcome the Coulomb barrier, and that this reaction can take place at a fair clip at far more modest pressures and temperatures — modest enough to bring the the EROEI above break-even in human-habitable environments.
That still does not make this a viable power source. As described so far, this reaction is a heat source, which — apart from parlor tricks like boiling a pot of tea or warming up your shower water — needs to be turned into useful work through a heat engine. There is a hard theoretical limit to the efficiency of a heat engine of any sort. This limit is based on the absolute temperature difference between the heat input and the heat outflow. A heat source needs to be able to generate a temperature differential of about 1000° C. to be useful in most industrial applications. Steam is the most common heat carrier, and it stops being steam below 100° C. So the reactor needs to be able to reach and hold at least 1100° C.
Nickel melts at 1455° C, and at first guess, a “lattice-assisted” nuclear reaction would not work at all once the nickel melts. So the question becomes, how fast does the reaction fall off as the temperature approaches the melting point? That’s a question that will require an empirical answer.
The other question is, how fast can the reactor cell generate heat? It’s one thing to boil a teapot full of water (100° C.) It’s another thing to reach 1100° C. It’s yet another to bring an industrial boiler full of cold steam to 1100° C. in something less than a year. This is, again, an empirical problem.
There are dozens of other technical issues that need to be figured out, ironed out, painted over, and sold in the marketplace, before cold fusion becomes a viable power source. But those are all technical problems, and people excel at solving those kinds of problems.
So I’m going to assume for discussion’s sake that cold fusion is going to work, that it will have a ludicrously high EROEI (at least several thousand-to-one,) and that it will provide obscene — effectively limitless — amounts of energy. None of that may turn out to be true, but I think it’s worth thinking through what it would mean if it were true.
Clearly, this will solve all of our problems. Right?
I don’t think so. This is something I’d like to explore in future posts.
 An ideal heat engine’s efficiency is 1 – Tc/Th where the temperature is expressed in absolute degrees Kelvin. So an engine running at 1000° C (Th) and exhausting into 0° C (Tc) would be 1 – 273/1273 = 78%. Any real heat engine will have even less efficiency.
 I’m afraid all this physics is from thirty years in my past, and I’m still trying to figure out if there’s a non-zero probability of gamma-induced nuclear transmutation, which could — over time — result in at least trace levels of radioactive waste. The answer seems to be “No,” but I’m still asking questions.