and thermodynamics are often cited as a reason why diamondoid
mechanosynthesis can't work. Supposedly, the perfection of
the designs violates a law of physics that says things always
have to be imperfect and cannot be improved.
It has always been obvious to me why this argument was wrong.
The argument would be true for a closed system, but nanomachines
always have an energy source and a heat sink. With an external
source of energy available for their use, they can certainly
build near-perfect structures without violating thermodynamics.
This is clear enough that I've always assumed that people
invoking entropy were either too ignorant to be critics, or
It appears I was wrong. Not about the entropy, but about the
people. Consider John A. N. (JAN) Lee. He's a professor of
computer science at Virginia Tech, has been vice president
of the Association for Computing Machinery, has written a
book on computer history, etcetera. He's obviously intelligent
and well-informed. And yet, he makes the same mistake about
entropy--not in relation to nanotech, but in relation to Babbage,
who designed the first modern computer in the early 1800's.
In Lee's online history of Babbage, he asserts, "the
limitations of Newtonian physics might have prevented Babbage
from completing any Analytical Engine." He points out
that Newtonian mechanics has an assumption of reversibility,
and it wasn't until decades later that the Second Law of Thermodynamics
was discovered and entropy was formalized. Thus, Babbage was
working with an incomplete understanding of physics.
Lee writes, "In Babbage's design for the Analytical Engine,
the discrete functions of mill (in which 'all operations are
performed') and store (in which all numbers are originally
placed, and, once computed, are returned) rely on this supposition
of reversibility." But, says Lee, "information cannot
be shuttled between mill and store without leaking, like faulty
sacks of flour. Babbage did not consider this, and it was
perhaps his greatest obstacle to building the engine."
Translated into modern computer terms, Lee's statement reads,
"Information cannot be shuttled between CPU and RAM without
leaking, like faulty sacks of flour." The fact that my
computer works as well as it does shows that there's something
wrong with this argument.
In a modern computer, the signals are digital; each one is
encoded as a voltage in a wire, above or below a certain threshold.
Transistors act as switches, sensing the incoming voltage
level and generating new voltage signals. Each transistor
is designed to produce either high or low voltages. By the
time the signal arrives at its destination, it has indeed
"leaked" a little bit; it can't be exactly the same
voltage. But it'll still be comfortably within the "high"
or "low" range, and the next transistor will be
able to detect the digital signal without error.
This does not violate thermodynamics, because a little energy
must be spent to compensate for the uncertainty in the input
signal. In today's designs, this is a small fraction of the
total energy required by the computer. I'm not even sure that
engineers have to take it into account in their calculations,
though as computers shrink farther it will become important.
In Babbage's machine, information would move from place to
place by one mechanism pushing on another. Now, it's true
that entropy indicates a slightly degraded signal--meaning
that no matter how precisely the machinery was made, the position
of the mechanism must be slightly imprecise. But a fleck of
dust in a bearing would degrade the signal a lot more. In
other words, it didn't matter whether Babbage took entropy
into account or even knew about it, as long as his design
could tolerate flecks of dust.
Like a modern computer, Babbage's machine was designed to
be digital. The rods and rotors would have distinct positions
corresponding to encoded numbers. Mechanical devices such
as detents would correct signals that were slightly out of
position. In the process of correcting the system, a little
bit of energy would be dissipated through friction. This friction
would require external energy to overcome, thus preserving
the Second Law of thermodynamics. But by including mechanisms
that continually corrected the tiny errors in position caused
by fundamental uncertainty (along with the much larger errors
caused by dust and wear), Babbage's design would never lose
the important, digitally coded information. And, as in modern
computers, the entropy-related friction would have been vastly
smaller than friction from other sources.
Was Babbage's design faulty because he didn't take entropy
into account? No, it was not. Mechanical calculating machines
already existed, and worked reliably. Babbage was an engineer;
he used designs that worked. There was nothing very revolutionary
in the mechanics of his design. He didn't have to know about
atoms or quantum mechanics or entropy to know that one gear
can push another gear, that there will be some slop in the
action, that a detent can restore the signal, and that all
this requires energy to overcome friction. Likewise, the fact
that nanomachines cannot be 100% perfect 100% of the time
is no more significant than the quantum-mechanical possibility
that part of your brain will suddenly teleport itself elsewhere,
killing you instantly.
Should Lee have known that entropy was not a significant factor
in Babbage's designs, nor any kind of limitation in their
effectiveness? I would have expected him to realize that any
digital design with a power supply can beat entropy by continually
correcting the information. After all, this is fundamental
to the workings of electronic computers. But it seems Lee
didn't extend this principle from electronic to mechanical
The point of this essay is not to criticize Lee. There's no
shame in a scientist being wrong. Rather, the point is that
it's surprisingly easy for scientists to be wrong, even in
their own field. If a computer scientist can be wrong about
the effects of entropy on an unfamiliar type of computer,
perhaps we shouldn't be too quick to blame chemists when they
are likewise wrong about the effects of entropy on nanoscale
machinery. If a computer scientist can misunderstand Babbage's
design after almost two centuries, we shouldn't be too hard
on scientists who misunderstand the relatively new field of
But by the same token, we must realize that chemists and physicists
talking about molecular manufacturing are even more unreliable
than computer scientists talking about Babbage. Despite the
fact that Lee knows about entropy and Babbage did not, Babbage's
engineering was more reliable than Lee's science. How true
it is that "A little learning is a dangerous thing!"
There are several constructive ways to address this problem.
One is to continue working to educate scientists about how
physics applies to nanoscale systems and molecular manufacturing.
Another is to educate policymakers and the public about the
limitations of scientific practice and the fundamental difference
between science and engineering. CRN will continue to pursue
both of these course