The Edge is publishing a talk with Seth Lloyd, "Quantum Monkeys". Lloyd is Professor of Mechanical Engineering at MIT and a principal investigator at the Research Laboratory of Electronics, and invented "the first method for physically constructing a computer in which every quantum — every atom, electron, and photon — inside a system stores and processes information".
But according to Lloyd's theory, every atom, every particle, already stores and processes information merely by existing. So what he's saying is, he invented... a rock?
Lloyd talks about quantum mechanics, monkeys, sexual reproduction, mutation, software, and just about everything else he can think of. He tries to pull all these disparate ideas together under the umbrella of "information":
The notion that the universe is, at bottom, processing information sounds like some radical idea. In fact, it's an old discovery, dating back to Maxwell, Boltzmann and Gibbs, the physicists who developed statistical mechanics from 1860 to 1900. They showed that, in fact, the universe is fundamentally about information. They, of course, called this information entropy, but if you look at their scientific discoveries through the lens of twentieth century technology, what in fact they discovered was that entropy is the number of bits of information registered by atoms. So in fact, it's scientifically uncontroversial that the universe at bottom is processing information. My claim is that this intrinsic ability of the universe to register and process information is actually responsible for all the subsequent information processing revolutions.
[emphasis added]
Oh deary deary me. Lloyd may be a professor of engineering, but he's way over his head here and needs a refresher course on thermodynamics and information theory. Entopy is information? Maybe in Bizarro World, but not in this universe.
If information were entropy, then as a system becomes more and more disordered, more and more uncertain, it would contain more information. That's a bizarre and unhelpful concept. Imagine: you receive a noisy signal, lost of static, high in entropy, with enormous uncertainty. According to Lloyd, you have more information -- more certainty about the state of the signal -- the more uncertain you are.
No. No, I don't think so.
From Dr. Thomas D. Schneider of the Molecular Information Theory Group at the National Institutes of Health:
The confusion comes from neglecting to do a subtraction:Information is always a measure of the decrease of uncertainty at a receiver (or molecular machine).
[emphasis in original]
From the Biological Information Theory FAQ:
If someone says that information = uncertainty = entropy, then they are confused, or something was not stated that should have been. Those equalities lead to a contradiction, since entropy of a system increases as the system becomes more disordered. So information corresponds to disorder according to this confusion.
Remember class, information = certainty. Entropy = uncertainty. They are not the same thing, they are opposites.
No comments:
Post a Comment