nature | At the time, reversible computing was widely considered impossible. A conventional digital computer is assembled from an array of logic gates — ANDs, ORs, XORs and so on — in which, generally, two inputs become one output. The input information is erased, producing heat, and the process cannot be reversed. With Margolus and a young Italian electrical engineer, Tommaso Toffoli, Fredkin showed that certain gates with three inputs and three outputs — what became known as Fredkin and Toffoli gates — could be arranged such that all the intermediate steps of any possible computation could be preserved, allowing the process to be reversed on completion. As they set out in a seminal 1982 paper, a computer built with those gates might, theoretically at least, produce no waste heat and thus consume no energy1.
This seemed initially no more than a curiosity. Fredkin felt that the concept might help in the development of more efficient computers with less wasted heat, but there was no practical way to realize the idea fully using classical computers. In 1981, however, history took a new turn, when Fredkin and Toffoli organized the Physics of Computation Symposium at MIT. Feynman was among the luminaries present. In a now famous contribution, he suggested that, rather than trying to simulate quantum phenomena with conventional digital computers, some physical systems that exhibit quantum behaviour might be better tools.
This talk is widely seen as ushering in the age of quantum computers, which harness the full power of quantum mechanics to solve certain problems — such as the quantum-simulation problem that Feynman was addressing — much faster than any classical computer can. Four decades on, small quantum computers are now in development. The electronics, lasers and cooling systems needed to make them work consume a lot of power, but the quantum logical operations themselves are pretty much lossless.
Digital physics
Reversible computation “was an essential precondition really, for being able to conceive of quantum computers”, says Seth Lloyd, a mechanical engineer at MIT who in 1993 developed what is considered the first realizable concept for a quantum computer2. Although the IBM physicist Charles Bennett had also produced models of a reversible computation, Lloyd adds, it was the zero-dissipation versions described by Fredkin, Toffoli and Margolus that ended up becoming the models on which quantum computation were built.
For the cosmos to have been produced by a system of data bits at the tiny Planck scale — a scale at which present theories of physics are expected to break down — space and time must be made up of discrete, quantized entities. The effect of such a granular space-time might show up in tiny differences, for example, in how long it takes light of various frequencies to propagate across billions of light years. Really pinning down the idea, however, would probably require a quantum theory of gravity that establishes the relationship between the effects of Einstein’s general theory of relativity at the macro scale and quantum effects on the micro scale. This has so far eluded theorists. Here, the digital universe might just help itself out. Favoured routes towards quantum theories of gravitation are gradually starting to look more computational in nature, says Lloyd — for example the holographic principle introduced by ‘t Hooft, which holds that our world is a projection of a lower-dimensional reality. “It seems hopeful that these quantum digital universe ideas might be able to shed some light on some of these mysteries,” says Lloyd.
That would be just the latest twist in an unconventional story. Fredkin himself thought that his lack of a typical education in physics was, in part, what enabled him to arrive at his distinctive views on the subject. Lloyd tends to agree. “I think if he had had a more conventional education, if he’d come up through the ranks and had taken the standard physics courses and so on, maybe he would have done less interesting work.”