New Proof Dramatically Compresses Space Needed for Computation
Surprising new work bucks 50 years of assumptions about the trade-offs between computation space and time
Once upon a time computers filled entire rooms, reading numbers from spinning tapes and churning them through wires to do chains of basic arithmetic. Today they slip into our pockets, performing in a tiny fraction of a second what used to take hours. But even as chips shrink and gain speed, theorists are flipping the question from how much computation space we can pack into a machine to how little is enough to get the job done.
This inquiry lies at the heart of computational complexity, a measure of the limits of what problems can be solved and at what cost in time and space. For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory—the 0s and 1s that a machine uses to record information. (Technically, that equation was t/log(t), but for the numbers involved log(t) is typically negligibly small.) If a task involves 100 steps, for instance, you’d expect to need at least 100 bits, enough to diligently log each step. Using fewer bits was thought to require more steps—like alphabetizing your books by swapping them one by one on the shelf instead of pulling them all out and reshelving them. But in a surprising finding described this week at the ACM Symposium on Theory of Computing in Prague, Massachusetts Institute of Technology computer scientist Ryan Williams found that any problem solvable in time t needs only about √t bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits. “This result shows the prior intuition is completely false,” Williams says. “I thought there must be something wrong [with the proof] because this is extremely unexpected.”
The breakthrough relies on a “reduction,” a means of transforming one problem into another that may seem unrelated but is mathematically equivalent. With reductions, packing a suitcase maps onto determining a monthly budget: the size of your suitcase represents your total budget, pieces of clothing correspond to potential expenses, and carefully deciding which clothes can fit is like allocating your budget. Solving one problem would then directly solve the other. This idea is at the core of Williams’s result: any problem can be transformed into one you can solve by cleverly reusing space, deftly cramming the necessary information into just a square-root number of bits. Thus, the original problem must be solvable with this compact container.
On supporting science journalism
If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
“This progress is unbelievable,” says Mahdi Cheraghchi, a computer scientist at the University of Michigan. “Before this result, there were problems you could solve in a certain amount of time, but many thought you couldn’t do so with such little space.” Williams’s finding, he adds, is “a step in the right direction that we didn’t know how to take.”
While computers have continued to shrink, our theoretical understanding of their efficiency has exploded, suggesting that the real constraint is not how much memory we have but how wisely we use it.