☞ Determining the Properties of Your Computational Universe
HAKMEM and the Simulation Hypothesis
In 1972, several members of the Artificial Intelligence Laboratory at MIT compiled a memo of the various techniques, algorithms, and tricks that they had been using for the large computers they were working with: how to draw fractals, how to solve mathematical puzzles, and how to play certain games. This was all collected into a memo that is known as HAKMEM (pdf available here). It’s a veritable bestiary of methodologies, a sort of computational grimoire, which leans into its cryptic nature and vagueness.
And buried within this memo is item 154:
Bill Gosper, who also worked on the Game of Life and is known for the first glider gun in that game, works to debunk the notion that “any given programming language is machine independent.” Specifically, if you add together more and more powers of two, depending on what happens, you can determine the properties of the specific machine.
The reason that summing powers of two is special—or at least informative—is because of the way binary works. If you add all the powers of two, you end up with a number that is described in binary as a string of 1’s. For example, 1111111 = 1 + 2 + 4 + 8 + 16 + 32 + 64 = 127.
Because no machine can handle an arbitrarily large number, if you keep on adding powers of two, eventually something will occur that will allow you to determine the way it stores numbers and how it works, particularly by focusing on the details of what the number’s final digit looks like.
For example, imagine our theoretical machine can only hold numbers with eight bits. So if you now add 128—or 10000000—to our 127 in binary, one possibility is that the numbers start turning negative, which depending on what it turns into, could indicate that numbers are being stored according to something known as two’s complement. For example, if you add 128 to 127 with two’s complement, it would result in the number -1 being represented by our machine, as per the chart on Wikipedia.
But note 154 concludes in a surprising direction: “By this strategy, consider the universe, or, more precisely, algebra.” Gosper, using a simple joke calculation, arrives at the result that “algebra is run on a machine (the universe) which is twos-complement.”
To be clear, Gosper’s mathematics is spurious and presumably done in jest, but it seems that Gosper was taking a quick and playful stab at using mathematics in order to determine something about the very nature of the numerical engine of the cosmos.
What is Gosper actually doing here? He’s saying that if you take an arbitrarily large number (as implied by the ellipses) that is the sum of the powers of two, it will consist of a string of 1’s in binary. And if you add that number to itself, it will be a string of 1’s ending in a zero. So, apparently, this arbitrarily large number when added to itself is equal to the original number minus one. If this is true, then the original number must be -1—via some simple algebra—and hence two’s complement.
This is not actually how numbers work, of course—doubling a number in reality will not be the same length as the original number in its number of bits—so this all breaks down. But it is nonetheless intriguing that Gosper thought of using some mathematical prestidigitation to (inaccurately) imply something about the properties of our universe.
While this approach is playful and tongue in cheek, it turns out that some scientists have actually tried to do this kind of thing more seriously: there are a number of scientific papers devoted to determining the properties of our simulation.
In the end though, particularly if the cosmos is a machine, we must still remember that all computing is a deeply physical process: the world’s messiness—its physics, for better or worse—impinges on our computers. As much as we think of computing as disembodied symbol manipulation, it is far from this. And if this is true, we might be able to discern the nature of this cosmic machine in the smallest details. ■
Thanks to Will Byrd for feedback on early versions of this essay.
The Enchanted Systems Roundup
Here are some links worth checking out that touch on the complex systems of our world (both built and natural):
🜸 We Rarely Lose Technology: “loss of technology is not impossible. But to an innovative and large culture like modern human civilization, it’s not really something that happens. It’s just a fun trope for stories. Let’s hope it remains that way.”
🝳 Paul Ford’s Father’s Death in 7 Gigabytes: “Death is a lossy process, but something always remains.”
🝖 Miniature scale computer models
🜹 Long Science: “This suggests to me that there may be significant latent experimental supply – and scientific progress – that our society effectively leaves on the table. We could be productively expanding long-term, small-scale scientific exploration to inexpensively generate valuable knowledge.”
🝊 How NASA Writes Space-Proof Code: “In 2006, Gerard Holzmann of the NASA/JPL Laboratory for Reliable Software wrote a paper called The Power of 10: Rules for Developing Safety-Critical Code. The rules focus on testability, readability, and predictability”
🜸 I used to be somebody: “an old Macintosh Classic 2 standing outside the Apple Store. Equipped with a face-tracking webcam in the floppy drive it's desperately looking at people passing by, asking for spare change to survive.”
🜚 Text for Proofing Fonts: “The far more pernicious issue with pangrams, as a means for evaluating typefaces, is how poorly they portray what text actually looks like.”
And here’s a conversation I was part of related to computational tools for thinking.
Until next time.