23 Comments
Aug 8, 2022Liked by Maxwell Tabarrok

"Since humans are already universal explainers and constructors, they can already transcend their parochial origins, so there can be no such thing as a superhuman mind as such. . . Artificial scientists, mathematicians and philosophers [will never] wield concepts or arguments that humans are inherently incapable of understanding."

I wonder why Deutsch believes this. It seems possible, if not probable, that there are concepts beyond our understanding.

Expand full comment
author

I think it's based on Turing's universal computer. The only thing that separated Turing machines is how long their tape is and how fast they can read it. https://en.m.wikipedia.org/wiki/Turing_machine

But they can all run the same programs.

Expand full comment

I know this is a very old essay, but this idea:

"The currently accepted cosmological theory an accelerating expansion of the universe allows for an unbounded number of computations in a universe which is infinite in both space and time."

is plainly wrong. Computation is a thermodynamic process, and the laws of thermodynamics predict a final, maximum entropy state of the universe known as "heat death".

Expand full comment
author

Fair enough, I am taking this mostly on Deutsch's word. Heat death may not be incompatible with infinite computations. I am thinking of something like Gabriel's Horn where there is a final limiting state where the volume is zero but the surface area is unlimited. Idrk if this applies but that's where my mind went.

Expand full comment

Gabriel's Horn just has finite volume, not zero.

This actually sent me down a bit of a rabbit hole and I found the passage from Deutsch's book. If I were to summarize my thermodynamic objections simply it would be something like, "Doing math with the universe requires the ability to set up an energy distribution whose behavior does the calculation you want. Dark energy is distributed evenly across space and time, and therefore cannot be used to do any calculation."

Good to hear from you btw.

https://home.cern/science/physics/dark-matter

Expand full comment

> There are just as many numbers divisible by a trillion as there are odd numbers even though one seems far more common when counting.

This depends on what you mean by "just as many", which is not a precise concept. If you mean cardinality, then of course the cardinality of the two sets is certainly the same. However you might mean "natural density", and odd numbers have a natural density 500,000,000,000 times higher than numbers divisible by a trillion. I think when we casually use terms like "just as many" we often mean natural density.

However I actually agree with your argument. I think there's good reason to believe there are infinitely many humans, either because the universe is infinitely large in space, or there is a multiverse of infinitely many universes (or both). In such a case you can't make arguments based on the distribution of humans.

Expand full comment
Aug 3, 2022Liked by Maxwell Tabarrok

Hilbert's arguments depend on their being a real difference between "extremely large" and "actually infinite". The distance we travel through is not "infinite" to a physicist because of the fundamental quantum unit of a Planck length. There are a finite (even if very large) number of them between any points. Similarly, emulations would not be "infinite" because there are physical limits to how many of them could be created in the universe.

Expand full comment
author

My understanding, which may be flawed, is that particles have a minimum size, but they exist in a continuum with an uncountable number of points. A test of this is whether particles always move in whole multiples of the plank length or if they can move a fraction of it.

Either way, I agree that you need truly infinite sets for the argument to work. The infinite continuum of space time is just one example but there are a few others in cosmology that I talk about.

Expand full comment

The Planck length is the minimum distance, so there's no point of talking about any "fraction" of it. This is a separate issue from the size of "particles". You could have a computer program in which objects are of size X, but they can only be positioned in units of pixels which are much smaller than X. The pixel is the smallest unit for that, so they can never be moved a "fraction" of a pixel.

Expand full comment

Quantum mechanics deals with infinities. You can compute infinities, but those don't deal with pixels (which are finite and discreet units).

Imagine a screen with an infinite number of pixels. Now, try to traverse those infinitely long pixel chains. You can do it. But you cannot divide such a traversal up into infinitely long lengths in a trivial manner.

Quantum mechanics specifically forbids any "information" being resolved at smaller than the plank scale. So to consider there being information smaller than that scale, is considering something outside of our universe.

"So why is the Planck length thought to be the smallest possible length? The simple summary of Mead's answer is that it is impossible, using the known laws of quantum mechanics and the known behaviour of gravity, to determine a position to a precision smaller than the Planck length."

It includes position. It includes the "sub pixels" so to speak.

Expand full comment

Not really. Consider there is a limited amount of computation that can be done before a black hole forms. Thus, there can only be a limited amount of "steps" between any two points, as it would be impossible to calculate, or produce (in physics, even of the physical universe) those infinite steps. This can be demonstrated in the below example of the measurement problem.

The universe gets around being asked to do infinite steps, in part due to the Heisenberg uncertainty principle, which is part of the plank length scale limitation (again, you'd need an infinitely sized measuring device to get infinite steps, and thus also an infinitely dense universe). Even if we assume there are an infinite number of steps, we or the universe, could never experience or express them.

Field theory does seem to also side step this problem, as although we have infinities in the size of the fields and their interactions, any computation or formula expressed via or with them must arrive at a result. Similar to the Halting problem, if we ask a question that takes infinite time, we must be certain it also completes and gives a result.

Expand full comment

Actually, I guess it's more complex.

I accept part of the universe are "infinite". But I'm not sure we can say that some parts are infinite. But some parts certainly are not!

For reference: PBS Spacetime on the infinite calculations in particle physics https://youtu.be/WZfmG_h5Oyg

Expand full comment
Aug 1, 2022Liked by Maxwell Tabarrok

I really enjoyed this piece!

I've separately been a fan of both Holden's (and x-risk/EA/longtermist thinking about the future in general) and David Deutsch's thinking about the future for a while now but I've always felt there's an important disconnect between the two. I have a strong suspicion there's some really useful knowledge that can come out of combining/debating the ideas in the x-risk/long-termist community together with David Deutsch's arguments so I really appreciate this blog!

Expand full comment

"The currently accepted cosmological theory an accelerating expansion of the universe allows for an unbounded number of computations in a universe which is infinite in both space and time."

I don't understand this. Are there not physical limits on computation in a universe where there's a finite amount of matter and free energy that will be accessible given speed of light, cosmic inflation eventually being faster than light speed and leaving parts of the universe unobservable?

Expand full comment
author

Here's what Deutsch says about it:

"Cosmologists were trying to determine whether, despite slowing down, its expansion rate was sufficient to make the universe expand for ever (like a projectile that has exceeded escape velocity) or whether it would eventually recollapse in a ‘Big Crunch’. This is relevant to the question: is there a bound on the number of computational steps that a computer can execute during the lifetime of the universe? If there is, then physics will also impose a bound on the amount of knowledge that can be created – knowledge-creation being a form of computation. The cosmologist Frank Tipler discovered that in certain types of recollapsing universes the Big Crunch singularity is suitable for performing the faster-and-faster trick that we used in Infinity Hotel. To the inhabitants the universe would last for ever because they would be thinking faster and faster, without limit, as it collapsed"

"A small part of the revolution that is currently overtaking cosmology is that the omega-point models have been ruled out by observation. Evidence – including a remarkable series of studies of supernovae in distant galaxies – has forced cosmologists to the unexpected conclusion that the universe not only will expand for ever but has been expanding at an accelerating rate. Depending on what dark energy turns out to be, it may well be possible to harness it in the distant future, to provide energy for knowledge-creation to continue for ever. Because this energy would have to be collected over ever greater distances, the computation would have to become ever slower. In a mirror image of what would happen in omega-point cosmologies, the inhabitants of the universe would notice no slowdown, because, again, they would be instantiated as computer programs whose total number of steps would be unbounded. Thus dark energy, which has ruled out one scenario for the unlimited growth of knowledge, would provide the literal driving force of another."

Expand full comment

"Depending on what dark energy turns out to be, it may well be possible to harness it in the distant future, to provide energy for knowledge-creation to continue for ever."

Surely this line stands out to you as the pinnacle of hand-waving bullshit. Given that heat death is considered the most likely end-game for the universe right now, shouldn't the argument under that scenario be strongest? It's well known that computational operations have entropic costs. When the universe is at maximum entropy, there will be no more resources with which to perform the most atomic computations.

Expand full comment

Thanks for posting the full quote.

Expand full comment

"The currently accepted cosmological theory an accelerating expansion of the universe allows for an unbounded number of computations"

In order to believe that Heat Death/Big Freeze/Big Crunch (which are all mentioned in the article you linked) are avoidable, you need exotic theories, not accepted ones. Yes, there gaps in the accepted cosmological theories such as a lack of an explanation for "why did inflation slow down after the Big Bang?", but all attempts to fill those gaps remain conjectures rather than accepted theories.

In order to do computation, you need usable energy gradients and data storage. In Heat Death/Big Freeze/Big Crunch, there eventually are no usable energy gradients (as there is no mainstream theory for harnessing dark energy to accomplish work). In Big Crunch, all matter and energy will be randomized due to the astronomical temperature, which will separate each atom into constituent parts such that atoms no longer exist, so there won't be any way to keep data stored (if you store data as energy it will still get randomized by the astronomically high temperature of Big Crunch).

Expand full comment
author

Deutsch lays out theories for both the Heat Death and Big Crunch scenarios that an unbounded number of computations can be performed. In the big crunch, computations can keep getting faster and faster as the energy density of the universe increases so simulated minds feel an infinity of time. I remember the other one less well but its essentially just the reverse. Computations get slower over time but an infinite number of computations still happen. So they don't need to be avoided for an infinite number of computations to take place. I'm not a physicist so I don't know how standard or suspicious these theories are but they make sense to me.

Expand full comment
Aug 4, 2022·edited Aug 4, 2022Liked by Maxwell Tabarrok

Disclaimer: I am also not a physicist.

If we take mainstream theories we have a limited amount of matter, energy, and time before big freeze/big crunch. So where could DD be getting infinity from? I think he subscribes to a non-standard many-worlds-interpretation of quantum mechanics. To DD, not only do universes branch out from each other (many-worlds-interpretation, which is one of many mainstream quantum mechanics interpretations), but those branches can later merge together. To my understanding the mainstream thinks that branch merging can only occur when all the matter/energy (which includes all data storage) in each universe are in the same state as the other universe, which means branch merging would not increase the number of useful computations.

Expand full comment
author

The idea with the big crunch is that if computation scales with energy density then it can go to infinity because energy density goes to infinity as everything collapses to a single point. He said something more hand-wavey about how the dark energy which is powering the expansion of the universe can be harnessed for the heat death scenario.

The merging thing was also confusing to me but I think it is just his way of explaining things like the beam splitter or double slit experiments (https://en.wikipedia.org/wiki/Wheeler's_delayed-choice_experiment#Simple_interferometer)

where 1/2 probability photons interfere with each other.

Expand full comment

A few points.

There is a sense in which "why are we so near the beginning" is more surprising now than at other times.

Like if we found ourselves at time graham's number, it would be surprising that we were near the beginning only in the most technical sense.

If we assume we can observe only finitely many bits of info, then there must be some sequence of bits consistent with arbitrary times. The "no I can't write down any computer program short enough for you to read that can be proven to halt, but takes more than the current time to run"

Which is something that happens eventually, but takes a preposterously huge amount of time to happen.

Arguably this just passes the buck, the surprising thing is being a human sized mind, not a super vast one.

Also, if our utility function is bounded, then one part of the exponential will have most of the change in utility function.

Expand full comment

The 'where are the aliens' question arises in my mind in response to this post (and when I read David D's book - which is fabulous). Why are humans unique in an infinite universe? If not, where are the others? (etc.. etc..)

Expand full comment