Pet theory is that our universe is run on some external computational substrate. A lot of the strangeness we see in quantum physics are side effects of how that computation is executed efficiently.
The inability to reconcile quantum field theory and general relativity is the that gravity is a fundamentally different thing to matter: matter is an information system that's run to execute the laws of physics, gravity is a side effect of the underlying architecture being parallelized across many compute nodes.
The speed of light limitation is the side-effect of it taking a finite time for information to propagate in the underlying computational substrate.
The top-level calculation the universe is running is constantly trying to balance computation efficiently among the compute nodes in the substrate: e.g. the universe is trying to maintain a constant complexity density across all compute nodes.
Black holes act as complexity sinks, effectively "garbage collection." The matter than falls below the event horizon is effectively removed from the computation needs of the substrate. The cosmological constant can be explained by more compute power being available as more and more matter is consumed by black holes.
This can be introduced into GR by adding a new scalar field whose distribution encodes "complexity density." e.g. some metric of complexity like counting micro-states, etc. This scalar field attempts to remain spatially uniform in order to best "smooth" computation across the computational substrate. If you apply this to a galaxy with a large central supermassive black hole, you end up with almost a point sink of complexity at the center, then a large area of high complexity in the accretion disk, and then a gradient of complexity away towards the edges of the galaxy. That is, the scalar field has strong gradients along the radius of the galaxy, and this gives rise to varying gravitational effects over the radius (very MOND-like).
Some back of the napkin calculations show that adding this complexity density scalar field to GR does replicate observed rotation curves of galaxies. Would love to formalize this and run some numerical simulations.
Would hope that fitting the free parameters of GR with this complexity density scalar field would yield some testable predictions that differ from current naive assumptions around dark matter and dark energy.
”External computation susbtrate” is a useful idea if it leads to falsifiable theories. As a ”theory of everything” it sucks because it’s clearly not motivated by any specific maths or observations, but by the human need to map nature into some comprehensible analogue. Ie. taking some simpler subset of nature and trying to pretend the rest of it is like that as well. Usually nature so far has become more incomprehensible the deeper we’ve looked at it.
Newtonian mechanics & mechanical clocks being hottest precision technique led scientists at the time to viewing nature as a clockwork. Now we have computers, we think ”nature is like computers” because it’s an appealing analogue.
But it’s a false analogue imo. Just like clocks are a thing enabled by nature (a subset, in every meaning of the word) similarly computers are a subset of nature. So yes, nature can think (with human brains) and nature can run computations (with cpu:s impregnated with programs) but that also is just a subset of nature.
Now: games of the mind and helpfull analogues rock. And asking ”how is nature analogous to a turing machine” is interesting for sure. But just because a game is fun or analogue appealing, should not one let forget in the philosophical sense that one is playing only with a limited subset of a thing.
The inability to reconcile quantum field theory and general relativity is the that gravity is a fundamentally different thing to matter: matter is an information system that's run to execute the laws of physics, gravity is a side effect of the underlying architecture being parallelized across many compute nodes.
The speed of light limitation is the side-effect of it taking a finite time for information to propagate in the underlying computational substrate.
The top-level calculation the universe is running is constantly trying to balance computation efficiently among the compute nodes in the substrate: e.g. the universe is trying to maintain a constant complexity density across all compute nodes.
Black holes act as complexity sinks, effectively "garbage collection." The matter than falls below the event horizon is effectively removed from the computation needs of the substrate. The cosmological constant can be explained by more compute power being available as more and more matter is consumed by black holes.
This can be introduced into GR by adding a new scalar field whose distribution encodes "complexity density." e.g. some metric of complexity like counting micro-states, etc. This scalar field attempts to remain spatially uniform in order to best "smooth" computation across the computational substrate. If you apply this to a galaxy with a large central supermassive black hole, you end up with almost a point sink of complexity at the center, then a large area of high complexity in the accretion disk, and then a gradient of complexity away towards the edges of the galaxy. That is, the scalar field has strong gradients along the radius of the galaxy, and this gives rise to varying gravitational effects over the radius (very MOND-like).
Some back of the napkin calculations show that adding this complexity density scalar field to GR does replicate observed rotation curves of galaxies. Would love to formalize this and run some numerical simulations.
Would hope that fitting the free parameters of GR with this complexity density scalar field would yield some testable predictions that differ from current naive assumptions around dark matter and dark energy.