> continuity of spacetime is a convenient approximation
I disagree, and there's no evidence for this. This is computer science leaking out; physics has no formulation of spacetime in discrete terms, and indeed, all of physics presumes continuity.
In QM, the space of wavefns is infinite-dim continuous, and if wasnt, QM wouldnt be linear.
Cognition is discrete, but the world is continuous.
If it really was continuous so that physical quantities were real numbers as defined in mathematics, then it is in contradiction to maximal information density. Because almost all real numbers contain infinite amount of information.
Full argument is elaborated here "Indeterminism in Physics, Classical Chaos and Bohmian Mechanics. Are Real Numbers Really Real? by Nicolas Gisin": https://arxiv.org/abs/1803.06824
All the people who use thinks like the word "information" in this context are confusing thermodynamic, logical, probabilistic, (+ many others) and equivocating.
"Information" is not a physical quantity, and there cant be a "volume" of it. Nor does this have anything to do with real numbers.
It is impossible for there to be any system extended in space and time to "zoom infinitely" into a continuous range and hence record an infinite amount of information. No one claims this, and the formulation of physics (entirely on real numbers) does not require it.
Rather to say, eg., space is continuous, is to say its unbroken. There is no physical quantity which is becoming infinite.
I suspect that Gisin has a very clear idea of what he means by "information" in this context, having worked for over 40 years at the forefront of theoretical physics with a specialisation in quantum information theory.
I can link to people who’ve worked their whole life on various fields of Physics who still talk about perpetual motion. I am not saying he is wrong in this specific case, but an appeal to authority is not very convincing.
I've given your comment quite a lot of thought over the last few days. Maybe too much.
At first I was inclined to agree with you that this is an appeal to authority, with the caveat that such appeals do not always constitute a fallacy. For example, if we both agreed that such a Gisin is an expert whose opinion on this topic can be trusted, then his statements are valid evidence for one way or another.
But then I realised that the very claim being challenged is whether Gisin knows what he's talking about. Floating his credentials and experience feels like a valid contribution. For what it's worth, back in my PhD days I read several of his papers and saw a couple of his talks at conferences, and can confirm he's one of the leading researchers in the field and is particularly thoughtful and careful in his work.
I don't have much issue with Gisin's solution, the idea that the reals are random is one solution to the problem of how to deal with them that I like (since, my meta-issue is whether reality is computable, I say it isnt, and randomness is not computable).
There's a problem for people who think reality can be modelled by computable functions of finite inputs: this makes classical physics non-deterministic, because chaos requires infinite precision for determinism.
So either you go for "reality is deterministic and continuous, and not computable" or "reality is non-deterministic, and discrete, and not computable"
either option in this fork includes properties that offend the minds of the people who want everything to be discrete.
I lean towards a preference for determinism & continuity (via, in QM, superdeterminism) since that's trivial to justify on our best physics
Well, I don't think Gisin's mind is offended by non-determinism:
> I argue that there is another theory, similar but different from classical mechanics, with precisely the same set of predictions, though this alternative theory is indeterministic
and in the footnote he describes indeterminism to mean:
> given the present and the laws of nature, there is more than one possible future
Out of curiosity, why do you lean towards superdeterminism and not other deterministic interpretations of QM such as Many-Worlds or Bohmian mechanics?
Some people would disagree with dismissing information as non physical. For instance: https://scottaaronson.blog/?p=3327 The argument there would be that stuffing an extra bit of information in an information saturated volume would make it collapse into a black hole.
It's not entirely clear "Energy" is a physical property either. By physical I mean a causal property of a system which is a basic constituent of reality.
For example in E = 1/2mv^2, a particle has kinetic energy in virtue of being matter in motion -- it is motion and matter which are basic. Energy is just a system of accounting which tracks motion in the aggregate over time (with kinetic/potential just being the future/past in the accounting) hence why energy conservation is just a temporal invarience.
When making arguments about the physical properties reality has (eg., whether aspects are continuous) you need to be exceptionally clear what your terms mean, and terminology in physics isnt designed for this.
There are no "information saturated volumes", this is a series of abstractions piled on top of each other.
All the words in this area have quite complex formal definitions that are have quite difficult to unpack semantics, you cannot just go around saying "saturated volumes" -- it is this sort of language which breeds cranks, and pop sci does it with abandon.
This entire discussion is a matter of several PhDs, and to be done only well by people with PhDs in the matter (philosophy of physics), or equivalent research. It's not possible to scrap fragaments of what compusci bloggers say and derive much that's likely to be actually correct.
Nothing in physics is more basic than something else, as there are equivalent formulations in other quantities. Energy and momentum are as real as matter and motion. Matter is bundles of energy exhibiting inertia and motion is just some transformational relationship between phenomena in different areas of space-time. There is nothing "real" about any of this, only what animals like humans have evolved to model directly in their brains.
Yes, the thermodynamic properties of information are well established.
Various Hawking-Bekenstein results about black holes relate to information density, especially, shockingly, that information is proportional to surface area, not volume. This makes perfect sense because a black hole has all its incoming matter and energy sprawled, flattened and red-shifted on its horizon (to a distant observer). It can never export its internal state to the outside world, so you might never expect a volume's-worth of states to be exposed.
The idea was generalized by 't Hooft to the Holographic Principle, for 2D screens encoding the state of 3D volumes on the other side.
However, the full AdS/CFT Correspondence only applies to a certain type of AdS space, not our actual dS space. At the moment, it seems half of theoretical physics doctoral students are trying to extend AdS/CFT to dS space (obviously - strings :) and half of observational astrophysics doctoral students are desperately hoping to show we live in AdS space - LOL
> If it really was continuous so that physical quantities were real numbers as defined in mathematics, then it is in contradiction to maximal information density. Because almost all real numbers contain infinite amount of information.
Yeah. As they said, it’s computer science leaking out.
It can be misleading to reason about entropy, which is the relevant physical concept, as if it were strictly equivalent to information as computer scientists understand it. Entropy works perfectly fine with continuous densities of states and real numbers.
There are operators with continuous spectra. The previous commenter was accidentally half-right, in that the usual intro QM picture where everything lives in L2 really isn't fully rigorous, but this is fairly easy to resolve.
The correct setting is a rigged Hilbert space: given an algebra of operators A on a Hilbert space H, let S be the maximal subspace of H such that |sa| is finite for any s in S, a in A. These are your states. Operators in A don't necessarily have eigenvectors in H, but they do have eigenvectors in the space S* of all continuous linear functionals on S. So <x|, for instance, is just the map `psi -> delta_x(psi)`.
I take minor issue with the phrase "correct" here. Thats one way you can do things but its also works completely fine to not do that. Another way of setting these things up has your states be honest elements of L2, and says observables are just POVMs (i.e. maps from a space of measurable sets to positive operators which obey some natural restrictions like additivity). Then given a measurable subset A of the spectrum of some operator the Born-rule probability is just given by an inner product like < phi | P phi> where P is the projector you get if you integrate the spectral measure of the operator over A.
This has the advantage of not having any funky "rigged" states suddenly appearing in your calculations and is also exactly how we deal with non-projective measurements in finite dimensional quantum mechanics.
Sure, I should have been clearer: a rigged Hilbert space is the right setting for bras and kets. You can also get rid of them entirely. In my experience QM classes unfortunately tend to split the difference by slinging around suggestive nonsense like \int_{x} |x><x|.
Consider the Hilbert L^2([0,1]) associated with a physical particle that has position somewhere between 0 and 1, the corresponding multiplication operator X which takes a wavefunction f and maps it to Xf where (Xf)(x) = x f(x). Then X is a bounded self-adjoint operator. It doesn't have any eigenvalues or eigenvectors but it's spectrum is exactly the set of numbers [0,1] as you'd expect (prefect measurements of position return real numbers in [0,1]).
The spectral theorem, rather than decomposing X in terms of a sum of eigenvectors & eigenvalues instead decomposes it as an integral over the spectrum with respect to the (spectral) projection-valued measure.
Now it is fair to question whether this "observable" is really observable, but it certainly works out mathematically consistently in the normal way we do things in quantum mechanics.
As the sibling hinted at, Gisin's statement is quite sloppy and – at the very least – confuses "definite" "information"¹ (a given real number) with "uncertain" "information" (entropy), at least if you follow the definition of entropy by the book: The probability distribution for an observable that takes on (exactly) the value of a given real number with probability 1 has entropy 0.
That being said, Gisin's approach is still interesting and his results can still be valid. But he starts with the assumption that real (irrational) numbers are unphysical, i.e. that – in a sense – our observable from above can actually only take on certain (rational) values, and then he derives certain predictions from that.
¹) Putting "information" in quotation marks here because no one really knows what it is.
I am very sympathetic to Gisin and his cause, but he does not propose any sensible resolution. By the way, not a fault, and no blame for him. Pointing out logical deficiencies always comes before a satisfying solution, and he is to be praised for his insight.
There are many interesting ways to probe this problem.... here's one:
Say I tell you to imagine a circle, an ideal Platonic circle in a Cartesian coordinate system (real coordinates, first uneasiness). Let's ignore translation, so it is centered at (0,0). I tell you the radius. Can you imagine the circle with Plato? Model the circle? Reproduce the circle? Do you need pi? Does the circle include or encode pi? But pi is has infinite information.
Perhaps all you need is the square root function? But that's also an infinite Taylor series expansion. You can plot and recreate the circle to any precision if you have a square root function. The series will only need to run to the required precision. The circle will always be granular, depending on the number of terms you use in pi, or the square root function. Yeah, right, obvious, so why is that a problem?
What if I tell you the circle is the physical manifestation of equipotentials of a stationary charge (say, nucleus), or mass (say Earth), with inverse square law - so basically a geometric fall-off with range determined by spatial (circular, spherical) considerations. What is the force at some distant point? Do you need pi? Do you need square root function? Or reciprocals? How does the other charge or mass feel the 56,323rd decimal place of the force due to the potential?
Maybe it doesn't, because by the time it has felt the second decimal place, time has moved on, the charges/masses have moved on, and the nuance of what would've/should've been felt in a never changing universe are never experienced. There is a modified differential equation that relates various time derivatives to precision of experienced forces (this almost sounds like relativity :)
The discrete explanation with photons goes like this: the force is produced by radiating photons. They automatically encode the geometric expansion as inverse square law, because of their pathways, no need for pi, or sqrt functions. But that is statistical, the accuracy is only as good as the number of photons that can arrive from the source. The circular/spherical nature of the force only emerges over time, as photons arrive and act. The accuracy of smooth circularity and inverse square only establishes itself over time...
Elapsed time affects experienced precision - hmmm, interesting.
How would you quantify such a thing, where time changes the precision of what you feel? Well, the other obvious example is the Heisenberg Uncertainty Principle. This is just a simple and obvious example of Fourier Analysis for any theory based on a linear wave equation. It almost doesn't need stating, and if it must have a name, it is certainly Fourier, not Heisenberg. Anyway, any math/physics/engineering student knows Fourier to their core, and it gives a nice solution to the information problem: coordinates may be real-valued degrees of freedom, but there is no way to mathematically or physically resolve all coordinates and their derivatives to infinite precision. It's just not possible, even if the underlying equations/reality maintain the fiction of real-valuedness.
Fourier combines time, waves, amplitude, velocity (momentum, etc.) with a specific expression for possible information. A picture is worth a thousand words at this point, just look at a wave-packet, it's obvious. Fourier is a masterwork, and vastly underappreciated as a fundamental limit on knowledge, in a real world sitting on smooth continuous waves.
So Fourier sets limits on knowledge, even in the wavy world of the smooth continuum. Of course, I do not believe in the smooth continuum anyway, but Fourier is my wingman to fight the real-infinitists on their own smooth turf.
It does not, according to any sane way of defining its information content. For example the Kolomogrov complexity of pi is clearly finite - I can write down a program for a Turing machine which will run (forever) and keep writing down digits of pi as it does so.
Not true either. The original Kolmogorov complexity is for finite strings. Plus, the program would need to store that for whichever special strings you choose you will use this function but for other ones not. That will be a giant table that is part of your program.
That's also a different point than the parent's. Seems they're saying if you were to specify pi as the limit of some expansion that describes the physical process of photons arriving in some area, then that specification's information increases with more terms added. Pi, being almost random by every statistical measure, has as much information as a random string, in fact, in any normal conception of information. You cannot wave that away by machine manipulation tricks or by defining a new constant, and this is borne out also by the parent's physical argument that in reality there are no low-complexity universal constants, but that there may be limits to information density (in space and time).
Continuous physics can be a manipulation of limiting quantities without being literal.
> The original Kolmogorov complexity is for finite strings.
I wrote "Kolmogorov complexity" not "original Kolmogorov complexity" so this isn't particularly relevant. The application of the concept to the infinite string which represents pi is essentially trivial.
> Plus, the program would need to store that for whichever special strings you choose you will use this function but for other ones not. That will be a giant table that is part of your program.
I honestly can't parse this.
> Pi, being almost random by every statistical measure, has as much information as a random string
This is wrong. You can consider something like a simple communication task. Alice and Bob share a phone line and she is attempting to tell him a number. Every second the line allows her to send a bit to Bob. For a truly random number she has to use the line infinitely many times to tell him the number. To send pi she can send a finite number of bits which amount to a program to compute pi and he can do the computation on his end.
I urge you to go back and look at how Kolmogorov complexity is defined. It includes the notion that a program needs to decide whether to output the string directly or to generate it from some program.
You're assuming Alice and Bob have already pre-synchronized what kind of computing machine is going to be used, one in which pi is the output of a relatively short program, as opposed to another type of machine where some other random-looking number has that property (random to you, pseudo-randomly generated via some machinery for all you know). You are assuming many things away.
Also it is absolutely not trivial to extend Kolmogorov complexity to infinite strings. There are multiple formulations and they are a lot more difficult than for finite strings. Not the computation part but the complexity assignment part.
I agree there is a bunch of complexity in generalising Kolmogorov complexity to general infinite strings. However I'm not really trying to do that here, all I want is enough to back-up the statement I made before, that the complexity of pi is finite. Doing that is much more trivial than what you're talking about.
Theres a bunch of fine detail in getting it down to defining an actual number measuring complexity which I don't care about at all, all I care about (in the context of this discussion) is that the number is finite.
> physics has no formulation of spacetime in discrete terms
There are some attempts of working with discrete spacetime (e.g. causal set theory), but yeah, all our best descriptions so far very much assume smooth spacetime.
That said, our Turing Machine model of computation is discrete, and the Church-Turing thesis implies human thought is Turing Complete.
It's not empirical evidence, but it's something. (I really doubt an empirical test is possible at all, so it seems philosophizing is all we have, unfortunately.) I'm not aware of any (communicable) model of thought that actually can't be reduced to the Turing model (in fact, that AFAIK precisely the reason he proposed the model).
Analog signals can be approximated to arbitrary precision, so while we conventionally think of it as continuous, it doesn't imply our cognition really has infinite precision floats internally...
I think it's really unfair to only focus on half of the picture (saying there's no evidence for "Cognition is discrete") where in fact we actually have no evidence at all whether anything is fundamentally continuous or merely approximated as such with high precision.
Traditionally the math in physics is continuous, and the math in computing is mostly discrete. If people point to Hilbert space as some kind of justification for believing physics is continuous, then it seems equally valid (or invalid) to use the Turing model as justification to believe cognition is discrete. I think both approaches are misguided, but as I said, it's really unfair to point out only the convenient half of these invalid arguments.
That's the "trivial sense" I'm talking about. If we restrict "cognition" to the stuff we know is discrete then trivially it's discrete. But cognition is a hell of a lot more than that.
I entirely agree. I wasn't making a statement that "what we know is discrete". I was referring to a particular subset of cognition as "what [i.e. the things that] we know are discrete".
There are aspects of cognition that are discrete: a language contains a finite set of phonemes and words, a human mind is capable of (painfully slowly) carrying out purely symbolic algorithms like those a computer performs, etc. My point was that these things are a small subset of cognition, and most of cognition we have no particular reason to think depends on discreteness, which I think is the same point you're making.
Personally I strongly suspect that the "discrete" aspects of cognition are things that have evolved on top of / within a system that is fundamentally continuous (analogue) in nature.
> My point was that these things are a small subset of cognition, and most of cognition we have no particular reason to think depends on discreteness, which I think is the same point you're making.
How do you convince yourself that you have thoughts that cannot be accurately written down no matter how many words you use?
> Personally I strongly suspect that the "discrete" aspects of cognition are things that have evolved on top of / within a system that is fundamentally continuous (analogue) in nature.
How do you tell whether things are really fundamentally continuous, or a really high definition pixel art?
Discrete here means "non-continuous" - i.e. there is no smooth transition function between ideas/thoughts/rationalizations.
For example, a formal proof is a discrete process: it follows step-wise rules that you can assign natural numbers to (this is the first step, this is the second step, this is the third step). A non-discrete process, a continuous one, would have a smooth transition between these steps, which is hard to even imagine.
While I am not convinced it is correct to say that "human reasoning is discrete", human language is definitely discrete. Words don't blend smoothly into each other. If you don't believe me, try to define a function f:[0,1] -> Words, such that f(0) = "red" and f(1) = "blue" and tell me what is f(sqrt(2)/2), or what is df/dx.
Interpretations of QM and measurement have very little to do with whether space-time is discrete or continuous. The simple fact is that no common QM formulation uses discrete mathematics for space-time, and it's unclear if any that does would even work.
Also, your link is not a formulation of QM, it is a different theory which makes different predictions (it is a quantum gravity theory). And, per the sounds of the Wikipedia article at least, it is not actually proven equivalent to QM in the regimes where it needs to be ("There is evidence [1] that, at large scales, CDT approximates the familiar 4-dimensional spacetime", or in other words, it is not fully worked out if this is the case).
Is it though? Does it matter one way or the other? Do we think reality is the math in some way, or is the math a really darn good model of the reality?
Imagine there was a grid for space. For simplicity consider a regular grid of size 1unit in one direction and 1unit in a perpendicular direction. If such a grid existed, using one unit of ?something? would move you 1 unit along the axes of the grid, but you'd need 2 units of ?something? to move root2 units 45deg to the grid. Any discrete grid of any shape or size or pattern would have something like this, some sort of preferred alignment, but as far as we can tell there no such preference. Physics in free space is rotationally invariant and thus not on a grid thus continuous.
You don't have to imagine an ordered grid. If grid unit is small enough (say plank length 1,6 10^-35) and the grid is chaotic, for the distances of ~ 10^-16 that we can measure, everything will look the same in all directions.
This happens the same way in which steel demonstrates isotropic behavior although its microscopic structure is anisotropic.
So there is no easy way to prove or disprove continuity of space.
The "underlying issue" often at stake in the debate is whether reality is a computer, since it would need to be discrete if so, and often whether a computer can be made to simulate it.
However, what's missed here is that discrete is a necessary but not sufficient condition.
Once you give any sort of plausible account of how reality could be discrete, as you've done here, you end up with non-computable aspects (eg., typically randomness). So the metagame is lost regardless: reality isnt a computer (/ no complete physical theories of reality are computable).
Though the meta-meta-game around "simulation" is probably internally incoherent in itself -- whether reality is a computer or not would really have nothing to do with whether any properties had by it (eg., mass) are simulated.
Since either you take reality to have this property and hence "simulation" doesn't make sense, or you take it to be faked. If it's faked, being computable or not is irrelevant. There's an infinite number of conceivable ways that, globally, all properties could be faked (eg., by a demon that is dreaming).
I don't see how randomness can make anything non-computable. Sure you may not know the exact random numbers but you get a similar enough universe with any other sequence of random numbers.
Also continuous doesn't mean uncomputable either, because in many cases the infinite amount of computation for continuum does not add anything interesting and finite approximation works good enough.
> So the metagame is lost regardless: reality isnt a computer (/ no complete physical theories of reality are computable).
I don't see any evidence for this. For now we do not have a proof for one way or another. If for instance it turns out that quantum computers really can run Shor's algorithm factoring very large numbers, it would be a good evidence for continuum, but we are not there yet.
But even that would not be an evidence for reality not being a computer, since it will still allow the possibility of reality being a computer that can perform operations on real numbers.
Why is randomness non-computable? In computer science, the theorem is that the set of all Deterministic Finite Automata is equivalent to the set of all Nondeterministic Finite Automata. It is a non-obvious theorem that is a one page proof taught in every junior level theory of computation course. This theorem is what lets deterministic and nondeterministic Turing machines to be used interchangeably in many subsequent proof sketches in these classes.
The "Nondeterministic" in NFA means its transition function goes from states to sets of states, instead of from states to states. Informally, it can explore multiple paths in parallel for the cost of one. They're not probabilistic.
The computational semantics of the NFA simply requires that the next state be one of the allowable next states in the transition function d: Q x ∑ --> PowerSet(Q).
Thus, this semantics implicitly encodes the notion that the machine is nondeterministically choosing the next state in each execution.
The decision problem of whether an NFA accepts a string w is what allows for the informal parallel interpretation, that it accepts iff you imagine the computation is forking off a new thread at each nondeterministic branch. But to say that this not nondeterministic or not probabilistic is like saying the Many-Worlds Interpretation means there is no real superposition, or something like that. It's like saying a throw of a dice does not really involve probability because of a symmetry argument that a dice has six equal sides. Mainly, I don't understand that, because I see probability as a way to implement nondeterminism: a system is probabilistic only because it is making nondeterministic choices according to some probability distribution. And checking Sipser 2nd ed. p.368: "A probabilistic Turing machine is a type of nondeterministic Turing machine in which each step is a coin-flip step".
Anyhow, my main issue was that the original commenter casually claimed that probability makes things (physics) uncomputable. But Turing computability has nothing to do with probability, since as I recall the closest concept is the Non-deterministic Turing Machine (NDTM) and with that it is a basic proof to show that NFAs vs. DFAs, as well as NDTMs vs. DTMs, are computationally equivalent and there are theorems for that.
Meaning either they are using an idiosyncratic definition of computability or are ignorant of an introductory course on theory of computing which explains formally what Turing/Church's theories were about when clarifying the concept of computability. Okay or maybe they have a deeper philosophical disagreement with computability and complexity theorists - maybe they reject Sipser's definition above - but these are standard undergraduate curricula in CS by now and it could be argued that perhaps it is the non-CS experts who haven't thought deeply enough about what computability really is and would benefit from actually learning from these subdisciplines. I don't know, as they did not reply.
A chaotic grid would be macroscopically observable because random + random != 2 random, it's equal to 'bell curve'. Everything would be smeared as a function of distance, which we don't see.
This characteristic is observable for metals as well. Steel becomes less flexible as it's worked because it's grains become smaller and more chaotic - A microscopic property with a macroscopic effect.
If we are talking about a grid with a very small spacing, say around the Planck length, I don't see how we would be able to macroscopically observe it.
Everything we can see move on the grid is at least 20 orders of magnitude bigger than the grid spacing. Any macroscopic objects we can experiment with are more like 30+ orders of magnitude bigger than the grid spacing and consist of numerous atoms that will all be moving within the object due to thermal jiggling over distances orders of magnitude bigger than the grid spacing.
In physics you never have measurements differentiating between distance 2 and say 2+10^-20, and that gives enough space to hide any 'bell curve' you want.
There are many times in physics where people have thought they've had to choose between either one thing or its opposite, where both choices had clear deficiencies. The ultimate solution ended up being a new hybrid that nobody thought of for a long time.
I kind of suspect "is the universe continuous versus discrete" will come down to that. I don't know what a hybrid of such things looks like. With our current conceptions it seems impossible. But it always does, before the breakthrough comes and then in hindsight all the people of the future will get to look back at us going "How could they not see this obvious thing?", to which my only defense is that you, dear future reader, only think it's obvious because it was handed to you on a silver platter and you'd be as confused as we are if you were back here with us.
> Is it though? Does it matter one way or the other?
These are two very different questions. As for the former, I don't believe there is consensus at this point with good arguments for and against. As for the latter, if we can reliably prove that the reality is not analog but digital, it has consequences at various levels, and we might make better choices when using math to describe it/make approximations.
Given that no one, or at least no human, can experiment what reality in its whole, and as far as we want to honestly recognize the effective scope of our knowledge, probably we will never know in absolute terms.
What matter is a subjective topic. What we all have in common is logistics constraints. So if some people set as a goal something that requires to settle if reality is more easily handled when modeled in continuous or discrete manner for logistical reasons, then it this scope it matters. But whatever you settle on, human brain is thus built that it can always assume that the perfectly fitting model is only valid in its scope which is built on top of an other more subtle level of reality which is on it’s part better modelized with an antithetic approach.
Now, on a very personal out of blue opinion, I fail to see how any causal series might happen without an underlying continuous flow of event. I mean, supposing causal discontinuity is to my mind as relevant as supposing that universe as it is right now, actually just appeared, without anything we can think about it being relevant, and in the next instant could be completely different or nonexistent since universe is not bound in any remote way to what we might expect on our delusional just created sense of causality.
You say you can't comprehend how something can move from 1 to 2 discretely.
But the paradoxical notion of infinite continuous change has been known since antiquity. It's faith either way.
Discrete doesn't mean state changes are wholly globally arbitrary. Imagine a graph with nodes and edges, a state machine as computer sciences call it. I think it's easy to agree that the universe could be parsed by a regex ;-)
Heck, imagine an integer on the number line that can go up or down.
Worlfram has written a ton about this. Despite all his issues, his math is solid. (Which is not to say his physics is true.)
> You say you can't comprehend how something can move from 1 to 2 discretely.
I didn’t mean that, sorry if my words that induced you to believe so.
What I want to point out is that, to my mind, if I assume a discrete foundation of universe, on meta-cognitive level I must recognize it implies everything I experiment through my current attention might possibly be a just made up state without any compelling ontological relation to anything I can recall or think of. So, as far as I’m concerned, believing in continuity is just a lazy way to relax on a metaphysical Gordian knot.
> It's faith either way.
Yes and no. It’s probably easier to change scientific perspective to whatever model apply best for some purpose than to adhere to some philosophy about Nature.
Zeno's """Paradox""" was nonsense even in it's own time. Easier now that we understand Newton's laws of motion but his contemporaries were able to sufficiently dispute his idea even without them.
Not nonsense. The argument goes that if time and space are both discrete, then to move from A to B in finite time means that you have to perform infinitely many actions in finite time.
Zeno didn't believe that the latter was possible. But he wasn't stupid, he obviously knew that motion was happening all the time in real life. His paradox really only makes sense in the context of Eleatic philosophy which assumes that reality is an illusion because change is fundamentally impossible (how can something come from nothing?).
If you want to reframe it in more modern terms, Zeno's paradox shows a contradiction in axioms. If you want to get rid of the contradiction, you have to change some of the axioms.
In real analysis, loosely speaking, we remove the axiom that an infinite process cannot result in a finite outcome - this way we are allowed to sum (some) infinite series, for example. But we don't "know" if reality behaves that way.
The atomists found a different solution: they argued that reality was fundamentally discrete. This way, Zeno's paradox also doesn't arise.
No, position and time are typically continuous variables in quantum mechanics. You can have formulations in which they are discrete but they are not required and are relatively exotic. QM certainly doesn't say they must be discrete.
I feel that a lot of people here are confusing the math and "reality".
You're definitely correct about the math, i.e. the systems that we humans have invented to model reality. But I guess most of us don't really care about what mathematical model scientists like to use (especially not whether they're "exotic" or not), but rather what reality could be like.
And the quantum properties of QM do seem to suggest that there's some sort of fundamental discreteness in reality. And it seems to run contrary to the resolute claims that reality must be continuous as if it were a proven fact. What I understand is that the math most commonly used by scientists is definitely continuous, but whatever we can measure seems to have some kind of planck limitation.
So are we talking about empirical science or science-flavored theology here? Have we actually found empirical evidence or proven the continuousness of space/time?
I responded to a couple of people who claimed with great certainty that QM meant spacetime had to be discrete, when it says nothing of the sort. I haven't claimed we have proof that it is continuous and I doubt we ever will as that seems akin to proving a negative existential.
Your penultimate paragraph suggests some confusion about ideas like Planck scale and quantisation.
Firstly, there is nothing special about the Planck length itself. It's just a unit of length. Around that sort of scale, though, our current theories of physics happen to break down because both quantum and gravitational effects become significant. That doesn't imply spacetime is discrete (or preclude it being discrete) at that scale. It's just a realm that our current theories don't work in.
Secondly, while describing aspects of nature that are quantised was a large part of why quantum mechanics was developed (and the source of its name), it in no sense says anything like "there's some sort of fundamental discreteness in reality". Quantum mechanics deals with both discrete and continuous observables in a single framework: functional analysis, essentially. The set of possible values for an observable is modelled as the spectrum of an operator, which can be either continuous or discrete. Which sort of observable is appropriate for a given physical theory is a choice made in constructing that theory. For things like charge and spin we use discrete (quantised) values because we have evidence that those things are quantised. For things like position we use continuous values and have no evidence that using discrete observables would better match nature.
Space could in reality be either discrete or continuous, or not even exist in any form we'd recognise as "space" on those scales. Quantum mechanics doesn't give us any hints one way or another.
Quantum mechanics does not mean everything is quantized. It got its name because the first predictions of quantum mechanics were quantized energy levels in some example systems, but that does not even mean that all energies are quantized in quantum mechanics. There are many systems you can study where energies are continuous, and many examples where other quantities are continuous in quantum mechanics.
I think its a linguistic difference only. At least where I studied it was quite normal to call phenomena you can derive from a physical theory "predictions" even if they have been observed before. I agree the photoelectric effect strongly suggested some quantization before quantum mechanics was formalised.
A butterfly also literally has butter in the name. The point of QM is that certain energy levels are quantized. Or more generally that lots of operators/observables on continuous Hilbert spaces have discrete spectra.
What's your point? Everything they did, including Einstein's (and everyone else's at the time) quantum mechanics work, was based on continuous space and time variables.
Quantum mechanics does not mean everything is quantized. It got its name because the first predictions of quantum mechanics were quantized energy levels in some example systems, but that does not even mean that all energies are quantized in quantum mechanics. There are many systems you can study where energies are continuous, and many examples where other quantities are continuous in quantum mechanics.
It is true that there is no experimental evidence, but I think there are some convincing arguments that something must happen at the Planck scale (for very short distances) in a full quantum-gravity theory.
Here are some quotes from "Covariant Loop Quantum Gravity", Rovelli and Vidotto (slightly redacted). I suggest the whole chapter 1, in particular 1.2 to get an idea of why fundamentally spacetime may be discrete.
"In general relativity, any form of energy E acts as a gravitational mass and distorts spacetime around itself. The distortion increases when energy is concentrated, to the point that a black hole forms when a mass M is concentrated in a sphere of radius R ∼ GM/c^2, where G is the Newton constant. If we take L arbitrary small, to get a sharper localization, the concentrated energy will grow to the point where R becomes larger than L. But in this case the region of size L that we wanted to mark will be hidden beyond a black hole horizon, and we lose localization. Therefore we can decrease L only up to a minimum value, which clearly is reached when the horizon radius reaches L, that is when R = L.
Combining the relations above, [..] we find that it is not possible to localize anything with a precision better than the Planck length (~10^-35 m).
Well above this length scale, we can treat spacetime as a smooth space. Below, it makes no sense to talk about distance. What happens at this scale is that the quantum fluctuations of the gravitational field, namely the metric, become wide, and spacetime can no longer be viewed as a smooth manifold: anything smaller than the Planck length is “hidden inside its own mini-black hole”."
"The existence of a minimal length scale gives quantum gravity universal character, analogous to special relativity and quantum mechanics: Special relativity can be seen as the discovery of the existence of a maximal local physical velocity, the speed of light c. Quantum mechanics can be interpreted as the discovery [..] that a compact region of phase space contains only a finite number of distinguishable quantum states, and therefore there is a minimal amount of information in the state of a system. Quantum gravity yields the discovery that there is a minimal length lo at the Planck scale. This leads to a fundamental finiteness and discreteness of the world."
You're talking about minimum lengths, not discrete spacetime.
It may be the case that there's a minimum length beyond which "no meaningful laws of physics apply", but it really says nothing about whether real numbers are indispensable in the formulation of physics, or about whether spacetime is continuous.
There being a minimum length doesnt mean that everything is a discrete multiple of this length, or that space is broken into units of it, or that objects have to be aligned on grid boundaries defined by it.
Whenever people try to do philosophy of physics the inevitable place everyone lands at is a series of false equivocations, often caused by the language of physics being ambiguous and polysemous. But "minimum length" here does not mean a sort of grid length.
I agree with all your claims here in the literal sense, but suppose there's a minimum length, then it would seem to be at least theoretically possible to use discrete mathematics to formulate an approximation to the "real number formulation of physics"?
The fact that we don't have already a full system using discrete maths doesn't mean it is impossible, because our current system is based on a long tradition of belief in real numbers, and assuming physical space is continuous.
I'd argue (admittedly unhelpfully) that unless we have actually tried to formulate physics using discrete mathematics and found a barrier that we prove unequivocally that it is impossible to overcome, we can't claim that physics must be formulated using real numbers/continuous math. There's a difference between "we don't know how to do this" vs "we know we can't do this".
I don't understand your point, I never said that "everything is a discrete multiple of this length, or that space is broken into units of it, or that objects have to be aligned on grid boundaries defined by it", I just wanted to mention that "continuity of spacetime is a convenient approximation" may be a correct sentence in the context of quantum gravity.
Also, for what is worth, in QM the space of wavefunctions can also be finite dimensional (for instance the Hilbert space of a spin 1/2 particle).
And that's not what a "minimum length" in this case means. We're not talking about space having a minimum unit. We're talking, at best, about (presumably massive) objects having a minimum extension in space .
Even with a "minimum length" (in this specific sense), you have an object at position p, and can (move/observe) it at any p+dx continuously.
importantly, the question is whether the best theories of physics in a world with a minimum extension-in-space require continuous mathematics, and there's nothing about this plank length to suggest they wouldnt
Discreteness would mean that there exists some base distance p such that the distance between any two objects is Np, with N being a natural number (and any surface is some Mp^2 and any volume is Qp^3 and so on). Continuity is simply the opposite of that. It could be that objects can be at arbitrary real-valued distance d from each other, but that d > p is a precondition for any other law of physics.
By contrast, discretness has various unintuitive mathematical properties that mean it's not easy to fit into some other theories (particularly those relying on differential equations).
Planck constant would like to have a word with you. But it is true that CS shines a light on the matter of mapping the infinite into bounded spaces.
This matter of ‘cognition’ is the entire matter (npi) of contention. What is the actual relationship between number and perceived phenomena? What is the deeper meaning of the concordance of mathematics and physics? Where do these magical constants come from and what does it all mean?
It seems we bring the ‘world’ into being by partitioning. See Genesis 1 for details.
> the world is continuous
Reality is actually a unified undivided unity without form and timeless & eternal - that is all we can say with certainty. The “world” is our perception of this reality. Our cognitive machinary is discreet, and a mapping of this reality into metric & temporal spaces of the mind.
As someone who was once a, "I've watched a lot of YouTube videos about physics", person, i think it's very interesting how confident people are in their understanding of what is essentially the edge of physics, something only seen in a masters or doctorate degree. Specifically the whole spacetime and qm thing.
There's so many videos on it that you begin to feel like you really understand it after half a dozen or so repeat the same words at you. But the thing you don't realize is that those are literally the only words they could possibly communicate, and that's an infinitesimal fraction of the nuance of the real thing. Plus, there's a selection bias in that the videos that make you feel good get more views, so you're more likely to stumble upon videos that make you _think_ you understand it that videos that actually do, partly because the only videos that could make you understand are a graduate degrees worth of 80-100 hour lecture courses that you're gonna have to take notes on.
It makes me wonder if this is true for literally every subject. I fancy myself are least politically versed in modern events but is my entire understanding based on an entertainment-first version of an actual education? How much do i walk around using jargon that i don't really understand to make points whose true depth I'm completely unaware of?
My favorite phenomenon of which you speak are the guys who haven’t thought about math since 11th grade or physics ever aside from seeing some Joe Rogan clips with Eric Weinstein but will pound their keyboards with fury that “STRING THEORY IS A BIG LIE!!!”
Well, unfortunately some otherwise great physics educators intentionally stoke that fire, portraying the current particle physics agenda as if its some conspiracy to waste funding rather than the consensus of thousands of the best minds in physics. Selling people a superficial sense of contrarian insight ends up being a very successful marketing tactic.
Most nerds who "understand quantum mechanics" misunderstand (usually do to incorrect explanations) that the Planck units are somehow the fundamental units of reality
A discreet unit of measure exists in dynamics & its relationship to position space is informed by Heisenberg’s theorem.
(My actual point is that Reality is neither continuous nor discreet - it is an infinitesimal point and it is our mind — that likes to name and number things and relies on duality to make ‘distinctions’ — that creates the universe, the subjective reality that we perceive as inhabiting.)
The tao that can be told
is not the eternal Tao
The name that can be named
is not the eternal Name.
The unnamable is the eternally real.
Naming is the origin
of all particular things.Free from desire, you realize the mystery.
Caught in desire, you see only the manifestations.
Yet mystery and manifestations
arise from the same source.
This source is called darkness.
Darkness within darkness.
The gateway to all understanding.
I disagree, and there's no evidence for this. This is computer science leaking out; physics has no formulation of spacetime in discrete terms, and indeed, all of physics presumes continuity.
In QM, the space of wavefns is infinite-dim continuous, and if wasnt, QM wouldnt be linear.
Cognition is discrete, but the world is continuous.