Okay, I’ve updated my explanation of strong cosmic censorship, both in the article and here on the blog. Thanks!

]]>Thanks! Do you have a favorite introductory paper on this subject?

Right now the condensed matter section is pathetically out of date and pathetically small. I’d really like to see some lists of the top open questions in condensed matter physics.

]]>Sure, for instance by replacing “maximal Cauchy development” with “influence zone” throughout. And replacing “initial data set” with “initial state”.

]]>Thanks for the attempt at a “popular” explanation. I’m very happy with the concept of “Cauchy development” myself, but I’m reluctant to spring this term on the readers of the FAQ. It doesn’t really help people much unless they’ve studied a bit about hyperbolic PDE and their Cauchy surfaces. But I think it’s possible to say what you said without bringing in this term. I may try something like this.

]]>Of course I understand that you want your FAQ to be lowbrow. Nevertheless, it might be possible to give a rough explanation of the strong cosmic censorship issue, a little less rough than “under reasonable conditions general relativity is a deterministic theory”. Maybe as follows.

Assuming that we know, for some fixed time , the state of our universe at each point in space, can we can deduce the state of the universe also at all other points in time and space? Most physical theories are indeed deterministic in this sense. The question is whether the same is true in general relativity.

For each possible state at time (aka “initial data set”) there is indeed a spacetime region (the “maximal Cauchy development” of the initial data set) whose properties are determined by the initial data set. The question is whether this region is always (for each possible initial data set) the whole universe.

The answer is “no”: for certain special initial data sets, the universe extends beyond the maximal Cauchy development. However, Penrose formulated the “Strong Cosmic Censorship Conjecture”, which says that for **almost** all initial data sets the maximal Cauchy development cannot be extended: When we change any given initial data set slightly, the typical result is an initial data set with inextendible maximal Cauchy development.

Whether this conjecture is true might depend on what precisely counts as an “extension” of a given development. If we demand only that an extension is a “continuous” universe — it is allowed to look like a crumpled piece of paper, just not torn anywhere —, then the Strong Cosmic Censorship Conjecture becomes false. However, it might become true when when we demand extensions to be smooth, without crumples.

]]>Signal processing has all sorts of comparably spooky phenomena. Information Theory didn’t become a mainstream physics topic until well after the rise of quantum theory in the early 1920s when people like E. T. Jaynes were becoming very concerned about the nature and role of information in physics. The Nyquist rate reeks of an Uncertainty Theory though the discrete moiré patterns from aliasing would be more analogous to continuous wave interference.

Nils Abramson who passed away last year wrote a rather unusual book, Light in Flight, about those Moiré patterns after he made those amazing photographs of a light front in slow motion. Well of course everyone thought you couldn’t possibly snap a photo faster than the speed of light, but anything seems possible if you know some trick. So I find it hard to believe in strict no-go theorems. Improbability is much easier to believe in. Those aliasing patterns, the Moirés, turned out to be very useful. As useful as wavefunctions became to a discretizing quantum theory? Digital signal processing could have just have easily been called quantum signal processing. DSP and QM are both dealing with discrete values as technical improvements over classical, and more theoretical analog techniques.

And now digital signal processors, DSPs, are the basis of GPUs which do tensor maths and host the physics engines that simulate virtually real physical effects in simulations and video games. They are making it harder and harder for a sane person to tell the difference between the real world and a deep fake hallucination. So the next time you see an acausual phenomena, you might be tempted to call it a one-off miracle, a Houdini escape magic trick, a deja vu error inside the Matrix. Or, just blame the non-determinism on non-associativity, non-commutivity, or Chua circuits. But I suspect in terms of signal processing they might blame it on something like a superluminal propagation in a coax cable emulating a photonic crystal. Or something like that.

Q: Does redundant information survive inside a black hole, or are black holes like PKzip?

Q: Does spaghettification cause amnesia and leave everything else intact?

Q: Are black hole analogous to optical spatial filters (they get rid of noise, but you still have Airy rings, which look like Gibbs phenomena for cones instead of square waves)?

Happy New Year…

]]>