I seem to be getting pulled into the project of updating this FAQ:

The more I look at it, the bigger the job gets. I started out rewriting the section on neutrinos, and now I’m doing the part on cosmic censorship. There are even bigger jobs to come. But it’s fun as long as I don’t try to do it all in one go!

Here’s the new section on cosmic censorship. If you have any questions or have other good resources to suggest, let me know.

Does Cosmic Censorship hold? Roughly, is general relativity a deterministic theory—and when an object collapses under its own gravity, are the singularities that might develop guaranteed to be hidden behind an event horizon?Proving a version of Cosmic Censorship is a matter of mathematical physics rather than physics per se, but doing so would increase our understanding of general relativity. There are actually at least two versions: Penrose formulated the “Strong Cosmic Censorship Conjecture” in 1986 and the “Weak Cosmic Censorship Hypothesis” in 1988. Very roughly, strong cosmic censorship asserts that under reasonable conditions general relativity is a deterministic theory, while weak cosmic censorship asserts that that any singularity produced by gravitational collapse is hidden behind an event horizon. Despite their names, strong cosmic censorship does not imply weak cosmic censorship.

In 1991, Preskill and Thorne made a bet against Hawking in which they claimed that weak cosmic censorship was false. Hawking conceded this bet in 1997 when a counterexample was found by Matthew Choptuik. This features finely-tuned infalling matter poised right on the brink of forming a black hole. It

almostcreates a region from which light cannot escape—but not quite. Instead, it creates a naked singularity!Given the delicate nature of this construction, Hawking did not give up. Instead he made a new bet, which says that weak cosmic censorship holds “generically”—that is, except for very unusual conditions that require infinitely careful fine-tuning to set up. For an overview see:

• Robert Wald, Gravitational Collapse and Cosmic Censorship.

In 1999, Christodoulou proved that for spherically symmetric solutions of Einstein’s equation coupled to a massless scalar field, weak cosmic censorship holds generically. For a review of this and also Choptuik’s work, see:

• Carsten Gundlach, Critical Phenomena in Gravitational Collapse.

While spherical symmetry is a very restrictive assumption, this result is a good example of how, with plenty of work, we can make progress in rigorously settling the questions raised by general relativity.

What about strong cosmic censorship? In general relativity, for each choice of initial data—that is, each choice of the gravitational field and other fields at “time zero”—there is a region of spacetime whose properties are completely determined by this choice. The question is whether this region is always the whole universe. That is: does the present determine the whole future?

The answer is: not always! By carefully choosing the fields at time zero you can manufacture counterexamples. But Penrose, knowing this, claimed only that

genericallythe fields at time zero determine the whole future of the universe.In 2017, Mihalis Dafermos and Jonathan Luk showed that even this is false if you don’t demand that the fields stay smooth. But perhaps the conjecture can be saved if we require that:

• Kevin Hartnett, Mathematicians Disprove Conjecture Made to Save Black Holes.

• Oscar J.C. Dias, Harvey S. Reall and Jorge E. Santos, Strong Cosmic Censorship: Taking the Rough with the Smooth.

I remember reading somewhere about some students who met with an aging Einstein in the early 1950s. These students supposedly tried to “educate” him about a new technology called “signal processing”. 1948 was the “annus mirabilis” of signal processing I don’t remember reading the outcome of that meeting; whether or not it had any impact on Einstein’s thinking. But I tend to think signal processing artifacts such as Gibbs Phenomena might have rung in his ears alongside thoughts of Dirac deltas and perhaps illiciting another way of looking at evaporating singularities.

Signal processing has all sorts of comparably spooky phenomena. Information Theory didn’t become a mainstream physics topic until well after the rise of quantum theory in the early 1920s when people like E. T. Jaynes were becoming very concerned about the nature and role of information in physics. The Nyquist rate reeks of an Uncertainty Theory though the discrete moiré patterns from aliasing would be more analogous to continuous wave interference.

Nils Abramson who passed away last year wrote a rather unusual book, Light in Flight, about those Moiré patterns after he made those amazing photographs of a light front in slow motion. Well of course everyone thought you couldn’t possibly snap a photo faster than the speed of light, but anything seems possible if you know some trick. So I find it hard to believe in strict no-go theorems. Improbability is much easier to believe in. Those aliasing patterns, the Moirés, turned out to be very useful. As useful as wavefunctions became to a discretizing quantum theory? Digital signal processing could have just have easily been called quantum signal processing. DSP and QM are both dealing with discrete values as technical improvements over classical, and more theoretical analog techniques.

And now digital signal processors, DSPs, are the basis of GPUs which do tensor maths and host the physics engines that simulate virtually real physical effects in simulations and video games. They are making it harder and harder for a sane person to tell the difference between the real world and a deep fake hallucination. So the next time you see an acausual phenomena, you might be tempted to call it a one-off miracle, a Houdini escape magic trick, a deja vu error inside the Matrix. Or, just blame the non-determinism on non-associativity, non-commutivity, or Chua circuits. But I suspect in terms of signal processing they might blame it on something like a superluminal propagation in a coax cable emulating a photonic crystal. Or something like that.

Q: Does redundant information survive inside a black hole, or are black holes like PKzip?

Q: Does spaghettification cause amnesia and leave everything else intact?

Q: Are black hole analogous to optical spatial filters (they get rid of noise, but you still have Airy rings, which look like Gibbs phenomena for cones instead of square waves)?

Happy New Year…

Unfortunately I don’t know suitable resources. These issues are highly technical and cannot be popularized easily. Which is why each main result has a proof that fills several hundred pages and comes with a long introduction section in which the main result is stated only in vague terms. The precise statements come always much later in the article and involve MANY pages of definitions.

Of course I understand that you want your FAQ to be lowbrow. Nevertheless, it might be possible to give a rough explanation of the strong cosmic censorship issue, a little less rough than “under reasonable conditions general relativity is a deterministic theory”. Maybe as follows.

Assuming that we know, for some fixed time , the state of our universe at each point in space, can we can deduce the state of the universe also at all other points in time and space? Most physical theories are indeed deterministic in this sense. The question is whether the same is true in general relativity.

For each possible state at time (aka “initial data set”) there is indeed a spacetime region (the “maximal Cauchy development” of the initial data set) whose properties are determined by the initial data set. The question is whether this region is always (for each possible initial data set) the whole universe.

The answer is “no”: for certain special initial data sets, the universe extends beyond the maximal Cauchy development. However, Penrose formulated the “Strong Cosmic Censorship Conjecture”, which says that for

almostall initial data sets the maximal Cauchy development cannot be extended: When we change any given initial data set slightly, the typical result is an initial data set with inextendible maximal Cauchy development.Whether this conjecture is true might depend on what precisely counts as an “extension” of a given development. If we demand only that an extension is a “continuous” universe — it is allowed to look like a crumpled piece of paper, just not torn anywhere —, then the Strong Cosmic Censorship Conjecture becomes false. However, it might become true when when we demand extensions to be smooth, without crumples.

Thanks for the attempt at a “popular” explanation. I’m very happy with the concept of “Cauchy development” myself, but I’m reluctant to spring this term on the readers of the FAQ. It doesn’t really help people much unless they’ve studied a bit about hyperbolic PDE and their Cauchy surfaces. But I think it’s possible to say what you said without bringing in this term. I may try something like this.

Sure, for instance by replacing “maximal Cauchy development” with “influence zone” throughout. And replacing “initial data set” with “initial state”.

Okay, I’ve updated my explanation of strong cosmic censorship, both in the article and here on the blog. Thanks!

Just looking at your FAQs, I would like to suggest one that would be particularly well suited for you. One of the big open questions in condensed matter physics is the classification of topological phases of matter. It is a central question about the nature of many body physics. It is also a very mathematically rich field utilizing techniques from algebraic topology to category theory, so this should be right up your alley.

Thanks! Do you have a favorite introductory paper on this subject?

Right now the condensed matter section is pathetically out of date and pathetically small. I’d really like to see some lists of the top open questions in condensed matter physics.