E.g. https://eprint.iacr.org/2014/595 . You can compress a computation trace of any reasonable length to a *constant* 374 byte string, such that given the string it is possible to verify in *constant* time (about 30 msecs) that either the generator performed the computation as described, or else that it violated certain computational assumptions. (said assumptions are not post-quantum secure, and there’s a constant factor slowdown of roughly 10^11)

So, presumably nobody here has requested a copy of either the 200 TB or the 68 GB version and verified it themselves. If they had published a 374 byte SNARK (ignoring the implausible amount of computation it would take to generate by known techniques), how would that change the status of the result?

]]>• Marijn J. H. Heule, Oliver Kullmann and Victor W. Marek, Solving and verifying the Boolean Pythagorean triples problem via cube-and-conquer.

I didn’t see a single number giving the total length of all the programs used.

]]>It’s interesting that these 200 terabytes were used to solve a yes-or-no question, whose answer takes a single bit to state: *no*.

If the Turing machine does not halt, then there is a periodicity in the written string with written states (another tape). ]]>

This looks like a translation of my post into Russian.

]]>