I apologize for this entry, which doesn’t really belong here: it’s a piece of fan fiction, or actually fictional history, with only a slight bit of relevance to the problems this blog is about.
Like many scientists I have a grudging admiration for the Star Trek franchise: grudging because the science is so often silly, and could often have been improved easily without spoiling the stories; admiration because it has created a hopeful vision of the future, some fun stories, and some enduringly interesting characters.
In Discovery we heard about the Logic Extremists, a dissident faction of Vulcans who wanted to leave the Federation. But we didn’t learn much about their core beliefs! They seemed rather similar to the Vulcan Isolationists, who came about a hundred years later. There seemed to be an interesting untold story lurking behind the name.
So, I went to T’Karath and spent a couple of weeks poring through the historical documents on this movement. Here’s a quick sketch of what I found.
In the first half of the 22nd century, the central government had become corrupt, with Romulan operatives infiltrating the Vulcan High Command. Some Vulcans, the Syrannites, attempted to reinstate and develop the original teachings of the Vulcan philosopher Surak. But around 2140, another small group decided that Surak had not developed logic with sufficient thoroughness.
This group of thinkers argued that all deductive reasoning should be formalized, all inductive reasoning should be Bayesian with explicit probabilities on hypotheses, and all decision-making should maximize utility.
This group, who called themselves the Pure Logic movement, moved to Xir’tan and set up a commune there. They began a program of formal concept analysis so that all words would have precise definitions. Before each meal they bowed, seemingly in prayer, but actually to optimize their activities to come. Children were schooled in an even more disciplined way than usual: less high-tech than the skill domes of the 2200s, but with an intense focus on logic, semiotics, probability, and statistics.
Conflicts erupted in 2200 between what we would call Jaynesian-Bayesians and hardcore subjective Bayesians. The former advocated entropy-maximizing priors. The latter argued that no prior counts as “right” without further assumptions, so one is free to start with any prior.
As the Pure Logic movement became established, they spread and set up communes the main continent, especially in Gol, Xial and Raal. They started influencing the political establishment, first locally and then at the federal level.
As this happened, factions with radical positions gradually gained influence. Especially important were the subjective Bayesians who argued that ethics could not be logically derived, so that instead of maximizing utility, a rational agent was free to maximize any chosen quantity. Their motto was remarkably similar to a saying credited to Hume:
Going further, the most extreme subjective Bayesians adopted spreading the Pure Logic movement as their only goal. All decisions were to be evaluated based on how much they furthered the spread of logical thinking. They took a vow to this effect, and pressed this vow on other citizens as a prerequisite for holding office of any sort. Their opponents dubbed them “Logic Extremists”, and the term stuck.
In 2226, in a hard-fought political struggle, these extremists triumphed and completely pushed the Jaynesian-Bayesians and moderate subjective Bayesians out of power. Two years later V’Arak took control: a charismatic leader who asserted with 100% prior probability that the Federation was trying to subvert Vulcan culture and stop the spread of the Pure Logic movement.
Any attempt to reason with V’arak and his supporters, or compromise with them, was interpreted as further evidence of an increasingly elaborate Federation conspiracy. Most Vulcans repudiated this stance, and as the Logic Extremists’ public support shrank they turned to terrorism.
The violence came to a head around 2256, when V’latak (shown below) attempted to assassinate Sarek before the peace talks on Cancri IV, saying:
logic above all. Vulcans will soon recognize and withdraw from
the failed experiment known as the Federation.
At this point support for the Logic Extremists rapidly dropped and the movement began to dissipate, though Patar still managed to infiltrate Section 31.
However, the most interesting aspect of the Logic Extremists are their early theoretical writings — especially those of Avarak, and Patar’s father Tesov. They were an extremely bold attempt to plan a society based purely on logic. I hope they’re translated soon.
That was a fun read! I wonder how much the “Jaynesian-Bayesians” are based on today’s rationalists :)
The Jaynesian-Bayesians are based on today’s Jaynesian-Bayesians: the people who follow E. T. Jaynes in using entropy maximization to choose priors. However, the “logic extremists” are my way of trying to gently poke fun at certain excesses I’ve seen here and there in today’s “rationalist” movement… not that any of them go quite that far.
MaxEnt works well for many probability density functions. The approach is that it characterizes based only on the limited information available, so first assuming the mean only, then the mean and variance, etc.
Is that the MO of the Jaynesian-Bayesians as well?
Yes — Jaynes was the guy who came up with max-ent methods in statistics in 1957, and one of his favorite problems was something like guessing the probability that you’ll roll a 6 on a die if the mean value of this die’s rolls is 4. Check this out if you’re not already familiar with him:
• Wikipedia, Edwin Thompson Jaynes.
It’s funny: The same process of reading and late nights that convinced me that subjective probability is good for physics also led me to conclude that it can’t be the theory of all learning that “rationalists” want it to be.
I usually think of myself as an objective Bayesian, because of the importance I see in the relationship between information and probability. If you want to consider the probability that something is true about the real world, then you should think of this as determined by the information that you have about the world. But information is ultimately just as subjective as probability (and not just in the sense that all bayesianism is subjective because it depends on who has the information and whose probability you're considering). As the mathematics of probability doesn’t care how you got your prior, so the mathematics of information theory doesn’t care how you got your information. You can say ‹Suppose that the probabilities are such and such; then what?›, and you can say ‹Suppose that the available information is this and that; then what?›. (Indeed, these are two different ways of saying the same thing.)
And I don’t think that information is always prior to probability either. If the information that you have can be thought of as a list of facts, so that the probabilities are such as to give the maximum entropy consistent with those facts, then very well. But you can also start with a probability distribution and say that the information that you have is that these are the probabilities. To be more concrete, if you know that a coin can land on either heads or tails and you have no further information, then the probability is 1/2 heads and 1/2 tails; but you can also say that what you know about the coin is that the probability is 1/2 heads and 1/2 tails. Or in another situation, you could say that what you know about the coin is that the probability is 1/3 heads and 2/3 tails; in that case, you still know that the coin can land on either heads or tails but you do have further information besides that, and the best way to sum up that additional information is to state the resulting probabilities. You can't describe that information as simply a list of facts; at least, not a list of facts about just the coin.
So, I’m a subjective objective Bayesian. (And since bayesianism is a subjective philosophy of probability, that makes me a subjective objective subjectivist.) Still, I’m not so subjective as to think that, when considering the probability of a real-world event, it is ever okay to assign exactly 100% probability to any non-tautology!
I’m glad you’re not a logic extremist! Yes, there are a lot of fascinating subtleties in this business. I wanted to imagine how “logic extremists” could become terrorists, and the easiest way was to imagine them very strictly dividing their reasoning into “deduction rules” and “initial assumptions”, and then saying that since the assumptions aren’t constrained by the deduction rules (except perhaps for the requirement that they not lead to a contradiction), one is free to choose them in any way. Of course this is devastating to ethics, or even the practice of subjective Bayesianism. And it makes some psychological sense that such a logic extremist
might choose the promulgation of their own views as their central goal… since that’s how fanatics work.
Curiously the injunction against attitributing 100% prior probability to any non-tautology is named after Cromwell, who once said:
Which just goes to show that you can be not a logical extremist and still be a terrorist.
Cromwell was far from being a terrorist in this instance. He was the voice of liberty (in a limited form, I concede, but this was the 17th century, and even now…) against the autocratic rule based on some supposedly “divine right” that run in a family.
@Carl : I’m generally pro-Parliament in the English Civil War, and I don’t object to what Cromwell wrote in that letter to the Covenanters, but his actions in Ireland qualify as terror in my opinion. (I’m similarly generally pro-Jacobin in the French Revolution, but the word ‘terrorism’ was invented, by them, to describe some of their actions.) It's possible to do great evil even after winning a fight against another evil.
Speaking of droughts (English drought of 1976): if there’s a film of soap or oil on the ocean surface, how would that affect the ocean waters? Would a lower evaporation rate measurably raise ocean temperatures? Wouldn’t the higher ocean temperatures melt the polar ice caps? Wouldn’t a reduction of ocean evaporation result in a dryer atmosphere? Wouldn’t a dryer atmosphere result in more evaporation water from land masses? …
A causative line of reasoning always seems treacherous.Wouldn’t an atrophied peripheral vision or some neurological defect presenting as some kind of perceptual aliasing contribute to the attribution of miraculous causes to otherwise easily explanable chains of events? I often get fooled by magic tricks which are designed to evade such analysis.
Vaccine skeptics actually think differently than other people
Simulations show extreme opinions can lead to polarized groups
If you say, “There’s a 50:50 chance of winning the lottery.”, on some level you’d be right, whether or not there’s a pre-analyzed 1/2 chance or a 2/3 chance, or 1/million chance &c: either you win or you don’t.
On another level you’d have to make up the differences by hypothesizing a sort of virtual dark force that restores these extremist numbers back to sanity. For instance, if there’s a 1:6 chance of rolling a ‘6’ and you roll 10,000 times and never get a ‘6’, you might make the gambler’s mistake of betting this virtual force will restore his lost sanity by letting him roll a ‘6’ on the next roll. That virtual force will act upon the dice as it is rolling in mid-air like some six-state qubit in superposition.
Somewhere the nature of ‘sanity’ always aises. One of the most terroristic of religions features a demigod who clearly suffered from delusions of grandeur, split-personality, paranoia, manic-depression, and a host of other psychological nuisances..
It seems to me the most recent development is that insurance companies are giving up on fundamentals and resorting to what the stock traders are doing these days (technical trading). The insurance companies are no longer carefully planning for the future based on real measured numbers, but instead fitting an incrementally updated curve and then expecting it will match the future. When it doesn’t, like the market collapse of late 1990s they hold a candle-light vigil, chaulk it up to quantum tunneling or some other leaky logic, and then proceed confidently knowing that the sun will rise again tomorrow and that the same weather model will work again.
I know there’s more to it than that, but that’s the impression my limited vision allows. I can’t make much logical sense of it.
As Einstein probably did NOT say: “The definition of insanity is doing the same thing over and over again and expecting different results.”. But this seems true enough even though it may be false.
Yeah, I don’t know much about Cromwell, but I was surprised that he was the one saying this, not having it said to him. (Maybe he was beseeching other people to realize they could be wrong.)
Yes, he was beseeching others (Scottish Puritans who had just switched sides and made an alliance with the exiled King); and I’m pretty sure that he would never have said it if they'd still agreed with him!
I consider myself a ‘frequentist’ and view Bayesianism as a subset of that. (They just combine absolute probabilites to get conditional ones .
Of course they do but often you have to make a quick choice on your ‘random walk over rugged landscapes’. )
I view myself as a collection of cells, fundamental particles, and with a long geneology which presumably traces back to Africa and the big bang
. But some people call me ‘ishi’ as a summary –that’s my ‘nature’ nickname; other people call me different names based on music or my birth certificate.
It seems one can learn or at least get exposed to alot of science/math in movies or other ‘creative arts’ (fiction ,music)–I know that much of what little i know of rigorous science I first learned in music lyrics, fiction books, and a few movies–in popular form (though some i think were as profound as insights from current science–for example, one of my favorite quotes is by a notorious writer: ‘men only see what they look at, and they only look at what they already have in mind’).
I came across ET Jaynes ‘maxEnt’ approach from studying some statistical mechanics. (I was sort of amazed that what I considered physics could be viewed as just a branch of probability theory–now it seems obvious in a way).
I probably am a ‘logical extremist’ except i’m on the wrong planet—occasionaly in this area birdwatchers (another community i’ve been a member of at times ) see ‘strays’ –you see some bird around here that normally is found 1000 or 3000 miles away. ‘what is a snowy owl doing around her?’
There are alot of ways of measuring information or the ‘volume of phase space’. MaxEnt seems to work well in many cases–this is ‘an inconveniant truth’ (although the formalism is often conveniant especially for the mathematically dis (or specially) abled). Shannon and Kolmogorov entropies (or information) are often if not always identical. One can create a random or deterministic fractal.
I’m an agnostic (though the term i prefer is stochastic because i disagree with some people who call themselves agnostics—-so i sort of have left the ‘faith’ . Some people say i am a christian because i sort of believe in and practice the ‘golden rule’ ( ‘do unto others as you would have them do unto you’ though some interpret it as meaning ‘blondes have more fun’). But many other animals also follow some form of the golden rule but don’t go to church and tithe.
Only part of logical extemism i may disagree with is Hume’s dictum –you can’t get ought from is. From a cionstructionist view, maybe you can–as a stochastic i can’t decide as Leibniz could that ‘this is the best and only possible world’ and whether we have a choice in what the next one may be if there is or will be one. (I’m also a conventionalist a la Poincare—-so all worlds are the same one, just in different states.)
(Some libertarian philosopher at U Colorado named Huemer has on his blog a post about ‘check your prior’ as a question by Feynman about the future of civilization. I’m definately not a libertarian or gun owner and proponent as he is –my area already has way too much gun violence and gun stores–but he blogged this interesting Feynman question . fakenous.net i wonder (why i wonder why …) how you get a job as a philospher )
Fascinating. I think there is a lot right with that “Logical Extremist” view. However, even as assumptions are sort of freely chooseable, I’d still want them to, besides being contradiction-free, at least vaguely statisfy some variant of Occam’s Razor.
Fewer, weaker assumptions are preferable if they suffice to come to any given conclusion. Just like fewer, weaker axioms are preferable. And taking as an assuption that fact X is 100% certain – as in, it’s a fact of the world, rather than that it’s something I take as fact because it suits my current purposes (like an axiom) seems extremely strong.
As for ethics, it seems to me that logic often is ill-equipped to handle that notion. I don’t see any “natural”, “universe-given” reason as to why one set of ethical beliefs should be preferable over any other. There are some vague arguments one could make, but they are all necessarily going to be human-centric. It seems more a matter of agreement/local consensus than one of universal law. All logic could possibly tell us is whether our current ethical ideals are free from contradictions and whether our actions don’t contradict our ethical ideals. But it’s gonna be incredibly hard to even get any given useful system of ethics to be sufficiently clearly defined to do arbitrary logic on top. Most of the questions arising there seem to be mere toy examples.
Though the greatest issue, I think, is the part where they attempt to make every single word in their language defined with 100% precision, such that everybody agrees on the full meaning of every single word. Language is fluid and ever changing and expanding. New contexts arise all the time, and with them comes the need to describe them, usually by either repurposing existing language in analogy or metaphor, or by inventing entirely new words or phrases.
Furthermore, contexts aren’t homogenously accessed by all people either. Special groups of people are exposed to special contexts which lead to specialized language.
There certainly is
* language that can be pinned down completely such that it’s just a matter of education whether it’s properly understood. But the vast majority of situations could not possibly be covered by that. I’d assign to this being avoidable a probability of 0 (of the form “almost certainly not”).
* and language that is fully pinned down comes with extra caveats: This pinning only could fully happen under logical omniscience. People are not able to hold arbitrarily complex facts in their mind, so they will briefly forget currently irrelevant information in favor of pieces that are relevant. That means, at any given moment, very often, even “precisely defined” terms aren’t gonna be fully exact in the minds of those people. They might be able to cover the full** scope of a simple term over some time, but not instantly.
** gotta be really careful with “full” as well. Even with simple facts it’s gonna be impossible to, in a single instant, realize everything that follows from them. There’s gotta be some sort of cutoff for the scope. Only in such a limited sense could “the full scope” even just potentially be known.
“So, I went to T’Karath …”
best post ever
I like to think that Spock was aware of non-binary (paraconsistent, relevant) logic. Perhaps the logic extremists were merely “classical” binary logicians, who were naturally driven insane by the paradoxes inherent in such a limited logic. Banach-Tarski? Really? Classical logic is broken. (They even killed an android on Star Trek by saying “I am lying.” https://www.youtube.com/watch?v=wlMegqgGORY)
There’s a paraconsistent version of Bayes’ Theorem, too. The total probability takes into account the probability that P is true, or P is false, or the probability that P is both true and false (think of the boundary between two sets as having some overlap). Quantum mechanics suggest this is the correct way of modeling entangled states (although I’m mystified by the square root of a probability), and presumably the Vulcans, who had FTL travel, knew about quantum mechanics … Those poor logic extremists. You might as well argue that all numbers are rational.
“Logic is a little tweeting bird chirping in a meadow. Logic is a wreath of pretty flowers which smell bad.” — Spock
re: the balance of forces and the protector circa 1650, a short form ref is in wikipedia, see “Wars of the Three Kingdoms”.
thanks for the reminder to actually read jaynes, not just borrow, skim, and return.