Bayesian Analysis of Craig Keener, “Otho: A Targeted Comparison”

[Recently I wrote a critical review of Christian scholar Craig Keener’s new volume Biographies and Jesus: What Does It Mean for the Gospels to Be Biographies?, with emphasis on chapter 6–“Otho: A Targeted Comparison of Suetonius’ Biography and Tacitus’ History, with Implications for the Gospels’ Historical Reliability”–which is written by Keener himself. Ancient historian Richard Carrier sent me some further analysis, which makes both a deductive and inductive critique of Keener’s arguments. Carrier’s feedback can be found below. -MWF]

Applying Bayes’ Theorem to your article’s point:

Keener says we can be sure Suetonius et al. worked from sources, because they say they worked from sources. Then he says we can assume the same of the Gospels, because the Gospels have other similarities to Suetonius et al., except for that one.

This is a straightforward fallacy of false generalization. “All X’s did Y, and all Y’s entail doing Z, therefore all X’s did Z” does not lead by any valid logical inference to “The Gospels are an X,” precisely because the Gospels did not do Y (so the first premise in the argument fails to obtain). So any other similarities there may be are analogically irrelevant to whether the Gospels did Z. Only doing Y can entail Z. He would need to find examples of texts that we can be certain did Z, without doing Y (Y being “naming and discussing sources”). Without arguing in a circle.

So for the deductive logic.

But an apologist will insist it’s inductive. But then Bayes’ Theorem enters.

Keener’s argument that “we can be sure Suetonius et al. worked from sources” has this form:

The probability that they would say Y and not have done Z is low; therefore, given Y, the probability they did Z is high. But Keener has no evidence this relation holds for anything other than Y.

Where Y is in e, as are all other similarities between Suetonius et al. and the Gospels, then:

P(Z|e) = P(Z)P(e|Z) / [ P(Z)P(e|Z) + P(~Z)P(e|~Z) ]

Suppose we break the evidence into just Y, and then X for all the other parallels.

For just Y:

P(Z|Y) = P(Z)P(Y|Z) / [ P(Z)P(Y|Z) + P(~Z)P(Y|~Z) ]

Assume we’re neutral on the prior (we might not be, but that depends on other arguments and we are just analyzing this one), so that P(Z) = P(~Z), then:

P(Z|Y) = P(Y|Z) / [ P(Y|Z) + P(Y|~Z) ]

Which –> 1 as P(Y|~Z) –> 0.

That’s Keener’s argument.

But when he turns to the Gospels, he falsely treats X as if it were Y. But that doesn’t work. His argument from X is:

P(Z|X) = P(Z)P(X|Z) / [ P(Z)P(X|Z) + P(~Z)P(X|~Z) ]

And with a neutral prior that’s:

P(Z|X) = P(X|Z) / [ P(X|Z) + P(X|~Z) ]

Keener presents no evidence that that –> 1 as P(X|~Z) –> 0, nor any evidence that P(X|Z) is even high.

What instead he does is argue:

P(Z|Y&X) = P(Y&X|Z) / [ P(Y&X|Z) + P(Y&X|~Z) ]

Which gets him the Y result, then he uses the same argument for the Gospels, but “forgets” the Gospels don’t have Y. He is thus conflating Y with X. To argue X correlates with Z in the absence of Y requires actual evidence that that is ever the case. He presents none. Presenting examples that correlate Z with Y&X simply does not constitute evidence that Z correlates with Y. That’s the generic Bayesian analysis of the fallacy of false analogy in a nutshell.

Conversely, you point out that the generic similarities in X are actually known or credible properties even of fiction, so that in fact the evidence there is actually argues *against* any distinct correlation between X and Z (it may be at best 0.5, such that P(Y&X|Z) = P(Y&X|~Z)), so it’s even worse than Keener having no evidence that X correlates with Z; the evidence actually is against there being any such correlation, at least in any reliable sense. That’s the “at best” 0.5 correlation; but it’s possibly worse, if the absence of Y is telltale of fiction, another argument of yours. And that need not be a correlation of 1, it could be, say, 0.8, allowing 20% of examples of no-Y still being Z texts. Look what happens when you are even that generous (and still using a neutral prior as if no other considerations mattered, which we know isn’t the case), assuming no correlation exists between no-Z texts containing X:

P(Z|X) = P(X|Z) / [ P(X|Z) + P(X|~Z) ] = 0.2 / (0.2) + (0.5) = 0.29.

If no-Z texts typically contained X, it’s even worse. Only if no-Z texts rarely contain X would it get better; but that would require the very evidence Keener doesn’t present: that X correlates with Z.

Introducing Y also changes the result, of course. But that’s precisely what the Gospels don’t do. Likewise any other generic factor that might up the odds of Z; which would need to be demonstrated as doing so in other texts (and without circular argument).

-Richard Carrier


Big History: An Introduction

What is big history? This emergent and interdisciplinary field, enriched and pioneered by Dr. David Christian of Macquarie University, encourages a more holistic understanding of human events than does the traditional study of history. While historians are concerned with understanding the past in context, and considering cause and effect in human terms, big historians are concerned with understanding the past not only in its immediate human historical setting, but in the context of scientific and physical laws of nature as well. If history is written by the victor, then big history is written in the stars themselves.

Screenshot 2015-12-20 at 1.39.32 PMDr. Christian, bolstered by the support of philanthropist Bill Gates, first injected big history into the public sector with a 2011 TED Talk, providing an 18 minute overview of world history. In this sensational talk, which has garnered more than 5 million views since its publication, Dr. Christian identifies the basic principles of big history, including the concept of Goldilocks conditions and the various “thresholds” of complexity that we observe in the universe. At various moments in the cosmic past, Christian states, certain Goldilocks conditions have come about, in which “not too little, and not too much” of certain components — usually energy or mass — have allowed the universe to reach states of increasing complexity.

Starting at the Big Bang and the first moment of time itself, Christian traces the cause-and-effect of each moment and identifies these thresholds. He highlights the six universal thresholds of complexity as follows:

Continue reading