In preparation for an alien invasion, the Earth Defense League has been working on new missiles to shoot down space invaders. Of course, some missile designs are better than others; let's assume that each design has some probability of hitting an alien ship, x.
Based on previous tests, the distribution of x in the population of designs is roughly uniform between 10% and 40%. To approximate this distribution, we'll assume that x is either 10%, 20%, 30%, or 40% with equal probability.
Now suppose the new ultra-secret Alien Blaster 10K is being tested. In a press conference, an EDF general reports that the new design has been tested twice, taking two shots during each test. The results of the test are confidential, so the general won't say how many targets were hit, but they report: ``The same number of targets were hit in the two tests, so we have reason to think this new design is consistent.''
Is this data good or bad; that is, does it increase or decrease your estimate of x for the Alien Blaster 10K?
Now here's a solution:
I'll start by creating a
Pmfthat represents the four hypothetical values of
pmf = Pmf([0.1, 0.2, 0.3, 0.4]) pmf.Print()
0.1 0.25 0.2 0.25 0.3 0.25 0.4 0.25
Before seeing the data, the mean of the distribution, which is the expected effectiveness of the blaster, is 0.25.
Here's how we compute the likelihood of the data. If each blaster takes two shots, there are three ways they can get a tie: they both get 0, 1, or 2. If the probability that either blaster gets a hit is x, the probabilities of these outcomes are:
Here's the likelihood function that computes the total probability of the three outcomes:
both 0: (1-x)**4 both 1: (2 * x * (1-x))**2 both 2: x**4
def likelihood(hypo, data): """Likelihood of the data under hypo. hypo: probability of a hit, x data: 'tie' or 'no tie' """ x = hypo like = x**4 + (2 * x * (1-x))**2 + (1-x)**4 if data == 'tie': return like else: return 1-like
To see what the likelihood function looks like, I'll print the likelihood of a tie for the four hypothetical values of
data = 'tie' for hypo in sorted(pmf): like = likelihood(hypo, data) print(hypo, like)
0.1 0.6886 0.2 0.5136 0.3 0.4246 0.4 0.3856
If we multiply each likelihood by the corresponding prior, we get the unnormalized posteriors:
for hypo in sorted(pmf): unnorm_post = pmf[hypo] * likelihood(hypo, data) print(hypo, pmf[hypo], unnorm_post)
0.1 0.25 0.17215 0.2 0.25 0.1284 0.3 0.25 0.10615 0.4 0.25 0.0964
Finally, we can do the update by multiplying the priors in
pmfby the likelihoods:
for hypo in pmf: pmf[hypo] *= likelihood(hypo, data)
And then normalizing
pmf. The result is the total probability of the data.
And here are the posteriors.
0.1 0.342178493341 0.2 0.255217650566 0.3 0.210991850527 0.4 0.191612005565
The lower values of
xare more likely, so this evidence makes us downgrade our expectation about the effectiveness of the blaster. The posterior mean is 0.225, a bit lower than the prior mean, 0.25.
A tie is evidence in favor of extreme values of
Finally, here's how we can solve the problem using the Bayesian update worksheet:
In the next article, I'll present my solution to The Skeet Shooting problem.