Author: Jason Bryer, Ph.D. (jason@bryer.org)
Website: bryer.org

Running the App

The app is deployed to RStudio's shinyapps.io service at jbryer.shinyapps.io/BayesBilliards.

The Shiny App is included in the DATA606 package on Github and can be run, once installed, using the DATA606::shiny_demo('BayesBilliards') function.

Or, run the app directly from Github using the shiny::runGitHub('DATA606', 'jbryer', subdir='inst/shiny/BayesBilliards') function.

Problem Statement

Consider a pool table of length one. An 8-ball is thrown such that the likelihood of its stopping point is uniform across the entire table (i.e. the table is perfectly level). The location of the 8-ball is recorded, but not known to the observer. Subsequent balls are thrown one at a time and all that is reported is whether the ball stopped to the left or right of the 8-ball. Given only this information, what is the position of the 8-ball? How does the estimate change as more balls are thrown and recorded?

Usage

Strategy

For the initial iteration, we have one observation of left or right. We assume a uniform prior distribution of length k. For each k, we sample unif(0,1) and record whether that value is less than (i.e. left) or greater than (i.e. right) of k. The values from k that match the original observation remain and constitutes our posterior distribution.

For subsequent iterations, we simply use our posterior distribution from the prior iteration as our prior distribution for the current sampling.

In R, we start with a vector prior drawn from runif.

k <- 5000
prior <- runif(k, min=0, max=1)
hist(prior)

Assume the first ball falls to the left of the 8-ball. We randomly draw k values from a uniform distribution and keep the values from the prior distribution where the random value is greater. This is the posterior distribution.

posterior <- prior[prior < runif(k, min=0, max=1)]
hist(posterior)

Now, consider we throw another ball that lands to the right. The prior distribution for this iteration is the posterior from the previous iteration. In order to have k elements in the prior distribution, we sample k from the posterior with replacement.

prior <- sample(posterior, k, replace=TRUE)

Getting the posterior distribution is the same the initial iteration.

posterior <- prior[prior > runif(k, min=0, max=1)]
hist(posterior)

With two observations, there is a 50% chance the 8-ball is between r quantile(posterior, c(.25, .5))[1] and r quantile(posterior, c(.25, .5))[2].

mean(posterior)
quantile(posterior, c(.25, .5))

References

Bayes, T. (1763), An essay towards solving a problem in the Doctrine of Chances. Philosophical Transactions of the Royal Society of London, 53. Retrieved from http://www.stat.ucla.edu/history/essay.pdf

Downey, A. (2015). Bayesian Billiards. Retrieved from http://allendowney.blogspot.com/2015/06/bayesian-billiards.html

Eddy, S.R. (2004). What is Bayesian statistics? Nature Biotechnology 22. Retrieved from http://www.nature.com/nbt/journal/v22/n9/full/nbt0904-1177.html

McGrayne, S.B. (2011). The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy. Yale University Press.



jbryer/DATA606 documentation built on Feb. 17, 2024, 4:13 a.m.