Bayesian updating normal distribution
As a simple starting place, I'll assume that the prior distributions for these variables are uniform over all possible values.
I'm going to use a mesh algorithm to compute the joint posterior distribution, so I'll "cheat" and construct the mesh using conventional estimates for the parameters.
Ian Hacking noted that traditional "Dutch book" arguments did not specify Bayesian updating: they left open the possibility that non-Bayesian updating rules could avoid Dutch books.
Hacking wrote "And neither the Dutch book argument, nor any other in the personalist arsenal of proofs of the probability axioms, entails the dynamic assumption. So the personalist requires the dynamic assumption to be Bayesian.
The distribution of belief over the model space may then be thought of as a distribution of belief over the parameter space.
This has the disadvantage that it does not account for any uncertainty in the value of the parameter, and hence will underestimate the variance of the predictive distribution.
Jeffrey's rule, which applies Bayes' rule to the case where the evidence itself is assigned a probability.
Bayesian theory calls for the use of the posterior predictive distribution to do predictive inference, i.e., to predict the distribution of a new, unobserved data point.
Bayesian updating is particularly important in the dynamic analysis of a sequence of data.
Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law.Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available.