## Week 5

This week, we begain by finishing our disussion of hypothesis testing and p-values, I followed the book giving some examples of how this works in the location- and location-scle model.
We then went on to introduce one of the most important concepts in statistics: Bayesian models. I explained the difference in one of two ways (the first assignment question asks you to show formally why they are the same):
• A statistical model, being the data of a family of probability measures $$P_\theta$$ lets you compute the probability of an outcome if you assume a certain parameter to be the true value. However, a Bayesian model consits of a probability measure $$\Pi$$ on $$\Theta\times S$$. So, here the information is that you can determine how likely it is that an outcome and a parameter takes place at the same time!
• Alternatively, a Bayesian model consists of an extra probability measure on the parameter space $$\Theta$$, called the prior. This inuitevly corresponds to the assumption that you have a rather good idea as to how likely a parameter will appear!
There are endless papers on whether or not a situation calls for a Bayesian approach, the reason being that we are in general not exactly sure as to how to define a prior...
A good example of this was raised in class: suppose I want to make inferences about the midterm. It makes perfect sense -given enough students- to assume this is described by a location-scale statistical model. If I wanted to apply Bayesian techniques, I would have to describe how the parameters $$\mu, \sigma^2$$ are distributed somehow. It is not clear how to do this (this question essentially asks how likely is it that a midterm produces standard results..) Another example of a situation where the Bayesian approach would work well was the batting average of baseball players: here after collecting enought data, it seems plausible that we could describe how much more likely a batting average is than another one! (In fact, the typical batting average is 0.27 with almost all averages ranging from 0.2 to 0.35)