1 Answers

Bayesian linear regression is a type of conditional modeling in which the mean of one variable is described by a linear combination of a set of other variables, with the goal of obtaining the posterior probability of the regression coefficients and ultimately allowing the out-of-sample prediction of the regressand, y {\displaystyle y} , conditional on observed values of the regressors, X {\displaystyle X} . The simplest and most widely used version of this model is the normal linear model, in which the distribution of y {\displaystyle y} given X {\displaystyle X} follows a normal distribution. In this model, and under a particular choice of prior probabilities for the parameters—so-called conjugate priors—the posterior can be found analytically, but generally posteriors have to be approximated.

4 views

Related Questions