Likelihood

From WikiMD's Wellness Encyclopedia

Likelihood is a concept in statistics and probability theory that measures the goodness of fit of a statistical model to a sample of data for given values of the unknown parameters. It is used in the method of maximum likelihood estimation, which seeks the parameter values that maximize the likelihood.

Definition[edit | edit source]

In the context of a statistical model, the likelihood of a set of parameter values, given some observed outcomes, is equal to the probability of those observed outcomes given those parameter values. More specifically, if the statistical model defines a distribution over outcomes, indexed by the parameters, then the likelihood of the parameters is defined as the probability of the observed outcomes under that distribution.

Properties[edit | edit source]

The likelihood has several important properties that make it useful for statistical inference. These include:

  • Invariance to reparameterization: The likelihood remains the same under one-to-one transformations of the parameter space.
  • Sufficiency: If a statistic is sufficient for the parameter, then the likelihood depends on the data only through that statistic.
  • Factorization: If the data are independent given the parameter, then the likelihood factors into a product of individual likelihoods.

Applications[edit | edit source]

The likelihood is used in many areas of statistics, including:

  • Maximum likelihood estimation: This is a method of estimating the parameters of a statistical model by maximizing the likelihood.
  • Likelihood-ratio test: This is a statistical test based on the ratio of the likelihoods under the null and alternative hypotheses.
  • Bayesian inference: In this context, the likelihood is used to update the prior distribution on the parameters to obtain the posterior distribution.

See also[edit | edit source]

References[edit | edit source]


Likelihood Resources

Contributors: Prab R. Tumpati, MD