Observed information

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

In statistics, the observed information, or observed Fisher information, is the negative of the second derivative (the Hessian matrix) of the "log-likelihood" (the logarithm of the likelihood function). It is a sample-based version of the Fisher information.

Definition

Suppose we observe random variables , independent and identically distributed with density f(X; θ), where θ is a (possibly unknown) vector. Then the log-likelihood of the parameters given the data is

.

We define the observed information matrix at as

In many instances, the observed information is evaluated at the maximum-likelihood estimate.[1]

Fisher information

The Fisher information is the expected value of the observed information given a single observation distributed according to the hypothetical model with parameter :

.

Applications

In a notable article, Bradley Efron and David V. Hinkley [2] argued that the observed information should be used in preference to the expected information when employing normal approximations for the distribution of maximum-likelihood estimates.

See also

References

  1. ^ Dodge, Y. (2003) The Oxford Dictionary of Statistical Terms, OUP. ISBN 0-19-920613-9
  2. ^ Efron, B.; Hinkley, D.V. (1978). "Assessing the accuracy of the maximum likelihood estimator: Observed versus expected Fisher Information". Biometrika. 65 (3): 457–487. doi:10.1093/biomet/65.3.457. JSTOR 2335893. MR 0521817. 
Retrieved from "https://en.wikipedia.org/w/index.php?title=Observed_information&oldid=786531087"
This content was retrieved from Wikipedia : http://en.wikipedia.org/wiki/Observed_information
This page is based on the copyrighted Wikipedia article "Observed information"; it is used under the Creative Commons Attribution-ShareAlike 3.0 Unported License (CC-BY-SA). You may redistribute it, verbatim or modified, providing that you comply with the terms of the CC-BY-SA