Stephen P. King (email@example.com)
Wed, 05 May 1999 10:12:52 -0400
attached mail follows:
In short, when you estimate a parameter, you estimate it's value usually by taking
the estimate of the parameter to be the maximum likelihood value. So we get an
estimated parameter value, and we know it's uncertain. Imagine it as a normal
distribution, the center of which is our estimate, and the variance of which is the
uncertainty we have in the location of our estimate. The Fischer Information
essentially describes how sharp that normal distribution is around our estimate.
More Fischer Information roughly implies a more informative estimate (i.e. tighter
spread around the MLE).
Hope it helps,
This archive was generated by hypermail 2.0b3 on Sun Oct 17 1999 - 22:10:30 JST