首页 | 本学科首页   官方微博 | 高级检索  
     检索      


Demystifying Fisher Information: What Observation Data Reveal about Our Models
Authors:Judith Schenk  Eileen Poeter  William Navidi
Institution:Colorado School of Mines, Golden, Colorado, 80401
Abstract:Information theory is the basis for understanding how information is transmitted as observations. Observation data can be used to compare uncertainty on parameter estimates and predictions between models. Jacobian Information (JI) is quantified as the determinant of the weighted Jacobian (sensitivity) matrix. Fisher Information (FI) is quantified as the determinant of the weighted FI matrix. FI measures the relative disorder of a model (entropy) in a set of models. One‐dimensional models are used to demonstrate the relationship between JI and FI, and the resulting uncertainty on estimated parameter values and model predictions for increasing model complexity, different model structures, different boundary conditions, and over‐fitted models. Greater model complexity results in increased JI accompanied by an increase in parameter and prediction uncertainty. FI generally increases with increasing model complexity unless model error is large. Models with lower FI have a higher level of disorder (increase in entropy) which results in greater uncertainty of parameter estimates and model predictions. A constant‐head boundary constrains the heads in the area near the boundary, reducing sensitivity of simulated equivalents to estimated parameters. JI and FI are lower for this boundary condition as compared to a constant‐outflow boundary in which the heads in the area of the boundary can adjust freely. Complex, over‐fitted models, in which the structure of the model is not supported by the observation dataset, result in lower JI and FI because there is insufficient information to estimate all parameters in the model.
Keywords:
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号