Abstract
We study the problem of parameter identifiability with Kullback-Leibler information divergence (KLID) criterion. The KLID-identifiability is defined, which can be related to many other concepts of identifiability, such as the identifiability with Fisher's information matrix criterion, identifiability with least-squares criterion, and identifiability with spectral density criterion. We also establish a simple check criterion for the Gaussian process and derive an upper bound for the minimal identifiable horizon of Markov process. Furthermore, we define the asymptotic KLID-identifiability and prove that, under certain constraints, the KLID-identifiability will be a sufficient or necessary condition for the asymptotic KLID-identifiability. The consistency problems of several parameter estimation methods are also discussed.
| Original language | English |
|---|---|
| Pages (from-to) | 940-960 |
| Number of pages | 21 |
| Journal | International Journal of Adaptive Control and Signal Processing |
| Volume | 23 |
| Issue number | 10 |
| DOIs | |
| State | Published - Oct 2009 |
| Externally published | Yes |
Keywords
- Consistency in probability
- Fisher's information matrix (FIM)
- Kullback-leibler information divergence (KLID)
- Parameter estimation
- System identification