Parameter identifiability with kullback-leibler information divergence criterion

Research output: Contribution to journalArticlepeer-review

6 Scopus citations

Abstract

We study the problem of parameter identifiability with Kullback-Leibler information divergence (KLID) criterion. The KLID-identifiability is defined, which can be related to many other concepts of identifiability, such as the identifiability with Fisher's information matrix criterion, identifiability with least-squares criterion, and identifiability with spectral density criterion. We also establish a simple check criterion for the Gaussian process and derive an upper bound for the minimal identifiable horizon of Markov process. Furthermore, we define the asymptotic KLID-identifiability and prove that, under certain constraints, the KLID-identifiability will be a sufficient or necessary condition for the asymptotic KLID-identifiability. The consistency problems of several parameter estimation methods are also discussed.

Original languageEnglish
Pages (from-to)940-960
Number of pages21
JournalInternational Journal of Adaptive Control and Signal Processing
Volume23
Issue number10
DOIs
StatePublished - Oct 2009
Externally publishedYes

Keywords

  • Consistency in probability
  • Fisher's information matrix (FIM)
  • Kullback-leibler information divergence (KLID)
  • Parameter estimation
  • System identification

Fingerprint

Dive into the research topics of 'Parameter identifiability with kullback-leibler information divergence criterion'. Together they form a unique fingerprint.

Cite this