Files in this item



application/pdfBU-THESIS-2016.pdf (671kB)
(no description provided)PDF


Title:Estimation of KL divergence: optimal minimax rate
Author(s):Bu, Yuheng
Advisor(s):Veeravalli, Venugopal V.
Department / Program:Electrical & Computer Eng
Discipline:Electrical & Computer Engr
Degree Granting Institution:University of Illinois at Urbana-Champaign
Subject(s):KL divergence estimation
Best polynomial approximation
Large alphabet
Functional estimation
Abstract:The problem of estimating the Kullback-Leibler divergence D(P||Q) between two unknown distributions P and Q is studied, under the assumption that the alphabet size k of the distributions can scale to infinity. The estimation is based on m independent samples drawn from P and n independent samples drawn from Q. It is first shown that there exists no consistent estimator that guarantees asymptotically small worst-case quadratic risk over the set of all pairs of distributions. A restricted set that contains pairs of distributions, with density ratio bounded by a function f(k), is further considered. An augmented plug-in estimator is proposed, and is shown to be consistent if and only if m has an order greater than k∨log^2(f(k)), and n has an order greater than kf(k). Moreover, the minimax quadratic risk is characterized to be within a constant factor of (k/(m log k)+kf(k)/(n log k))^2+log^2 f(k)/m+f(k)/n, if m and n exceed constant factors of k/log(k) and kf(k)/log k, respectively. The lower bound on the minimax quadratic risk is characterized by employing a generalized Le Cam's method. A minimax optimal estimator is then constructed by employing both the polynomial approximation and plug-in approaches.
Issue Date:2016-11-17
Rights Information:Copyright 2016 Yuheng Bu
Date Available in IDEALS:2017-03-01
Date Deposited:2016-12

This item appears in the following Collection(s)

Item Statistics