Files in this item
Files  Description  Format 

application/pdf Huang_Dayu.pdf (327kB)  (no description provided) 
Description
Title:  Mismatched divergence and universal hypothesis testing 
Author(s):  Huang, Dayu 
Advisor(s):  Meyn, Sean P. 
Contributor(s):  Meyn, Sean P. 
Department / Program:  Electrical & Computer Eng 
Discipline:  Electrical & Computer Engr 
Degree Granting Institution:  University of Illinois at UrbanaChampaign 
Degree:  M.S. 
Genre:  Thesis 
Subject(s):  KullbackLeibler divergence
Hoeffding test Pinsker's inequality universal hypothesis testing robust test mismatched universal test mismatched divergence detection learning variance variational representation 
Abstract:  An important challenge in detection theory is that the size of the state space may be very large. In the context of universal hypothesis testing, two important problems pertaining to the large state space that have not been addressed before are: (1) What is the impact of a large state space on the performance of tests? (2) How does one design an effective test when the state space is large? This thesis addresses these two problems by developing a generalization of KullbackLeibler (KL) mismatched divergence, called mismatched divergence. 1. We describe a drawback of the Hoeffding test: The asymptotic bias and variance of the Hoeffding test are approximately proportional to the size of the state space; thus, it performs poorly when the number of test samples is comparable to the size of state space. 2. We develop a generalization of the Hoeffding test based on the mismatched divergence, called the mismatched universal test. We show that this test has asymptotic bias and variance proportional to the dimension of the function class used to define the mismatched divergence. The dimension of the function class can be chosen to be much smaller than the size of the state space, and thus our proposed test has a better finitesample performance in terms of bias and variance. 3. We demonstrate that the mismatched universal test also has an advantage when the distribution of the null hypothesis is learned from data. 4. We develop some algebraic properties and geometric interpretations of the mismatched divergence. We also show its connection to a robust test. 5. We develop a generalization of Pinsker’s inequality, which gives a lower bound of the mismatched divergence. 
Issue Date:  20100106 
URI:  http://hdl.handle.net/2142/14758 
Rights Information:  Copyright 2009 Dayu Huang 
Date Available in IDEALS:  20100106 20120107 
Date Deposited:  December 2 
This item appears in the following Collection(s)

Dissertations and Theses  Electrical and Computer Engineering
Dissertations and Theses in Electrical and Computer Engineering 
Graduate Dissertations and Theses at Illinois
Graduate Theses and Dissertations at Illinois