Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence

Authors
Noh, Yung-KyunSugiyama, MasashiLiu, Songdu Plessis, Marthinus C.Park, Frank ChongwooLee, Daniel D.
Issue Date
Jan-2018
Publisher
MIT PRESS
Citation
NEURAL COMPUTATION, v.30, no.7, pp.1930 - 1960
Indexed
SCIE
SCOPUS
Journal Title
NEURAL COMPUTATION
Volume
30
Number
7
Start Page
1930
End Page
1960
URI
https://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/150666
DOI
10.1162/neco_a_01092
ISSN
0899-7667
Abstract
Nearest-neighbor estimators for the Kullback-Leiber (KL) divergence that are asymptotically unbiased have recently been proposed and demonstrated in a number of applications. However, with a small number of samples, nonparametric methods typically suffer from large estimation bias due to the nonlocality of information derived from nearest-neighbor statistics. In this letter, we show that this estimation bias can be mitigated by modifying the metric function, and we propose a novel method for learning a locally optimal Mahalanobis distance function from parametric generative models of the underlying density distributions. Using both simulations and experiments on a variety of data sets, we demonstrate that this interplay between approximate generative models and nonparametric techniques can significantly improve the accuracy of nearest-neighbor-based estimation of the KL divergence.
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 컴퓨터소프트웨어학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Noh, Yung Kyun photo

Noh, Yung Kyun
COLLEGE OF ENGINEERING (SCHOOL OF COMPUTER SCIENCE)
Read more

Altmetrics

Total Views & Downloads

BROWSE