Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence

Full metadata record
DC Field Value Language
dc.contributor.authorNoh, Yung-Kyun-
dc.contributor.authorSugiyama, Masashi-
dc.contributor.authorLiu, Song-
dc.contributor.authordu Plessis, Marthinus C.-
dc.contributor.authorPark, Frank Chongwoo-
dc.contributor.authorLee, Daniel D.-
dc.date.accessioned2022-07-12T17:11:35Z-
dc.date.available2022-07-12T17:11:35Z-
dc.date.created2021-05-14-
dc.date.issued2018-01-
dc.identifier.issn0899-7667-
dc.identifier.urihttps://scholarworks.bwise.kr/hanyang/handle/2021.sw.hanyang/150666-
dc.description.abstractNearest-neighbor estimators for the Kullback-Leiber (KL) divergence that are asymptotically unbiased have recently been proposed and demonstrated in a number of applications. However, with a small number of samples, nonparametric methods typically suffer from large estimation bias due to the nonlocality of information derived from nearest-neighbor statistics. In this letter, we show that this estimation bias can be mitigated by modifying the metric function, and we propose a novel method for learning a locally optimal Mahalanobis distance function from parametric generative models of the underlying density distributions. Using both simulations and experiments on a variety of data sets, we demonstrate that this interplay between approximate generative models and nonparametric techniques can significantly improve the accuracy of nearest-neighbor-based estimation of the KL divergence.-
dc.language영어-
dc.language.isoen-
dc.publisherMIT PRESS-
dc.titleBias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence-
dc.typeArticle-
dc.contributor.affiliatedAuthorNoh, Yung-Kyun-
dc.identifier.doi10.1162/neco_a_01092-
dc.identifier.scopusid2-s2.0-85048930046-
dc.identifier.wosid000435657600006-
dc.identifier.bibliographicCitationNEURAL COMPUTATION, v.30, no.7, pp.1930 - 1960-
dc.relation.isPartOfNEURAL COMPUTATION-
dc.citation.titleNEURAL COMPUTATION-
dc.citation.volume30-
dc.citation.number7-
dc.citation.startPage1930-
dc.citation.endPage1960-
dc.type.rimsART-
dc.type.docType정기학술지(Article(Perspective Article포함))-
dc.description.journalClass1-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science, Artificial IntelligenceNeurosciences-
dc.relation.journalWebOfScienceCategoryComputer ScienceNeurosciences & Neurology-
dc.subject.keywordPlusFEATURE-SELECTION-
dc.subject.keywordPlusGENE-EXPRESSION-
dc.subject.keywordPlusINFORMATION-
dc.subject.keywordPlusRELEVANCE-
dc.identifier.urlhttps://direct.mit.edu/neco/article-abstract/30/7/1930/8407/Bias-Reduction-and-Metric-Learning-for-Nearest?redirectedFrom=fulltext-
Files in This Item
Go to Link
Appears in
Collections
서울 공과대학 > 서울 컴퓨터소프트웨어학부 > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Noh, Yung Kyun photo

Noh, Yung Kyun
COLLEGE OF ENGINEERING (SCHOOL OF COMPUTER SCIENCE)
Read more

Altmetrics

Total Views & Downloads

BROWSE