Feedback

X
Divergence Measures

Divergence Measures

0 Ungluers have Faved this Work
Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.

This book is included in DOAB.

Why read this book? Have your say.

You must be logged in to comment.

Rights Information

Are you the author or publisher of this work? If so, you can claim it as yours by registering as an Unglue.it rights holder.

Downloads

This work has been downloaded 48 times via unglue.it ebook links.
  1. 48 - pdf (CC BY) at Unglue.it.

Keywords

  • Augustin–Csiszár mutual information
  • Bahadur efficiency
  • Bayes risk
  • bootstrap
  • Bregman divergence
  • capacitory discrimination
  • Carlson–Levin inequality
  • chi-squared divergence
  • conditional limit theorem
  • conditional Rényi divergence
  • convexity
  • data transmission
  • difference of convex (DC) programming
  • Dimensionality Reduction
  • Discriminant analysis
  • error exponents
  • f-divergence
  • f-divergences
  • horse betting
  • hypothesis testing
  • information contraction
  • Information geometry
  • information inequalities
  • information measures
  • Jensen diversity
  • Jensen–Bregman divergence
  • Jensen–Shannon centroid
  • Jensen–Shannon divergence
  • Kelly gambling
  • Large Deviations
  • Markov chains
  • Mathematics & science
  • maximal correlation
  • maximum likelihood
  • method of types
  • minimum divergence estimator
  • mixture family
  • mutual information
  • n/a
  • Pinsker’s inequality
  • Reference, information & interdisciplinary subjects
  • relative entropy
  • Rényi divergence
  • Rényi entropy
  • Rényi mutual information
  • Research & information: general
  • skew-divergence
  • statistical divergences
  • statistical inference
  • strong data–processing inequalities
  • total variation
  • α-mutual information

Links

DOI: 10.3390/books978-3-0365-4331-4

Editions

edition cover

Share

Copy/paste this into your site: