Explore
Information and Divergence Measures
0 Ungluers have
Faved this Work
Login to Fave
The concept of distance is important for establishing the degree of similarity and/or closeness between functions, populations, or distributions. As a result, distances are related to inferential statistics, including problems related to both estimation and hypothesis testing, as well as modelling with applications in regression analysis, multivariate analysis, actuarial science, portfolio optimization, survival analysis, reliability theory, and many other areas. Thus, entropy and divergence measures are always a central concern for scientists, researchers, medical experts, engineers, industrial managers, computer experts, data analysts, and other professionals. This reprint focuses on recent developments in information and divergence measures and presents new theoretical issues as well as solutions to important practical problems and case studies illustrating the great applicability of these innovative techniques and methods. The contributions in this reprint highlight the diversity of topics in this scientific field.
This book is included in DOAB.
Why read this book? Have your say.
You must be logged in to comment.
Rights Information
Are you the author or publisher of this work? If so, you can claim it as yours by registering as an Unglue.it rights holder.Downloads
This work has been downloaded 29 times via unglue.it ebook links.
- 29 - pdf (CC BY) at Unglue.it.
Keywords
- academic evaluation
- Awad entropy
- bi-logistic growth
- bootstrap discrepancy comparison probability (BDCP)
- Clustering
- concomitants
- conditional independence
- COVID-19 data
- cross tabulations
- Differential Geometry
- discrepancy comparison probability (DCP)
- divergence-based tests
- divergences
- double index divergence test statistic
- empirical survival Jensen–Shannon divergence
- epidemic waves
- exponential family
- extremal combinatorics
- FGM family
- Fisher information
- Fisher–Tsallis information number
- geodesic
- GOS
- graphs
- Han’s inequality
- information inequalities
- joint entropy
- Kaniadakis logarithm
- Kolmogorov–Smirnov two-sample test
- Kullback–Leibler divergence
- Kullback–Leibler divergence (KLD)
- Lauricella D-hypergeometric series
- likelihood ratio test (LRT)
- LPI
- Mathematics & science
- minimum Rényi’s pseudodistance estimators
- model selection
- moment condition models
- multiple power series
- Multivariate Cauchy distribution (MCD)
- multivariate data analysis
- multivariate Gaussian
- n/a
- p-value
- passive interception systems
- past entropy
- Physics
- polymatroid
- radar waveform
- rank function
- Rao-type tests
- Reference, information & interdisciplinary subjects
- Rényi’s pseudodistance
- Research & information: general
- residual entropy
- restricted minimum Rényi’s pseudodistance estimators
- robustness
- set function
- Shannon entropy
- Shearer’s lemma
- skew logistic distribution
- statistical divergence
- statistical K-means
- statistical manifold
- submodularity
- transversality
- truncated exponential family
- truncated normal distributions
- Tsallis divergence
- Tsallis entropy
- Tsallis logarithm
- weighted Kaniadakis divergence
- weighted Tsallis divergence