Feedback

X
New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

en

0 Ungluers have Faved this Work
This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.

This book is included in DOAB.

Why read this book? Have your say.

You must be logged in to comment.

Rights Information

Are you the author or publisher of this work? If so, you can claim it as yours by registering as an Unglue.it rights holder.

Downloads

This work has been downloaded 129 times via unglue.it ebook links.
  1. 56 - pdf (CC BY-NC-ND) at Unglue.it.
  2. 45 - pdf (CC BY-NC-ND) at res.mdpi.com.

Keywords

  • 2-alternating capacities
  • ?-divergence
  • association models
  • asymptotic normality
  • asymptotic property
  • Bayesian nonparametric
  • Bayesian semi-parametric
  • Berstein von Mises theorem
  • bootstrap distribution estimator
  • Bregman divergence
  • Bregman information
  • centroid
  • Chernoff Stein lemma
  • composite hypotheses
  • composite likelihood
  • composite minimum density power divergence estimator
  • compressed data
  • consistency
  • correlation models
  • corrupted data
  • divergence
  • divergence based testing
  • divergence measure
  • Efficiency
  • general linear model
  • generalized linear model
  • generalized renyi entropy
  • goodness-of-fit
  • Hellinger distance
  • Hölder divergence
  • hypothesis testing
  • indoor localization
  • influence function
  • Information geometry
  • iterated limits
  • Kullback-Leibler distance
  • least-favorable hypotheses
  • local-polynomial regression
  • location-scale family
  • log-linear models
  • logarithmic super divergence
  • maximum composite likelihood estimator
  • measurement errors
  • minimum disparity methods
  • minimum divergence inference
  • minimum divergence methods
  • minimum penalized ?-divergence estimator
  • misspecified hypothesis and alternative
  • mixture index of fit
  • MM algorithm
  • model assessment
  • model check
  • n/a
  • Neyman Pearson test
  • non-quadratic distance
  • nonparametric test
  • ordinal classification variables
  • quasi-likelihood
  • relative entropy
  • relative error estimation
  • representation formula
  • robust
  • robust estimation
  • robust testing
  • robustness
  • semiparametric model
  • single index model
  • sparse
  • statistical distance
  • thematic quality assessment
  • total variation
  • two-sample test
  • Wald statistic
  • Wald test statistic
  • Wald-type test
  • Wald-type test statistics

Links

DOI: 10.3390/books978-3-03897-937-1

Editions

edition cover
edition cover
edition cover

Share

Copy/paste this into your site: