Vui lòng dùng định danh này để trích dẫn hoặc liên kết đến tài liệu này:
http://thuvienso.vanlanguni.edu.vn/handle/Vanlang_TV/32683
Nhan đề: | New Developments in Statistical Information Theory Based on Entropy and Divergence Measures |
Tác giả: | Pardo, Leandro(editor) |
Từ khoá: | New Developments Based on Entropy Divergence Measures Divergence measures |
Năm xuất bản: | 2019 |
Nhà xuất bản: | MDPI |
Tóm tắt: | "The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. In fact, the so-called Statistical Information Theory has been the subject of much statistical research over the last fifty years. Minimum divergence estimators or minimum distance estimators have been used successfully in models for continuous and discrete data due to its robustness properties. Divergence statistics, i.e., those ones obtained by replacing either one or both arguments in the measures of divergence by suitable estimators, have become a very good alternative to the classical likelihood ratio test in both continuous and discrete models, as well as to the classical Pearson-type statistic in discrete models. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, enjoy several optimum asymptotic properties but they are highly non-robust in case of model misspecification under presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Thus, the practical importance of a robust test procedure is beyond doubt; and it is helpful for solving several real-life problems containing some outliers in the observed sample. For this reason in the last years a robust version of the classical Wald test statistic, for testing simple and composite null hypotheses for general parametric models, have been introduced and studied for different problems in the statistical literature. These test statistics are based on minimum divergence estimators instead of the maximum likelihood and have been considered in many different statistical problems: Censoring, equality of means in normal and lognormal models, logistic regression models in particular and GLM models in general, etc. The scope of the contributions to this Special Issue will be to present new and original research based on minimum divergence estimators and divergence statistics, from a theoretical and applied point of view, in different statistical problem with special emphasis on efficiency and robustness. Manuscripts summarizing the most recent state-of-the-art of these topics will also be welcome." |
Mô tả: | https://doi.org/10.3390/books978-3-03897-937-1 CC BY-NC-ND. |
Định danh: | http://thuvienso.vanlanguni.edu.vn/handle/Vanlang_TV/32683 |
ISBN: | 9783038979371 |
Bộ sưu tập: | Khoa học cơ bản_TLNM_SACH |
Các tập tin trong tài liệu này:
Khi sử dụng các tài liệu trong Thư viện số phải tuân thủ Luật bản quyền.