Vui lòng dùng định danh này để trích dẫn hoặc liên kết đến tài liệu này: http://thuvienso.vanlanguni.edu.vn/handle/Vanlang_TV/32683
Nhan đề: New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
Tác giả: Pardo, Leandro(editor)
Từ khoá: New Developments Based on Entropy
Divergence Measures
Divergence measures
Năm xuất bản: 2019
Nhà xuất bản: MDPI
Tóm tắt: "The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. In fact, the so-called Statistical Information Theory has been the subject of much statistical research over the last fifty years. Minimum divergence estimators or minimum distance estimators have been used successfully in models for continuous and discrete data due to its robustness properties. Divergence statistics, i.e., those ones obtained by replacing either one or both arguments in the measures of divergence by suitable estimators, have become a very good alternative to the classical likelihood ratio test in both continuous and discrete models, as well as to the classical Pearson-type statistic in discrete models. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, enjoy several optimum asymptotic properties but they are highly non-robust in case of model misspecification under presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Thus, the practical importance of a robust test procedure is beyond doubt; and it is helpful for solving several real-life problems containing some outliers in the observed sample. For this reason in the last years a robust version of the classical Wald test statistic, for testing simple and composite null hypotheses for general parametric models, have been introduced and studied for different problems in the statistical literature. These test statistics are based on minimum divergence estimators instead of the maximum likelihood and have been considered in many different statistical problems: Censoring, equality of means in normal and lognormal models, logistic regression models in particular and GLM models in general, etc. The scope of the contributions to this Special Issue will be to present new and original research based on minimum divergence estimators and divergence statistics, from a theoretical and applied point of view, in different statistical problem with special emphasis on efficiency and robustness. Manuscripts summarizing the most recent state-of-the-art of these topics will also be welcome."
Mô tả: https://doi.org/10.3390/books978-3-03897-937-1 CC BY-NC-ND.
Định danh: http://thuvienso.vanlanguni.edu.vn/handle/Vanlang_TV/32683
ISBN: 9783038979371
Bộ sưu tập: Khoa học cơ bản_TLNM_SACH

Các tập tin trong tài liệu này:
Tập tin Mô tả Kích thước Định dạng  
SA10773_1.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Cover.pdf
  Giới hạn truy cập
Cover733.58 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_2.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Copyright.pdf
  Giới hạn truy cập
Copyright538.86 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_3.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Contents.pdf
  Giới hạn truy cập
Contents518.23 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_4.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_About the editor.pdf
  Giới hạn truy cập
About the editor513.37 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_5.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 1.pdf
  Giới hạn truy cập
New Developments in Statistical Information Theory Based on Entropy and Divergence Measures599.94 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_6.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 2.pdf
  Giới hạn truy cập
A Generalized Relative (α, β)-Entropy: Geometric Properties and Applications to Robust Statistical Inference788.61 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_7.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 3.pdf
  Giới hạn truy cập
Asymptotic Properties for Methods Combining the Minimum Hellinger Distance Estimate and the Bayesian Nonparametric Density Estimate999.96 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_8.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 4.pdf
  Giới hạn truy cập
Composite Likelihood Methods Based on Minimum Density Power Divergence Estimator711.04 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_9.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 5.pdf
  Giới hạn truy cập
Composite Tests under Corrupted Data753.53 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_10.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 6.pdf
  Giới hạn truy cập
Convex Optimization via Symmetrical Hölder Divergence for a WLAN Indoor Positioning System1.42 MBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_11.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 7.pdf
  Giới hạn truy cập
Likelihood Ratio Testing under Measurement Errors631.36 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_12.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 8.pdf
  Giới hạn truy cập
Minimum Penalized φ-Divergence Estimation under Model Misspecification675.83 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_13.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 9.pdf
  Giới hạn truy cập
Non-Quadratic Distances in Model Assessment701.42 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_14.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 10.pdf
  Giới hạn truy cập
φ-Divergence in Contingency Table Analysis660.42 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_15.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 11.pdf
  Giới hạn truy cập
Robust and Sparse Regression via γ-Divergence722.67 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_16.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 12.pdf
  Giới hạn truy cập
Robust-BD Estimation and Inference for General Partially Linear Models1.15 MBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_17.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 13.pdf
  Giới hạn truy cập
Robust Estimation for the Single Index Model Using Pseudodistances723.09 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_18.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 14.pdf
  Giới hạn truy cập
Robust Inference after Random Projections via Hellinger Distance for Location-Scale Family1.16 MBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_19.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 15.pdf
  Giới hạn truy cập
Robustness Property of Robust-BD Wald-Type Test for Varying-Dimensional General Linear Models861.94 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_20.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Chapter 16.pdf
  Giới hạn truy cập
Robust Relative Error Estimation1.32 MBAdobe PDFXem/Tải về  Yêu cầu tài liệu
SA10773_21.New Developments in Statistical Information Theory Based on Entropy and Divergence Measures_Back cover.pdf
  Giới hạn truy cập
Back cover564.93 kBAdobe PDFXem/Tải về  Yêu cầu tài liệu


Khi sử dụng các tài liệu trong Thư viện số phải tuân thủ Luật bản quyền.