Jeżeli nie znalazłeś poszukiwanej książki, skontaktuj się z nami wypełniając formularz kontaktowy.

Ta strona używa plików cookies, by ułatwić korzystanie z serwisu. Mogą Państwo określić warunki przechowywania lub dostępu do plików cookies w swojej przeglądarce zgodnie z polityką prywatności.

Wydawcy

Literatura do programów

Informacje szczegółowe o książce

Statistical Implications of Turings Formula - ISBN 9781119237068

Statistical Implications of Turings Formula

ISBN 9781119237068

Autor: Zhiyi Zhang

Wydawca: Wiley

Dostępność: 3-6 tygodni

Cena: 583,80 zł

Przed złożeniem zamówienia prosimy o kontakt mailowy celem potwierdzenia ceny.


ISBN13:      

9781119237068

ISBN10:      

1119237068

Autor:      

Zhiyi Zhang

Oprawa:      

Hardback

Rok Wydania:      

2016-12-27

Ilość stron:      

296

Wymiary:      

249x158

Tematy:      

PB

Features a broad introduction to recent research on Turing s formula and presents modern applications in statistics, probability, information theory, and other areas of modern data science

Turing′s formula is, perhaps, the only known method for estimating the underlying distributional characteristics beyond the range of observed data without making any parametric or semiparametric assumptions. This book presents a clear introduction to Turing s formula and its connections to statistics. Topics with relevance to a variety of different fields of study are included such as information theory; statistics; probability; computer science inclusive of artificial intelligence and machine learning; big data; biology; ecology; and genetics. The author provides examinations of many core statistical issues within modern data science from Turing′s perspective. A systematic approach to long–standing problems such as entropy and mutual information estimation, diversity index estimation, domains of attraction on general alphabets, and tail probability estimation is presented in light of the most up–to–date understanding of Turing′s formula. Featuring numerous exercises and examples throughout, the author provides a summary of the known properties of Turing′s formula and explains how and when it works well; discusses the approach derived from Turing′s formula in order to estimate a variety of quantities, all of which mainly come from information theory, but are also important for machine learning and for ecological applications; and uses Turing′s formula to estimate certain heavy–tailed distributions.

In summary, this book:

Features a unified and broad presentation of Turing s formula, including its connections to statistics, probability, information theory, and other areas of modern data science

Provides a presentation on the statistical estimation of information theoretic quantities

Demonstrates the estimation problems of several statistical functions from Turing′s perspective such as Simpson′s indices, Shannon′s entropy, general diversity indices, mutual information, and Kullback Leibler divergence

Includes numerous exercises and examples throughout with a fundamental perspective on the key results of Turing s formula

Statistical Implications of Turing′s Formula is an ideal reference for researchers and practitioners who need a review of the many critical statistical issues of modern data science. This book is also an appropriate learning resource for biologists, ecologists, and geneticists who are involved with the concept of diversity and its estimation and can be used as a textbook for graduate courses in mathematics, probability, statistics, computer science, artificial intelligence, machine learning, big data, and information theory.

Zhiyi Zhang, PhD, is Professor of Mathematics and Statistics at The University of North Carolina at Charlotte. He is an active consultant in both industry and government on a wide range of statistical issues, and his current research interests include Turing′s formula and its statistical implications; probability and statistics on countable alphabets; nonparametric estimation of entropy and mutual information; tail probability and biodiversity indices; and applications involving extracting statistical information from low–frequency data space. He earned his PhD in Statistics from Rutgers University.



Contents

Dedication

Preface

Chapter 1: Turing′s formula

1.1 Turing′s Formula

1.2 Univariate Normal Laws

1.3 Multivariate Normal Laws

1.4 Turing′s formula Augmented

1.5 Goodness–of–—t by Counting Zeros

1.6 Remarks

1.7 Exercises

Chapter 2: Estimation of Simpson′s indices

2.1 Generalized Simpson′s indices

2.2 Estimation of Simpson′s indices

2.3 Normal Laws

2.4 Illustrative Examples

2.5 Remarks

2.6 Exercises

Chapter 3: Estimation of Shannon′s entropy

3.1 A Brief Overview

3.2 The Plug–in Entropy Estimator

3.2.1 When K is Finite

3.2.2 When K is Countably Infinite

3.3 Entropy Estimator in Turing′s Perspective

3.3.1 When K is Finite

3.3.2 When K is Countably Infinite

3.4 Appendix

3.4.1 Proof of Lemma 3.2

3.4.2 Proof of Lemma 3.5

3.4.3 Proof of Corollary 3.5

3.4.4 Proof of Lemma 3.14

3.4.5 Proof of Lemma 3.18

3.5 Remarks

3.6 Exercises

Chapter 4: Estimation of Diversity Indices

4.1 A Unified Perspective on Diversity Indices

4.2 Estimation of Linear Diversity Indices

4.3 Estimation of Renyi′s Entropy

4.4 Remarks

4.5 Exercises

Chapter 5: Estimation of Information

5.1 Introduction

5.2 Estimation of Mutual Information

5.2.1 The Plug–in Estimator

5.2.2 Estimation in Turing′s Perspective

5.2.3 Estimation of Standardized Mutual Information

5.2.4 An Illustrative Example

5.3 Estimation of Kullback–Leibler Divergence

5.3.1 The Plug–in Estimator

5.3.2 Estimation in Turing′s Perspective

5.3.3 Symmetrized Kullback–Leibler Divergence

5.4 Tests of Hypotheses

5.5 Appendix

5.5.1 Proof of Theorem 5.12

5.6 Exercises

Chapter 6: Domains of Attraction on Countable Alphabets

6.1 Introduction

6.2 Domains of Attraction

6.3 Examples and Remarks

6.4 Appendix

6.4.1 Proof of Lemma 6.3

6.4.2 Proof of Theorem 6.2

6.4.3 Proof of Lemma 6.6

6.5 Exercises

Chapter 7: Estimation of Tail Probability

7.1 Introduction

7.2 Estimation of Pareto Tail

7.3 Statistical Properties of AMLE

7.4 Remarks

7.5 Appendix

7.5.1 Proof of Lemma 7.7

7.5.2 Proof of Lemma 7.9

7.6 Exercises

Appendix

Bibliography

Index



Zhiyi Zhang, PhD, is Professor of Mathematics and Statistics at The University of North Carolina at Charlotte. He is an active consultant in both industry and government on a wide range of statistical issues, and his current research interests include Turing′s formula and its statistical implications; probability and statistics on countable alphabets; nonparametric estimation of entropy and mutual information; tail probability and biodiversity indices; and applications involving extracting statistical information from low–frequency data space. He earned his PhD in Statistics from Rutgers University.

Koszyk

Książek w koszyku: 0 szt.

Wartość zakupów: 0,00 zł

ebooks
covid

Kontakt

Gambit
Centrum Oprogramowania
i Szkoleń Sp. z o.o.

Al. Pokoju 29b/22-24

31-564 Kraków


Siedziba Księgarni

ul. Kordylewskiego 1

31-542 Kraków

+48 12 410 5991

+48 12 410 5987

+48 12 410 5989

Zobacz na mapie google

Wyślij e-mail

Subskrypcje

Administratorem danych osobowych jest firma Gambit COiS Sp. z o.o. Na podany adres będzie wysyłany wyłącznie biuletyn informacyjny.

Autoryzacja płatności

PayU

Informacje na temat autoryzacji płatności poprzez PayU.

PayU banki

© Copyright 2012: GAMBIT COiS Sp. z o.o. Wszelkie prawa zastrzeżone.

Projekt i wykonanie: Alchemia Studio Reklamy