Information Geometry (Volume 45) (Handbook of Statistics, Volume 45)

دانلود کتاب Information Geometry (Volume 45) (Handbook of Statistics, Volume 45)

38000 تومان موجود

کتاب هندسه اطلاعات (جلد 45) (دفترچه راهنمای آمار، جلد 45) نسخه زبان اصلی

دانلود کتاب هندسه اطلاعات (جلد 45) (دفترچه راهنمای آمار، جلد 45) بعد از پرداخت مقدور خواهد بود
توضیحات کتاب در بخش جزئیات آمده است و می توانید موارد را مشاهده فرمایید


این کتاب نسخه اصلی می باشد و به زبان فارسی نیست.


امتیاز شما به این کتاب (حداقل 1 و حداکثر 5):

امتیاز کاربران به این کتاب:        تعداد رای دهنده ها: 5


توضیحاتی در مورد کتاب Information Geometry (Volume 45) (Handbook of Statistics, Volume 45)

نام کتاب : Information Geometry (Volume 45) (Handbook of Statistics, Volume 45)
ویرایش : 1
عنوان ترجمه شده به فارسی : هندسه اطلاعات (جلد 45) (دفترچه راهنمای آمار، جلد 45)
سری :
نویسندگان : , ,
ناشر : North Holland
سال نشر : 2021
تعداد صفحات : 250
ISBN (شابک) : 0323855679 , 9780323855679
زبان کتاب : English
فرمت کتاب : pdf
حجم کتاب : 7 مگابایت



بعد از تکمیل فرایند پرداخت لینک دانلود کتاب ارائه خواهد شد. درصورت ثبت نام و ورود به حساب کاربری خود قادر خواهید بود لیست کتاب های خریداری شده را مشاهده فرمایید.


فهرست مطالب :


Front Cover
Information Geometry
Copyright
Contents
Contributors
Preface
Section I: Foundations of information geometry
Chapter 1: Revisiting the connection between Fisher information and entropy\'s rate of change
1. Introduction
2. Fisher information and Cramer–Rao inequality
3. Fisher information and the rate of change of Boltzmann–Gibbs entropy
3.1. Brownian particle with constant drag force
3.2. Systems described by an N-dimensional Fokker–Planck equation
4. Possible lines for future research
5. Conclusions
References
Chapter 2: Pythagoras theorem in information geometry and applications to generalized linear models
1. Introduction
2. Pythagoras theorems in information geometry
3. Power entropy and divergence
4. Linear regression model
5. Generalized linear model
6. Discussion
References
Further reading
Chapter 3: Rao distances and conformal mapping
1. Introduction
2. Manifolds
2.1. Conformality between two regions
3. Rao distance
4. Conformal mapping
5. Applications
Acknowledgments
References
Chapter 4: Cramer-Rao inequality for testing the suitability of divergent partition functions
1. Introduction
2. A first illustrative example
2.1. Evaluation of the partition function
2.2. Instruction manual for using our procedure
2.3. Evaluation of r
2.4. Dealing with r2
2.5. Obtaining fisher information measure
2.6. The six steps to obtain a finite Fisher\'s information
2.7. Cramer-Rao inequality (CRI)
2.8. Numerical example
3. A Brownian motion example
3.1. The present partition function
3.2. Mean values of x-powers
3.3. Tackling fisher
3.4. The present Cramer-Rao inequality
4. The harmonic oscillator (HO) in Tsallis statistics
4.1. The HO-Tsallis partition function
4.2. HO-Tsallis mean values for r2
4.3. Mean value of r
4.4. Variance V
4.5. The HO-Tsallis Fisher information measure
5. Failure of the Boltzmann-Gibbs (BG) statistics for Newton\'s gravitation
5.1. Tackling Znu
5.2. Mean values derived from our partition function (PP)
5.2.1. r-Value
5.2.2. The r2 instance
5.3. Variance Deltar = r2-r2
5.4. Gravitational FIM
5.5. Incompatibility between Boltzmann-Gibbs statistics (BGS) and long-range interactions
6. Statistics of gravitation in Tsallis statistics
6.1. Gravity-Tsallis partition function
6.2. Gravity-Tsallis mean values for r and r2
6.3. Tsallis Gravity treatment and Fisher\'s information measure
6.4. Tsallis Gravity treatment and Cramer-Rao inequality (CRI)
7. Conclusions
References
Chapter 5: Information geometry and classical Cramér–Rao-type inequalities
1. Introduction
2. I-divergence and Iα-divergence
2.1. Extension to infinite X
2.2. Bregman vs Csiszár
2.3. Classical vs quantum CR inequality
3. Information geometry from a divergence function
3.1. Information geometry for α-CR inequality
3.2. An α-version of Cramér–Rao inequality
3.3. Generalized version of Cramér–Rao inequality
4. Information geometry for Bayesian CR inequality and Barankin bound
5. Information geometry for Bayesian α-CR inequality
6. Information geometry for Hybrid CR inequality
7. Summary
Acknowledgments
Appendix
A.1. Other generalizations of Cramér–Rao inequality
References
Section II: Theoretical applications and physics
Chapter 6: Principle of minimum loss of Fisher information, arising from the Cramer-Rao inequality: Its role i
1. Introduction (Fisher, 1922; Frieden 1998, 2004; Frieden and Gatenby, 2019)
1.1. On learning, energy, sensory messages
1.2. On variational approaches
1.3. Vital role played by information
2. Overview and comparisons of applications
2.1. Classical dynamics (Frieden, 1998, 2004; Frieden and Gatenby, 2007)
2.2. Quantum physics (Frieden, 1998, 2004)
2.3. Biology (Darwin, 1859; Fisher, 1922; Frieden and Gatenby, 2020; Gatenby and Frieden, 2016; Hodgkin and Huxley, 1952)
2.4. Thermodynamics (Frieden et al., 1999)
2.5. Extending use of the principle of natural selection (Popper, 1963)
2.6. From biological cell to earth to solar system, galaxy, universe, and multiverse
2.7. Creation of a multiverse (Popper, 1963) by requiring its Fisher I to be maximized
2.8. Analogy of a cancer ``universe´´
2.9. What ultimately causes a multiverse to form?
2.10. Is there empirical evidence for a multiverse having formed?
2.11. Details of the process of growing successive universes (Frieden and Gatenby, 2019)
2.12. How many universes N might exist in the multiverse?
2.13. Annihilation of universes
2.14. Growth of a bubble of nothing
2.15. Counter-growth of new universes
2.16. Possibility of many annihilation waves
2.17. How large a number N of universes exist (Linde and Vanchurin, 2010)?
2.18. Is the multiverse merely a theoretical construct?
2.19. Should the fact that we do not, and have not observed life elsewhere in our universe affect a belief that we exist ...
3. Derivation of principle of maximum Fisher information (MFI)
3.1. Cramer-Rao (C-R) inequality (Frieden, 1998, 2004; Frieden and Gatenby, 2020)
3.2. On derivation of the C-R inequality
3.3. What do such data values (augmented by knowledge of a single equality obeyed by the system physics) have to say abou ...
3.3.1. Dependence of system knowledge on the arbitrary nature of forming the data
3.3.2. Dependence on dimensionality
3.3.3. Dependence of system complexity (or order) upon Fisher I.
4. Kantian view of Fisher information use to predict a physical law
4.1. How principle of maximum information originates with Kant
4.2. On significance of the information difference I-J
5. Principle of minimum loss of Fisher information
5.1. Verifying that minimum loss is actually achieved by the principle
5.2. Summary and foundations of the Fisher approach to knowledge acquisition
5.3. What is accomplished by use of the Fisher approach
6. Commonality of information-based growths of cancer and viral infections
6.1. MFI applied to early cancer growth
6.2. Later-stage cancer growth
6.3. MFI applied to early covid-19 growth
6.4. Common biological causes of cancer- and covid-19 growth; the ACE2 link
References
Chapter 7: Quantum metrology and quantum correlations
1. Quantum correlations
2. Parameter estimation
3. Cramer–Rao bound
4. Quantum Fisher information
5. Quantum correlations in estimation theory
5.1. Heisenberg limit
5.2. Interferometric power
6. Conclusion
References
Chapter 8: Information, economics, and the Cramér-Rao bound
1. Introduction
2. Shannon entropy and Fisher information
3. Financial economics
3.1. Discount factors and bonds
3.2. Derivative securities
4. Macroeconomics
5. Discussion and summary
Acknowledgments
References
Chapter 9: Zipf\'s law results from the scaling invariance of the Cramer–Rao inequality
1. Introduction
2. Our goal
3. Fisher\'s information measure (FIM) and its minimization
4. Derivation of Zipf\'s law
5. Zipf plots
6. Summary
References
Further reading
Section III: Advanced statistical theory
Chapter 10: λ-Deformed probability families with subtractive and divisive normalizations
1. Introduction
1.1. Deformation models
1.2. Deformed probability families: General approach
1.3. Chapter outline
2. λ-Deformation of exponential and mixture families
2.1. λ-Deformation
2.2. Deformation: Subtractive approach
2.3. Deformation: Divisive approach
2.4. Relation between the two normalizations
2.5. λ-Exponential and λ-mixture families
3. Deforming Legendre duality: λ-Duality
3.1. From Bregman divergence to λ-logarithmic divergence
3.2. λ-Deformed Legendre duality
3.3. Relationship between λ-conjugation and Legendre conjugation
3.4. Information geometry of λ-logarithmic divergence
4. λ-Deformed entropy and divergence
4.1. Relation between potential functions and Rényi entropy
4.2. Relation between λ-logarithmic divergence and Rényi divergence
4.3. Entropy maximizing property of λ-exponential family
5. Example: λ-Deformation of the probability simplex
5.1. λ-Exponential representation
5.2. λ-Mixture representation
6. Summary and conclusion
References
Chapter 11: Some remarks on Fisher information, the Cramer–Rao inequality, and their applications to physics
1. Introduction
2. Diffusion equation
3. Connection with Tsallis statistics
4. Conclusions
Appendix
A.1. The Cramer–Rao bound (Frieden, 1989)
References
Index
Back Cover




پست ها تصادفی