蒙特卡羅統計方法

圖書信息

出版社: 世界圖書出版公司; 第2版 (2009年10月1日)
外文書名: Monte Carlo Statistical Methods (2nd Edition)
平裝: 645頁
正文語種: 英語
開本: 24
ISBN: 9787510005114, 7510005116
條形碼: 9787510005114
尺寸: 22 x 15 x 3 cm
重量: 980 g

作者簡介

作者:(法國)羅伯特(Christian P.Robert) (法國)George Casella

內容簡介

《蒙特卡羅統計方法(第2版)(英文版)》內容簡介:Introduction、Statistical Models、Likelihood Methods、Bayesian Methods、Deterministic Numerical Methods、Optimization、Integration、Comparison、Problems、Notes、Prior Distributions、bootstrap Methods、Random Variable Generation、Introduction、Uniform Simulation、The Inverse Transform、Alternatives、Optimal Algorithms、General Transformation Methods、Accept-Reject Methods、The Fundamental Theorem of Simulation、The Accept-Reject Algorithm、Envelope Accept-Reject Methods、The Squeeze Principle、Log-concave Densities等等。

目錄

Preface to the Second Edition
Preface to the First Edition
1 Introduction
1.1 Statistical Models
1.2 Likelihood Methods
1.3 Bayesian Methods
1.4 Deterministic Numerical Methods
1.4.1 Optimization
1.4.2 Integration
1.4.3 Comparison
1.5 Problems
1.6 Notes
1.6.1 Prior Distributions
1.6.2 Bootstrap Methods
2 Random Variable Generation
2.1 Introduction
2.1.1 Uniform Simulation
2.1.2 The Inverse Transform
2.1.3 Alternatives
2.1.4 Optimal Algorithms
2.2 General Transformation Methods
2.3 Accept-Reject Methods
2.3.1 The Fundamental Theorem of Simulation
2.3.2 The Accept-Reject Algorithm
2.4 Envelope Accept-Reject Methods
2.4.1 The Squeeze Principle
2.4.2 Log-Concave Densities
2.5 Problems
2.6 Notes
2.6.1 The Kiss Generator
2.6.2 Quasi-Monte Carlo Methods
2.6.3 Mixture RepresentatiOnS
3 Monte Carlo Integration
3.1 IntroduCtion
3.2 Classical Monte Carlo Integration
3.3 Importance Sampling
3.3.1 Principles
3.3.2 Finite Variance Estimators
3.3.3 Comparing Importance Sampling with Accept-Reject
3.4 Laplace Approximations
3.5 Problems
3.6 Notes
3.6.1 Large Deviations Techniques
3.6.2 The Saddlepoint Approximation
4 Controling Monte Carlo Variance
4.1 Monitoring Variation with the CLT
4.1.1 Univariate Monitoring
4.1.2 Multivariate Monitoring
4.2 rao-Blackwellization
4.3 Riemann Approximations
4.4 Acceleration Methods
4.4.1 Antithetic Variables
4.4.2 Contr01 Variates
4.5 Problems
4.6 Notes
4.6.1 Monitoring Importance Sampling Convergence
4.6.2 Accept-Reject with Loose Bounds
4.6.3 Partitioning
5 Monte Carlo Optimization
5.1 Introduction
5.2 Stochastic Exploration
5.2.1 A Basic Solution
5.2.2 Gradient Methods
5.2.3 Simulated Annealing
5.2.4 Prior Feedback
5.3 Stochastic Approximation
5.3.1 Missing Data Models and Demarginalization
5.3.2 Thc EM Algorithm
5.3.3 Monte Carlo EM
5.3.4 EM Standard Errors
5.4 Problems
5.5 Notes
5.5.1 Variations on EM
5.5.2 Neural Networks
5.5.3 The Robbins-Monro procedure
5.5.4 Monte Carlo Approximation
6 Markov Chains
6.1 Essentials for MCMC
6.2 Basic Notions
6.3 Irreducibility,Atoms,and Small Sets
6.3.1 Irreducibility
6.3.2 Atoms and Small Sets
6.3.3 Cycles and Aperiodicity
6.4 Transience and recurrence
6.4.1 Classification of irreducible Chains
6.4.2 Criteria for Recurrence
6.4.3 Harris Recurrence
6.5 Invariant Measures
6.5.1 Stationary Chains
6.5.2 Kac’s Theorem
6.5.3 Reversibility and the Detailed Balance Condition
6.6 Ergodicity and Convergence
6.611 Ergodicity
6.6.2 Geometric Convergence
6.6.3 Uniform Ergodicity
6.7 Limit Theorems
6.7.1 Ergodic Theorems
6.7.2 Central Limit Theorems
6.8 Problems
6.9 Notes
6.9.1 Dri允Conditions
6.9.2 Eaton’S Admissibility Condition
6.9.3 Alternative Convergence Conditions
6.9.4 Mixing Conditions and Central Limit Theorems
6.9.5 Covariance in Markov Chains
7 The Metropolis-Hastings Algorithm
7.1 The MCMC Principle
7.2 Monte Carlo Methods Based on Markov Chains
7.3 The Metropolis-Hastings algorithm
7.3.1 Definition
7.3.2 Convergence Properties
7.4 The Independent Metropolis-Hastings Algorithm
7.4.1 Fixed Proposals
7.4.2 A Metropolis-Hastings Version of ARS
7.5 Random walks
7.6 Optimization and Contr01
7.6.1 Optimizing the Acceptance Rate
7.6.2 Conditioning and Accelerations
7.6.3 Adaptive Schemes
7.7 Problems
7.8 Nores
7.8.1 Background of the Metropolis Algorithm
7.8.2 Geometric Convergence of Metropolis-Hastings Algorithms
7.8.3 A Reinterpretation of Simulated Annealing
7.8.4 RCference Acceptance Rates
7.8.5 Langevin Algorithms
8 The Slice Sampler
8.1 Another Look at the Fundamental Theorem
8.2 The General Slice Sampler
8.3 Convergence Properties of the Slice Sampler
8.4 Problems
8.5 Notes
8.5.1 Dealing with Di伍cult Slices
9 The Two-Stage Gibbs Sampler
9.1 A General Class of Two-Stage Algorithms
9.1.1 From Slice Sampling to gibbs sampling
9.1.2 Definition
9.1.3 Back to the Slice Sampler
9.1.4 The Hammersley-Clifford Theorem
9.2 Fundamental Properties
9.2.1 Probabilistic Structures
9.2.2 Reversible and Interleaving Chains
9.2.3 The Duality Principle
9.3 Monotone Covariance and Rao-Btackwellization
9.4 The EM-Gibbs Connection
9.5 Transition
9.6 Problems
9.7 Notes
9.7.1 Inference for Mixtures
9.7.2 ARCH Models
10 The Multi-Stage Gibbs Sampler
10.1 Basic Derivations
10.1.1 Definition
10.1.2 Completion
……
11 Variable Dimension Models and Reversible Jump Algorithms
12 Diagnosing Convergence
13 Perfect Sampling
14 Iterated and Sequential Importance Sampling
A Probability Distributions
B Notation
References
Index of Names
Index of Subjects

相關詞條

熱門詞條

聯絡我們