000 06036cam a2200517Mu 4500
001 9780429202322
003 FlBoTFG
005 20240213122829.0
006 m d
007 cr cnu---unuuu
008 191130s2019 flu o 000 0 eng d
040 _aOCoLC-P
_beng
_cOCoLC-P
020 _a9780429510946
020 _a0429510942
020 _a9780429202322
_q(electronic bk.)
020 _a0429202326
_q(electronic bk.)
020 _a9780429514371
_q(electronic bk. : EPUB)
020 _a0429514379
_q(electronic bk. : EPUB)
035 _a(OCoLC)1129160328
035 _a(OCoLC-P)1129160328
050 4 _aQA276.4
072 7 _aBUS
_x061000
_2bisacsh
072 7 _aCOM
_x021030
_2bisacsh
072 7 _aMAT
_x029000
_2bisacsh
072 7 _aUFM
_2bicssc
082 0 4 _a519.50285
_223
100 1 _aZwanzig, Silvelyn.
245 1 0 _aComputer Intensive Methods in Statistics
_h[electronic resource].
260 _aBoca Raton :
_bCRC Press LLC,
_c2019.
300 _a1 online resource (227 p.)
500 _aDescription based upon print version of record.
505 0 _aCover; Half Title; Title Page; Copyright Page; Contents; Preface; Introduction; 1. Random Variable Generation; 1.1 Basic Methods; 1.1.1 Congruential Generators; 1.1.2 The KISS Generator; 1.1.3 Beyond Uniform Distributions; 1.2 Transformation Methods; 1.3 Accept-Reject Methods; 1.3.1 Envelope Accept-Reject Methods; 1.4 Problems; 2. Monte Carlo Methods; 2.1 Independent Monte Carlo Methods; 2.1.1 Importance Sampling; 2.1.2 The Rule of Thumb for Importance Sampling; 2.2 Markov Chain Monte Carlo; 2.2.1 Metropolis-Hastings Algorithm; 2.2.2 Special MCMC Algorithms; 2.2.3 Adaptive MCMC
505 8 _a2.2.4 Perfect Simulation2.2.5 The Gibbs Sampler; 2.3 Approximate Bayesian Computation Methods; 2.4 Problems; 3. Bootstrap; 3.1 General Principle; 3.1.1 Unified Bootstrap Framework; 3.1.2 Bootstrap and Monte Carlo; 3.1.3 Conditional and Unconditional Distribution; 3.2 Basic Bootstrap; 3.2.1 Plug-in Principle; 3.2.2 Why is Bootstrap Good?; 3.2.3 Example where Bootstrap Fails; 3.3 Bootstrap Confidence Sets; 3.3.1 The Pivotal Method; 3.3.2 Bootstrap Pivotal Methods; 3.3.2.1 Percentile Bootstrap Confidence Interval; 3.3.2.2 Basic Bootstrap Confidence Interval
505 8 _a3.3.2.3 Studentized Bootstrap Confidence Interval3.3.3 Transformed Bootstrap Confidence Intervals; 3.3.4 Prepivoting Confidence Set; 3.3.5 BCa-Confidence Interval; 3.4 Bootstrap Hypothesis Tests; 3.4.1 Parametric Bootstrap Hypothesis Test; 3.4.2 Nonparametric Bootstrap Hypothesis Test; 3.4.3 Advanced Bootstrap Hypothesis Tests; 3.5 Bootstrap in Regression; 3.5.1 Model-Based Bootstrap; 3.5.2 Parametric Bootstrap Regression; 3.5.3 Casewise Bootstrap in Correlation Model; 3.6 Bootstrap for Time Series; 3.7 Problems; 4. Simulation-Based Methods; 4.1 EM Algorithm; 4.2 SIMEX; 4.3 Variable Selection
505 8 _a4.3.1 F-Backward and F-Forward Procedures4.3.2 FSR-Forward Procedure; 4.3.3 SimSel; 4.4 Problems; 5. Density Estimation; 5.1 Background; 5.2 Histogram; 5.3 Kernel Density Estimator; 5.3.1 Statistical Properties; 5.3.2 Bandwidth Selection in Practice; 5.4 Nearest Neighbor Estimator; 5.5 Orthogonal Series Estimator; 5.6 Minimax Convergence Rate; 5.7 Problems; 6. Nonparametric Regression; 6.1 Background; 6.2 Kernel Regression Smoothing; 6.3 Local Regression; 6.4 Classes of Restricted Estimators; 6.4.1 Ridge Regression; 6.4.2 Lasso; 6.5 Spline Estimators; 6.5.1 Base Splines
505 8 _a6.5.2 Smoothing Splines6.6 Wavelet Estimators; 6.6.1 Wavelet Base; 6.6.2 Wavelet Smoothing; 6.7 Choosing the Smoothing Parameter; 6.8 Bootstrap in Regression; 6.9 Problems; References; Index
520 _aThis textbook gives an overview of statistical methods that have been developed during the last years due to increasing computer use, including random number generators, Monte Carlo methods, Markov Chain Monte Carlo (MCMC) methods, Bootstrap, EM algorithms, SIMEX, variable selection, density estimators, kernel estimators, orthogonal and local polynomial estimators, wavelet estimators, splines, and model assessment. Computer Intensive Methods in Statistics is written for students at graduate level, but can also be used by practitioners. Features Presents the main ideas of computer-intensive statistical methods Gives the algorithms for all the methods Uses various plots and illustrations for explaining the main ideas Features the theoretical backgrounds of the main methods. Includes R codes for the methods and examples Silvelyn Zwanzig is an Associate Professor for Mathematical Statistics at Uppsala University. She studied Mathematics at the Humboldt- University in Berlin. Before coming to Sweden, she was Assistant Professor at the University of Hamburg in Germany. She received her Ph.D. in Mathematics at the Academy of Sciences of the GDR. Since 1991, she has taught Statistics for undergraduate and graduate students. Her research interests have moved from theoretical statistics to computer intensive statistics. Behrang Mahjani is a postdoctoral fellow with a Ph.D. in Scientific Computing with a focus on Computational Statistics, from Uppsala University, Sweden. He joined the Seaver Autism Center for Research and Treatment at the Icahn School of Medicine at Mount Sinai, New York, in September 2017 and was formerly a postdoctoral fellow at the Karolinska Institutet, Stockholm, Sweden. His research is focused on solving large-scale problems through statistical and computational methods.
588 _aOCLC-licensed vendor bibliographic record.
650 7 _aBUSINESS & ECONOMICS / Statistics
_2bisacsh
650 7 _aCOMPUTERS / Database Management / Data Mining
_2bisacsh
650 7 _aMATHEMATICS / Probability & Statistics / General
_2bisacsh
650 0 _aStatistics
_xData processing.
700 1 _aMahjani, Behrang.
856 4 0 _3Taylor & Francis
_uhttps://www.taylorfrancis.com/books/9780429202322
856 4 2 _3OCLC metadata license agreement
_uhttp://www.oclc.org/content/dam/oclc/forms/terms/vbrl-201703.pdf
999 _c5456
_d5456