Doing Bayesian Data Analysis: A Tutorial With R, Jags, And Stan

Hardcover | November 3, 2014

byJohn KruschkeEditorJohn Kruschke

not yet rated|write a review
There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan provides an accessible approach to Bayesian data analysis, as material is explained clearly with concrete examples. The book begins with the basics, including essential concepts of probability and random sampling, and gradually progresses to advanced hierarchical modeling methods for realistic data. Included are step-by-step instructions on how to conduct Bayesian data analyses in the popular and free software R and WinBugs. This book is intended for first-year graduate students or advanced undergraduates. It provides a bridge between undergraduate training and modern Bayesian methods for data analysis, which is becoming the accepted research standard. Knowledge of algebra and basic calculus is a prerequisite. New to this Edition partial list : There are all new programs in JAGS and Stan. The new programs are designed to be much easier to use than the scripts in the first edition. In particular, there are now compact high-level scripts that make it easy to run the programs on your own data sets. This new programming was a major undertaking by itself. The introductory Chapter 2, regarding the basic ideas of how Bayesian inference re-allocates credibility across possibilities, is completely rewritten and greatly expanded. There are completely new chapters on the programming languages R Ch. 3 , JAGS Ch. 8 , and Stan Ch. 14 . The lengthy new chapter on R includes explanations of data files and structures such as lists and data frames, along with several utility functions. It also has a new poem that I am particularly pleased with. The new chapter on JAGS includes explanation of the RunJAGS package which executes JAGS on parallel computer cores. The new chapter on Stan provides a novel explanation of the concepts of Hamiltonian Monte Carlo. The chapter on Stan also explains conceptual differences in program flow between it and JAGS. Chapter 5 on Bayes rule is greatly revised, with a new emphasis on how Bayes rule re-allocates credibility across parameter values from prior to posterior. The material on model comparison has been removed from all the early chapters and integrated into a compact presentation in Chapter 10. What were two separate chapters on the Metropolis algorithm and Gibbs sampling have been consolidated into a single chapter on MCMC methods as Chapter 7 . There is extensive new material on MCMC convergence diagnostics in Chapters 7 and 8. There are explanations of autocorrelation and effective sample size. There is also exploration of the stability of the estimates of the HDI limits. New computer programs display the diagnostics, as well. Chapter 9 on hierarchical models includes extensive new and unique material on the crucial concept of shrinkage, along with new examples. All the material on model comparison, which was spread across various chapters in the first edition, in now consolidated into a single focused chapter Ch. 10 that emphasizes its conceptualization as a case of hierarchical modeling. Chapter 11 on null hypothesis significance testing is extensively revised. It has new material for introducing the concept of sampling distribution. It has new illustrations of sampling distributions for various stopping rules, and for multiple tests. Chapter 12, regarding Bayesian approaches to null value assessment, has new material about the region of practical equivalence ROPE , new examples of accepting the null value by Bayes factors, and new explanation of the Bayes factor in terms of the Savage-Dickey method. Chapter 13, regarding statistical power and sample size, has an extensive new section on sequential testing, and making the research goal be precision of estimation instead of rejecting or accepting a particular value. Chapter 15, which introduces the generalized linear model, is fully revised, with more complete tables showing combinations of predicted and predictor variable types. Chapter 16, regarding estimation of means, now includes extensive discussion of comparing two groups, along with explicit estimates of effect size. Chapter 17, regarding regression on a single metric predictor, now includes extensive examples of robust regression in JAGS and Stan. New examples of hierarchical regression, including quadratic trend, graphically illustrate shrinkage in estimates of individual slopes and curvatures. The use of weighted data is also illustrated. Chapter 18, on multiple linear regression, includes a new section on Bayesian variable selection, in which various candidate predictors are probabilistically included in the regression model. Chapter 19, on one-factor ANOVA-like analysis, has all new examples, including a completely worked out example analogous to analysis of covariance ANCOVA , and a new example involving heterogeneous variances. Chapter 20, on multi-factor ANOVA-like analysis, has all new examples, including a completely worked out example of a split-plot design that involves a combination of a within-subjects factor and a between-subjects factor. Chapter 21, on logistic regression, is expanded to include examples of robust logistic regression, and examples with nominal predictors. There is a completely new chapter Ch. 22 on multinomial logistic regression. This chapter fills in a case of the generalized linear model namely, a nominal predicted variable that was missing from the first edition. Chapter 23, regarding ordinal data, is greatly expanded. New examples illustrate single-group and two-group analyses, and demonstrate how interpretations differ from treating ordinal data as if they were metric. There is a new section 25.4 that explains how to model censored data in JAGS. Many exercises are new or revised. Accessible, including the basics of essential concepts of probability and random sampling Examples with R programming language and JAGS software Comprehensive coverage of all scenarios addressed by non-Bayesian textbooks: t-tests, analysis of variance ANOVA and comparisons in ANOVA, multiple regression, and chi-square contingency table analysis Coverage of experiment planning R and JAGS computer programming code on website Exercises have explicit purposes and guidelines for accomplishment Provides step-by-step instructions on how to conduct Bayesian data analyses in the popular and free software R and WinBugs

Pricing and Purchase Info

$112.77 online
$140.06 list price (save 19%)
In stock online
Ships free on orders over $25

From the Publisher

There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan provides an accessible approach to Bayesian data analysis, as material is explained clearly...

From the Jacket

There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS, provides an accessible approach to Bayesian Data Analysis, as material is explained ...

John K. Kruschke is Professor of Psychological and Brain Sciences, and Adjunct Professor of Statistics, at Indiana University in Bloomington, Indiana, USA. He is eight-time winner of Teaching Excellence Recognition Awards from Indiana University. He won the Troland Research Award from the National Academy of Sciences (USA), and the Rem...

other books by John Kruschke

Format:HardcoverDimensions:776 pages, 9.41 × 7.24 × 0.98 inPublished:November 3, 2014Publisher:Academic PressLanguage:English

The following ISBNs are associated with this title:

ISBN - 10:0124058884

ISBN - 13:9780124058880

Customer Reviews of Doing Bayesian Data Analysis: A Tutorial With R, Jags, And Stan

Reviews

Extra Content

Table of Contents

1. What's in This Book (Read This First!) PART I The Basics: Models, Probability, Bayes Rule, and R 2. Introduction: Credibility, Models, and Parameters 3. The R Programming Language 4. What Is This Stuff Called Probability? 5. Bayes Rule PART II All the Fundamentals Applied to Inferring a Binomial Probability 6. Inferring a Binomial Probability via Exact Mathematical Analysis 7. Markov Chain Monte Carlo 8. JAGS 9. Hierarchical Models 10. Model Comparison and Hierarchical Modeling 11. Null Hypothesis Significance Testing 12. Bayesian Approaches to Testing a Point ("Null") Hypothesis 13. Goals, Power, and Sample Size 14. Stan PART III The Generalized Linear Model 15. Overview of the Generalized Linear Model 16. Metric-Predicted Variable on One or Two Groups 17. Metric Predicted Variable with One Metric Predictor 18. Metric Predicted Variable with Multiple Metric Predictors 19. Metric Predicted Variable with One Nominal Predictor 20. Metric Predicted Variable with Multiple Nominal Predictors 21. Dichotomous Predicted Variable 22. Nominal Predicted Variable 23. Ordinal Predicted Variable 24. Count Predicted Variable 25. Tools in the Trunk Bibliography Index

Editorial Reviews

"I think it fills a gaping hole in what is currently available, and will serve to create its own market as researchers and their students transition towards the routine application of Bayesian statistical methods.? -Prof. Michael lee, University of California, Irvine, and president of the Society for Mathematical Psychology "Kruschke's text covers a much broader range of traditional experimental designs.has the potential to change the way most cognitive scientists and experimental psychologists approach the planning and analysis of their experiments" -Prof. Geoffrey Iverson, University of California, Irvine, and past president of the Society for Mathematical Psychology "John Kruschke has written a book on Statistics. It's better than others for reasons stylistic. It also is better because itis Bayesian. To find out why, buy it -- it's truly amazin'!?-James L. (Jay) McClelland, Lucie Stern Professor & Chair, Dept. Of Psychology, Standford University