Bayesian Methods for Management and Business: Pragmatic Solutions for Real ProblemsISBN: 9781118637555
384 pages
September 2014

Description
HIGHLIGHTS THE USE OF BAYESIAN STATISTICS TO GAIN INSIGHTS FROM EMPIRICAL DATA
Featuring an accessible approach, Bayesian Methods for Management and Business: Pragmatic Solutions for Real Problems demonstrates how Bayesian statistics can help to provide insights into important issues facing business and management. The book draws on multidisciplinary applications and examples and utilizes the freely available software WinBUGS and R to illustrate the integration of Bayesian statistics within datarich environments.
Computational issues are discussed and integrated with coverage of linear models, sensitivity analysis, Markov Chain Monte Carlo (MCMC), and model comparison. In addition, more advanced models including hierarchal models, generalized linear models, and latent variable models are presented to further bridge the theory and application in realworld usage.
Bayesian Methods for Management and Business: Pragmatic Solutions for Real Problems also features:
 Numerous realworld examples drawn from multiple management disciplines such as strategy, international business, accounting, and information systems
 An incremental skillbuilding presentation based on analyzing data sets with widely applicable models of increasing complexity
 An accessible treatment of Bayesian statistics that is integrated with a broad range of business and management issues and problems
 A practical problemsolving approach to illustrate how Bayesian statistics can help to provide insight into important issues facing business and management
Bayesian Methods for Management and Business: Pragmatic Solutions for Real Problems is an important textbook for Bayesian statistics courses at the advanced MBAlevel and also for business and management PhD candidates as a first course in methodology. In addition, the book is a useful resource for management scholars and practitioners as well as business academics and practitioners who seek to broaden their methodological skill sets.
Table of Contents
Preface xv
1 Introduction to Bayesian Methods 1
1.1 Bayesian Methods: An Aerial Survey 1
1.1.1 Informal Example 3
1.2 Bayes’ Theorem 4
1.3 Bayes’ Theorem and the Focus Group 6
1.4 The Flavors of Probability 8
1.4.1 Common Ground 9
1.4.2 FrequencyBased Probability 9
1.4.3 Subjective Probability 10
1.5 Summary 11
1.6 Notation Introduced in this Chapter 11
2 A First Look at Bayesian Computation 12
2.1 Getting Started 12
2.2 Selecting the Likelihood Function 13
2.3 Selecting the Functional Form 16
2.4 Selecting the Prior 17
2.5 Finding the Normalizing Constant 18
2.6 Obtaining the Posterior 19
2.7 Communicating Findings 23
2.8 Predicting Future Outcomes 26
2.9 Summary 28
2.10 Exercises 28
2.11 Notation Introduced in this Chapter 29
3 ComputerAssisted Bayesian Computation 30
3.1 Getting Started 30
3.2 Random Number Sequences 31
3.3 Monte Carlo Integration 33
3.4 Monte Carlo Simulation for Inference 36
3.4.1 Testing for a Difference in Proportions 37
3.4.2 Predicting Customer Behavior 38
3.4.3 Predicting Customer Behavior Part 2 40
3.5 The Conjugate Normal Model 40
3.5.1 The Conjugate Normal Model: Mean with Variance Known 40
3.5.2 The Conjugate Normal Model: Variance with Mean Known 42
3.5.3 The Conjugate Normal Model with Mean and Variance Both Unknown 44
3.6 In Practice: Inference for the Conjugate Normal Model 45
3.6.1 Conjugate Normal Mean with Variance Known 46
3.6.2 Conjugate Normal Variance with Mean Known 47
3.6.3 Conjugate Normal Mean and Variance Both Unknown 48
3.7 Count Data and the Conjugate Poisson Model 52
3.7.1 In Detail: Conjugate Poisson Model Development 53
3.7.2 In Practice: Inference for the Conjugate Poisson Model 54
3.8 Summary 56
3.9 Exercises 56
3.10 Notation Introduced in this Chapter 58
3.11 Appendix—In Detail: Finding Posterior Distributions for the Normal Model 58
3.11.1 Analysis of the Normal Mean with Variance Known 59
3.11.2 Analysis of the Normal Variance with Mean Known 61
3.11.3 Analysis of the Conjugate Normal Model with Mean and Variance Both Unknown 62
4 Markov Chain Monte Carlo and Regression Models 64
4.1 Introduction to Markov Chain Monte Carlo 64
4.2 Fundamentals of MCMC 66
4.3 Gibbs Sampling 67
4.3.1 Gibbs Sampling for the Normal Mean 69
4.3.2 Output Analysis 70
4.4 Gibbs Sampling and the Simple Linear Regression Model 73
4.5 In Practice: The Simple Linear Regression Model 76
4.6 The Metropolis Algorithm 79
4.6.1 In Practice: Simulating from a Standard Normal Distribution Using the Metropolis Algorithm 81
4.6.2 In Practice: Regression Analysis Using the Metropolis Algorithm 85
4.7 Hastings’ Extension of the Metropolis Algorithm 87
4.7.1 In Practice: The Metropolis–Hastings Algorithm 89
4.7.2 The Relationship Between the Gibbs Sampler and the Metropolis–Hastings Algorithm 90
4.8 Summary 91
4.9 Exercises 92
5 Estimating Bayesian Models With WinBUGS 93
5.1 An Introduction to WinBUGS 94
5.2 In Practice: A First WinBUGS Model 95
5.3 In Practice: Models for the Mean in WinBUGS 104
5.3.1 Examining the SingleSample Mean 104
5.3.2 The TwoSample tTest 106
5.3.3 An Alternative Parameterization of the TwoSample tTest 108
5.4 Examining the Prior’s Influence with Sensitivity Analysis 111
5.4.1 Sensitivity Analysis with Informative Priors 111
5.4.2 Sensitivity Analysis with Noninformative Priors 113
5.4.3 In Practice: Presensitivity Analysis: Graphically Examining a Mean Parameter’s Prior and Posterior Distribution 114
5.4.4 In Practice: Presensitivity Analysis—Graphically Examining a Precision Parameter 117
5.4.5 In Practice: Sensitivity Analysis for a Mean Parameter 118
5.4.6 In Practice: Sensitivity Analysis for a Precision Parameter 118
5.5 In Practice: Examining Proportions in WinBUGS 120
5.5.1 Analyzing Differences in Proportions 121
5.5.2 Predicting Customer Behavior: Part 2 Revisited 124
5.6 Analysis of Variance Models 125
5.6.1 In Practice: OneWay ANOVA 126
5.6.2 In Practice: OneWay ANOVA with Effects Coding 132
5.6.3 In Practice: OneWay ANOVA with Unequal Variances 133
5.6.4 Indexing Parameters by Group Membership Variables 136
5.7 Higher Order ANOVA Models 137
5.7.1 In Practice: TwoWay ANOVA with structure Data 139
5.7.2 TwoWay ANOVA with Group Indicator Variables 140
5.7.3 Using Columnar Data in WinBUGS 143
5.8 Regression and ANCOVA Models in WinBUGS 144
5.8.1 In Practice: Simple Linear Regression Using WinBUGS 145
5.8.2 In Practice: ANCOVA Models Using WinBUGS 147
5.8.3 In Practice: “Undifferenced” ANCOVA Models Using WinBUGS 150
5.9 Summary 152
5.10 Chapter Appendix: Exporting WinBUGS MCMC Output to R 152
5.11 Exercises 153
6 Assessing MCMC Performance inWinBUGS 155
6.1 Convergence Issues in MCMC Modeling 155
6.2 Output Diagnostics in WinBUGS 158
6.2.1 The Quantiles Tool 158
6.2.2 The Autocorrelation Function Tool 159
6.3 Reparameterizing to Improve Convergence 161
6.4 Number and Length of Chains 165
6.4.1 Number of Chains 165
6.4.2 Length of Chains 173
6.5 Metropolis–Hastings Acceptance Rates 175
6.6 Summary 177
6.7 Exercises 178
7 Model Checking and Model Comparison 180
7.1 Graphical Model Checking 180
7.1.1 In Practice: Graphical Fit Plots 181
7.1.2 In Practice: Residual Analysis 183
7.2 Predictive Densities and Checking Model Assumptions 185
7.2.1 The Posterior Predictive pvalue 186
7.2.2 In Detail: Comparing Posterior Predictive pValue Test Statistics 190
7.3 Variable Selection Methods 192
7.3.1 Kuo and Mallick’s Method 192
7.3.2 In Practice: Kuo and Mallick Variable Selection 194
7.3.3 Gibbs Variable Selection 196
7.3.4 In Practice: Gibbs Variable Selection 197
7.3.5 Reversible Jump MCMC 197
7.3.6 In Practice: Reversible Jump MCMC with WinBUGS 198
7.4 Bayes Factors and Bayesian Information Criterion 201
7.4.1 In Practice: Calculating the Marginal Likelihood for a Simple Proportion 204
7.4.2 Bayesian Information Criterion 205
7.5 Deviance Information Criterion 208
7.5.1 AIC and Classical Nonnested Model Selection 208
7.5.2 DIC: A Bayesian Version of AIC 209
7.5.3 In Practice: DIC for Variable Selection 211
7.5.4 In Practice: Likelihood Transformations and DIC 213
7.6 Summary 214
7.7 Exercises 214
8 Hierarchical Models 217
8.1 Fundamentals of Hierarchical Models 218
8.1.1 In Detail: Hierarchical Model Error Terms 222
8.1.2 In Practice: The OneWay RandomEffects ANOVA Model 223
8.1.3 In Practice: Hierarchical Centering 225
8.1.4 In Practice: Examining Alternative Priors for Variance Components 226
8.1.5 In Practice: Longitudinal Modeling 227
8.2 The Random Coefficients Model 228
8.2.1 In Practice: Structuring Data for Hierarchical Models 231
8.2.2 In Practice: The Random Coefficients Model 233
8.2.3 In Practice: Changing Random Coefficients to Be Nonrandom 236
8.2.4 In Practice: MultiplePredictor Random Coefficients Models 237
8.3 Hierarchical Models for Variance Terms 238
8.4 Functional Forms at Multiple Hierarchical Levels 242
8.4.1 In Practice: SecondLevel Functional Forms 245
8.4.2 In Practice: Interpreting SecondLevel Coefficients 247
8.5 In Detail: Modeling Covarying Hierarchical Terms 249
8.5.1 Specifying Priors for the Bivariate Normal 250
8.5.2 In Practice: The Covarying Random Coefficients Model 252
8.5.3 In Practice: Case Studies in the Covarying Random Coefficients Model 254
8.6 Summary 256
8.7 Exercises 256
8.8 Notation Introduced in this Chapter 257
9 Generalized Linear Models 259
9.1 Fundamentals of Generalized Linear Models 259
9.2 Count Data Models: Poisson Regression 262
9.3 Models for Binary Data: Logistic Regression 266
9.4 The Probit Model 271
9.5 In Detail: Multinomial Logistic Regression for Categorical Outcomes 274
9.5.1 In Practice: Multinomial Logit for Contingency Tables 277
9.5.2 In Practice: Multinomial Logit with Continuous Predictors 279
9.6 Hierarchical Models for Count Data 281
9.6.1 The Negative Binomial Regression Model 282
9.6.2 In Practice: Simulating from the Negative Binomial Distribution 282
9.6.3 In Practice: Negative Binomial Regression 285
9.7 Hierarchical Models for Binary Data 287
9.7.1 In Practice: Logistic Regression with Random Intercepts 288
9.8 Summary 290
9.9 Exercises 291
9.10 Notation Introduced in this Chapter 292
10 Models For Difficult Data 294
10.1 Living with Outliers—Robust Regression Models 294
10.1.1 Another Look at the tDistribution 296
10.1.2 In Practice: Robust Regression with the tDistribution 297
10.1.3 In Detail: Placing a Prior on 301
10.2 Handling Heteroscedasticity by Modeling Variance Parameters 304
10.2.1 In Practice: Modeling Heteroscedasticity 305
10.3 Dealing with Missing Data 309
10.4 Types of Missing Data 311
10.4.1 Missing Completely at Random Data 311
10.4.2 In Practice: Analyzing MCAR Data 312
10.4.3 Missing at Random Data 314
10.4.4 In Practice: Analyzing MAR Data 315
10.4.5 Missing Not at Random Data 317
10.5 Missing Covariate Data and NonNormal Missing Data 318
10.6 Summary 319
10.7 Exercises 320
10.8 Notation Introduced in this Chapter 321
11 Introduction To Latent Variable Models 322
11.1 Not Seen but Felt 322
11.2 Latent Variable Models for Binary Data 323
11.2.1 In Practice: The Probit Model Using Latent Variables 325
11.3 Structural Break Models 327
11.3.1 In Practice: Estimating Structural Break Models 329
11.3.2 In Practice: Adding Covariates to Structural Break Models 332
11.3.3 In Detail: Improving Parameter Mixing in Structural Break Models 333
11.4 In Detail: The Ordinal Probit Model 335
11.4.1 Posterior Simulation in the Ordinal Probit Model 336
11.4.2 In Practice: Modeling Credit Ratings with Ordinal Probit 339
11.5 Summary 341
11.6 Exercises 342
AppendixA Common Statistical Distributions 344
References 346
Author Index 357
Subject Index 361
Author Information
Eugene D. Hahn, PhD, is Associate Professor in the Department of Information and Decision Systems in the Franklin P. Perdue School of Business at Salisbury University. He has published in leading business and management journals as well as in journals that discuss Bayesian methods.