University of Washington - Department of Economics
Model uncertainty is central to economics, where researchers attempt to discriminate among alternative theories in robustness analyses. Bayesian Model Averaging (BMA) is an approach designed to address model uncertainty as part of the empirical strategy. Applications of BMA to economics are widespread; however it is often unclear whether subtle differences in the choice of parameter and model priors affect inference. We present an integrated procedure, based on 12 popular, noninformative parameter priors and any given model prior, to conduct sensitivity analysis in BMA.
Our procedure is then applied to a growth dataset that has been a showcase of model uncertainty. We find that small differences in parameter priors influence inference profoundly. In contrast, we show that, in a variety of (well behaved) datasets consisting of simulated data, the effects of priors are felt only when the number of candidate regressors becomes large, relative to the number of observations. We evaluate the predictive performance of alternative priors in the context of economic growth and find that neither the Fernandez, Ley and Steelâ€™s (2001a) â€œbenchmarkâ€ parameter prior, nor the preferred subjective model prior of Sala-i-Martin et al (2004) are optimal. Our results suggest the Unit Information Prior (UIP) as a reference prior, in the sense that it obtains consistently high quality predictive performance. Nevertheless, true to the Bayesian spirit, there always exists a specific prior of a given application that further optimizes predictive performance. This demonstrates the importance of our integrated procedure, which allows researchers to compare and evaluate the effects of alternative prior distributions.
This work is joint with Chris Papageorgiou, Research Department, International Monetary Fund, Department of Economics, Louisiana State University and Adrian E. Raftery, Departments of Statistics and Sociology, Center for Statistics and the Social Sciences, University of Washington.