cover of book

Big Data for Twenty-First-Century Economic Statistics
edited by Katharine G. Abraham, Ron S. Jarmin, Brian C. Moyer and Matthew D. Shapiro
University of Chicago Press, 2022
Cloth: 978-0-226-80125-4 | eISBN: 978-0-226-80139-1
Library of Congress Classification HB143.5.B54 2022
Dewey Decimal Classification 330.0727

The papers in this volume analyze the deployment of Big Data to solve both existing and novel challenges in economic measurement. 

The existing infrastructure for the production of key economic statistics relies heavily on data collected through sample surveys and periodic censuses, together with administrative records generated in connection with tax administration. The increasing difficulty of obtaining survey and census responses threatens the viability of existing data collection approaches. The growing availability of new sources of Big Data—such as scanner data on purchases, credit card transaction records, payroll information, and prices of various goods scraped from the websites of online sellers—has changed the data landscape. These new sources of data hold the promise of allowing the statistical agencies to produce more accurate, more disaggregated, and more timely economic data to meet the needs of policymakers and other data users. This volume documents progress made toward that goal and the challenges to be overcome to realize the full potential of Big Data in the production of economic statistics. It describes the deployment of Big Data to solve both existing and novel challenges in economic measurement, and it will be of interest to statistical agency staff, academic researchers, and serious users of economic statistics.

See other books on: Big data | Data processing | Econometrics | Statistical methods | Statistics
See other titles from University of Chicago Press
Nearby on shelf for Economic theory. Demography / Methodology / Mathematical economics. Quantitative methods: