The ABCs of RBCs is the first book to provide a basic introduction to Real Business Cycle (RBC) and New-Keynesian models. These models argue that random shocks—new inventions, droughts, and wars, in the case of pure RBC models, and monetary and fiscal policy and international investor risk aversion, in more open interpretations—can trigger booms and recessions and can account for much of observed output volatility.
George McCandless works through a sequence of these Real Business Cycle and New-Keynesian dynamic stochastic general equilibrium models in fine detail, showing how to solve them, and how to add important extensions to the basic model, such as money, price and wage rigidities, financial markets, and an open economy. The impulse response functions of each new model show how the added feature changes the dynamics.
The ABCs of RBCs is designed to teach the economic practitioner or student how to build simple RBC models. Matlab code for solving many of the models is provided, and careful readers should be able to construct, solve, and use their own models.
In the tradition of the “freshwater” economic schools of Chicago and Minnesota, McCandless enhances the methods and sophistication of current macroeconomic modeling.
Advanced Econometrics is both a comprehensive text for graduate students and a reference work for econometricians. It will also be valuable to those doing statistical analysis in the other social sciences. Its main features are a thorough treatment of cross-section models, including qualitative response models, censored and truncated regression models, and Markov and duration models, as well as a rigorous presentation of large sample theory, classical least-squares and generalized least-squares theory, and nonlinear simultaneous equation models.
Although the treatment is mathematically rigorous, the author has employed the theorem-proof method with simple, intuitively accessible assumptions. This enables readers to understand the basic structure of each theorem and to generalize it for themselves depending on their needs and abilities. Many simple applications of theorems are given either in the form of examples in the text or as exercises at the end of each chapter in order to demonstrate their essential points.
This text prepares first-year graduate students and advanced undergraduates for empirical research in economics, and also equips them for specialization in econometric theory, business, and sociology.
A Course in Econometrics is likely to be the text most thoroughly attuned to the needs of your students. Derived from the course taught by Arthur S. Goldberger at the University of Wisconsin–Madison and at Stanford University, it is specifically designed for use over two semesters, offers students the most thorough grounding in introductory statistical inference, and offers a substantial amount of interpretive material. The text brims with insights, strikes a balance between rigor and intuition, and provokes students to form their own critical opinions.
A Course in Econometrics thoroughly covers the fundamentals—classical regression and simultaneous equations—and offers clear and logical explorations of asymptotic theory and nonlinear regression. To accommodate students with various levels of preparation, the text opens with a thorough review of statistical concepts and methods, then proceeds to the regression model and its variants. Bold subheadings introduce and highlight key concepts throughout each chapter.
Each chapter concludes with a set of exercises specifically designed to reinforce and extend the material covered. Many of the exercises include real microdata analyses, and all are ideally suited to use as homework and test questions.
The celebrated economist Zvi Griliches’s entire career can be viewed as an attempt to advance the cause of accuracy in economic measurement. His interest in the causes and consequences of technical progress led to his pathbreaking work on price hedonics, now the principal analytical technique available to account for changes in product quality.
Hard-to-Measure Goods and Services, a collection of papers from an NBER conference held in Griliches’s honor, is a tribute to his many contributions to current economic thought. Here, leading scholars of economic measurement address issues in the areas of productivity, price hedonics, capital measurement, diffusion of new technologies, and output and price measurement in “hard-to-measure” sectors of the economy. Furthering Griliches’s vital work that changed the way economists think about the U.S. National Income and Product Accounts, this volume is essential for all those interested in the labor market, economic growth, production, and real output.
Contributors: John Aldrich, Jeff E. Biddle, Olav Bjerkholt, Marcel Boumans, Chao-Hsi Huang, Robert W. Dimand, Duo Qin, Ariane Dupont-Kieffer, Hsiang-Ke Chao, Aiko Ikeo, Francisco Louçã, Mary S. Morgan, Daniela Parisi, Alain Pirotte, Charles G. Renfro, Thomas Stapleford, Sofia Terlica
Marcel Boumans is Associate Professor of Economics at the University of Amsterdam. Ariane Dupont-Kieffer is a Researcher at the French National Institute of Research on Transport and Safety. Duo Qin is Reader of Economics at the University of London.
This book is a full-scale exposition of Charles Manski's new methodology for analyzing empirical questions in the social sciences. He recommends that researchers first ask what can be learned from data alone, and then ask what can be learned when data are combined with credible weak assumptions. Inferences predicated on weak assumptions, he argues, can achieve wide consensus, while ones that require strong assumptions almost inevitably are subject to sharp disagreements.
Building on the foundation laid in the author's Identification Problems in the Social Sciences (Harvard, 1995), the book's fifteen chapters are organized in three parts. Part I studies prediction with missing or otherwise incomplete data. Part II concerns the analysis of treatment response, which aims to predict outcomes when alternative treatment rules are applied to a population. Part III studies prediction of choice behavior.
Each chapter juxtaposes developments of methodology with empirical or numerical illustrations. The book employs a simple notation and mathematical apparatus, using only basic elements of probability theory.
This book provides a language and a set of tools for finding bounds on the predictions that social and behavioral scientists can logically make from nonexperimental and experimental data. The economist Charles Manski draws on examples from criminology, demography, epidemiology, social psychology, and sociology as well as economics to illustrate this language and to demonstrate the broad usefulness of the tools.
There are many traditional ways to present identification problems in econometrics, sociology, and psychometrics. Some of these are primarily statistical in nature, using concepts such as flat likelihood functions and nondistinct parameter estimates. Manski's strategy is to divorce identification from purely statistical concepts and to present the logic of identification analysis in ways that are accessible to a wide audience in the social and behavioral sciences. In each case, problems are motivated by real examples with real policy importance, the mathematics is kept to a minimum, and the deductions on identifiability are derived giving fresh insights.
Manski begins with the conceptual problem of extrapolating predictions from one population to some new population or to the future. He then analyzes in depth the fundamental selection problem that arises whenever a scientist tries to predict the effects of treatments on outcomes. He carefully specifies assumptions and develops his nonparametric methods of bounding predictions. Manski shows how these tools should be used to investigate common problems such as predicting the effect of family structure on children's outcomes and the effect of policing on crime rates.
Successive chapters deal with topics ranging from the use of experiments to evaluate social programs, to the use of case-control sampling by epidemiologists studying the association of risk factors and disease, to the use of intentions data by demographers seeking to predict future fertility. The book closes by examining two central identification problems in the analysis of social interactions: the classical simultaneity problem of econometrics and the reflection problem faced in analyses of neighborhood and contextual effects.
This outstanding text by a foremost econometrician combines instruction in probability and statistics with econometrics in a rigorous but relatively nontechnical manner. Unlike many statistics texts, it discusses regression analysis in depth. And unlike many econometrics texts, it offers a thorough treatment of statistics. Although its only mathematical requirement is multivariate calculus, it challenges the student to think deeply about basic concepts.
The coverage of probability and statistics includes best prediction and best linear prediction, the joint distribution of a continuous and discrete random variable, large sample theory, and the properties of the maximum likelihood estimator. Exercises at the end of each chapter reinforce the many illustrative examples and diagrams. Believing that students should acquire the habit of questioning conventional statistical techniques, Takeshi Amemiya discusses the problem of choosing estimators and compares various criteria for ranking them. He also evaluates classical hypothesis testing critically, giving the realistic case of testing a composite null against a composite alternative. He frequently adopts a Bayesian approach because it provides a useful pedagogical framework for discussing many fundamental issues in statistical inference.
Turning to regression, Amemiya presents the classical bivariate model in the conventional summation notation. He follows with a brief introduction to matrix analysis and multiple regression in matrix notation. Finally, he describes various generalizations of the classical regression model and certain other statistical models extensively used in econometrics and other applications in social science.
This is a textbook for the standard undergraduate econometrics course. Its only prerequisites are a semester course in statistics and one in differential calculus. Arthur Goldberger, an outstanding researcher and teacher of econometrics, views the subject as a tool of empirical inquiry rather than as a collection of arcane procedures. The central issue in such inquiry is how one variable is related to one or more others. Goldberger takes this to mean "How does the average value of one variable vary with one or more others?" and so takes the population conditional mean function as the target of empirical research.
The structure of the book is similar to that of Goldberger's graduate-level textbook, A Course in Econometrics, but the new book is richer in empirical material, makes no use of matrix algebra, and is primarily discursive in style. A great strength is that it is both intuitive and formal, with ideas and methods building on one another until the text presents fairly complicated ideas and proofs that are often avoided in undergraduate econometrics.
To help students master the tools of econometrics, Goldberger provides many theoretical and empirical exercises and real micro-and macroeconomic data sets. The data sets, available for download at www.hup.harvard.edu/features/golint/, deal with earnings and education, money demand, firm investment, stock prices, compensation and productivity, and the Phillips curve.
THE DATA SETS CAN BE FOUND HERE.
Innovative new approaches for improving GDP measurement to better gauge economic productivity.
Official measures of gross domestic product (GDP) indicate that productivity growth has declined in the United States over the last two decades. This has led to calls for policy changes from pro-business tax reform to stronger antitrust measures. But are our twentieth-century economic methods actually measuring our twenty-first-century productivity?
The Measure of Economies offers a synthesis of the state of knowledge in productivity measurement at a time when many question the accuracy and scope of GDP. With chapters authored by leading economic experts on topics such as the digital economy, health care, and the environment, it highlights the inadequacies of current practices and discusses cutting-edge alternatives.
Pragmatic and forward-facing, The Measure of Economies is an essential resource not only for social scientists, but also for policymakers and business leaders seeking to understand the complexities of economic growth in a time of rapidly evolving technology.
Rational Expectations and Econometric Practice was first published in 1981. Minnesota Archive Editions uses digital technology to make long-unavailable books once again accessible, and are published unaltered from the original University of Minnesota Press editions.
Assumptions about how people form expectations for the future shape the properties of any dynamic economic model. To make economic decisions in an uncertain environment people must forecast such variables as future rates of inflation, tax rates, government subsidy schemes and regulations. The doctrine of rational expectations uses standard economic methods to explain how those expectations are formed.
This work collects the papers that have made significant contributions to formulating the idea of rational expectations. Most of the papers deal with the connections between observed economic behavior and the evaluation of alternative economic policies.
Robert E. Lucas, Jr., is professor of economics at the University of Chicago. Thomas J. Sargent is professor of economics at the University of Minnesota and adviser to the Federal Reserve Bank of Minnesota.
The Solutions Manual to Elements of Econometrics, Second Edition provides chapter solutions to the exercises in the college textbook: Elements of Econometrics, Second Edition by Jan Kmenta.
This book gives a practical, applications-oriented account of the latest techniques for estimating and analyzing large, nonlinear macroeconomic models. Ray Fair demonstrates the application of these techniques in a detailed presentation of several actual models, including his United States model, his multicountry model, Sargent's classical macroeconomic model, autoregressive and vector autoregressive models, and a small (twelve equation) linear structural model. He devotes a good deal of attention to the difficult and often neglected problem of moving from theoretical to econometric models. In addition, he provides an extensive discussion of optimal control techniques and methods for estimating and analyzing rational expectations models.
A computer program that handles all the techniques in the book is available from the author, making it possible to use the techniques with little additional programming. The book presents the logic of this program. A smaller program for personal microcomputers for analysis of Fair's United States model is available from Urban Systems Research & Engineering, Inc. Anyone wanting to learn how to use large macroeconomic models, including researchers, graduate students, economic forecasters, and people in business and government both in the United States and abroad, will find this an essential guidebook.
READERS
Browse our collection.
PUBLISHERS
See BiblioVault's publisher services.
STUDENT SERVICES
Files for college accessibility offices.
UChicago Accessibility Resources
home | accessibility | search | about | contact us
BiblioVault ® 2001 - 2024
The University of Chicago Press