[关闭]
@fanxy 2018-06-15T07:12:26.000000Z 字数 44820 阅读 1387

笔记 数学


Uncertainty in Economic Analysis and
the Economic Analysis of Uncertainty
lars peter hansen, university of chicago
Real knowledge is to know the extent of one’s ignorance.
—Confucius
W
hen i think about knowledge, I find it virtually impos-
sible to avoid thinking about uncertainty. Uncertainty
adds a new dimension to discussions of knowledge that
are especially important in economic analyses when we seek a better
understanding of markets and the resulting outcomes for society and
quantitative answers to important policy questions. It has been im-
portant in my research and also more generally in economic scholar-
ship to take inventory, not only of what we know, but also of the gaps
to this knowledge. Thus, part of economic research assesses what we
know about what we do not know and how we confront what we do
not know. Uncertainty matters not only for how economic research-
ers interpret and use evidence, but also for how the consumers and
enterprises that we seek to model confront the future.
Modeling systems as they play out over time is pervasive in many
scientific disciplines, including economics. To use this approach to
addressreal-world problems requires thatwe imposeaspecificstruc-
ture on the models, guided by economic analysis and empirical evi-
© 2017 by the university of chicago. all rights reserved. know v1n1, spring 2017
dence. Econometrics is the subfield of economics in which models
grounded in economic theory are considered in light of real-world
measurements. The productive examination of these models requires
adaptation and modification of statistical methods to understand bet-
tertheirsuccessesanddefects.My ownresearchinterestsexplorehow
to build and assess the implications of dynamic economic models by
developing and applying statistical methods for analyzing time series
data. The dynamic models are necessarily abstractions and purpose-
fully simplified along some dimensions. They are wrong by choice of
the builder, yet they aim to be revealing. The evidence that I (and
others) often use for studying dynamic economic models is quite nat-
urallytimeseriesdata.Thesedataarespacedovertimebytakingsnap-
shots oraverages of measuresof macroeconomic outcomes and finan-
cial market returns. Especially recently, important extensions have
included the time series measurements of the distributions of the
actions of consumers and enterprises as they interact in markets. In
theaggregate,theiractionsaffecteconomicoutcomes,andsomemod-
els incorporate these distributional dynamics in nontrivial ways. This
fieldandmyownresearchnecessarilyconfrontuncertaintyinmultiple
ways. Over the course of my career, my perspective and focus have
shifted toward broader notions of uncertainty and their consequences
for market outcomes and prudent policy making. At the same time, it
has been advantageous to bring in, adapt, and modify insights from
otherdisciplinesincludingstatistics,decisiontheory,andcontroltheory.
Quantitatively oriented economists build what are called structural
models.Thisambitiontakesusbeyondthepureforecastingproblems
that occupy considerable attention in the private sector. In his Nobel
address, Milton Friedman wrote, “Positive scientific knowledge that
enables us to predict the consequences of a possible course of action
is clearly a prerequisite for the normative judgement whether that
course of action is desirable.” 1 Notice that Friedman refers to a “pos-
sible course of action.” This requires making predictions that are pos-
know: a journal on the formation of knowledge
172
sibly outside of the range of the historical data and hence for which
wemaynothavedirectevidenceoneconomic impactsofthepossible
courses of action of interest. Economists refer to these predictions as
“counterfactual” predictions as distinct from just forecasting the fu-
ture without entertaining alternative policies with economic conse-
quences. Structural models aim to make counterfactual predictions,
and they do so by using formal economic models in conjunction with
data that support the exploration of alternative courses of action.
These models are meant to inform us as to what happens when we
explore changes in economic policy, such as government subsidies
or taxes, that are outside the realm of historical experience. Such
models also aim to help assess the economic consequences when
weentertainalternativemonetaryorfiscalpoliciesandwhenwecon-
sider the impact of governmental oversight of financial markets and
other forms of regulation. These are policy questions with conse-
quences to the entire economic system. The counterfactual predic-
tionsareanexplicitformofpolicyanalysisforaninterdependentsys-
tem typical of dynamic economic models. They are meant to answer
policy-relevant questions, but to do so rigorously, they require clear
statements of what is maintained as constant or invariant when we
alter other parts of the system as stipulated by the policy under con-
sideration. 2 The credible development and application of structural
dynamic models in economics relevant for policy analysis remain
an important research challenge. This development includes incor-
poratinguncertaintyinboththemodeldevelopmentandtheanswers
that the models are used to provide. The remainder of my essay dis-
cusses this challenge.
I. Uncertainty in Economic Dynamics
Counterfactual predictions are most appropriately framed by using
probabilities. While we might wish that a counterfactual prediction
spring 2017
173
be a simple number, this is typically not a credible ambition. One
source of uncertainty confronted in economics is “external random
impulses” or “shocks” that are taken as exogenously specified model-
ing inputs aimed at capturing unanticipated changes from outside
the economic system being modeled. Decades ago, Ragnar Frisch fea-
tured dynamic economic models characterized by the transmission
of random impulses over time to economic variables of interest. 3 Fol-
lowing on the insights of previous scholars such as Eugen Slutsky,
randomorunanticipatedchangestotheeconomicenvironmenthave
influences that persist over time. 4 Random changes in the weather
canhavealastingimpact onagriculturalproduction.Random changes
in technology, including, say, information technology, take time to
fully absorb and exploit, and they can have durable impacts on the
economic system. Formalizing these surprise changes as random im-
pulses when incorporated into an economic model makes predic-
tions probabilistic. This gives one formal way for the builders and
users of dynamic economic models to incorporate uncertainty. Given
that the inherent random impulses have lasting impact, the resulting
modeling outcome is a stochastic process with temporal dependence
inthe economic variables.Byassumption,wecannot knowinadvance
the outcome of these random shocks. An additional source of uncer-
tainty emerges because we only know model inputs imperfectly. This
sourceisoftenandconvenientlycapturedbyso-calledsubjectiveprob-
abilities.Observationsthatweaccumulateovertimeandfromavariety
of sources help in learning or resolving this specification uncertainty,
but this learning may occur slowly.
In spite of a strong conceptual basis for structural models, some
critics go so far as to dismiss the quantitative aspiration of structural
models as hopeless. 5 At the very least there is a justifiable concern
that our quantitative models might miss something important, as
they are at best rough approximations to a more complex economic
system.Therearemultiplecomponentstouncertainty ineconomicanal-
know: a journal on the formation of knowledge
174
yses, and it is a challenge for researchers to characterize the full na-
ture and magnitude of these components.
Economic models contain people making decisions often in the
presence of uncertainty. For instance, any investment in human or fi-
nancial capital requires a forward-looking perspective to determine
thenatureandmagnitudeoftheinvestment.Eveninmodern-dayfarm-
ers’ markets, suppliers confront uncertainty by deciding how much to
bring to the marketplace, and they make guesses as to the likely de-
mands for their goods. For tractability, economists are led to embrace
simplifiedmodelsofdecisionmakingforhowindividualscopewiththis
uncertainty, recognizing that they are at best approximations. Such
models are well understood not to do justice to the full set of insights
from psychology of individual decision-making.
One relevant concept in decision-making is the risk-aversion par-
adigm commonly used in economic analyses that endows decision
makers with known probabilities over possible events that can be re-
alized in the future. But market environments can be complex, and
this complexity makes it challenging to assign probabilities when us-
ing a risk-aversion model to capture individual behavior. While some
modelbuilders may prefer touse so-calledrules ofthumbin structural
economicmodels,theseso-calledrulesofthumbstillmustspecifyhow
these rules adapt to the environmental complexity and changes as we
explore alterations in the underlying economic environment. The eco-
nomicanalysisofuncertainty becomesacentralingredientintheconstruc-
tion of dynamic economic models. It has ramifications for prices that
clear markets and for how resources are allocated through the use of
thesemarkets.Insteadofpositingrulesofthumb,Iwillexploremoredis-
ciplined ways to extend the elegant and valuable risk-aversion model
used pervasively in economics. In less formal terms, imagine entertain-
ingmultipleviews(inmycase,models)oftheeconomicsystemwithun-
certainty about which might be the best one. These views are relevant
because they provide inputs into the forward-looking decisions we
spring 2017
175
make. Instead of committing to just one view, all might be considered,
butwithdifferentweightsattachedtotheirvalidity.Thechoiceofweights
may not be obvious and in fact may even be influenced by the implica-
tions of the alternative views. Taking this one step further, add in an ac-
knowledgment that each of the possible views is a simplified guess and
not a fully complete or accurate picture of the economic system. To con-
necttotheformalizationthatIuseinthisessay,thinkofviewsasmodels
with implied probabilities of outcomes. How to weight the predictions
of these models and to capture their potential limitations adds to un-
certaintyaboutopportunitiesdecisionmakersmightfaceinthefuture.
Itakesuchconsiderationstobepervasiveandapplicabletoindividuals,
to businesses, and to the design and conduct of economic policy.
II. Formalizing the Components of Uncertainty
Scholars have long wrestled with uncertainty and its consequences,
and I find it valuable to draw on some of their perspectives. For in-
stance, initial contributions of probability as an application of math-
ematics were to games of chance, such as flipping coins, throwing
dice, drawing coloredballsrandomly from anurn withaknownnum-
ber of each contained in the urn, and complex extensions of such
games.Forcoinflipping,weareconfidentina50-50chanceofthecoin
coming up heads, and when rolling one die we are comfortable with
presuming that there is a one in six chance of rolling a five. The for-
malization of probability in conjunction with games of chance has a
long history. The study of potentially complicated games of chance
drew in eminent mathematicians, including Blaise Pascal and Pierre
de Fermat in their famed exchange about the so-called “problem of
points” or “division of stakes.” The analysis proceeded with given or
prespecified probabilities. This component of uncertainty where we
know probabilities but not outcomes is what I will call risk within a
model, building on a distinction made by Frank Knight and others. 6 I
know: a journal on the formation of knowledge
176
include within a model to remind us that we are taken as given the
probabilities. The case of known probabilities is a key part of how econ-
omistsand othersconceiveof risk aversion.In dynamic contexts, the
random impulses that I mentioned previously when modeled for-
mally with probability specifications provide sources of macroeco-
nomic risk confronted by individuals, markets, and governments.
Anoriginalcontributor totheuseofprobabilitytheoryfortheanal-
ysis of social science data is Jacob Bernoulli, one in a family of math-
ematicians, over three hundred years ago (see fig. 1). His discovery is
the Law of Large Numbers, along with some refinements. His funda-
mental result characterizes how unknown probabilities are revealed
by repeated sampling, say, from an urn with an unknown fraction
of white and red balls. Bernoulli was not motivated by games of
chance but instead by the application of probability theory to repre-
sent and understand social scientific data. These are data in which
probabilities are unknown ex ante and only fully revealed imperfectly
by actual data. These probabilities or their implications are presumed
targets of the empirical investigation. 7
Bernoulliconfrontedacommonsituationinwhichwedonotknow
probabilities but seek to learn about them. Sometimes this learning
occurs so quickly as to reveal the answer we seek, but often not. This
is why we have a field of statistics to study more complicated ver-
sions of the question that intrigued Bernoulli. For the purposes of this
essay, the conceptual contributions of Bruno de Finetti and Leonard
Savage stand out. 8 They provided a framework for subjective probabil-
ity. If you take n draws from an urn with an unknown fraction of balls,
subjectivists argue that the draw n 1 1 should not be viewed as inde-
pendent of the previous draw because this draw will be informative
about the unknown probability. Statistical independence commonly
used in building is a conditional statement, one that conditions on
the actual probability. Bernoulli’s calculations were made conditioned
on theprobability of, say,the fraction of white balls in theurn,treating
spring 2017
177
the draw n 1 1 as independent of draw n. To complete the probability
specification from the de Finetti and Savage perspectives requires a
“subjectiveprobability”(prior)overthepossiblefractions,whichinduces
a form of dependence, but it also allows for the formal probabilistic
statement of what we know about the fractions of white balls after ob-
serving n draws from an urn. While I use urns in this illustration, what
really interested Bernoulli is what we can learn from data about the
probabilities of outcomes.
More generally, when external analysts such as econometricians
are unsure which among a family of possible models is correct, sub-
jective probability suggests that we assign weights to the alternative
models. Given an initial weighting, we open the door to the elegant
Bayesian approach to learning. I use the term ambiguity about a model
for the component of uncertainty that pertains to how we assign
Figure 1.
Two perspectives on uncertainty:
the left part of the figure depicts
Bernoulli as a statistician look-
ing at a marketplace from the
outside seeking empirical evi-
dence to analyze. The right part
gives a painting by Pissarro
completed in 1898 where partici-
pants inside the marketplace in
Rouen face uncertainty in terms
of the demands for their goods
and the price they might receive
for these goods. The original of
the Pissarro painting is in the
Metropolitan Museum of Art,
New York. Color version avail-
able as an online enhancement.
know: a journal on the formation of knowledge
178
weights across alternative models. While de Finetti and Savage were
both proponents of subjective probability, both also acknowledge the
challenge of doing this in practice. This challenge is the impetus for
robust Bayesian methods that explore the sensitivity analysis to sub-
jective probability inputs. For instance, probabilities of the potential
outcomes or predictions of interest could be very sensitive to the
initial subjective weighting of alternative models. A robust Bayesian
seeks to characterize that sensitivity. 9
Models in economics and elsewhere derive their value in part from
their simplifications or abstractions. They are necessarily wrong or
equivalently misspecified along some dimensions. However, this ob-
servation by no means destroys their value. In economic applications,
this misspecification isoftentransparent, and wehope thatitdoes not
distort too much the answer to the questions we address. But the po-
tentialformodelmisspecificationgivesathirdcomponenttouncertainty,
one that is perhaps the most difficult to address or quantify. Some of
the more interesting attempts to address this challenge come out of
the extensive literature on robust control theory. An example that I
found to be particularly revealing and valuable in my own research is
a paper by Ian Petersen, Matthew James, and Paul Dupuis, where there
is uncertainty about how to specify the probabilities for the outcomes
of the random shocks. 10 As I noted previously, following Slutsky and
Frisch, 11 these random shocks are pervasive in modeling economic
timeseries.Uncertaintyabouttheprobabilitiesoftheserandomshocks
includes an incomplete understanding of intertemporal dependencies
in the constructed dynamic economic models. Recognizing the limita-
tions of the existing models alters their prudent usage.
III. Who Confronts Uncertainty?
Like others, I think of uncertainty from two vantage points, both of
which are important in building, assessing, and using dynamic eco-
spring 2017
179
nomic models. 12 One perspective is that of researchers who estimate
someunknownparameters,justasBernoullienvisioned,andtheyas-
sess or test the model implications. I call this perspective of an exter-
nal analyst as coming from outside of the models looking to evaluate
them basedonevidence or prior judgment. Thisisthetypical vantage
point of the discipline of statistics, and a rich array of methods have
been developed with this in mind.
Economists’ models include economic agents making decisions.
Forinstance,investmentdecisionsareinpartbasedonpeople’sviews
of the future possible benefits. Decisions on how much to produce
when production takes time depends in part on perceived prices or
economic rewards for selling the goods in the future. Once economic
decision makers are included in formal dynamic economic models,
their expectations come into play and become an important ingredi-
ent to the model as well as the uncertainties they confront. This chal-
lenge was well-appreciated by economists such as Arthur Pigou, John
Maynard Keynes, and John Hicks. Thus, economic agents inside the
models that economists build face challenges that bear similarity to
those of statisticians. What are sensible ways to forecast the future,
and how much confidence should we have in those forecasts?
When building models, while some researchers make simplistic
connections to psychology, we make no pretense to capture all of
the psychological complexities faced by individuals in different situ-
ations.Wemakeboldsimplificationsto keeptheanalysis oftheinter-
dependent system tractable. An elegant, pervasively used simplifica-
tion is the imposition of rational expectations. This is an equilibrium
construct that imposes model-consistent beliefs on the individuals
inside models. This approach was initiated within macroeconomics
by John Muth and Robert Lucas. 13 Following Lucas’s paper, in partic-
ular, rational expectations became an integral part of an equilibrium
forastochasticeconomicmodel.Thisapproachmakestheanalysisof
risk aversion tractable and provides an operational way to analyze
know: a journal on the formation of knowledge
180
counterfactualsusingdynamiceconomicmodels.Thereisadirectex-
tensionoftherationalexpectationsparadigmthatincludesunknown
parameters or states confronted by economic agents using subjective
probabilities and Bayesian learning. The rational expectations hy-
pothesisanditsextensiontoBayesianlearningabstractfromambigu-
ity about subjective inputs and concerns about potential model mis-
specification.
A substantial literature has evolved on econometric implications
of dynamic models with rational expectations with a variety of differ-
ent implementations. One important line quantifies the impact of
alternative shocks featured originally by Slutsky and Frisch to the
macroeconomy by inferring these shocks from data and measuring
how they are transmitted to the macroeconomy. 14 An initial impor-
tant contributor of this extensively used approach is Christopher
Sims. 15 An empirical counterpart to rational expectations is implicit
in much of this work, as the shocks that are identified through econo-
metric methods are also the ones pertinent to the economic system
being analyzed. A complementary approach imposes more a priori
structure on the underlying transition mechanisms while imposing
rational expectations in deriving and assessing testable restrictions
on the data generation. 16 As featured in “Nobel Lecture: Uncertainty
Outside and Inside Economic Models,” I along with several coauthors
explored and applied a third approach aimed at studying part of a dy-
namic economic system while seeking to be agnostic about the rest. 17
Even though the implicit model of the economy was that of an inter-
relateddynamicsystem,itprovedadvantageoustohaveeconometric
methods that allow the researcher to “do something without doing
everything.” My own interest focused on the implied linkages be-
tween the macroeconomy and financial markets. The featured rela-
tions captured the forward-looking investment decisions of indi-
viduals and enterprises. This approach also imposed an empirical
counterpart to rational expectations by, in this case, presuming
spring 2017
181
the beliefs of the economic agents are consistent with historical
time series data. Although not their original aim, empirical investi-
gations, including my own, produced characterizations of empirical
puzzlesratherthanconfirmationofmodels.Itpushedmeandothers
to think harder about the potential for model misspecification and
its consequences. If I, as a researcher, have to struggle in selecting
good models of the economy, perhaps the people inside the models
that I study face similar challenges. Thinking about uncertainty in
broader terms became an attractive extension of the rational expec-
tations perspective.
Theperceivedcomplexityoftheeconomicenvironmentaltershow
individualsmakeforward-lookingdecisions.Thisisself-evidentfrom
statistical decision theory and looks equally pertinent to external
analysis as well as to the economic decision makers in the models
webuild.Ifindthetoolsofdecisiontheoryandstatisticstobevaluable
in thinking about both challenges. It is easier to imagine behavioral
anomaliespersistingincomplexenvironmentsinwhichmodelselec-
tion is known to be truly challenging even for sophisticated statisti-
cians. There is a rather extensive literature on decision theory under
uncertainty that draws on insights from economics, statistics, and
control theory that are valuable guides for thinking through such is-
sues. 18 Myownresearchandapplicationshavefoundvaluefromboth
theaxiomaticapproaches common ineconomics, themore practically
oriented control theory methods, and the insights from applied proba-
bilitytheorythatfeaturecharacterizationsofstatisticalcomplexityand
resultingdifficultiesinlearningfromevidence.Decisiontheoryprovides
two attributes relevant for building and using dynamic economic mod-
els.Itgivesaformallanguagetodiscussdecisionmakinginanuncertain
environment, and it provides justifications for tractable ways to repre-
sent preferences to be used in formal statements of decision problems.
It is challenging to understand financial markets using the risk-
aversion model under rational expectations. Asset pricing theory in-
know: a journal on the formation of knowledge
182
forms us that it is the exposure to macroeconomic risks that requires
market compensation. These are the risks that cannot be diversified
by averaging over large cross-sections of exposures. The risk com-
pensations are sometimes observed to be large and puzzling. More-
over, in some of the existing economic models, exposure to long-
term macroeconomic risks can have even short-term consequences
for financial markets. 19 Thus, the models implicitly impose a burden
on investors inside the model to assign credible probabilities to
events that will only be realized far into the future. Motivated in part
by the empirical shortcomings that I mentioned previously, a litera-
ture is emerging that uses advances in decision theory to study the
impact of uncertainty, broadly conceived, on market prices and the
resulting outcomes. Adding in components of uncertainty other than
risk provides a different perspective on this evidence. For instance,
economists currently debate the possibility of a permanent “secular
stagnation” in the macroeconomy. By the term secular stagnation,
economists refer to the possibility that future growth rates will be on
average smaller than past ones. The alternative views of the pros-
pects for permanently sluggish growth may be conceptualized as
alternative models of the economy with uncertainty as to which of
these views gives the best approximation. Uncertainty of this na-
ture spills over to private sector investor decisions and financial mar-
ket returns. A broad perspective on uncertainty adds a richness to
how we capture investor behavior inside economic models. Investor
struggles in the presence of ambiguity aversion or concerns with
model misspecification aid our understanding of why financial mar-
kets reflect more caution in bad macroeconomic times than in good
times. 20
This more general perspective on uncertainty also provides a way
tocaptureinvestorconfidence.Afullyconfidentinvestormaycommit
completely to a single model where a less confident investor may en-
tertain multiple models with uncertainty as to how to weight them or
spring 2017
183
suspect each of them to be, at best, a coarse approximation (and
therefore misspecified). Such a formulation could also be a way to in-
troduce investor heterogeneity in economic models, heterogeneity
thatcapturesdifferencesinhowconfidentinvestorsareintheirviews
of the future. I next explore how a more sober perspective on uncer-
tainty could enrich the analysis of prudent policy design.
IV. Uncertainty and Policy
The connection between uncertainty and incomplete knowledge and
the design of economic policy has long been discussed in informal
ways. Ifstructural econometric modelsareto providequantitative in-
putsinto decision-making, how willuncertaintyalterhow these mod-
els should be used as formal guides for policy making? The impact of
uncertainty has been recognized by scholars, but less so when econ-
omists play advisory roles. Indeed, years ago when Friedrich Hayek
wrote on the pretense of knowledge, he warned of the dangers of try-
ingto satisfy what thepublic seeks: “Even if true scientists should rec-
ognize the limits of studying human behaviour, as long as the public
has expectations, there will be people who pretend or believe that
they can do more to meet popular demand than what is really in their
power.” 21
From my standpoint, there are two elaborations of these state-
ments that intrigue me. First, I am inclined to think in terms of uncer-
tainty in our understanding of human behavior and its economic con-
sequences. Second, I am concerned about unproductive policies
premised on a projected overconfidence in a particular model or per-
spective of the economic system. Going further, I see at least two in-
terrelated questions:
(a) Does incomplete knowledge or understanding of complicated
policy problems enhance the appeal of simple solutions?
know: a journal on the formation of knowledge
184
(b) How socially detrimental is complexity in policy implementa-
tion in light of the resulting uncertainties faced by the private
sector?
Regarding the first question, with a complete and confident under-
standing of an interdependent complex economic system, we might
well be led to embrace a complex policy to improve social well-being.
How does this perspective change when our understanding is incom-
plete, and does uncertainty or incomplete knowledge make simple
solutions to complex problems more appealing? 22
Decades ago, Friedman made reference to “long and variable lags”
in the mechanism by which money influences prices and the macro-
economy. 23 He used this observation to argue for simple policy rules
instead of more ambitious attempts at more subtle management of
the macroeconomy. The reference to long and variable lags was a
statement of skepticism about the knowledge needed to credibly im-
plement a more complicated policy rule. Monetary policy is different
now from when Friedman was writing, and some of Friedman’s own
perspectives on monetary transition mechanisms have since been
challenged in important ways. 24 But Friedman’s concern that there
will be unproductive outcomes induced by overstating our under-
standing of a basic mechanism continues to be relevant to current-
day macroeconomic policy making. Learning more about the eco-
nomicsystempotentiallyopensthedoortomorereliablepolicylevers,
but there remains an important task: to assess when the uncertainty
is sufficiently resolved to justify a more finely tuned approach to
the conduct of policy.
Regarding the second question pertaining to policy and complexity,
part of the practical ramifications of complexity in the design of policy
is to provide additional flexibility to policy makers in their implemen-
tation. For instance, it might well be desirable that policy authorities
have some discretionary powers in times of crisis or extreme events
spring 2017
185
thatwerenotappropriatelyplannedfor.Butthissamecomplexitybur-
dens the private sector as it is left guessing about implementation in
thefuture.Counterproductiveaspectsofregulatorydiscretionareknown
from the important work of Stephen Stigler and others. 25 A different
twist on discretion occurs in the dynamic macroeconomic policy set-
ting.Whenpolicymakersareunabletomakelong-termcommitments,
there is a repeated temptation for them to act in discretionary ways.
Finn Kydland and Edward Prescott have studied the resulting adverse
consequences relative to rule-based commitments. 26 Both contribu-
tions are fundamental, but my interest in this essay is to add to this dis-
cussionbysuggestinganinterplaybetweencomplexityanduncertainty.
Going beyond these two questions, I find it both attractive and
challenging to provide a more systematic analysis of uncertainty
anditsconsequencesforthedesignandconductofpolicy.Inwhatfol-
lows, I will talk briefly about two policy challenges for which I find a
broad perspective on uncertainty to be revealing. No doubt each one
deserves its own essay or more likely treatise; but letme at leastplace
them on the radar screen of readers to provide some more specific
context to my discussion.
A. Financial Market Oversight
The term “systemic risk” has shown up prominently in the academic
literature and in discussions related to financial market oversight
since the advent of the financial crisis. Prior to the crisis, the term
wasrarelyused.Mitigating systemic riskisacommondefense under-
lying the need for macro-prudential policy initiatives. How to design
and implement such policies remains an open question. When it
comestosystemicrisk,perhapsweshoulddeferandtrustourgovern-
mental officials engaged in regulation and oversight to “knowit when
they see it,” but this opens the door to counterproductive regulatory
discretion and policy uncertainty. 27
know: a journal on the formation of knowledge
186
Ihavewrittenpreviouslyonthechallengesinidentifyingandmea-
suring systemic risk. 28 There, I argue for thinking more broadly in
terms of systemic uncertainty instead of the more narrow construct
of risk. I exposit some of the many challenges that are pertinent
to building quantitative models to support the conduct of macro-
prudential policy. While I am an enthusiastic supporter of model de-
velopment in this area, currently we face counterparts to Friedman’s
concerns about long and variable lags because of our limited under-
standing of the underlying phenomenon. People on the front lines
of policy making have also noted important limitations both in our
understanding of systemic risk and in making it a guiding principle
for financial oversight. 29 How best to provide governmental oversight
of financial markets is arguably a hard and complex problem. Given
limitations in our knowledge base, it is not at all apparent that a com-
plexsolutionisthebestcourseofaction.Friedman’sappealforsimple
and transparent rules for monetary policy may be equally applicable
to the design and conduct of macro-prudential policy.
B. Climate Economics
Federal agencies use estimates of the “social cost of carbon” to assess
the climate impacts of various programs and regulations. Economists
applaud cost-benefit analysis, and the aim to be numerate looks at-
tractive. The current computations come from simulations from
alternative models of the interplay between the climate and the eco-
nomic system. There is a weighting across models, a reported sen-
sitivitytothechoiceofadiscountfactorusedincomputingpresentval-
ues measures, an attempt to make probabilistic statements, and an
acknowledgment of some omissions in the measure of climate dam-
ages. This all has the appearance of good quantitative social science
in action. Unfortunately, the current calculations also abstract from
some critical sources of uncertainty about the timing and magnitude
spring 2017
187
of how human inputs influence the climate, and they run the danger
of conveying a deeper understanding than truly exists.
Letmestartwiththebasicconstruct.Howusefulisittothinkofthe
social cost of carbon divorced from the benefits? How far can we push
microeconomic reasoning without thinking through the macroeco-
nomicsystem-wideconsequences? Thus,itisnotcleartomeconcep-
tually what should be meant by the social cost of carbon net of bene-
fitsandsystem-wideimplications.Wecandeterminewhatisactually
measured by opening the hoods, so to speak, of the models used to
generate the computations. By so doing, there are at least partial an-
swers to these questions.
But let’s take a step back. First, while basic physical considerations
play important roles in the construction of climate models, there are
important gaps in the ability to translate these insights into reliable
quantitative predictions. Second, once merged with economic com-
ponents, the carbon-temperature linkage is dramatically simplified
for reasons of tractability with only limited understanding of the con-
sequences of this simplification. Third, it is well known from the the-
ory of asset pricing that there should be an important link between
uncertainty and discounting when computing intertemporal valua-
tions that balance off costs over time. Thus, the uncertain social im-
pact of carbon in the future should alter the stochastic discounting
of inputs used to measure the net social cost of carbon. While so-
called local or small changes are amenable to stochastic counterparts
to the discount formulations used in deterministic cost-benefit anal-
yses, more global changes in policy require more comprehensive cal-
culations. 30 These three points just scratch the surface of some truly
important modeling challenges that climate scientists and econo-
mists continue to grapple with. 31 Perhaps the most productive out-
come of regulatory discussions of the social cost of carbon is the nur-
turing of future research in this important area rather than the actual
reported numbers.
know: a journal on the formation of knowledge
188
Acknowledging uncertainty and our limits to understanding does
not imply a call for inaction. Depending on what aspect of the uncer-
tainty we find to be most consequential to society helps us to better
frame a discussion of policy making in the future. The possibility of
major adverse impacts of, say, carbon emissions on the climate can
suffice for justifying policy responses such as carbon taxation or cap
and trade. Even though we are uncertain as to the magnitude, timing,
andclimateimpactsfromcarbonemissions,thisalonedoesnotratio-
nalize a wait-and-see attitude. Indeed, it may well be less costly so-
cially to act now than to defer policy responses to the future. Such trade-
offsareofcriticalimportancetoexploreandarebestdonesocognizant
of the limits in our understanding and uncertainty in our analyses.
V. How Might Decision Theory Contribute?
Arguably, Pascal’s wager about the existence of God is an initial for-
malization of decision theory. This dramatic illustration shows the
role of assessing consequences of actions in the face of uncertainty
when determining the rational decision. Indeed, as formulated in
Pascal’s example, the best course of action, behaving as if God exists,
is independent of any probabilistic detail beyond the possibility that
God might exist. More generally, decision theory captures the impor-
tant and sometimes subtle interplay between how uncertain we are
about the future and the consequences of alternative actions we
might take that affect that future.
I have already discussed how decision theory targeted to broad
notions of uncertainty helps us to understand better the behavior
of financial markets. In concluding this essay, let me discuss how
decision theory can also help shape discussions of prudent policy
making. Some of the insights from decision theory will appear to
be self-evident and of little surprise, but the formalism is still of con-
siderable value in both building and using models. Some examples
spring 2017
189
of unsurprising insights include the following. When we are unsure
about some modeling inputs, it makes good sense to perform a sen-
sitivity analysis by computing the consequences of changing the in-
puts. The target of this analysis should be the potential conse-
quences that the decision maker truly cares about.
A sensible decision or a good course of action is one that performs
relativelywellacrossarangeofmodelspecifications.Iwillrefertothis
propertyas“robustness.”Whentherearemultiplemodelstoconsider
and we are unsure of how to weight them, decision theory pushes us
to ask what the consequences are of a course of action under each of
the possible weightings of models that are entertained. Policies that
workwellunderthealternativemodelsbecomeattractiveevenifthey
cease to be the best course of action under any of the specific models.
Unlesswehaveacompellingaprioriwaytoweightmodels,cautionor
aversion to ambiguity translates into looking at the adverse conse-
quences of alternative possible weighting schemes in evaluating al-
ternative policies. Potential misspecification can be conceptualized
similarly but places an extra and perhaps unwieldy burden in guess-
ing the myriad of ways the models might be wrong. If the ways that a
model could be wrong is small in scope and easy to delineate, then
presumably it would be tractable and preferable to fix the models
rather than just acknowledge the potential flaws. I am not claiming
that confronting potential model misspecification is an easy task,
but I am suggesting that it not be forgotten. As I have already men-
tioned, robust control theory has already produced some tractable
and revealing ways to confront model misspecification in dynamic
settings. The survey paper by Massimo Marinacci and myself describes
research, including some of my own with Thomas Sargent and others,
thatbuildsonsomeoftheinsightsfromcontroltheoryandincorporates
them formally into decision theory and economic analysis. 32
Applying decision theory sharpens the questions and frames the
analysis, but it is not a panacea that makes prudent decision-making
know: a journal on the formation of knowledge
190
necessarily easy. For instance, even with decision theory I am un-
aware of any general proposition linking incomplete knowledge of a
social or economic problem and to desirability of a simple course of
action. However appealing this link may seem, my guess is that justi-
fying it formally may turn out to be context specific and may depend
on the details of the actual policy problem.
Thedecision-theoreticapproachraisesinterestingchallengesabout
how to communicate uncertainty in a policy realm. A robust statisti-
cian might just report ranges of potential probabilities for important
outcomes that are computed by looking across alternative ways to
weight model implications. This, as you might imagine, can quickly
overwhelm the attempt to communicate uncertainty. For decision
problems with a sufficiently nice mathematical structure, there are
so-called “worst-case” weighting schemes that depend on the details
of the decision problem, including the delineation of potential ways
to weight the models that are of interest. By construction, the so-
calledrobustcourseofactionisactuallythebestcourseunderthisworst
case weightedfamily of models.Thisresult essentially defineswhat is
meant by the worst case and it is derived as part of the solution to the
decision problem. In other words, it is an outcome of the decision
problem and not a hardwired input to that problem. This worst-case
weighting reflects caution induced by adopting a broad notion of un-
certainty. A fully committed Bayesian would only entertain one such
weighting scheme as implied by the subjective prior probabilities im-
posed on the problem. The chosen robust course of action, however,
wouldagreewiththatofaBayesianfullycommittedtotheworst-case
weighting.
Theworst caseprioroverafamily of modelsisnotjustsubjectively
determined.Itscomputationreliesonthedetailsofthedecisionprob-
lem, and the resulting weighting is slanted toward models with ad-
verseconsequences for thedecision maker.It isthe result ofthe aver-
sion to ambiguity or a concern about model misspecification. In
spring 2017
191
economic decision problems that confront possible secular stagna-
tion, the implied worst-case probabilities of macro growth are tilted
in a pessimistic direction. These probabilities are among a range of
possibilities entertained in the decision problem. Reporting (con-
strained) worst-case computations opens the door to claims of a bi-
ased treatment of the data. Indeed, this claim is accurate, but pur-
posefully so. The worst-case prior deliberately slants how models
are weighted and is part of the output when solving a decision prob-
lem. It also understates the underlying uncertainty.
A policy advisor may be tempted to slant model choices along the
lines of this worst-case weighting in order to defend a course of ac-
tion. Conveying formally the worst-case weighting as a weighting
scheme of particular interest may be too subtle for communication
pertinent in the policy arena. Perhaps we should expect our policy
makers to engage in a form of “noble falsehood,” conceding that a
broad notion of uncertainty is itself too complex for public discourse. 33
Indeed, projecting views with great confidence is perhaps the easiest
way to persuade policy makers and the public even when this confi-
dence is not real. If only we had the requisite knowledge that allowed
us to avoid such tricky issues and to embrace simple models with full
confidence.Unfortunately,weareseldomthatlucky.Butnaivelyignor-
ing uncertainty opens the door to ill-conceived policies that fail to de-
liver on their intended ambition.
VI. Conclusion
Like other areas of science, the study of economic dynamics seeks to
provide quantitative answers to important policy questions. In so do-
ing,uncertaintyisprevalentinavarietyofways,asIhavedescribedin
this essay. We should not shunt aside this uncertainty nor leave it in
the background even if it can be challenging to acknowledge and act
upon. The pretense of knowledge carries social costs that may only
know: a journal on the formation of knowledge
192
be realized in the long term. But these costs undermine the modeling
developments and the integrity of the resulting applications. We
should be bold enough to bring uncertainty to the forefront in discus-
sions of what we know about the economy and the implications of
that knowledge for the conduct of policy.
spring 2017
193
Notes
Amy Boonstra, William Brock, Jonathan Lear, Stephen Stigler, Grace Tsiang, and
Shadi Bartsch-Zimmer provided valuable feedback on earlier drafts of this essay.
1. See M. Friedman, “Nobel Lecture: Inflation and Unemployment,” Journal of
Political Economy 85, no. 3 (1977): 451–72.
2. The formal definition of a structural model was elegantly articulated by
L. Hurwicz, “On the Structural Form of Interdependent Systems,” in Logic, Meth-
odology, and Philosophy of Science: Proceedings of the 1960 International Congress, ed.
P. Suppes, E. Nagel, and A. Tarski, Studies in Logic and the Foundations of Math-
ematics 44 (Amsterdam: Elsevier, 1966), 232–39.
3. R. Frisch, “Propagation Problems and Impulse Problems in Dynamic Eco-
nomics,” in Economic Essays in Honour of Gustav Cassel (London: Allen & Unwin,
1933), 171–205.
4. E. Slutsky, “The Summation of Random Causes as the Source of Cyclic Pro-
cesses,” in Problems of Economic Conditions, vol. 3 (Moscow: The Conjuncture Institute,
1927).
5. For prominent historical examples, see J. M. Keynes’s skepticism of
Tinbergen’s initial efforts at econometric model building in “Professor Tinber-
gen’s Method,” Economic Journal 49 (1939): 558–68; and F. A. von Hayek’s discus-
sion of quantitative models and their implied measurements in his Nobel ad-
dress: “The Pretense of Knowledge” (Nobel Prize in Economics Documents 1974-2,
Nobel Prize Committee, 1974).
6. F. H. Knight, Risk, Uncertainty and Profit (Boston: Houghton Mifflin, 1921).
7. See S. Stigler, “Soft Questions, Hard Answers: Jacob Bernoulli’s Probability
in Historical Context,” International Statistical Review 82, no. 1 (2014): 1–16, for a
thoughtful discussion of Bernoulli’s accomplishments and influence; J. Bernoulli,
Ars conjectandi, opus posthumum. Accedit Tractatus de seriebus infinitis, et epistola
gallicè scripta de ludo pilae reticularis (Basel: Thurneysen Brothers, 1713).
8. B. de Finetti, “La Prevision: Ses Lois Logiques, Ses Sources Subjectives,”
Annales de l’Institute Henri Poincaré 7 (1937): 1–68, English translation in Studies in
Subjective Probability, ed. H. E. Kyburg and H. E. Smokler (New York: Wiley, 1964);
L. J. Savage, The Foundations of Statistics (New York: Wiley, 1954).
9. For instance, see the discussion in J. Berger, “The Robust Bayesian View-
point,” in Robustness of Bayesian Analyses, ed. J. B. Kadane (Amsterdam: North-
Holland, 1984), 63–124.
10. I. R. Petersen, M. R. James, and P. Dupuis, “Minimax Optimal Control of
Stochastic Uncertain Systems with Relative Entropy Constraints,” IEEE Transac-
tions on Automatic Control 45 (2000): 398–412.
11. Slutsky, “The Summation of Random Causes”; Frisch, “Propagation
Problems.”
12. For further discussion, see L. P. Hansen, “Nobel Lecture: Uncertainty Out-
side and Inside Economic Models,” Journal of Political Economy 122, no. 5 (2014):
945–87.
13. J. H. Muth, “Rational Expectations and the Theory of Price Movements,”
Econometrica 29, no. 3 (1961): 315–35; R. E. Lucas, “Expectations and the Neutrality
of Money,” Journal of Economic Theory 4, no. 2 (1972): 103–24.
14. Slutsky, “The Summation of Random Causes”; Frisch, “Propagation
Problems.”
15. C. A. Sims, “Macroeconomics and Reality,” Econometrica 48 (1980): 1–48.
16. For instance, see T. J. Sargent, “Rational Expectations, the Real Rate of In-
terest, and the Natural Rate of Unemployment,” Brookings Papers in Economic Ac-
tivity 4, no. 2 (1973): 429–80. As I described in “Nobel Lecture,” early in my career
I also contributed to this line of research in collaboration with Sargent.
17. These include John Cochrane, Robert Hodrick, Ravi Jagannathan, Scott
Richard, and in particular Ken Singleton. See L. P. Hansen, “Large Sample Proper-
ties of Generalized Method of Moments Estimators,” Econometrica 50, no. 4 (1982):
1029–54, for an original methodological contribution in support of this research.
18. For recent surveys, see I. Gilboa and M. Marinacci, “Ambiguity and the
Bayesian Paradigm,” in Advances in Economics and Econometrics: Theory and Applica-
tions, ed. D. Acemoglu, M. Arellano, and E. Dekel (Cambridge: Cambridge Univer-
sity Press, 2013); L. P. Hansen and M. Marinacci, “Ambiguity Aversion and Model
Misspecification: An Economic Perspective,” Statistical Science 31, no. 4 (2016):
511–15.
19. See, e.g., R. Bansal and A. Yaron, “Risks for the Long Run: A Potential Res-
olution of Asset Pricing Puzzles,” Journal of Finance 59, no. 4 (2004): 1481–1509.
spring 2017
195
20. For a recent illustration of this mechanism, see L. P. Hansen and T. J. Sar-
gent, “Prices of Macroeconomic Uncertainties with Tenuous Beliefs,” technical
report, University of Chicago and New York University, 2016.
21. Von Hayek, “The Pretense of Knowledge,” 6.
22. This question is distinct from a related argument for simple models initi-
ated as Ockham’s razor. Ockham’s razor is intended to help guide how we con-
struct alternative models, an important question in its own right, but one that I
have largely sidestepped in this essay. I have chosen instead to discuss how we
confront uncertainty in the answers that our models are meant to provide.
23. M. Friedman, “The Lag in Effect of Monetary Policy,” Journal of Political
Economy 69, no. 5 (1961): 447–66.
24. See, e.g., C. A. Sims, “Statistical Modeling of Monetary Policy and Its Ef-
fects,” American Economic Review 102, no. 4 (2012): 1187–1205.
25. G. Stigler, “The Theory of Economic Regulation,” Bell Journal of Economics 2,
no. 1 (1971): 3–21.
26. F. E. Kydland and E. C. Prescott, “Rules Rather than Discretion: The Incon-
sistency of Optimal Plans,” Journal of Political Economy 85, no. 3 (1977): 473–91.
27. Recall Justice Potter Stewart’s treatment of pornography.
28. See L. P. Hansen, “Challenges in Identifying and Measuring Systemic
Risk,” in Risk Topography: Systemic Risk and Macro Modeling, ed. M. Brunnermeier
and A. Krishnamurthy (Chicago: University of Chicago Press, 2012).
29. See, e.g., A. Haldane and V. Madouros, “The Dog and the Frisbee,” speech
given at Federal Reserve Bank of Kansas City’s 366th Economic Policy Symposium
“The Changing Policy Landscape,” Jackson Hole, WY, August 31, 2012. In a speech
on March 31, 2011, entitled “Regulating System Risk,” Daniel Tarullo, United
States Federal Reserve Board of Governors, argued: “There is also need for more
study of the dynamics by which stress at large, interconnected institutions can
have negative effects on national and global financial systems. In fact, what may
be needed is a new subdiscipline that combines the perspectives of industrial or-
ganization economics with finance. Without work of this sort, it may be difficult
to fashion the optimally strong, sensible, post-crisis regulatory regime.”
30. For some illustrations of local welfare analysis from an asset-pricing per-
spective, see L. P. Hansen, T. J. Sargent, and T. D. Tallarini, “Robust Permanent
Income and Pricing,” Review of Economic Studies 66, no. 4 (1999): 873–907; and
F. Alvarez and U. J. Jermann, “Using Asset Prices to Measure the Cost of Business
Cycles,” Journal of Political Economy 112, no. 6 (2004): 1223–56.
know: a journal on the formation of knowledge
196
31. For a more extensive discussion of uncertainty as conceived in this essay
and the modeling challenges in climate economics, see W. Brock and L. P. Han-
sen, “Uncertainty and Climate Economic Models: Wrestling with Decision The-
ory,” technical report, University of Chicago, 2017.
32. Hansen and Marinacci, “Ambiguity Aversion.”
33. This term is reminiscent of the Noble Lie of Socrates and Plato except
that I use J. Lear’s preferred translation (Wisdom Won from Illness [Cambridge,
MA: Harvard University Press, 2017]). Lear’s discussion adds more subtly to the
discussion by arguing: “What is striking about the Noble Falsehood, in contrast
to other myths and ideologies that are meant to legitimate the status quo, is
that this allegory does its work by generating dissatisfaction.”
spring 2017
197

添加新批注
在作者公开此批注前,只有你和作者可见。
回复批注