Goodhart's law

Goodhart's law is an adage named after economist Charles Goodhart, which has been phrased by Marilyn Strathern as: "When a measure becomes a target, it ceases to be a good measure."[1] One way in which this can occur is individuals trying to anticipate the effect of a policy and then taking actions which alter its outcome.[2]

Formulation

Goodhart first advanced the idea in a 1975 paper, which later became used popularly to criticize the United Kingdom government of Margaret Thatcher for trying to conduct monetary policy on the basis of targets for broad and narrow money. His original formulation was:

Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.[3]

However, some parts of the concept considerably pre-dates Goodhart's statement in 1975.[4] Shortly after Goodhart published his paper, others suggested closely related ideas, including Campbell's law (1976) and the Lucas critique (1976).

As applied in economics, the law is implicit in the economic idea of rational expectations, a theory in economics that states that entities who are aware of a system of rewards and punishments will optimize their actions within said system to achieve their desired results. E.g. employees whose performance in a company is measured by some known quantitative measure (cars sold in a month etc.) will attempt to optimize with respect to that measure regardless of whether or not their behavior is profit-maximizing. While it originated in the context of market responses, the law has profound implications for the selection of high-level targets in organizations.[5] Jón Danı́elsson quotes the law as "Any statistical relationship will break down when used for policy purposes", and suggests a corollary to the law for use in financial risk modelling: "A risk model breaks down when used for regulatory purposes."[6] Mario Biagioli has related the concept to consequences of using citation impact measures to estimate the importance of scientific publications:

All metrics of scientific evaluation are bound to be abused. Goodhart's law (named after the British economist who may have been the first to announce it) states that when a feature of the economy is picked as an indicator of the economy, then it inexorably ceases to function as that indicator because people start to game it.[7]

See also

References

  1. Strathern, Marilyn. "Improving Ratings" (PDF). Audit in the British University System European Review 5: 305–321.
  2. Manheim, David; Garrabrant, Scott (2018). "Categorizing Variants of Goodhart's Law". arXiv:1803.04585 [cs.AI].
  3. Goodhart, Charles (1981). "Problems of Monetary Management: The U.K. Experience". Anthony S. Courakis (ed.), Inflation, Depression, and Economic Policy in the West. Rowman & Littlefield: 111–146.
  4. "Overpowered Metrics Eat Underspecified Goals" Ribbonfarm. Accessed 26 January 2017
  5. Goodhart, C.A.E. (1975). "Problems of Monetary Management: The U.K. Experience". Papers in Monetary Economics. Reserve Bank of Australia. I.
  6. Daníelsson, Jón (July 2002). "The Emperor Has No Clothes: Limits to Risk Modelling". Journal of Banking & Finance. 26 (7): 1273–96. doi:10.1016/S0378-4266(02)00263-7.   via ScienceDirect (Subscription required.)
  7. Biagioli, Mario (12 July 2016). "Watch out for cheats in citation game". Nature. 535 (7611): 201. Bibcode:2016Natur.535..201B. doi:10.1038/535201a. PMID 27411599.

Further reading

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.