-- / --
The unprecedented financial crisis of 2008 revealed the uncritical dependence of the global financial community and, by extension, the economic system, on the innovative risk-management formula of David X. Li known as the Gaussian Copula function -- subsequently described as "the secret fomula that destroyed Wall Street".
The debate on climate change, informed by the report of the United Nations Intergovernmental Panel on Climate Change, culminating in the United Nations Climate Change Conference (Copenhagen, 2009), has essentially been based on the Kaya Identity. This is an equation, little known to climate debaters and negotiators, relating factors that determine the level of human impact on climate, in the form of emissions of the greenhouse gas carbon dioxide.
As metrics both formulae have been positioned, with little question, as fundamental to global strategic initiatives. The first example indicates the danger of such dependence. The second raises questions about the coherence of the debate on climate change -- at a time when scientists are vigorously claiming a global consensus on the matter. But of greater concern is the possible existence of other such metrics which have been placed at the heart of strategic initiatives -- unknown to those who might question this tendency to single-metric dependence, the validity of the metric, or the use of other such metrics as the basis for other vital strategies, whether now or in the future.
The theme follows from an earlier exploration of Comprehensive Formulations and their Cognitive Challenge (2009), itself part of a continuing interest summarized in Unexplored Potential of Mathematics and Geometry -- in reframing psycho-social challenges (2008). It also relates to concern for indicators of the capacity to act in the light of indicators of problematic conditions, as originally explored in the context of the Goals, Processes and Indicators of Development ( GPID) project of the United Nations University (Remedial Capacity Indicators Versus Performance Indicators, 1981).
The innovative formula of David X. Li with regard to the Gaussian Copula function is of interest since its successful use is alleged to be at the root of the overconfidence of the global financial community in taking the high orders of investment risk which led to the global financial crisis of 2008, and its consequences. It is admirably described by Felix Salmon (Recipe for Disaster: the formula that killed Wall Street, Wired, 17.03, March 2009) -- or on the title page of the issue as The Secret Formula that Destroyed Wall Street. As Li had indicated in 2005 "Very few people understand the essence of the model" (Mark Whitehouse, Slices of Risk, The Wall Street Journal, 12 September 2005). A second description is offered by Kevin Drum (The Gaussian Copula, Mother Jones, 24 February 2009).
Li's original paper (On Default Correlation: A Copula Function Approach, Journal of Fixed Income 9, 2000, pp. 43-54) was the first appearance of the Gaussian Copula models for the pricing of collateralized debt obligations (CDO's). This quickly became a tool for financial institutions to correlate associations between multiple securities -- allowing CDOs to be accurately priced for a wide range of investments that were previously too complex to price, such as mortgages. In this respect they were at the core of the subprime crisis.
Expressed succinctly, the formula is:
![]() |
An articulated expression of the formula is:
Gaussian Copula | |
![]() |
![]() |
Pr = probability that any two potential sources of risk (A and B) in a proposed investment will both default |
Φ2 = couples the individual probabilities associated with A and B |
= | |
TA = amount of time between now and when A may be expected to default | Φ-1 (FA(1)) = probability of how long A is likely to survive before defaulting |
TB = amount of time between now and when B may be expected to default | Φ-1 (FB(1)) = probability of how long B is likely to survive before defaulting |
. | γ = correlation parameter, reducing correlation to a single constant |
The Kaya Identity was developed by Japanese energy economist Yoichi Kaya. It is the subject of his book Environment, Energy, and Economy: strategies for sustainability co-authored with Keiichi Yokobori as the output of the Conference on Global Environment, Energy, and Economic Development (1993 : Tokyo, Japan). It states that total emission level can be expressed as the product of four inputs: population, GDP per capita, energy use per unit of GDP, carbon emissions per unit of energy consumed. This equation is very simple and tricky, as it can be reduced to only two terms, but it is developed so that the carbon emission calculation becomes easy, as per the available data, or generally in which format the data is available.
Kaya Identity |
F = P*(G / P)*(E / G)*(F / E) = P*g*e*f |
|
The Kaya identity plays a core role in the development of future emissions scenarios in the IPCC Special Report on Emissions Scenarios prepared for the Third Assessment Report (TAR) in 2001. The scenarios set out a range of assumed conditions for future development of each of the four inputs. Population growth projections are available independently from demographic research; GDP per capita trends are available from economic statistics and econometrics; similarly for energy intensity and emission levels. The projected carbon emissions can drive carbon cycle and climate models to predict future CO2 concentration and climate change.
As a contribution to the Fourth Assessment Report (Rogner, H.-H., D. Zhou, R. Bradley. P. Crabbé, O. Edenhofer, B.Hare (Australia), L. Kuijpers, M. Yamaguchi, 2007: Introduction. In Climate Change 2007: Mitigation. Contribution of Working Group III to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change) it is stated that:
The Kaya identity (Kaya, 1990) is a decomposition that expresses the level of energy related CO2 emissions as the product of four indicators: (1) carbon intensity (CO2 emissions per unit of total primary energy supply (TPES)), (2) energy intensity (TPES per unit of GDP), (3) gross domestic product per capita (GDP/cap) and (4) population. The global average growth rate of CO2 emissions between 1970 and 2004 of 1.9% per year is the result of the following annual growth rates: population 1.6%, GDP/cap12 1.8%, energy-intensity of -1.2% and carbon-intensity -0.2% ....
At the global scale, declining carbon and energy intensities have been unable to offset income effects and population growth and, consequently, carbon emissions have risen....
The challenge - an absolute reduction of global GHG emissions - is daunting. It presupposes a reduction of energy and carbon intensities at a faster rate than income and population growth taken together. Admittedly, there are many possible combinations of the four Kaya identity components, but with the scope and legitimacy of population control subject to ongoing debate, the remaining two technology-oriented factors, energy and carbon intensities, have to bear the main burden.... [emphasis added]
With regard to the Kaya Identity, at the time of the Copenhagen negotiations on climate change, might its discoverer, Yoichi Kaya, have echoed the words of David Li regarding the Gaussian Copula before the financial crash of 2008: "Very few people understand the essence of the model"?
Following the consequences of any "binding agreement" in Copenhagen, as with the financial crisis, might there be occasion for an article entitled something like Recipe for Disaster: the formula that crashed the climate -- or perhaps as The Secret Formula that Destroyed the Global Climate?
Is the unquestioning global confidence in the Kaya Identity to be compared with that in the Gaussian Copula -- notably in the light of the unprecedented resources invested in each?
In a White Paper on All the Options for Managing a Systemic Bank Crisis (2008), Bernard Lietaer, Robert Ulanowicz and Sally Goerner provide an overview of an approach to the financial system in the light of a single metric derived more generally from the Stability and Sustainable Viability in Complex Flow Systems:
There is now scientific evidence that a structural issue is indeed involved. The theoretical origin of this evidence may be surprising to the economic or financial community, although it wouldn't be such a surprise for scientists familiar with natural ecosystems, thermodynamics, complexity or information theory. The science that explains this issue rests on a thermodynamic approach with deep historical roots in economics. In this view, complex systems, such as ecosystems, living organisms, and economies are all seen as matter-, energy-, and information-flow systems. For example, the famous food chain is actually a matter/energy flow-network built of complex relationships among organisms. Plants capture the sun's energy with photosynthesis; animals eat the plants; species then eat each another in a chain to top predator, only to have all organisms die, decompose, and their energy/matter be recycled by bacteria. Similarly, economies are circulation networks consisting of millions of businesses and billions of customers exchanging different products and services, which when taken as a whole, are supposed to meet the needs of all participants....
Consequently, as Goerner (1999) says about universality: 'all [flow] systems, no matter how complex, fall into one of a few classes. All members of a class share certain common patterns of behavior.' Similarly, Cvitanovic explains: 'The wonderful thing about this universality is that it does not matter much how close our equations are to the ones chosen by nature, as long as the model is in the same universality class…as the real system. This means that we can get the right physics out of very crude models.' The existence of parallel patterns and dynamics explains why similar energy flow concepts and analysis methods apply to economic systems as well as natural ones....
Sustainability of a complex flow system can therefore be defined as the optimal balance between efficiency and resilience of its network. With these distinctions we are able to define and precisely quantify a complex system's sustainability in a single metric. Indeed, there is now a way of quantitatively measuring all the relevant components separately: total throughput, efficiency, and resilience. Furthermore, the underlying mathematics are well-behaved enough so that there exists only one single maximum for a given network system....
It is critical to understand that the findings described so far arise from the very structure of a complex network system, and therefore that they remain valid for any complex network with a similar structure, regardless of what is being processed in the system: It can be biomass in an ecosystem, information in a biological system, electrons in an electrical power distribution network, or money in an economic system. This is precisely one of the strong points of using a web-like network approach instead of machine-like metaphor. [emphasis added]
The details of this systemic approach are described in a separate paper, with technical and mathematical proof (Robert Ulanowicz, Sally Goerner, Bernard Lietaer and Rocio Gomez, Quantifying Sustainability: Efficiency, Resilience and the Return of Information Theory, Journal of Ecological Complexity. 6, 2009, pp. 27-36). Complementary arguments are provided separately by the authors (Sally J. Goerner, Bernard Lietaer, and Robert E. Ulanowicz, Quantifying economic sustainability: Implications for free-enterprise theory, policy and practice, Ecological Economics, 69, 2009, pp. 76-81)
Depending on how "metric" is understood, the following suggest other possibilities for a single metric:
Mathematics might be understood as the study of metrics and as such to itself constitute a (meta)metric. Insights into its nature might even be achieved through metaphor (Towards a Periodic Table of Ways of Knowing -- in the light of metaphors of mathematics, 2009). Various approaches are taken to reduce multiple metrics to a single metric.
One concern is the extent to which any single metric might be elaborated and used in secret for competitive advantage. This is evident in the approach taken by institutions in developing formulae used in trading on the financial markets. Such management of risk is similarly evident in gambling where gamblers may each develop and use their own secret formula.
It is to be expected that intelligence agencies would develop a secret metric to assess "points of interest" calling for heightened attention. Of particular interest is the metric through which levels of threat are assessed to determine an appropriate Defense Readiness Condition (DEFCON), especially in response to terrorism (Distinguishing degrees of fear and terror, 2004). The threat of terrorism itself offers a kind of singular "metric" (Promoting a Singular Global Threat -- Terrorism: Strategy of choice for world governance, 2002).
A related approach is evident in the use of copyright to protect a formula or to prevent others from using it -- as in the practice of "patent squatting". The possibility that the world might be held to ransom for its food by potential "terminator seeds" has long been recognized with concern -- but more intriguing is the possibility that access to vital insights might be withheld by practices analogous to such genetic use restriction technology. Perhaps a form of "memetic use restriction technology"? (Future Coping Strategies: beyond the constraints of proprietary metaphors, 1992).
Variants of such secretive possibilities are evident in secret societies organized to offer access progressively to deeper knowledge -- presumably more integrative and singular -- through a succession of initiations. It is however also the case that commercial barriers to access to copyrighted publications may render "secret" to the majority the models they articulate.
Another form of metric is that used for page ranking of results from a search engine query, such as with Google. This could prove to be of increasing relevance in shaping global knowledge society through secret rules for including, excluding or weighting certain results -- possibly under commercial, religious, political or security pressures (as recently highlighted in debate over access to Google in China). According to Wikipedia, a generic page rank equation is as follows:
![]() |
The challenge with respect to any singular metric of global strategic significance is whether it has a "human face". This was the theme of the classic challenge of UNICEF to the metric used by the International Monetary Fund in determining the necessary "structural adjustment" in developing countries (UNICEF, Development with a Human Face, 1997). A related argument has been made by the UNFPA (Thalif Deen, Development: UNFPA Puts Human Face on Climate Blowback, IPS, 18 November 2009) and with respect to climate change more generally (Ellen Goodman, The 'human' factor is missing in Copenhagen, The Boston Globe, 11 December 2009).
There is thus a sense that there is an inappropriate dilution of an essentially human dimension by the use of the kinds of metrics favoured by economists and related disciplines:
The issue seems to be a combination of:
These suggest a reason for the use of images with "human interest" to frame a period.
Ergodicity and Economics? |
Time to move beyond average thinking: Taking the expectation value of an observable is not the same as averaging over time. (Nature Physics, 15, 2019, 1207) When confronted with this argument, many economists tend to get defensive. They will readily acknowledge the limitations of excessively narrow definitions for measures of economic performance, while also pointing out that much of the criticism they receive is itself based on caricaturing the work that they do. |
Future global implication of unknown "black box" algorithms? |
Rebecca Heilweil: New York City couldn't pry open its own black box algorithms. (Recode, 18 Dec 2019). Rashida Richardson (Ed.): Confronting Black Boxes: A Shadow Report of the New York City Automated Decision System Task Force (AI Now Institute, 4 December 2019) Neil C. Renic: Death of efforts to regulate autonomous weapons has been greatly exaggerated (Bulletin of the Atomic Scientists, 18 December 2019) Brad Allenby: 5G, AI, and big data: We're building a new cognitive infrastructure and don't even know it (Bulletin of the Atomic Scientists, 19 December 2019) Matt Field: As the US, China, and Russia build new nuclear weapons systems, how will AI be built in? (Bulletin of the Atomic Scientists, 20 December 2019) |
Continued in Part 2: Relevance of Mythopoeic Insights to Global Challenges
For further updates on this site, subscribe here |