<< Computers are nice, but ... >>
Here is a fascinating excerpt from an article about how banks are pricing the risk in their large portfolios of loans:
<< Just as the economic profession has become dominated by neoclassical thought, which holds that all market participants are rational and fully informed, the risk community has embraced quantitative methods to such an extreme degree that the idea of performing a classical credit analysis on obligors is not even considered. Fascination with models and the mathematical tools borrowed from quantum physics has turned the economics profession - and, we suggest, their brethren in the risk analytics community -- into what Philip Ball, writing in the FT on Monday, calls purveyors of "neoclassical idiocies."
For example, in their IMF working paper (06/134) on "Risk Models of the Financial Sector Assessment Program," Avesani, Liu, Mirestean and Salvati state that "over the last ten years, we have witnessed major advances in the field of modeling credit risk. There are now three main approaches to quantitative credit risk modeling: the "Mertonstyle" approach, the purely empirical econometric approach, and the actuarial approach… Each of these approaches has, in turn, produced several models that are widely used by financial institutions around the world."
Notice that these respected researchers do not even mention the idea of using fundamental financial factors to track the behavior of a specific obligor. In the same paper, the authors then articulate "three main approaches to estimating the probabilities of default. One approach is to use historical frequencies of default events for different classes of obligors in order to infer the probability of default for each class. Alternatively, econometric models such as logit and probit could be used to estimate probabilities of default. Finally, when available, one could use probabilities implied by market rates and spreads."
Again, the concept of focusing attention on the behavior of a specific obligor is not even considered among the contemporary methods in use today by risk professionals. Instead, derivative indicators and models, using aggregate studies of past default experience and other data, are employed to estimate the possible future behavior of obligors. So hungry are the modelers for data to feed the great engine of statistical analysis that they even admit to using bond spreads as data inputs when equity market prices are unavailable!
Contemporary risk models, to paraphrase Ball, assume a degree of homogeneity and stability among subjects that simply does not exist. Most market professionals know this statement to be true, yet the risk vendors and the largest financial institutions have such an enormous investment in quantitative methods that they are unwilling to give them up.
The situation today reminds us of another era not too long ago, when the "whiz kids" of Wall Street made the mistake of relying upon "mark to model" methods for managing the risk in their bond portfolios -- and wound up damaging their trading desks and running from the law in shame. How quickly we forget the damage done by those SPARC 2 workstations in the hands of math geniuses. When the models and real life diverged, the trade tickets were found hidden in desk drawers as the guilty parties prayed that the real world would move back towards their risk position before anyone found out. >>
| us1.institutionalriskanalytics.com |