Executives at financial institutions have always worried about two major risks: credit risk, in which a big borrower (such as a country) defaults on debt, and market risk, in which a collapse wipes out the value of certain investments. In recent years, a third risk has done its part to keep bankers and other finance executives awake at night. That is operational risk: the chance that an institution may be laid low by an internal system failure – such as not guarding against a rogue trader’s activities – or an external event. In the aftermath of September 11’s terrorist atrocities, it will surprise few people to learn that operational risk has been the topic du jour of debate among financial institutions and regulators.


Assessing and managing operational risk was the theme of the annual Risk Roundtable series organized this year by the Wharton Financial Institutions Center.

Co-sponsored by Oliver, Wyman Institute, the academic liaison of Oliver, Wyman & Company, the series focuses on risks associated with current market developments and industry trends. This year attention centered on the lessons learned from recent operational losses, methods to measure operational risks, and the steps needed to manage them.

Concern over operational risk has grown during the past few years, fueled by a variety of factors. These include the use of more highly automated technology; large-scale mergers and acquisitions that test the viability of newly integrated systems; the emergence of banks as very large-volume service providers; and the increased prevalence of outsourcing and the greater use of financing techniques that reduce credit and market risk, but enhance operational risk.

The Fallout from September 11
September 11 and its impact on the financial markets dominated discussions during the first part of the conference, while the Bank for International Settlements’ proposed capital charge for operational risk stirred the emotions of the market operators during the second half of the conference. Wharton finance professor Richard J. Herring, co-director of the Wharton Financial Institutions Center, and Frank Diebold, professor of economics and statistics at the University of Pennsylvania and founding member of the Oliver Wyman Institute, moderated the discussions at the conference.

John Drizk, chairman of Oliver, Wyman & Company, defined the cornerstones within which all debates and concerns about operational risk lie: Inconsistency in the definition of operational risks, paucity of data and measuring techniques, the limitations of holding capital against such risks, and the primacy of internal controls and market discipline for managing operational risks. “While there has been a lot of progress in efforts to quantify operational risk, it is being over-emphasized as a solution,” Drizk noted in his keynote address. “Operational risk is more about improving management practices than management,” he added.


The first session, “What have we learned about Operational Risk from Sept. 11?” discussed three central concerns: provision of redundancy for business continuity planning at the firm level, and at the industry level, issues such as the danger of cyber-terrorism to the financial system and managing the liquidity needs of the market in the week following the September 11 attacks.


Richard Berner, the Chief U.S. Economist and Managing Director of Morgan Stanley, described the firm’s response to Sept. 11 immediately after the attacks. “People are our most valuable asset,” he said. “We soon realized that business was not the immediate issue. Our focus was on our people, their families and our clients. We had to keep communication lines flowing and it was equally important to control rumors.” Attention then shifted to clients to ensure them that their funds were intact, and so was their relationship with Morgan Stanley.


James Koster, managing director at the Depository Trust & Clearing Corporation (DTCC), discussed the criticality of DTCC’s functions to the financial services industry. The Depository Trust & Clearing Corporation (DTCC), established in September 1999, oversees two principal subsidiaries – The Depository Trust Corporation (DTC) and the National Securities Clearing Corporation (NSCC). These two firms provide the primary infrastructure for the clearance, settlement and custody of the vast majority of equity, corporate debt and municipal bond transactions in the U.S.


Koster listed the preparations that DTCC has made to minimize losses from a Sept. 11-type event:

  • Establish a primary data center outside of Manhattan and in the longer-term, implement a three data center solution – two in New York City and one outside the metro area.
  • Relocate a significant number of operational and technical staff permanently outside of lower Manhattan. Before Sept. 11, while non-critical staff was located in backup sites, the location of key management people was centralized. This has now been changed with a minimum of two top level executives dispersed between locations.
  • Sept. 11 exposed wide gaps in the redundancy capability of the telecommunication network. The Internet was more reliable. DTCC is now securing the internet addresses of its employees and its counter parties as an alternative means of communication in a crisis.
  • DTCC has honed its capability to recover data center operations in one hour, improving on the mandated three-hour limit set by the Federal Reserve Bank.


Sandra C. Krieger, senior vice president and head of domestic reserves management and discount at the Federal Reserve Bank of New York, noted that the emerging split-business model – which involves maintaining backup sites – can have a tremendous impact on costs. She suggested that questions that must be asked while implementing this model include: What are the critical business activities? Is it practical to keep personnel in alternate sites, if they do not have enough business? Does the backup site have the capability to interface with customers? What happens if the primary site shuts off and you use the backup site? Do you need to back up the backup site? How is the connectivity between the backup site and the counter party’s backup site?


Krieger argued that it was essential to pay attention to details to ensure that disaster recovery goes through smoothly. “What does your staff feel about the firm’s plan for emergency? Do they know the arrangements you have made and the risks? Does the staff know which staircase has windows, which goes to the basement and which goes to the first floor? Do you have a Crisis Management Safety Team? Do you have a policy for handling natural disasters? Do you have a leadership team responsible for taking charge during a crisis? Do you have a public address system, gas masks, oxygen cylinders, flash lights and water? Does your staff know the safest route out?”


Krieger also discussed the Fed’s response to the liquidity needs of the financial system after the terrorist attacks. The New York Fed kept its wholesale payments system, Fedwire, functioning without interruption on September 11. The other major dollar payments system, CHIPS (Clearing House Interbank Payments System), was also unharmed in the attack and continued operating on September 11.


Nonetheless, each of the systems faced significant strains. To meet the market’s liquidity needs in the week following the attack, the New York Fed injected tens of billions of dollars into the financial system through discount window loans and open market operations. To cope with potential shortages of dollar liquidity outside the U.S. that could not be met through the correspondent banking network, the Fed entered into temporary swap arrangements with the European Central Bank and the Bank of England and augmented its existing swap arrangement with the Bank of Canada.


Krieger noted that September 11 also brought into focus the collaborative effort of financial institutions and regulators, especially in determining when the stock market would reopen. When the markets opened after four days, payments and securities flowed smoothly enough to accommodate the largest volume of trading that has ever occurred on a single day in the New York Stock Exchange’s history. Some 2 billion shares changed hands that day without a hitch.


Measuring, Managing and Insuring Operational Risk

In the second session, Doug Hoffman, president of Operational Risk Advisors, said the goal of operational risk management (ORM) is to enhance management performance through early identification and avoidance of business disruption. Drawing upon his book, Managing Operational Risk, Hoffman emphasized six key tenets of ORM, which include enterprise-wide culture and commitment; governance for operational risk management; potential responses to operational risk; dynamic risk identification, measurement and responses; the role of regulation, and technological changes that improve a firm’s ability to measure and manage operational risk. He said it was important for firms to move towards best practices in ORM.


Peter Ulrich, managing director of Enterprise Risk Management, said events of the scale of September 11 should be described as “supercatastrophes.” He noted that while in the past property losses exceeded human losses, in supercatastrophes the human loss typically exceeds property losses. Ulrich estimated that in the loss of $53 billion at the World Trade Center, property accounted for just 27% and Business Interruption (BI) 24%, compared to the loss of $15 billion on account of the Northridge earthquake, where property losses were 88% and BI just 7%.


Dwelling on the Basel norms for operational risk, Andy Kuritzkes, vice chairman of Oliver, Wyman, explained the complex taxonomy of the different risks a bank faces and talked about where operational risk fits into the framework. He classified all risk into financial and non-financial and explained how operating risk, which is a non-financial risk, is divided into three main categories:

·         Internal event risk (like those experienced at institutions such as Barings, where Nick Leeson, a rogue trader, racked up enormous losses);

·         External event risk, (where losses result from an uncontrollable external event, such as a terrorist attack or a natural disaster);

·         Business event risk, a catch-all risk category that includes risks associated with price wars, depressed levels of activity and a downturn in the stock market, among others. (For example Credit Suisse First Boston experienced a $1 billion operating loss during the 4th Quarter of 2001 because of the downturn on Wall Street).


“The risk categories are not rigid and what starts as an external risk can quickly bleed into a business risk,” says Kuritzkes. “And business risk matters. A back-of-the-envelope calculation shows that business risk, roughly speaking, is 60% of the total non-financial risk. Yet the Bank for International Settlements (BIS) ignores it.”


The BIS in its 1999 Consultative Paper and periodic revisions to this Paper decided that operational risk should be subjected to a capital charge. This drew a lot of flak at the conference. “I do not think that BIS or any other regulatory authority can come up with any rules for how much capital banks can hold against operational risk. The first line of defense for such risk is internal controls.”


The Advanced Management Approach (AMA) to measure regulatory capital, which allows banks to use the output of their internal operational risk measurement systems, subject to qualitative and quantitative standards set by the Basel Committee, is particularly flawed, according to Kuritzkes. “The AMA not only tries to specify a rule based approach but tries to do that in a highly structured and sophisticated way that I think stretches the bounds of what operational risk measurement can deliver. It seems not worth a candle. There is a much better pay back for BIS to concentrate on other components of the Risk Framework like the credit risk.”


Alexander Muermann, a professor of insurance and risk management at Wharton, discussed his time at an investment bank where he participated in a task force charged with measuring operational risk. He also suggested that perhaps regulatory capital is not necessary, since operational risk is bank specific. He ended with a provocative question: Could capital allocation have avoided major risk events such as the collapse of Barings?


Private and Public Initiatives

Richard A. Koss, senior vice president and head of global fixed income management at Brown Brothers Harriman (BBH), discussed the methods that private partnerships such as BBH employ to contain operational risks. Private partnerships are effective structures for mitigating operational risk since they do not face the principal/agent problem that confronts firms, i.e. the incentives of managers are not generally perfectly aligned with those of owners. In a partnership structure, they are the same so managers have several incentives to deal with operational risks.


Koss discussed conditions after the September 11 attacks in the market for repurchase agreements. (Repurchase agreements, or repos, as they are usually called, are often used as a method of investing or raising funds.) The repo market has no price or credit risk since it is a short-term – and sometimes an overnight – market for buying and selling the same asset. But after Sept. 11, the repo market faced high operational risks because of the high market concentration among the clearing banks. A few firms were responsible for more than 50% of repo holdings so when one was knocked out for a while after Sept. 11, it caused ripple effects that reverberated through the whole market. The Treasury market was affected because these are usually the collateral used in these transactions. The market functioned, but liquidity was limited.


The Fed intervened and added an immense amount of liquidity to the financial system to overcome this problem. In addition, it as provided technical assistance to troubled institutions, and some semblance of normalcy returned after about a week. Some interbank firms reported that it was about a month before all the issues were settled in that market.


Koss concluded that operational risk and market risk are related, and it would be useful for market participants and regulators to have a better understanding of these inter-relationships to make better decisions for investment and public policy.


Conference participants seemed to agree that the greatest threat to financial markets was from cyber-terrorism. Koster emphasized that for DTCC, IT is not outsourced, as it is a very important component of the organization. Still, organizations must constantly guard against potential internal threats.


Stefan Walters, vice president and head of markets and liquidity at the Federal Reserve Bank of New York and a Basel Committee member, conceded that in an ideal world, regulatory capital could be made redundant by a non-formulaic approach involving internal processes for assessing capital adequacy and risk profiles along with regulatory review and supervision.


The key issue in ORM – as well as in the development of regulatory capital requirements – is the collection and analysis of loss data. Walters explained the intricate methods used by the Committee to encourage banks to develop data collection and analysis. He made a strong case for the AMA, stating that it attempts to provide a balance between flexibility and structure for the banking industry to develop ORM quantification techniques.


Karen Shaw Petrou, executive director of the Financial Guardian Group (FGG), a coalition of U.S. banks that focuses on asset management, custody, payment and other specialized fee-based services, made the final presentation. She said she would play the devil’s advocate to Walters and the Basel Committee’s viewpoint: “The role of regulators is not to prevent individual bank failure through unwise risk taking. Rather, it should be to prevent systemic market failure due to poor industry-wide risk management.”


Petrou further pointed out that there is no agreed-upon, internationally-acceptable way to measure operational risk. In addition, the Basel proposals do little, if anything, to recognize the value of operational risk management and mitigation. Petrou also noted that the problem of perverse selection would have negative implications in the U.S. Companies need not be financial holding companies to own banks and engage in a wide range of financial services. Companies like Morgan Stanley and Goldman Sachs have eschewed the financial holding company and its accompanying Federal Reserve regulation. She added that the banking systems in the U.S., Europe and Japan had considerable differences, and that while capital might offer effective solutions in Europe and Japan, in the U.S. internal controls and better measurement of operational risk would be a better solution.


A lively discussion followed. Ulku G. Oktem, a senior fellow at Wharton, suggested that the near-miss management system, which has been successfully used in the chemicals and airline industries, could be used by the finance industry as a measure of internal control. Frank Loverso of the National Bank of Canada argued that developing an effective system of early warning signals for the banking industry would be more effective than forcing institutions to allocate regulatory capital to hedge risk.