Better Data Brings a Renewal at the Bank of England

A venerable banking institution is using data in new ways to refine its view of the UK economy.

by: Michael Fitzgerald

Introduction

In June 2014, the Bank of England — one of the world’s oldest central banks — was preparing to announce its policy recommendations about the United Kingdom’s housing market. At the time, a dearth of new housing starts and a recovering economy was driving up housing prices, notably in London.1 This had raised concerns of a repeat of the market behaviors that had led to an economic crisis five years earlier.2 The Bank’s recommendations would be closely watched by the financial sector.

Several executives inside the Bank saw the policy recommendation as a watershed moment for the institution: It was one of the first times the Bank would make a major policy recommendation based in part on data from Britain’s Financial Conduct Authority (FCA), which was formed under the Financial Services Act 2012 as part of the UK’s response to its banking crisis during the recession. The FCA, which regulates the marketing of financial services products, has a memorandum of understanding to share data with the Bank of England.

Chapter 1

Policy That Hits Home

In particular, the Bank was using microeconomic data to form a detailed picture of the UK’s housing market. It had aggregated transactional data at the level of the country’s various local authorities, like the boroughs in London. One of these datasets, FCA’s product sales database, tracked every mortgage for owner/occupiers issued in the UK. Another was the Land Registry data, which included a housing price index and datasets with transaction data such as prices paid.

Access to these datasets had enabled the Bank to refine its models of how housing market behavior influenced risks to lenders’ overall financial health. For instance, the Bank’s analysis showed that the UK’s local housing markets varied a great deal. While there were concerns that another housing bubble was inflating in London and other parts of the southern UK, most of the rest of the country was not seeing similar price increases.

But overall indebtedness was a concern that the Bank wanted to address — in particular, the pace of lending for mortgages with high loan-to-income ratios. (See “Analytics in Action.”) The number of these high loan-to-income mortgages was growing rapidly, and the Bank recommended that lenders limit them as well as require a stress test to individual borrowers in order to see how they would fare if interest rates rose by as much as 3% over a five-year period.

Pulling together the data for the housing policy recommendation required collaboration among many groups at the Bank; it represented one of the early triumphs of a new way of working together at the institution, which at times had struggled to bring diverse perspectives together. At one point, the only room large enough to accommodate the number of people collaborating was an underground chamber deep beneath the Bank’s home on Threadneedle Street in the center of London.

Driving this change in behavior was Mark Carney, the Bank’s Governor since July 2013, and the first non-British leader in its history. Soon after he arrived from the Central Bank of Canada, Carney had organized the Bank of England’s mission — maintaining monetary and financial stability for the good of the people of the UK — around a “One Bank” structure that encouraged staff to build on each other’s expertise.3

In a very tangible sense, the Bank was changing the way it behaved to take better advantage of the data to which it had access. In Carney’s first year, the Bank had established a high-level data council, set up a data lab, hired a chief operating officer, and formed a new advanced analytics unit. It was looking to hire its first-ever chief data officer (CDO) as well. Data had always played a key role in the Bank’s work, but to realize the full potential of its access to new data, the Bank was changing its structure, its behavior, and its approach to problem solving.

The housing market recommendations were part of the Bank’s June 2014 Financial Stability Report.4 At a testy press conference after it was released, Carney was peppered with questions about its recommendations, centering on the lack of immediate action to cool off the housing market. Carney argued that the Bank’s recommendations left room for banks to make some risky loans, which helps first-time buyers, among others. But it also created what he called a “firebreak.”5

The response from the British press was lukewarm, perhaps because the press is largely based in London, a market that was seeing sharp rises in housing prices. One analyst called the Bank’s recommendations a “paper tiger.”6 But many more were willing to give the Bank’s moves the benefit of the doubt. The stock market responded by driving up homebuilder stock prices, and the head of the House of Commons Treasury Committee, Andrew Tyrie, said: “While apparently modest in its initial impact, it breaks new ground.”7

Chapter 2

Winds of Change

The Financial Stability Report had come out of a period of extraordinary institutional change. The Bank had restructured in the wake of the global recession of 2008, and its expanded writ directly regulating Britain’s banks and insurers meant it had to get up to speed with the volumes of data this would require, especially for performing stress tests on the financial health of the companies. Analytics became one of the four pillars of its reorganization. (See “The Bank of England's Strategic Plan — One Bank, One Mission.”) Analytics also would help with a cultural issue at the Bank: openness.8 One of Carney’s goals was to make the Bank more transparent in its decision making. Data could help explain some of its thinking.

In his prior post at the Bank of Canada, Carney had established himself as one of the world’s top central bankers.9 Joining him in London was Charlotte Hogg, who left a post running Banco Santander SA’s UK retail operations to become the Bank of England’s first chief operating officer.10 She was part of a push by Carney to diversify the Bank, traditionally dominated by white men. Another powerful woman was Nemat “Minouche” Shafik, who was a deputy managing director at the International Monetary Fund until Britain’s Chancellor of the Exchequer, George Osborne, named her the Bank of England’s deputy governor, responsible for markets and banking, in March 2014.11 That made Shafik one of the Bank’s nine-member Monetary Policy Committee, which sets interest rates and other monetary policy for the UK.

The Bank’s expanded remit, regulating the UK’s banks and insurers, was a return to part of its past. Regulating retail banks (though not insurers) had been part of its purview until 1997, when the Labor Party took power and decided to separate the Bank’s monetary and regulatory roles as part of giving the Bank control over monetary policy. (Previously, the government set monetary policy, meaning interest rate cuts often coincided with political need.)12 But now, overseeing these institutions meant running regular stress tests to assess their financial fitness under extreme conditions, a new and newly data-intensive mandate.13

The Bank of England has long been a trendsetter (see “A Brief History of the Bank of England”). It stands today as one of the world’s most influential central banks, not least because Britain is the world’s fifth-largest economy, with a $2.98 trillion gross domestic product.14 The UK is also an integral part of the $18.5 trillion European Union. The Bank of England plays a special role within the European Union, vis à vis the European Central Bank, because the UK is the only large European economy that is not part of the eurozone. While the UK is Europe’s third-most populous country with nearly 65 million inhabitants, it is Europe’s second-largest economy, and on a per-capita income basis is neck-and-neck with Germany, the most prosperous of the large European nations.15 London itself is one of the world’s most important financial centers.16

Opening Up the Bank

Carney and his five deputies,17 including Hogg, had developed the One Bank platform, meant to create a central bank that was both more diverse and more able to align around its goals. Hogg drove many of these sweeping changes in order to send a message: We are one bank, dedicated to the common good.

Hogg is steeped in the Bank; her first job was there in the early 1990s, after which she spent time in the U.S. at McKinsey & Company, Morgan Stanley, and its spinoff, Discover Financial Services, before returning to the UK to top positions at Experian and Santander. She knows where the saying “Not for ourselves, but for others” is on the Bank’s elaborately tiled floors. She knows the messages central bankers are meant to draw from the myriad busts of gods and goddesses and images of white owls (the messengers of the gods). Her own office came with a bas relief of Pegasus above the door, and she knows that for central bankers, it was meant to symbolize swiftness in making decisions for the good of the people. When she started, she asked that a framed copy of the Bank’s original 1694 charter be hung where she can see from her desk the Bank’s mission, which in colloquial terms is: “To promote the public good through financial and monetary stability.”

Her aim now is not so much to build a new Bank as to upgrade the current one. “It’s a matter of protecting and strengthening, enabling the heritage and responsibilities whilst losing some of the things that are past, legacy,” she says.

A big part of the upgrade process involves rethinking the way the Bank manages data. In the past, datasets were not always easy to find, so underlying the One Bank strategy are improved data sharing, IT, and analytics. The Bank set up a data council, initially chaired by Hogg and made up of senior-level officials interested in data, including the chief information officer (CIO), the CDO, the head of advanced analytics, and the head of statistics and regulatory data. The data council guides the Bank’s decisions about data, including its strategy for what to collect. “My argument is we cannot collect the world’s data,” says Hany Choueiri, the Bank’s first-ever CDO. He was hired in January 2015, reporting to the CIO. Choueiri was part of a push to remake IT from a service bureau to a driver of operational change when it comes to data.

To do big-data analytics requires a strong link between IT and the analytics units and tools. “The larger the datasets and the messier the datasets, the more important IT becomes,” says Paul Robinson, head of advanced analytics at the Bank. “Inappropriate or inadequate technology can lead to situations where people wait for hours to see results that shouldn’t take nearly so long.”

One novel way in which the Bank has used analytics techniques is to apply them to its own structure. Hogg wanted to see just how close the Bank of England was to acting as one bank. Personnel in the newly formed data lab took all the Bank’s Outlook communications, aggregated it, applied modeling tools to assess it, and then used a visualization tool to create a chart that displayed organizational communications at the Bank. (The data was not used to show which individuals were most connected, but how people within departments connect.) What emerged was what she calls a “Kandinsky chart” of the Bank, after the abstract artist and theorist who plotted the geometry of paintings. “It describes — in a way that nothing else could — how people are interacting in the Bank,” she says. “You could see who was really talking to whom, and whether that made sense.”

Chapter 3

A New Balance

Of course, the Bank of England has long used analytics. But prior to Carney’s arrival and its new supervisory mandates, the pace of analytics was not typically at the speed associated with big-data analytics, where large volumes of data can be gathered at frequencies approaching real time.

To help create datasets to support One Bank’s analytics goals, the Bank took its first-ever data inventory to see what kinds of datasets it had in house. Inventory “sounds quite boring,” says Hogg, “but it’s pretty fundamental. We need to know what we’ve got to know how to manage it.” Another reason the inventory was important: It would make it easier to aggregate datasets to help with policy decisions.

The inventory took most of a year and turned up nearly 1,000 datasets. Choueiri says he set up a data inventory tool to tab each dataset across a list of 14 categories, which are searchable on the Bank’s intranet. The inventory would make it clear which datasets can be used for which purposes; for instance, when the Bank collects data from an external source, the inventory also captures the purpose for which the Bank has agreed to use the data. The data inventory thus helps ensure that the Bank is compliant with legal restrictions on the data it has.

Analytics requires a balancing act of sorts at the Bank, given the different missions of the institution. Monetary policy and insurance regulation, for instance, use vastly different data and aim to accomplish different goals. While various parts of the Bank often need access to the same data, some has to be kept restricted for limited use because of regulatory provisions or because the Bank has agreed to use it for only certain purposes. But much of the data does not need to be restricted, and creating broader access can boost policy making because of reduced duplication of effort. Better policy making expands the value of the information.

Along with the data inventory, the Bank’s IT department was also putting in place the tools and structures they want for advanced, big data–style analytics. The datasets used for the Bank’s macroeconomic charter — measures like unemployment, consumer pricing, and productivity — are comprehensive for their purposes, but are neither especially large nor do they operate in anything approaching real time. “Historically, data collection has been very specific, with systems built for each one of the collections,” Choueiri says. The Bank is moving to more general tools to increase its flexibility, a move it is undertaking as part of the three-year One Bank data architecture program.

Next stages for data management will include building a data architecture to more effectively handle the various kinds of structured and especially unstructured data, such as text, that the Bank has or expects to get in order to help policy makers. And the Bank has worked to consolidate the use of tools for analyzing data and to move people off of Excel as the primary analytics tool. Reducing the number of specialized data tools in use at the Bank should make it easier for people from different parts of the Bank to share data and even work together on certain projects, with the end result being better policy decisions.

“This Stuff Is Brilliant”

Any time an organization tries to centralize control, it runs the risk of rebellion. Choueiri says he’s aware of CDOs who find themselves fighting pitched battles within their organization as they try to bring data together. He says the One Bank platform has largely helped him to avoid this at the Bank of England.

It helps, says Hogg, both that the Bank is analytically inclined by its nature and that people who work there do so out of a sense of public service. “I’ve found here that if what you’re doing is clearly in the interests of the mission of the institution, people tend to welcome it,” she says. “And this stuff is brilliant, right? I mean, your ability to be able to get a handle on different sources of data is really powerful, and people can see how that will benefit their work.”

Sujit Kapadia, head of research, is one of those beneficiaries. An economist who has been at the Bank since 2005, he says One Bank offers “a natural mechanism” for bringing together different perspectives from within the Bank. Adding this kind of diversity is valuable as the Bank looks to apply lessons learned in the 2008 economic crisis, which Kapadia says “caused us to rethink some of the conventional ways of approaching economics and finance and regulation.” There were also huge increases in the quantity and level of detail in the available data.

In July 2014, those factors all went into a workshop called “Big Data and Central Banks,” where the Bank brought together people from 20 central banks across the globe and external topic experts to discuss the impact of big data, defined as “datasets that are granular, high frequency, and/or non-numeric,” on central bank policy making.18 Breaking these characteristics down, granular means per item (for each loan or each security), high frequency means frequently updated, and non-numeric from widely varied sources. Historically, the Bank of England has used little in the way of big data; its datasets were typically highly structured and (when reported) were typically reported quarterly. But the Bank had been a fairly advanced adopter, for a central bank, of nontraditional data — for example, using Google data to look at housing and employment market conditions in 2011, examining the impact of high-frequency trading on stock markets by looking at equity transactions, and looking at credit swaps and liquidity management using high-frequency datasets.19

Such high-frequency datasets have not traditionally been in wide use for macroeconomic policy recommendations by the Bank. Speakers at the workshop (who were not identified by name) showed results from their work demonstrating that micro data could yield macro patterns, especially when visualization tools were used effectively.

For the Bank of England, with 300 years of macroeconomic data available on its website, the addition of much higher-frequency data represents an interesting development. It opens datasets that, for instance, can enhance understanding of how a monetary policy action like changing an interest rate affects the financial system.

Joy and Stress (Tests)

The mere creation of a chief data office sparked unusual emotion in some corners of the Bank. “I was whooping with joy, literally,” says Nathanael Benjamin, head of division for financial risk and resilience at the Bank. He knew that a CDO would give him easier access to the data and tools he needed to do his job. A major reason that mattered were the stress tests for banks and insurers. Stress tests use analytics to look at a bank’s financial structure and evaluate whether it could withstand different kinds of severe but plausible financial shocks: from short, sharp ones like a stock market crash to waves of bad economic developments that play out over months or even years. If it can’t, policy makers look at why, and tell the bank what it must do to prepare itself.

The rise to prominence of stress tests was triggered by the 2008 crisis, and Benjamin was involved in the early days of this evolution due to his experience in quantitative risk analysis and in regulation. He was on temporary assignment to the Federal Reserve Bank of New York from 2008 to 2010, and took part in the very first supervisory stress tests of major U.S. banks. “That worked really well, but it was painful,” he says. “It was the first time we were asking firms for this type of data, and it was the first time the firms had to provide it to us — and even to themselves sometimes. We found ourselves in a lot of situations where firms weren’t able to get hold of the data in a timely manner and really struggled to drill down and aggregate that risk data.”

In the end, it worked. But Benjamin saw the need to manage — with conviction — a well-defined data strategy for the risk-related data relevant to stress testing. The Bank of England now has such a strategy. Although it involves a great deal more data collection than before, it is being carried out under a very different regime from the one in place in the U.S., where central banks tend to seek out and gather every morsel of data. At the Bank of England, the data will be big, but it won’t be all-encompassing.

For stress tests, “we’re trying to get the cut of the data that tells us what we need to know, but not necessarily much more,” says Benjamin. He says vacuuming up large quantities of this data is very resource-intensive, requiring processing, checking, translating, and analyzing. There can be diminishing returns in asking for more. “We are, on purpose, targeting a middle ground in terms of the data we ask for,” he says.

The Bank of England is now running stress tests concurrently in seven banks each year. When it started, it was only able to perform them sequentially and could only do two banks a year. The increase is valuable, not just for scope but also because concurrent stress tests provide a better idea of the overall strength of the banking sector and permit the consistent exercise of supervisory judgment through benchmarking. In short, these tests help regulators ask the right questions.

Advancing Analytics Across the Bank

Part of the charge for analytics has fallen to Andrew Haldane, who had been executive director for financial stability prior to Carney’s arrival and is now the Bank’s chief economist. Haldane is a prolific researcher who has established himself as a bold, almost maverick, economist, talking publicly about setting negative interest rates20 and replacing cash with digital currency. As part of Carney’s sweeping reorganization, announced in March 2014, Haldane swapped jobs with then-chief economist Spencer Dale.21 Haldane has embraced a cross-departmental structure and created the research unit that Kapadia heads. In this unit, five or six full-time employees are charged with working on cross-cutting research projects spanning all of the Bank’s responsibilities. Additionally, members of different departments at the Bank rotate in for various project periods, almost like research fellows. Haldane also brought Paul Robinson on board to build the advanced analytics unit — basically a center of analytics expertise within the Bank.

Robinson is another returnee, having left for several years for the private sector. When he arrived back at the Bank, the advanced analytics unit had just four people. Now, it has 12 to 13 people, mostly new. They come from untraditional backgrounds like physics and computer science.

“We have lots of extremely sophisticated, very highly qualified, and highly numerate economists. We wanted to supplement them with people who had a different background and were used to modeling other sorts of phenomena,” Robinson says. Bringing in people from other spheres of knowledge also expanded techniques that could be used; for instance, among the techniques being used more frequently now are agent-based modeling and network analysis.

The analysis underlying the recommendations set out in the June 2014 Financial Stability Report helped underscore that the new analytics wasn’t just a nice set of tools for the already analytical parts of the organization. It helped to show just what the Bank might be able to do with its new access to transaction data. And it reinforced the Bank’s efforts to improve cross-group work. The potential for improved policy decisions is obvious, as is the likelihood that the Bank will be able to respond more quickly to market events.

Unstructuring the Data

The Bank’s experience with analytics was largely derived from its use of structured data. It ran a creative experiment analyzing new kinds of unstructured data when Scottish voters were preparing to vote on whether to leave the United Kingdom in September 2014. One IT staff member built a feed from Twitter to look for signs of a potential run on Scottish banks.22

The feed looked for terms like “run” and financial institutions such as “RBS” (Royal Bank of Scotland) and the like. The Sunday before the referendum, Twitter saw a spike of “RBS” mentions. It turns out that it wasn’t a sign people were planning to flood the Royal Bank of Scotland the next morning, but that an American football game was starting — the mentions of “RBS” identified in the Twitter feed were references to running backs. The football players didn’t stiff-arm the whole test, however; there were enough relevant tweets to show that unstructured data sources could provide useful information to Bank policy makers if they needed to quickly respond to something — an important lesson about unstructured data. Plus, it hadn’t been hard to do; the IT developer who built the feed did it working at home, in his bedroom.

Another experiment for analytics was to set up a Hadoop data framework, an open-source platform for handling large amounts of data on relatively inexpensive hardware. It was built as part of a data lab project meant to give the Bank an analytics sandbox to play in, a tool to experiment with cutting-edge analytics techniques on things like what an entire day’s trading records on a stock exchange might mean for bank stability.

Zinging results out of a Hadoop cluster sparked active debate within the traditional IT department. Some members raised valid concerns: The cluster didn’t have the typical IT controls; it was unclear how it would be secured or managed; and it wasn’t even clear who would handle system backups, since the cluster was set up outside of IT. This kind of discussion often takes place when an organization adopts a dual IT structure, adding a group for emerging technologies to run in parallel to the traditional organization. In this case, the issues were resolved by isolating the Hadoop cluster from key regulatory systems.

Overcoming Overfitting

Central banks deal not just in real-world economic conditions but also in theoretical scenarios and in rare events like major financial crises, making it harder to use actual circumstances to prove the models are accurate. Robinson calls this a key challenge for his unit, one that means the unit has to show rigor and robust explanations for its decisions. Organizations expanding into big-data analytics must have someone looking out for some decidedly abstract concerns, such as overfitting in the models. In overfitting, as the number of variables that might explain a set of observations increases, the chances grow that the models will come up with spurious relationships among the variables. “Then, as soon as you start using them outside the sample, they are utterly hopeless,” Robinson says.

Choueiri says the Bank isn’t yet really doing big data. He says there’s a huge variety of data analyzed at the Bank, but the volumes are not yet anywhere near what the private sector examines. That will change. His big-data platform will launch in 2016 and run in parallel to the existing systems to make sure it’s ready to handle heavier data analytical workloads. The other twist with big data is: While data gets more granular, it also demands speed. That might mean a drop in accuracy. “We’re potentially getting away from the notion that data has to be 100% accurate,” says Choueiri. “What we cannot do is wait months to obtain a highly accurate dataset” in some instances. That’s a huge cultural shift for central bankers.

Chapter 4

Changing the Climate With Analytics

All the reorganization and the adoption of new tools have already sparked new kinds of analysis from the Bank. Some of it wades into uncharted territory for a central bank. The insurance supervision department, for instance, has been doing long-term scenario planning around climate change. Their work has led Carney to warn insurers that they may need to do more to protect themselves from related financial risks

.

In a March 2015 speech to insurance market Lloyd’s of London, for instance, Carney presented data from a report warning that the frequency of catastrophic weather events was increasing, a trend that could affect insurers’ financial stability.23 He called on insurers to develop a disclosure committee that would push to get companies to disclose their exposure to climate change. He also called on companies to plan for the potential of a collapse in the fossil fuel industry, because of the risks such a scenario presents for insurers.

The potential effects of climate change are one way the Bank wants to use big-data analytics to address what Carney calls “the tragedy of the horizon,” the inability of private-sector firms to look more than two or three years ahead, and even of central banks like the UK’s to look out past a decade.24

Such uses of analytics have Hogg optimistic about its role at the Bank. While she cautions that “there’s still quite a long journey to go,” she’s seen over the last six months a jump in demand for analytics and datasets from various functional areas of the Bank. She thinks the housing crisis model, which combines granular data (that is, geographically aggregated data about individual loans) with macroeconomic policy data, “is perhaps where analytics is going to be the most powerful” for the Bank.

In a way, the Bank must become its own oracle — and the bet on analytics is that it will help make the policy game less opaque. In the case of the housing recommendation, the Bank was able to implement its limits on loan-to-income ratios in October 2014.25 Several months later, Parliament expanded the Bank’s powers in this area.26

Hogg says analytics helps bridge the macroeconomic question, “Is there a housing bubble?,” along with the smaller question of the balance sheets of individual banks, and the ultimate micro question about the level of debt for individuals. Other banks have weighed in, like UBS AG, the large private bank headquartered in Switzerland, which in October 2015 called London’s housing market the most overvalued in the world and said it was a bubble ripe for bursting.27 Hogg takes the comments in stride. “Pretty much everything we do, people will say something about us, and sometimes in a derisory way,” she says. Her own analysis is that the Bank is on a three-year journey toward remaking itself for this new set of responsibilities, and that analytics is its new weathervane for the economy.

About the Research

References

1. D. Beckett, “Trends in the United Kingdom Housing Market, 2014,” September 22, 2014, www.ons.gov.uk; and H. Osborne, “UK House Price Rises For 2014 Almost Twice As High As Predicted,” Guardian, Aug. 26, 2014.

2. C. Berg, “The Global Financial Crisis and the Great Recession: Causes, Effects, Measures, and Consquences For Economic Analysis and Policy” (presentation at the Workshop on Monetary Policy, Macroprudential Policy and Fiscal Policy, London, May 17-19, 2011). Note especially: “However, it is important to note that the crisis was not triggered by the global imbalances as many observers, such as the IMF, had warned. The trigger was instead the downturn in the housing market in the U.S. and the panic and credit contraction in the financial system. It is also probable that some of the global imbalances were due to the US price rise in housing dampening household savings. In that way the housing price bubble probably helped to increase the U.S. current account deficit, which also means that the adjustment in the U.S. housing market should make some contribution to adjusting the global imbalances.” Also see A. Bennett, “World Must Act to Stop Another Massive Housing Crash, Warns IMF,” June 12, 2014, www.huffingtonpost.co.uk; and J. Titcomb, “Bank of England Inflation Report As It Happened,” Daily Telegraph, May 14, 2014 (note number of questions focused on housing).

3. “The Bank’s Strategic Plan: One Bank, One Mission,” n.d., www.bankofengland.co.uk.

4. “Financial Stability Report,” June 2014, www.bankofengland.co.uk.

5. F. Bermingham, “Carney: Bank of England At the Limit of Its Tolerance Over Housing Market,” International Business Times, June 26, 2014.

6. A variant comment from the Guardian was less aggressive: “Mark Carney’s been taking lessons from his chum Mario Draghi. The president of the European Central Bank has become a dab hand at getting what he wants just by talking tough. Carney is trying to turn the same trick with Britain’s housing market.” See L. Elliott, “Mark Carney’s Housing Pill Needs Time to Let Economy Digest It,” Guardian, June 26, 2014; for “paper tiger,” see S.P. Chan, “Bank of England Cracks Down On Mortgages,” Telegraph, June 26, 2014.

7. “Bank of England Financial Stability Report, As It Happened,” Guardian, June 26, 2014; also see J. Treanor and L. Elliott, “Bank Will Not Act On Housing Prices Yet, Says Carney,” Guardian, June 26, 2014.

8. See, for instance, L. Elliott and J. Treanor, “Inside the Bank of England,” Guardian, November 10, 2015.

9. K. Carmichael, S. Silcoff, and B. Erman. “How Mark Carney Became a Star Player In a Global Financial Arena,” Globe and Mail, November 30, 2012.

10. C. Giles, “Carney Picks Santander’s Charlotte Hogg For Bank of England Post,” Financial Times, June 18, 2013.

11. HM Treasury and G. Osborne, “Chancellor Announces Three Senior Bank of England Appointments, ” news release, March 18, 2014, www.gov.uk. (Note: Dr. Shafik did not take on her role until August 1, 2014.)

12. L. Elliott and M. White, “Brown Gives Bank Independence to Set Interest Rates,” Guardian, May 7, 1997.

13. Stress testing emerged as a tool for risk management in the late 1990s but, as noted in by Andrew Haldane in a 2009 speech, financial services firms used them more to manage regulation than manage risk. Regular stress testing of UK banks by authorities was recommended by the Financial Policy Committee in March 2013. See A. Haldane, “Why Banks Failed the Stress Test” (speech given at the Marcus-Evans Conference on Stress-Testing, London, Feb. 13, 2009); and “Stress Testing,” 2013, www.bankofengland.co.uk.

14. World Bank, “Gross Domestic Product 2014,” Feb. 17, 2016, http://databank.worldbank.org.

15. This assertion excludes Russia and Turkey, which are transcontinental but would be listed as first and third if included in Europe.

16. Statista and Z/Yen, “Leading Financial Centres Globally as of June 2015,” n.d., www.statista.com.

17. “Governors,” n.d., www.bankofengland.co.uk.

18. D. Bholat, “Big Data and Central Banks,” November 10, 2014, www.bankofengland.co.uk.

19. N. McLaren and R. Shanbhogue, “Using Internet Search Data As Economic Indicators,” Bank of England Quarterly Bulletin 51, no. 2 (2011): 134-140; D. Pimlott and T. Bradshaw, “Bank of England Googles to Track Latest Trends,” Financial Times, June 13, 2011; E. Benos and S. Sagade, “High-Frequency Trading Behavior and Its Impact On Market Quality: Evidence From the UK Trading Market,” working paper no. 469, Bank of England, London, December 2012, www.bankofengland.co.uk; and E. Benos, A. Wetherilt, and F. Zikes, “The Structure and Dynamics of the UK Credit Default Swap Market,” Financial Stability Paper no. 25, Bank of England, London, November 2013, www.bankofengland.co.uk.

20. Negative interest rates involve a bank charging its depositors for holding their money; they are meant to encourage lending. The European Central Bank and central banks in Denmark, Sweden, and Switzerland all have negative interest rates, and in late January 2016, the Bank of Japan adopted them as well. See, for instance, J. Randow and S. Kennedy, “Negative Interest Rates: Less Than Zero,” March 18, 2016, www.bloombergview.com.

21. See, for instance, R. Peston, “All Change At the Bank of England,” March 18, 2014, www.bbc.com.

22. D. Bradnum, C. Lovell, P. Santos, and N. Vaughan, “Tweets, Runs, and the Minnesota Vikings,” August 18, 2015, http://bankunderground.co.uk.

23. Prudential Regulation Authority, “The Impact of Climate Change On the UK Insurance Sector,” September 2015, www.bankofengland.co.uk.

24. M. Carney, “Breaking the Tragedy of the Horizon: Climate Change and Financial Stability” (speech delivered at Lloyd’s of London, London, September 29, 2015).

25. Prudential Regulation Authority, “Implementing the Financial Policy Committee’s Recommendation On Loan to Income Ratios in Mortgage Lending,” consultation paper CP 11/14, Bank of England, London, June 2014, www.bankofengland.co.uk; and Prudential Regulation Authority, “Implementing the Financial Policy Committee’s Recommendation On Loan to Income Ratios in Mortgage Lending,” policy statement PS9/14, Bank of England, London, October 2014, www.bankofengland.co.uk.

26. HM Treasury, A. Leadsome, and G. Osborne, “Government Confirms New Powers For Bank of England to Guard Against Future Financial Risks,” news release, February 2, 2015, www.gov.uk.

27. “UBS Wealth Management Launches UBS Global Real Estate Bubble Index For Select Urban Housing Markets Worldwide; Most Cities Overvalued,” news release, October 29, 2015, https://www.ubs.com; see also, for instance, P. Gallagher, “London House Prices Are the Most Overvalued in the World, Report Says,” Independent, Oct. 29, 2015.

i. “Assessing the Impact of the FPC’s Recommendations on the Mortgage Market,” in “Financial Stability Report,” pp. 60-62, www.bankofengland.co.uk.

ii. As described in Section 5 of the Bank’s Financial Stability Report, increased household indebtedness may be associated with a higher probability of household distress, which can cause a sharp fall in consumer spending. This arises from the fact that households with the highest debt-to-income-ratios tend to spend a greater proportion of their income on consumption than less indebted households. That was seen clearly during the recent financial crisis, with the share of income attributed to consumption falling sharply for households with higher debt-to-income ratios (Chart 5.10). There is also evidence internationally that higher household debt to income ratios were associated with larger falls in consumption (Chart B).

iii. Ibid.

iv. C. Giles, “New Gust of Inflation Recalls an Older Era,” Financial Times, May 13, 2007.

Reprint #:

57471

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.

Comment (1)
Nik Zafri Abdul Majid
It is difficult for any banks in any kinds of financial crisis. (e.g. mortgage defaults, credit and lending being frozen, share prices plummet, money exchange rates, hedge fund markets, imports come to a halt and the list goes on) - some big banks are even broken up to pieces.

Banking and Financial institutions should be getting involved more actively in new regulations and monetary policies - work together with the Government rather than 1 or 2 reps or waiting for financial assistance (due to possible liquidation) There should also be a fair of liquidity and capital standards as well. Security commissions all over the globe must be given more authority to propose rules.

Do not forget that the first tier of victims would be the shareholders and the investors. A way must be made to ensure proper compensation is given to them.

Big Data must be mined and properly analysed so that reinvent and improvement can be made (every facet of banking and financial institutions). New packaged and affordable products can be developed from Big Data which in turn will boost marketing, transaction enhancement etc. Internally Big Data can help risk identification, mitigation, assessment and control on hard cases like fraud and crime. 

But remember, it is also about the customers and the end users - if Big Data is properly used - it will help upgrade both customer and end-users confidence on the speed, customized needs, service improvements and so on.

Big Data has a bright future on banks.