Introduction
In June 2014, the Bank of England — one of the world’s oldest central banks — was preparing to announce its policy recommendations about the United Kingdom’s housing market. At the time, a dearth of new housing starts and a recovering economy was driving up housing prices, notably in London.1 This had raised concerns of a repeat of the market behaviors that had led to an economic crisis five years earlier.2 The Bank’s recommendations would be closely watched by the financial sector.
Several executives inside the Bank saw the policy recommendation as a watershed moment for the institution: It was one of the first times the Bank would make a major policy recommendation based in part on data from Britain’s Financial Conduct Authority (FCA), which was formed under the Financial Services Act 2012 as part of the UK’s response to its banking crisis during the recession. The FCA, which regulates the marketing of financial services products, has a memorandum of understanding to share data with the Bank of England.
Policy That Hits Home
In particular, the Bank was using microeconomic data to form a detailed picture of the UK’s housing market. It had aggregated transactional data at the level of the country’s various local authorities, like the boroughs in London. One of these datasets, FCA’s product sales database, tracked every mortgage for owner/occupiers issued in the UK. Another was the Land Registry data, which included a housing price index and datasets with transaction data such as prices paid.
Access to these datasets had enabled the Bank to refine its models of how housing market behavior influenced risks to lenders’ overall financial health. For instance, the Bank’s analysis showed that the UK’s local housing markets varied a great deal. While there were concerns that another housing bubble was inflating in London and other parts of the southern UK, most of the rest of the country was not seeing similar price increases.
But overall indebtedness was a concern that the Bank wanted to address — in particular, the pace of lending for mortgages with high loan-to-income ratios. (See “Analytics in Action.”) The number of these high loan-to-income mortgages was growing rapidly, and the Bank recommended that lenders limit them as well as require a stress test to individual borrowers in order to see how they would fare if interest rates rose by as much as 3% over a five-year period.
Pulling together the data for the housing policy recommendation required collaboration among many groups at the Bank; it represented one of the early triumphs of a new way of working together at the institution, which at times had struggled to bring diverse perspectives together. At one point, the only room large enough to accommodate the number of people collaborating was an underground chamber deep beneath the Bank’s home on Threadneedle Street in the center of London.
Driving this change in behavior was Mark Carney, the Bank’s Governor since July 2013, and the first non-British leader in its history. Soon after he arrived from the Central Bank of Canada, Carney had organized the Bank of England’s mission — maintaining monetary and financial stability for the good of the people of the UK — around a “One Bank” structure that encouraged staff to build on each other’s expertise.3
In a very tangible sense, the Bank was changing the way it behaved to take better advantage of the data to which it had access. In Carney’s first year, the Bank had established a high-level data council, set up a data lab, hired a chief operating officer, and formed a new advanced analytics unit. It was looking to hire its first-ever chief data officer (CDO) as well. Data had always played a key role in the Bank’s work, but to realize the full potential of its access to new data, the Bank was changing its structure, its behavior, and its approach to problem solving.
The housing market recommendations were part of the Bank’s June 2014 Financial Stability Report.4 At a testy press conference after it was released, Carney was peppered with questions about its recommendations, centering on the lack of immediate action to cool off the housing market. Carney argued that the Bank’s recommendations left room for banks to make some risky loans, which helps first-time buyers, among others. But it also created what he called a “firebreak.”5
The response from the British press was lukewarm, perhaps because the press is largely based in London, a market that was seeing sharp rises in housing prices. One analyst called the Bank’s recommendations a “paper tiger.”6 But many more were willing to give the Bank’s moves the benefit of the doubt. The stock market responded by driving up homebuilder stock prices, and the head of the House of Commons Treasury Committee, Andrew Tyrie, said: “While apparently modest in its initial impact, it breaks new ground.”7
Winds of Change
The Financial Stability Report had come out of a period of extraordinary institutional change. The Bank had restructured in the wake of the global recession of 2008, and its expanded writ directly regulating Britain’s banks and insurers meant it had to get up to speed with the volumes of data this would require, especially for performing stress tests on the financial health of the companies. Analytics became one of the four pillars of its reorganization. (See “The Bank of England's Strategic Plan — One Bank, One Mission.”) Analytics also would help with a cultural issue at the Bank: openness.8 One of Carney’s goals was to make the Bank more transparent in its decision making. Data could help explain some of its thinking.
In his prior post at the Bank of Canada, Carney had established himself as one of the world’s top central bankers.9 Joining him in London was Charlotte Hogg, who left a post running Banco Santander SA’s UK retail operations to become the Bank of England’s first chief operating officer.10 She was part of a push by Carney to diversify the Bank, traditionally dominated by white men. Another powerful woman was Nemat “Minouche” Shafik, who was a deputy managing director at the International Monetary Fund until Britain’s Chancellor of the Exchequer, George Osborne, named her the Bank of England’s deputy governor, responsible for markets and banking, in March 2014.11 That made Shafik one of the Bank’s nine-member Monetary Policy Committee, which sets interest rates and other monetary policy for the UK.
The Bank’s expanded remit, regulating the UK’s banks and insurers, was a return to part of its past. Regulating retail banks (though not insurers) had been part of its purview until 1997, when the Labor Party took power and decided to separate the Bank’s monetary and regulatory roles as part of giving the Bank control over monetary policy. (Previously, the government set monetary policy, meaning interest rate cuts often coincided with political need.)12 But now, overseeing these institutions meant running regular stress tests to assess their financial fitness under extreme conditions, a new and newly data-intensive mandate.13
The Bank of England has long been a trendsetter (see “A Brief History of the Bank of England”). It stands today as one of the world’s most influential central banks, not least because Britain is the world’s fifth-largest economy, with a $2.98 trillion gross domestic product.14 The UK is also an integral part of the $18.5 trillion European Union. The Bank of England plays a special role within the European Union, vis à vis the European Central Bank, because the UK is the only large European economy that is not part of the eurozone. While the UK is Europe’s third-most populous country with nearly 65 million inhabitants, it is Europe’s second-largest economy, and on a per-capita income basis is neck-and-neck with Germany, the most prosperous of the large European nations.15 London itself is one of the world’s most important financial centers.16
Opening Up the Bank
Carney and his five deputies,17 including Hogg, had developed the One Bank platform, meant to create a central bank that was both more diverse and more able to align around its goals. Hogg drove many of these sweeping changes in order to send a message: We are one bank, dedicated to the common good.
Hogg is steeped in the Bank; her first job was there in the early 1990s, after which she spent time in the U.S. at McKinsey & Company, Morgan Stanley, and its spinoff, Discover Financial Services, before returning to the UK to top positions at Experian and Santander. She knows where the saying “Not for ourselves, but for others” is on the Bank’s elaborately tiled floors. She knows the messages central bankers are meant to draw from the myriad busts of gods and goddesses and images of white owls (the messengers of the gods). Her own office came with a bas relief of Pegasus above the door, and she knows that for central bankers, it was meant to symbolize swiftness in making decisions for the good of the people. When she started, she asked that a framed copy of the Bank’s original 1694 charter be hung where she can see from her desk the Bank’s mission, which in colloquial terms is: “To promote the public good through financial and monetary stability.”
Her aim now is not so much to build a new Bank as to upgrade the current one. “It’s a matter of protecting and strengthening, enabling the heritage and responsibilities whilst losing some of the things that are past, legacy,” she says.
A big part of the upgrade process involves rethinking the way the Bank manages data. In the past, datasets were not always easy to find, so underlying the One Bank strategy are improved data sharing, IT, and analytics. The Bank set up a data council, initially chaired by Hogg and made up of senior-level officials interested in data, including the chief information officer (CIO), the CDO, the head of advanced analytics, and the head of statistics and regulatory data. The data council guides the Bank’s decisions about data, including its strategy for what to collect. “My argument is we cannot collect the world’s data,” says Hany Choueiri, the Bank’s first-ever CDO. He was hired in January 2015, reporting to the CIO. Choueiri was part of a push to remake IT from a service bureau to a driver of operational change when it comes to data.
To do big-data analytics requires a strong link between IT and the analytics units and tools. “The larger the datasets and the messier the datasets, the more important IT becomes,” says Paul Robinson, head of advanced analytics at the Bank. “Inappropriate or inadequate technology can lead to situations where people wait for hours to see results that shouldn’t take nearly so long.”
One novel way in which the Bank has used analytics techniques is to apply them to its own structure. Hogg wanted to see just how close the Bank of England was to acting as one bank. Personnel in the newly formed data lab took all the Bank’s Outlook communications, aggregated it, applied modeling tools to assess it, and then used a visualization tool to create a chart that displayed organizational communications at the Bank. (The data was not used to show which individuals were most connected, but how people within departments connect.) What emerged was what she calls a “Kandinsky chart” of the Bank, after the abstract artist and theorist who plotted the geometry of paintings. “It describes — in a way that nothing else could — how people are interacting in the Bank,” she says. “You could see who was really talking to whom, and whether that made sense.”
A New Balance
Of course, the Bank of England has long used analytics. But prior to Carney’s arrival and its new supervisory mandates, the pace of analytics was not typically at the speed associated with big-data analytics, where large volumes of data can be gathered at frequencies approaching real time.
To help create datasets to support One Bank’s analytics goals, the Bank took its first-ever data inventory to see what kinds of datasets it had in house. Inventory “sounds quite boring,” says Hogg, “but it’s pretty fundamental. We need to know what we’ve got to know how to manage it.” Another reason the inventory was important: It would make it easier to aggregate datasets to help with policy decisions.
The inventory took most of a year and turned up nearly 1,000 datasets. Choueiri says he set up a data inventory tool to tab each dataset across a list of 14 categories, which are searchable on the Bank’s intranet. The inventory would make it clear which datasets can be used for which purposes; for instance, when the Bank collects data from an external source, the inventory also captures the purpose for which the Bank has agreed to use the data. The data inventory thus helps ensure that the Bank is compliant with legal restrictions on the data it has.
Analytics requires a balancing act of sorts at the Bank, given the different missions of the institution. Monetary policy and insurance regulation, for instance, use vastly different data and aim to accomplish different goals. While various parts of the Bank often need access to the same data, some has to be kept restricted for limited use because of regulatory provisions or because the Bank has agreed to use it for only certain purposes. But much of the data does not need to be restricted, and creating broader access can boost policy making because of reduced duplication of effort. Better policy making expands the value of the information.
Along with the data inventory, the Bank’s IT department was also putting in place the tools and structures they want for advanced, big data–style analytics. The datasets used for the Bank’s macroeconomic charter — measures like unemployment, consumer pricing, and productivity — are comprehensive for their purposes, but are neither especially large nor do they operate in anything approaching real time. “Historically, data collection has been very specific, with systems built for each one of the collections,” Choueiri says. The Bank is moving to more general tools to increase its flexibility, a move it is undertaking as part of the three-year One Bank data architecture program.
Next stages for data management will include building a data architecture to more effectively handle the various kinds of structured and especially unstructured data, such as text, that the Bank has or expects to get in order to help policy makers. And the Bank has worked to consolidate the use of tools for analyzing data and to move people off of Excel as the primary analytics tool. Reducing the number of specialized data tools in use at the Bank should make it easier for people from different parts of the Bank to share data and even work together on certain projects, with the end result being better policy decisions.
“This Stuff Is Brilliant”
Any time an organization tries to centralize control, it runs the risk of rebellion. Choueiri says he’s aware of CDOs who find themselves fighting pitched battles within their organization as they try to bring data together. He says the One Bank platform has largely helped him to avoid this at the Bank of England.
It helps, says Hogg, both that the Bank is analytically inclined by its nature and that people who work there do so out of a sense of public service. “I’ve found here that if what you’re doing is clearly in the interests of the mission of the institution, people tend to welcome it,” she says. “And this stuff is brilliant, right? I mean, your ability to be able to get a handle on different sources of data is really powerful, and people can see how that will benefit their work.”
Sujit Kapadia, head of research, is one of those beneficiaries. An economist who has been at the Bank since 2005, he says One Bank offers “a natural mechanism” for bringing together different perspectives from within the Bank. Adding this kind of diversity is valuable as the Bank looks to apply lessons learned in the 2008 economic crisis, which Kapadia says “caused us to rethink some of the conventional ways of approaching economics and finance and regulation.” There were also huge increases in the quantity and level of detail in the available data.
In July 2014, those factors all went into a workshop called “Big Data and Central Banks,” where the Bank brought together people from 20 central banks across the globe and external topic experts to discuss the impact of big data, defined as “datasets that are granular, high frequency, and/or non-numeric,” on central bank policy making.18 Breaking these characteristics down, granular means per item (for each loan or each security), high frequency means frequently updated, and non-numeric from widely varied sources. Historically, the Bank of England has used little in the way of big data; its datasets were typically highly structured and (when reported) were typically reported quarterly. But the Bank had been a fairly advanced adopter, for a central bank, of nontraditional data — for example, using Google data to look at housing and employment market conditions in 2011, examining the impact of high-frequency trading on stock markets by looking at equity transactions, and looking at credit swaps and liquidity management using high-frequency datasets.19
Such high-frequency datasets have not traditionally been in wide use for macroeconomic policy recommendations by the Bank. Speakers at the workshop (who were not identified by name) showed results from their work demonstrating that micro data could yield macro patterns, especially when visualization tools were used effectively.
For the Bank of England, with 300 years of macroeconomic data available on its website, the addition of much higher-frequency data represents an interesting development. It opens datasets that, for instance, can enhance understanding of how a monetary policy action like changing an interest rate affects the financial system.
Joy and Stress (Tests)
The mere creation of a chief data office sparked unusual emotion in some corners of the Bank. “I was whooping with joy, literally,” says Nathanael Benjamin, head of division for financial risk and resilience at the Bank. He knew that a CDO would give him easier access to the data and tools he needed to do his job. A major reason that mattered were the stress tests for banks and insurers. Stress tests use analytics to look at a bank’s financial structure and evaluate whether it could withstand different kinds of severe but plausible financial shocks: from short, sharp ones like a stock market crash to waves of bad economic developments that play out over months or even years. If it can’t, policy makers look at why, and tell the bank what it must do to prepare itself.
The rise to prominence of stress tests was triggered by the 2008 crisis, and Benjamin was involved in the early days of this evolution due to his experience in quantitative risk analysis and in regulation. He was on temporary assignment to the Federal Reserve Bank of New York from 2008 to 2010, and took part in the very first supervisory stress tests of major U.S. banks. “That worked really well, but it was painful,” he says. “It was the first time we were asking firms for this type of data, and it was the first time the firms had to provide it to us — and even to themselves sometimes. We found ourselves in a lot of situations where firms weren’t able to get hold of the data in a timely manner and really struggled to drill down and aggregate that risk data.”
In the end, it worked. But Benjamin saw the need to manage — with conviction — a well-defined data strategy for the risk-related data relevant to stress testing. The Bank of England now has such a strategy. Although it involves a great deal more data collection than before, it is being carried out under a very different regime from the one in place in the U.S., where central banks tend to seek out and gather every morsel of data. At the Bank of England, the data will be big, but it won’t be all-encompassing.
For stress tests, “we’re trying to get the cut of the data that tells us what we need to know, but not necessarily much more,” says Benjamin. He says vacuuming up large quantities of this data is very resource-intensive, requiring processing, checking, translating, and analyzing. There can be diminishing returns in asking for more. “We are, on purpose, targeting a middle ground in terms of the data we ask for,” he says.
The Bank of England is now running stress tests concurrently in seven banks each year. When it started, it was only able to perform them sequentially and could only do two banks a year. The increase is valuable, not just for scope but also because concurrent stress tests provide a better idea of the overall strength of the banking sector and permit the consistent exercise of supervisory judgment through benchmarking. In short, these tests help regulators ask the right questions.
Advancing Analytics Across the Bank
Part of the charge for analytics has fallen to Andrew Haldane, who had been executive director for financial stability prior to Carney’s arrival and is now the Bank’s chief economist. Haldane is a prolific researcher who has established himself as a bold, almost maverick, economist, talking publicly about setting negative interest rates20 and replacing cash with digital currency. As part of Carney’s sweeping reorganization, announced in March 2014, Haldane swapped jobs with then-chief economist Spencer Dale.21 Haldane has embraced a cross-departmental structure and created the research unit that Kapadia heads. In this unit, five or six full-time employees are charged with working on cross-cutting research projects spanning all of the Bank’s responsibilities. Additionally, members of different departments at the Bank rotate in for various project periods, almost like research fellows. Haldane also brought Paul Robinson on board to build the advanced analytics unit — basically a center of analytics expertise within the Bank.
Robinson is another returnee, having left for several years for the private sector. When he arrived back at the Bank, the advanced analytics unit had just four people. Now, it has 12 to 13 people, mostly new. They come from untraditional backgrounds like physics and computer science.
“We have lots of extremely sophisticated, very highly qualified, and highly numerate economists. We wanted to supplement them with people who had a different background and were used to modeling other sorts of phenomena,” Robinson says. Bringing in people from other spheres of knowledge also expanded techniques that could be used; for instance, among the techniques being used more frequently now are agent-based modeling and network analysis.
The analysis underlying the recommendations set out in the June 2014 Financial Stability Report helped underscore that the new analytics wasn’t just a nice set of tools for the already analytical parts of the organization. It helped to show just what the Bank might be able to do with its new access to transaction data. And it reinforced the Bank’s efforts to improve cross-group work. The potential for improved policy decisions is obvious, as is the likelihood that the Bank will be able to respond more quickly to market events.
Unstructuring the Data
The Bank’s experience with analytics was largely derived from its use of structured data. It ran a creative experiment analyzing new kinds of unstructured data when Scottish voters were preparing to vote on whether to leave the United Kingdom in September 2014. One IT staff member built a feed from Twitter to look for signs of a potential run on Scottish banks.22
The feed looked for terms like “run” and financial institutions such as “RBS” (Royal Bank of Scotland) and the like. The Sunday before the referendum, Twitter saw a spike of “RBS” mentions. It turns out that it wasn’t a sign people were planning to flood the Royal Bank of Scotland the next morning, but that an American football game was starting — the mentions of “RBS” identified in the Twitter feed were references to running backs. The football players didn’t stiff-arm the whole test, however; there were enough relevant tweets to show that unstructured data sources could provide useful information to Bank policy makers if they needed to quickly respond to something — an important lesson about unstructured data. Plus, it hadn’t been hard to do; the IT developer who built the feed did it working at home, in his bedroom.
Another experiment for analytics was to set up a Hadoop data framework, an open-source platform for handling large amounts of data on relatively inexpensive hardware. It was built as part of a data lab project meant to give the Bank an analytics sandbox to play in, a tool to experiment with cutting-edge analytics techniques on things like what an entire day’s trading records on a stock exchange might mean for bank stability.
Zinging results out of a Hadoop cluster sparked active debate within the traditional IT department. Some members raised valid concerns: The cluster didn’t have the typical IT controls; it was unclear how it would be secured or managed; and it wasn’t even clear who would handle system backups, since the cluster was set up outside of IT. This kind of discussion often takes place when an organization adopts a dual IT structure, adding a group for emerging technologies to run in parallel to the traditional organization. In this case, the issues were resolved by isolating the Hadoop cluster from key regulatory systems.
Overcoming Overfitting
Central banks deal not just in real-world economic conditions but also in theoretical scenarios and in rare events like major financial crises, making it harder to use actual circumstances to prove the models are accurate. Robinson calls this a key challenge for his unit, one that means the unit has to show rigor and robust explanations for its decisions. Organizations expanding into big-data analytics must have someone looking out for some decidedly abstract concerns, such as overfitting in the models. In overfitting, as the number of variables that might explain a set of observations increases, the chances grow that the models will come up with spurious relationships among the variables. “Then, as soon as you start using them outside the sample, they are utterly hopeless,” Robinson says.
Choueiri says the Bank isn’t yet really doing big data. He says there’s a huge variety of data analyzed at the Bank, but the volumes are not yet anywhere near what the private sector examines. That will change. His big-data platform will launch in 2016 and run in parallel to the existing systems to make sure it’s ready to handle heavier data analytical workloads. The other twist with big data is: While data gets more granular, it also demands speed. That might mean a drop in accuracy. “We’re potentially getting away from the notion that data has to be 100% accurate,” says Choueiri. “What we cannot do is wait months to obtain a highly accurate dataset” in some instances. That’s a huge cultural shift for central bankers.
Changing the Climate With Analytics
All the reorganization and the adoption of new tools have already sparked new kinds of analysis from the Bank. Some of it wades into uncharted territory for a central bank. The insurance supervision department, for instance, has been doing long-term scenario planning around climate change. Their work has led Carney to warn insurers that they may need to do more to protect themselves from related financial risks
.In a March 2015 speech to insurance market Lloyd’s of London, for instance, Carney presented data from a report warning that the frequency of catastrophic weather events was increasing, a trend that could affect insurers’ financial stability.23 He called on insurers to develop a disclosure committee that would push to get companies to disclose their exposure to climate change. He also called on companies to plan for the potential of a collapse in the fossil fuel industry, because of the risks such a scenario presents for insurers.
The potential effects of climate change are one way the Bank wants to use big-data analytics to address what Carney calls “the tragedy of the horizon,” the inability of private-sector firms to look more than two or three years ahead, and even of central banks like the UK’s to look out past a decade.24
Such uses of analytics have Hogg optimistic about its role at the Bank. While she cautions that “there’s still quite a long journey to go,” she’s seen over the last six months a jump in demand for analytics and datasets from various functional areas of the Bank. She thinks the housing crisis model, which combines granular data (that is, geographically aggregated data about individual loans) with macroeconomic policy data, “is perhaps where analytics is going to be the most powerful” for the Bank.
In a way, the Bank must become its own oracle — and the bet on analytics is that it will help make the policy game less opaque. In the case of the housing recommendation, the Bank was able to implement its limits on loan-to-income ratios in October 2014.25 Several months later, Parliament expanded the Bank’s powers in this area.26
Hogg says analytics helps bridge the macroeconomic question, “Is there a housing bubble?,” along with the smaller question of the balance sheets of individual banks, and the ultimate micro question about the level of debt for individuals. Other banks have weighed in, like UBS AG, the large private bank headquartered in Switzerland, which in October 2015 called London’s housing market the most overvalued in the world and said it was a bubble ripe for bursting.27 Hogg takes the comments in stride. “Pretty much everything we do, people will say something about us, and sometimes in a derisory way,” she says. Her own analysis is that the Bank is on a three-year journey toward remaking itself for this new set of responsibilities, and that analytics is its new weathervane for the economy.
Comment (1)
Nik Zafri Abdul Majid