Session 2 - New data needs for financial stability
This is a modal window.
The media could not be loaded, either because the server or network failed or because the format is not supported.
Formal Metadata
Title |
| |
Title of Series | ||
Number of Parts | 5 | |
Author | ||
Contributors | ||
License | CC Attribution 3.0 Unported: You are free to use, adapt and copy, distribute and transmit the work or content in adapted or unchanged form for any legal purpose as long as the work is attributed to the author in the manner specified by the author or licensor. | |
Identifiers | 10.5446/14049 (DOI) | |
Publisher | ||
Release Date | ||
Language |
Content Metadata
Subject Area | |
Genre |
00:00
Lecture/ConferenceMeeting/Interview
03:32
Lecture/ConferenceMeeting/Interview
09:32
Meeting/Interview
14:38
Meeting/InterviewLecture/Conference
17:19
Lecture/ConferenceMeeting/Interview
18:36
Meeting/Interview
19:44
Lecture/ConferenceComputer animation
20:41
Lecture/ConferenceMeeting/Interview
23:25
Lecture/ConferenceComputer animationMeeting/Interview
27:24
Lecture/ConferenceComputer animation
28:58
Lecture/ConferenceMeeting/Interview
31:03
Lecture/ConferenceComputer animationMeeting/Interview
33:16
Lecture/ConferenceComputer animationMeeting/Interview
35:03
Lecture/ConferenceComputer animation
36:04
Meeting/InterviewLecture/Conference
37:02
Lecture/ConferenceComputer animationMeeting/Interview
39:47
Lecture/ConferenceComputer animationMeeting/Interview
42:07
Lecture/ConferenceComputer animationMeeting/Interview
46:32
Computer animationLecture/ConferenceMeeting/Interview
50:07
Lecture/ConferenceComputer animationMeeting/Interview
52:36
Lecture/ConferenceComputer animation
57:12
Lecture/ConferenceMeeting/Interview
58:35
Lecture/ConferenceMeeting/Interview
01:00:09
Lecture/ConferenceMeeting/Interview
01:01:16
Lecture/Conference
01:02:14
Meeting/InterviewLecture/ConferenceComputer animation
01:04:18
Lecture/ConferenceMeeting/InterviewDiagram
01:05:20
Meeting/InterviewLecture/Conference
01:06:23
Computer animationMeeting/Interview
01:08:13
Lecture/ConferenceComputer animationMeeting/Interview
01:09:20
Lecture/ConferenceComputer animation
01:10:25
Meeting/Interview
01:11:42
Lecture/ConferenceMeeting/Interview
01:12:43
Lecture/ConferenceMeeting/Interview
01:13:44
Lecture/ConferenceMeeting/Interview
01:15:50
Lecture/Conference
Transcript: English(auto-generated)
00:00
ECB and particularly to the statistics stream for NRL and for inviting me to be chair of this panel. It's a real honor, and I think it's interesting that you're marking 20 years of ECB statistics. Those conferences tend to be somewhat retrospective, but I think it's nice that we actually turned it around and said let's look to the future, and I think we're covering
00:24
several areas of the demand side for statistics, which are very relevant for central bankers. We talked about monetary policy, and now we talk about financial stability. I think the overarching aim is to better assess systemic risks, not only in a monitoring
00:43
perspective but also as an input into policy making to calibrate various and design various policy tools which are increasingly being rolled out both in Europe and globally. I think there's a sense that supervisory data and let's say traditional financial sector
01:02
data, the MFI, aggregated MFI perspective, for instance, is not enough. We need to better understand linkages, linkages between the financial sector and the real economy and also linkages within the financial sector to understand exactly how shocks get transmitted and where the real fragilities might lie.
01:24
There's I think a need to cut the cake in different ways. We can have a national view, but there's also, since we're dealing with cross-border institutions, you need to view these institutions from a slightly broader perspective. There's the issue of non-bank financial intermediation.
01:41
We're not talking things like crowdfunding, fintech, new forms of funds and so on, which if you just focus on the narrow of MFI or the banking sector view is going to leave important issues on the table. There seem to be specific needs, I think, also from the borrower-based requirements that are being implemented. Some of the speakers will address this point, new data, and I think we had actually in
02:06
Estonia some good experience when we were rolling out these borrower-based requirements. We wanted to establish the instrument, but we didn't want to do it in a binding way, so you had to calibrate it to be pretty much in line with the practice in the market at the time.
02:22
And only through using this fairly granular view was that possible. And I think the rollout went pretty well. And as was discussed in the first panel, we're going to repeat a lot of things already, the greater interest in granular data, and as many speakers mentioned this issue about benefits versus costs, or maybe the data needs versus data wishes issue, I think, for
02:45
analysts and people, if you say, would a new data series be useful? Of course, since the price is zero, the demand can be almost infinite. And somebody somewhere will always say it's a great idea to have an additional data source, but the sector, I guess, is already chafing under quite a bit of reporting responsibility.
03:05
Sometimes you often think it becomes almost as important as the core business for a financial institution. So finding that balance is very important. I think the first session put some ideas on the table. So to discuss these issues, we have a great and diverse panel.
03:23
I won't go long into the bios, because I think most people here know, are familiar with the speakers. We'll start with Philip Lane, Governor of Central Bank of Ireland, and more recently, the Chair of the Advisory Technical Committee of the ESRB, and Richard Berner is Executive-in-Residence
03:40
and Adjunct Professor at the New York University Stern School. Third, then, Luis Pereira de Silva, Deputy General Manager of the BIS. And as discussant, we have Hans Helmuth-Kotz, who divides his time between Frankfurt and Cambridge, Massachusetts, between Goethe and Harvard Universities.
04:01
So I'll give each speaker 15, maximum 15 minutes, and we'll try to reserve a fair bit of time for questions from you as well. So, Philip, please leave us off. Thank you, Arto. It's a pleasure to be on this panel. I think, of course, we are following an interesting session this morning on monetary policy.
04:24
And Natasha Valla, I think, made the interesting point about how much do you really need in terms of data to run monetary policy. Maybe you just need the traditional macro time series. But, of course, when it comes to financial stability, if there's a case of granularity for monetary policy,
04:44
that's even stronger, again, when we want to talk about financial stability. Let me just say, first of all, in terms of thinking about central banking and financial stability, of course, there are different dimensions.
05:00
There's essentially ex ante, which is, let's try and – if the conditions are financially stable, let's see what we can do to preserve that. So that's a microprudential issue in terms of supervision. And, of course, then you think about the data you need for effective supervision. And it's a macroprudential policy issue in terms of what data do you need to, as Arne just said,
05:23
calibrate macroprudential measures. But, in addition, ex post, if the crisis arise, the conduct of financial stability policies, which for central banks is going to be the conduct of liquidity policies, also has a heavy data requirement.
05:43
And I think that's an interesting issue. So whether it's ex ante or ex post, this is a big topic. I think the organisers have really begged the question in terms of the composition of this panel, which is global in nature. So through our work here in Europe, through the work of the BIS, and, of course, with the US authorities,
06:08
it's clear that financial stability has a significant global component. And what's interesting is what can we do at an EU level, so going above nation by nation data,
06:22
and then what can we do at a global level. So within the EU, the European systemic risk board has a unique role in terms of the coordination of macroprudential policies and also in terms of having the oversight of data streaming out of every nation state. So the ESRB has, for example, the unique window into, for example,
06:43
the derivatives data collected under EMIR. And, of course, here at the ECB, in terms of macroprudential policy, because ECB has top-up powers, for example, in relation to the counter-cyclical capital buffer, it's necessary and would heavily endorse, it's a very good idea, that in addition to the national level,
07:04
use of macroprudential policy, the option to top up at ECB level provides an additional layer of discipline, and therefore the ECB needs to have that overview as well. And, of course, macroprudential policy is quite immature, as in it's fairly recent,
07:25
it's widespread use, and so the more we can learn from each other, I think that's quite important, and to learn from each other, the common data platform, I think is quite important. So there's a lot going on within the European system, and, of course, there's going to be a parallel conversation at the global level through the BIS, the FSB, the IMF, and so on.
07:47
But, of course, the extent of data sharing at a global level is quite limited. In fact, I think the exception proves the rule, I mean, the most important exception is the BIS data hub, which shares in a very limited way the firm level data on
08:04
globally systemically important financial institutions among a small group of supervisors. But the fact that that's so heavily limited in ring fence, it's a big step forward, it's happening, but it does indicate what is the near-term potential for much wider
08:20
sharing of data, I think. Let's see. Let me emphasize in terms of what's big steps forward, that really I think one thing to notice, and one thing if you look at the ESRB working paper series, you see more and more, is the value of the E-Mirror data. So, again, this is, I think, a big
08:43
reporting burden on those who are reporting, and it's vital that we demonstrate that, in fact, data are useful, these statistics are useful. And I think it's proving that way, that in terms of understanding what's going on, say, in the interest rate swap market, FX derivatives,
09:03
credit default swaps and so on, I think it's proving already quite valuable. It needed the, and I think the same is going to be true on a credit, there's going to be a little bit of a lag, because the researchers and the analysts need to learn the new data set, need to clean it up and so on. But I think we can be increasingly confident of the value of that.
09:24
Now, of course, because it's European level regulation, it doesn't capture everything we would like to know at a global level. And I think the more we can, over sufficient time span, to push for corresponding data collection and data sharing at a global level in derivatives,
09:42
I think that's really important. And, of course, important projects such as common identifiers like the LEI is an important part of that journey. Let me switch, actually, now to the really side of the international economy and the role of multinational firms. So multinational firms in production, we understand in terms
10:06
of global value chains and all of that, but more and more multinational firms are very important in the global financial system. You've noticed that some of these firms make a lot of money and they have very large treasury operations in terms of managing
10:23
that cash. So if you think about global savings and investments, understanding how multinationals allocate their cash is quite important. So I think there's a lot going on here. So recently I co-authored with some BIS colleagues and a colleague at the Central Bank of Ireland.
10:43
It's in the March quarterly review of the BIS. I think we put together our thoughts on this. But I think really understanding the balance of payments these days when you have the multinational firms, not just in the, as I say, the real side in terms of the trade balance,
11:00
but also in the financial account, I think is really important. You know, I think also, of course, these firms in terms of their treasury operations, in terms of using, say, special purpose entities in international financial centres, that really it's kind of interesting to look at it, but really understanding the headline BOP,
11:24
understanding the headline national accounts. And it's not just, you know, in my own country where this matters, it's much broader than that. So I think the agility, maybe the question here is agility. It's good to have stability. It's good that we know the rules for national accounting. It's good we're on BMP Manual 6 as
11:44
opposed to having one every quarter. But the agility of how quickly can the world's statisticians catch up with what's going on in terms of the evolving practices of multinational firms, I think, is a challenge for this community.
12:01
So, you know, coming back to the value of international data sharing, that, of course, builds on trust. And, you know, that's a scarce commodity. So I'm not saying there's any easy, you know, often I hear in certain circumstances people say, well, just do it. Just do it. So it's the Nike view of international data sharing, just do
12:24
it. But that's not realistic. You do have to build up to that point. And so the kind of slog of working towards that I think is quite important. Let me turn to the borrower-based measures because here we know the risk is in the tail of the distribution.
12:40
So knowing the average loan-to-value ratio in the population is not super helpful. You do need to think about what fraction of loans are in the different parts of the distribution. So I don't really see how you can have a reliable system of borrower-based measures without the granular loan-by-loan data. It's also, I think, even better if you can match
13:04
that with other data characteristics like income level, employment status, non-mortgage debt, and so on. And this goes to acceptability. We talked a little bit about acceptability this morning. And, for example, for us, I can tell you, because we have, say, the loan-to-value,
13:21
we have loan-to-value ceilings and loan-to-income ceilings, but we allowed the banks to do a certain amount of lending above those ceilings. And what we do is we're able to demonstrate that, in fact, the use of those exceptions, the above-ceiling lending, is in line with what people might think is desirable, as in younger people are more
13:42
likely to get a bigger exception from loan-to-income, because their incomes grow over time. First-time borrowers get more of an exception to loan-to-value, because, again, it's the case where that is made. So I think it's not just a question of what we need to set our policies.
14:00
It's also a question of what's needed to demonstrate that these policies are reasonable from a social point of view. So let me, in the past, just mention, we've just introduced a consumer credit register, and this is, I think, the confidence of what's helpful for banks in terms of having a CCR that they can look at can also have value for
14:21
statisticians. Let me also mention, I think, the value of cooperation, say, with tax authorities, because what really would be ideal is you're able to not just have the point-in-time data about when you take out a mortgage loan, what are your characteristics, but also updated. So five years into your loan, ten years into your loan, what remains,
14:45
and is there an update on your characteristics such as employment status and so on. Let me turn to an obvious financial stability risk, which is the boom-bust cycle in property markets, whether that's residential or commercial property. So this is, again, I think, an area
15:03
where anti-credit should be quite useful, because the anti-credit should deliver much more information about the distribution of exposures to the property sector, the interconnections across banks in relation to property exposures, the value of property collateral, and so on.
15:21
So I think you can really see a tremendous potential here in that area. In relation to pricing, I think there's more maturity in how we think about the construction of residential property price indices, but I think the ESRB has highlighted the data gaps of commercial
15:41
property prices, and there's an interesting issue about a purest approach where you want to build a comprehensive price index based on all commercial property transactions versus maybe a risk adjusted approach where you may say, well, maybe the bigger concerns are certain slices of the commercial property sector, such as prime real estate. So I think there's a lot
16:05
of work to be done there. And then maybe just in trying to close, Ardo mentioned the rising role of non-banks in the financial sector, and although a certain amount of information is collected from investment funds and so on, I do think in terms of financial stability
16:23
policies, having a better and more uniform way of thinking about the leverage and liquidity positions of investment funds may turn out to be helpful in the future. And really, we've maybe can say we've accomplished a lot in terms of banking data. Maybe we need
16:42
to pivot our attention towards these other sectors. And then finally, the other issue is interconnections, the issue of who to whom. So actually, it remains the case that what we've collected, it can be less than fully exploited because the ultimate owner,
17:00
ultimate destination is very necessary. So it's kind of dissatisfying if you look at some data set and you say, okay, well, now I know that, say, the Cayman Islands is a big investor in some country. You know the Cayman Islands is just a conduit. So really, understanding what is behind that remains a challenge. So it's 20 years into the ESCB's
17:25
work in these areas. The glass is basically still half empty, if not more than half empty. So I think there's still a big uphill mountain to climb. And so this goes back to cost issues.
17:42
To reconcile the need for more data and the actual cost, which is also not just on the report, it's also on us as central banks who've got to have the systems in place to take in the data, the efficiency issue where maybe the merger with data scientists, with IT and so on,
18:02
is just as important as the kind of financial institutions and us ourselves. Thank you for a very wide-ranging set of thoughts and a very concise executive summary
18:22
at the end saying we still got a long way to go. Richard Burner, please. Thanks very much. First, I want to thank the organizers for setting up this conference and for having me here. I'm honored to be here. And second, I want to congratulate R.L. Schubert with whom I've worked over the past several years for his leadership and all
18:44
his accomplishments as director general of statistics here at the ECB. R.L., when he asked me to come here, asked me to focus on potential threats to financial stability and how statistics can inform decision-making about them. Here today, I'm going to talk about both system-wide and enterprise risk assessment,
19:03
because I think that they should complement each other and use the same basic data. Philip's example, I think, of the multinational firms is a great example of how multinationals manage their risk is extremely important for us to understand as policymakers or whether we're doing research in the area. And using the same basic data makes perfect sense, not just from an
19:25
efficiency standpoint, but also because we also need to use the same basic facts in talking about the phenomena that we're talking about. So today, I'm going to talk a little bit about, let's see, I need to advance this. Is that the way? Do you know which button to push?
19:48
There we go. The big one. Okay. So it works. First, I'm going to talk about some financial system vulnerabilities,
20:03
and I've identified five just to try to keep the list relatively short. Second, I'll talk about ways to achieve that efficiency and effectiveness and best practices to improve data quality, scope, and accessibility. I'll talk about some critical data needs and give three examples. And finally, I'll make some comments about some requirements to realize the
20:26
potential of using big data, new analytics, and new technology to improve system-wide and enterprise risk assessment and compliance. I think first, the data must be fit for purpose, something we've already talked about, but I'll say some more. In addition,
20:42
I think an effective partnership between and among regulators and between regulators and industry is essential to align their use of these tools and to standardize them to make them interoperable. So first, to the vulnerabilities. I think first, there are still vulnerabilities in securities financing transactions, market-based finance and shadow banking,
21:04
and I draw the distinction between those. We can still see that the default of a broker-dealer can create fire sale externalities on a post basis. So I'll talk a little bit about those data needs. Second, we've mentioned earlier in the session the transition from LIBOR to
21:23
alternative reference rates. LIBOR's foundation still remains fragile, and I think it's widespread and ongoing use is going to make that transition a challenging one. Market participants must have confidence in the new reference rates, which, among other things, I think involves the integrity of the data that underlie them. So I'll talk a little bit about the U.S. experience
21:41
there. There are three other vulnerabilities that are also top of mind. First, operational and cyber threats, including those from cyber incidents, they may accelerate, especially against a backdrop of rapid innovation and technical change. Second, the current move to either tapering or
22:00
actually normalizing monetary policy may expose vulnerabilities in rising corporate leverage and deteriorating credit underwriting and credit quality. That's certainly true in the United States. And third, fragmentation and even conflict among national policies, I think, may test the resilience of cross-border arrangements in global financial markets in response to
22:21
external shocks. So I'm not going to spend a lot of time on any of those, but I'll just make three comments. First, the heterogeneity comment. As Philip mentioned, I think that all five of these require the use of granular data because we are interested in tail risk and because we can't understand that unless the data are fairly granular. Second,
22:42
this list is not exhaustive. And therefore, when we think about the many vulnerabilities that we might face in allocating our resources to collect data, we should think about how we can use the same data for many purposes and make that efficiency really work for us. And third, because financial stability and vulnerabilities in the financial system
23:05
is multidimensional, that's why we have a macroprudential toolkit, I think that the data needs are obviously going to be multidimensional as well. Here again, we need to think about how best to use those. So that brings me to best practices for filling data needs.
23:22
This is really not rocket science, but it does involve thinking hard about what we ought to be doing. The goals are to improve the quality, the scope, and the accessibility of financial data. And to my way of thinking, that means we need to align the interests and activities of both officials and industry practitioners. What does that really mean? I think it means
23:42
there's an important complementarity between the interests that each of them share. Obviously, one at the micro level, another at the system-wide level. But having good risk management practices is going to help both microprudential and macroprudential goals. And understanding what each is doing is going to help both parties in the work that they do.
24:06
In addition, I think there's an important complementarity between analytics and data. And I think Jan Smets referred to this earlier in the panel. Theory alone is not going to suffice when we think about the needs that we have for data, nor will pure observation.
24:24
Theory must provide a rigorous framework for hypothesis tests, and observation has to ground it in reality. Equally, I think we need rigor in how we go about filling our data needs. So some of the best practices involve the following four steps. First, we need to identify the data needed and their business purpose. Do we really need the data? What is the purpose for
24:44
which we're going to use the data? Is there more than one purpose? Are the data that exists sufficient for doing that, or do we need to change those data in some way? Second, we should design a template for the collection of those data. And that means being very specific and
25:00
prescriptive about the data that we're going to collect and the way that we're going to collect them. Third, I think we need to develop clear and precise definitions of the data that we need so that they align with the purpose that we have in mind. And here, it's extremely important to use industry standards. Philip alluded to that. I'll talk a little bit more about that. And finally, creating collection specifications for the way that we collect data. And I have more
25:26
on this in a paper that I'm going to submit, but I think all four of these best practices are things we should keep in mind when we go to collect data. A few additional thoughts on best practices. One, we should focus on collecting data, not reports. And what do I mean by that?
25:42
I mean, in the past, regulators have focused on collecting reports and some of the technologies that have been used. For example, in the United States, and the SEC has used Edgar. The electronification looks just like the paper reports. We should really think, as I mentioned earlier, about collecting the same data that industry uses to manage their risk
26:04
that we're going to use to assess where the risks are in the financial system. Of course, we already do that with swap data. We set up swap data repositories. We're involved very deeply in that. When I was in the government, from swap data repositories or trade repositories, the problem was that we didn't use effective standards. We didn't use some of
26:23
the other best practices that were needed to make those data coherent. So both are needed. Second, we should conduct industry outreach very early in the process to understand whether or not what we have in mind when we go to collect data really reflects
26:40
what industry is doing. And I think there, the alignment and the conversations without involving capture by industry are extremely important. That dialogue is very important. Third, in my experience, performing a pilot collection is invaluable in informing how we go about doing the larger permanent data collection. We learn a lot from the pilot.
27:04
In securities financing transactions, specifically in repo in the US, we learned a lot from a pilot collection that we did. That's very important. And last, I think we need to engage in evaluating, especially when things are changing, the structure of the financial system, the way that it's evolving to meet new client needs,
27:25
new technologies. We need to engage in a continuous lifecycle assessment improvement in data. And this chart illustrates that kind of lifecycle improvement, where we start with data requirements. We implement those data requirements. We assess the data. We identify
27:44
data gaps. We propose changes to them. And then we evolve in new directions. That may involve no longer collecting some data that are currently being collected because they're no longer relevant. That's all part of the lifecycle process. Let me spend a moment on data standards,
28:03
which was alluded to just briefly. I think they're essential for the quality of data in order to compare, aggregate, and link data sets. And that's been discussed in other panels here. We need to have standardization of data so we know exactly what the data represent
28:21
so that they can be compared. It's even more critical today to use data standards in representing data because if we're going to automate some of the processes, if we're going to have smart contracts, if we're going to use fintech effectively, if we're going to use technology for compliance and regulatory purposes,
28:40
then the data need to be standardized and precisely defined. And of course, you're familiar with some of the identification criteria, both in terms of the LEI, the Legal Entity Identifier, which helps us identify who is who. But equally, we need to know who owns what, as Philip alluded to. So the UTI and the UPI work that's gone forward under the FSB and other
29:08
organizations has been very important in that. And our engagement in the past with oral and the Bank of England in that regard, I think, has been really instrumental in moving that work forward. Let me give you three examples of critical data needs for financial
29:23
stability in which I've been involved. The first, which I alluded to earlier, is in the US repo markets and specifically in the bilateral market, which constitutes roughly half of the market, so this is pretty important for assessing securities financing transactions in the United
29:40
States. It's critical for, as an ingredient in constructing the US reference rate, the so-called secured overnight financing rate. And I think it's essential for financial stability analysis. Philip alluded to the fact that we spent a lot of time on collecting data from banks, but not so much on data outside of the banking system. We obviously need to do more of that.
30:06
If we focus on activities wherever they occur, like repo transactions, then I think we have a good shot at doing that. Related to that, US money market fund holdings, very similar to what's been done here in Europe with the MMSR, the SEC in 2010 started to collect data, but
30:25
those data were difficult to access, so the value added that my organisation added was to make them much more accessible both through visualisation and creating a database where people could look at the time series of data that was available on a monthly basis
30:42
and compare and contrast money fund characteristics by who owned them, who issued them, and understanding both on the buy and the sell side of the market what's going on. The third area is in swap transactions. Obviously that's extremely important, understanding
31:01
derivatives markets, making them more transparent, that's been an extremely important and beneficial development. It helps us assess risk in markets, the counterparties, and specifically in centrally cleared counterparties. As I alluded to, we did in the United States, and to some extent here, we didn't adequately use data standards and specifics on how to collect
31:27
those data, so now we're recognising that that has to be done, and the CFTC, to their credit, has gone back and they're going to issue new regulations on how to collect swap data. There's a very good publication that was put out this spring called Swaps Regulation 2.0
31:45
that I think is a good roadmap to looking at that. One area that we obviously find a challenge, and I alluded to operational and cyber risks, data on cyber risks are hard to come by. We lack data on the scope of incident and their
32:03
cost. I think in the past there's been a reluctance to report them, reluctance to understand, or some difficulty in understanding exactly what data should we report and how should we categorise on those data. I don't pretend to have all the answers, but there are some straws in the wind that are pretty helpful in that regard. The IMF has done some estimates using data from experts
32:27
and doing some tail risk analysis to assess the cost of a cyber shock in the financial system. In the United States, the institutional framework has improved for collecting and sharing data through institutions like the so-called FS-ISAC and FS-ARC, which are joined together.
32:43
Those are industry groups that are aimed at sharing and collecting data among the industry. At the regulator level, the so-called FIBIC, the Financial Banking Information Council, is also sharing data and looking at best practices and collecting them.
33:01
And within firms, firms are setting up what they call fusion centres, which draw all across the firm an interdisciplinary group of people to look at how operational risk can affect every aspect of the firm. So in identifying, detecting, understanding and recovering from cyber incidents, all these things are extremely important.
33:23
The governance around this really matters, training employees, making sure that people are aware and what the incident response ought to be. Thinking about how we ought to collect data in this arena is really important. There's been some good work on data. Thinking about those issues here in Europe with the Euro-Cyber Resilience Board and the
33:45
Threat Intelligence-Based Ethical Red Training, or Tiber EU, institution, but those are in their infancy and more work, I think, needs to be needed on that. I'll talk about partnerships again.
34:00
When we talk about the goals of having a dialogue, either among regulators themselves or between regulators in the industry, I think the goals are really important. They are to use the same data and facts for several purposes, for example, making policy decisions and risk management decisions. And I think it's also essential to have that partnership because
34:23
we need to think about how we're going to create a new approach to regulatory reporting and compliance. Technology can facilitate that. And I have a vision. The vision really does involve using the same data at the firm level that we can then aggregate up for policy makers
34:42
and regulators to have. But in order to do that, we need to make our systems interoperable. So there are two venues for collaboration and coordination, I think, that are needed. Among regulators is extremely important so we can share data. In the United States, as all of you know, we have a fragmented regulatory system.
35:01
To solve the collective action problem of sharing data among the regulators there is still a challenge that regulators are grappling with. And then between regulators and industry, we need to build trust between the two groups so that that can happen. Final points. I think we need to start expanding the collaboration now well before that
35:23
technological revolution is complete. If we don't, then we'll have to redo some of the things that are now being contemplated. Second, I think we need to start improving quality through more extensive use of data standards now. And third, not just data standards, but I think also
35:41
technology standards are essential at the start to assure the interoperability of the technological marvels that we're putting in place. Thanks very much. And thank you for your views and again on a range of issues. I thought these issues about
36:01
operational risk are quite interesting because I guess we usually think in terms of balance sheets and income statements and so on that these other risks that are emerging now might be coming from some totally different area and require different skills. Luis, the floor is yours. Thank you. Thank you very much. It's a pleasure to be here.
36:28
I think we can notice already a statistical oddity. I mean, the previous panel on data for monetary policy purpose was a standing panel and the panel on data for financial stability
36:43
is a sitting panel. I'm not sure this means anything, but let's see. Stability, right? We sit. Okay, so I'm going to be trying to make a point of the necessity, of course, of collecting data and new data, but also connecting it to understand what the financial
37:07
stability implications are. So obviously, I think we all know that financial stability has been in concern of policy makers, but it obviously has become something much more
37:25
present after the global financial crisis. And maybe let me introduce this speech by making a bit of a provocation with my friends from the previous panels, the monetary policy committee
37:43
guys. I think probably you have an easier life than the guys sitting in financial stability committees. In what sense? I mean, we know that even with the caveat of not understanding perfectly well where air star is, that the price stability is perhaps easier,
38:02
more measurable, more direct than financial stability, which is a pretty complicated multi-dimensional concept, right? So the life in MPCs, in monetary policy committees, have a procedure, a clear objective, a metric, and as was discussed previously, sort of reliance on
38:24
relatively straightforward data sets on inflation, expectations, output, wages, and so on and so forth. Now, of course, as Ivo mentioned, there was some complications. You want to think of housing costs, or if you want to get more granular, you scan your data. But overall,
38:45
the life is probably simpler. If you are sitting in a financial stability committee, you have this complex task of first finding the metric, and why is it after all financial stability? Is there anyone that can sort of define it properly, given that it faces
39:05
very complicated issues of understanding what is systemic risk and the capacity for this risk to change? So obviously, you can say, look, obviously, if you have more data, you can help policymakers with that, and you can anticipate and probably manage the next
39:24
financial crisis, but it might not be sufficient because of this complexity. And what I will try to argue throughout this presentation is that as much as data is necessary, the theoretical analytical framework with which you analyze it is also paramount.
39:43
So why is it so? Because the scientific discovery is not just assessing data and trying to pull – it doesn't happen by accident. Even if sometimes accidents help,
40:02
accidents only help, as Louis Pasteur used to say, those who can interpret what these accidents are. And, well, Yogi Berra, as you know, the baseball player used to say it in a more humorous way. If you really don't know exactly what you're trying to achieve, even if
40:22
you have a pile of data, it doesn't help you, right? You just get more confused. And somebody was mentioning the cost of collecting data, so you're going to spend a lot of money without helping you to understand what financial stability is about. Now, hopefully, I think we have evolved in financial stability analysis from the sort of
40:44
very nice, broad narratives that you can find in classical works, in the Berger, in Minsky, to something that is much more specific, which is to identify things that we can say, look, these are early warning indicators of crisis. And this is data that pertains to this
41:05
category because it has some predictive power. And, of course, the more you advance in the understanding of modern financial crisis, the more you try to find exactly what is the data that can allow you to fill these gaps, to identify the early warning indicators.
41:25
And we all know that exposed, it is very easy to say, look, this was precisely the stuff that created the financial crisis that we are analyzing. What is complicated is ex ante. What exactly is the type of vulnerability that is hidden, that is slowly undermining
41:47
the financial stability of the system, creating systemic risk? And it's not necessarily showing up in the type of information that we have. Think of, for example, the mispricing of risk.
42:02
Think of the analysis of rare disaster events that are very difficult to predict. So, in other words, the point is that data is, of course, relevant and important, but you need a theory to be capable of fitting the data into something that is meaningful
42:24
for preventing financial crisis and helping you to maintain financial stability. So, as much as collecting data, collecting dots is necessary, connecting them is also something that is very important. Now, even if you have a theory, mind you, it might be even
42:46
also tricky because it doesn't mean that if you have a theoretical framework about the financial crisis that you'll be able to interpret things in a straightforward way. Why is it so? Because we know that the theories behind the interpretations of financial crisis are full
43:06
of false beliefs. Remember this famous line with Ken Rogoff and Carmen Reyna, that there is always a story about, oh, this time is different. In other words, the data that we have doesn't
43:21
tell you exactly the truth or the data that we have, we don't really interpret it the way it should be interpreted because this time is different. For example, in Latin America, countries do not go bankrupt. During the Asian crisis, remember people say, oh, we have so much growth, high savings and solid public finance that it can support higher levels
43:44
of debt. And of course, we all know during the GFC that we used to sort of say, look, financial innovation, spreading risk and something that enhances stability. So
44:03
in all these episodes, we had data, we had the theory, but we were not really capable of interpreting them and of course, after interpreting them to take corrective action. So what do we stand? I think, look, we do have some stuff that resembles data that can
44:23
be considered some early warning indicators of episodes that disrupt financial stability. We're still missing a bigger robust theoretical analytical framework to understand endogeneity of financial crisis, but we do have, let's say, rough estimates at an aggregate level of things
44:48
that obviously we all know create unsustainable imbalances and we have data about that, credit to GDP gaps, pro-cyclicality of borrowing and excessive risk-taking. But
45:05
the more, probably the more we understand the financial crisis, the more we understand also that beyond these aggregates, you need granularity as was discussed here in the previous panel and granularity on financial data exposures. We know that everybody all of a
45:29
sudden discovered that they were exposed to Lehman. Interconnectedness between G-SIBs, now we know them, but before that we neglected that OTC derivatives and so on and so forth.
45:44
So the hope is that we are sort of assembling this data set about early warnings and the whole idea is to see if with this, with the stuff that we know now, it can help policymakers to anticipate, to prevent crisis, because that's the game, right? You don't want just to understand
46:04
crisis, you want to understand with data that you can sort of see the early warning and then take a remedial action before the crisis blow up. Obviously, we are increasingly aware that the international dimensions of crisis is paramount. Needless to say, globalization of
46:26
finance is obvious as a factor of risk in all our analyses, the interconnectedness. We know that the transmission was sudden, brutal from the subprime in the US to European banks.
46:45
We need essentially to understand the proliferation of these products and we do have now data on that, on bond markets, on non-bank intermediaries. I think Philip and Richard mentioned the need to assemble enterprise data on this. I think we understand now
47:04
the importance of collecting data on asset managers positions, on global CCPs, and also on some very big emerging markets that I would call systemically important middle-income
47:20
countries. Because why? Because they are highly interconnected with our financial systems and because they are prone to, let's say, crisis in the periphery of the system, they can cause spillbacks and they can sort of produce spillbacks that are severe enough to cause
47:41
financial crisis in the advanced economies. And of course, then we need data about these cross-border flows, cross-border exposures, which hopefully now we begin to have. Finally, a tricky thing is that you have stuff that you don't necessarily know now that is a
48:02
vulnerability in the making. You have a suspicion that this is dangerous, dangerous for the stability of the system, but not necessarily an awareness of where does it fit into the building of risk. Well, everybody spoke here about innovation, financial innovation, but think of
48:26
having more data on networks, on interconnectedness, on trade repositories, and on fintech, of course, on crypto assets, on algorithmic trading. And of course,
48:41
I think it was mentioned before, if you do have an avenue to explore is to look at big data. How could this bring in terms of new information for financial stability? Now, if you don't know stuff in your area, which is economics and finance, maybe you can ask for
49:05
little help from other disciplines, like physics, for example. We know that there are some people looking at network analysis to, I think Ardo mentioned this at the beginning, the necessity
49:21
to understand contagion, linkages. But the point is, okay, you have a network, is it a stable network? Is it something that you can observe as a static thing or something that will shift and change immediately if market positions move? In the same line,
49:44
you have now physicists that are exploring ways to model what is exactly that causes the changes between solid states, which you can consider more stable states, to liquid states or states
50:03
that you can consider more unstable. Should we as economists and policymakers try to explore a little bit what this is bringing to us? And would that be useful for policymakers? We don't know. And of course, last but not the least, what big data can bring us to understand
50:27
better financial stability. I think we all know that this is coming, that we can have real-time data affecting the major agents in the system, and I think we should explore that,
50:44
global banks, the interconnectedness of financial institutions. I think it was mentioned here, mortgage debt, household debt, of course, how it can transmit instability into the system and how, of course, the granular data about credit can help us. And this is, as we know,
51:07
it's a challenge that goes much beyond just the capacity to accumulate this information. It's not just an IT problem. Now, to sort of move to the end of this presentation, I think,
51:26
well, we have, let's say, some stuff about early warnings. We need to assemble new data. We can use big data. We can use different types of disciplines to understand the dynamics behind this complex notion of financial stability. But we also need perhaps to have stuff to
51:44
prevent financial crisis. And I'm here thinking of this array of macroprudential tools that we are learning to understand. I think Philip mentioned this as an important element to reduce excessive risk-taking. And obviously, if you want to measure the effectiveness of these
52:06
instruments of macroprudential tools, and there is an array of them in our toolkit, you need to have data sets to understand how they play with other macro and financial variables, credit growth, and balance sheets of institutions. And fortunately, they are.
52:25
We are at the beginning of a process of developing several data sets with researchers, with an effort by the BIS, but also, of course, the IMF and the FSB. And there are several of data sets that are available where you can understand the interplay between the fact that
52:45
you use an instrument, a macroprude instrument, and the capacity to bend the excessive risk-taking in a financial cycle. The problem with these data sets is they are binary. So you have only the plus one, minus one for tightening or loosening, and zero for stationary. It might not
53:05
be enough for doing a sound econometric exercise with this type of data set, but I think it's very important to have them. Finally, to conclude, okay, so with all this, what I would say is
53:21
the first policy implication in order to gather data to help us to maintain financial stability. As I mentioned at the beginning, first thing, I would improve data for early warning purposes, which we start looking at the diversity of things, maybe outside a little bit the domain that we
53:50
know. Because I think we need to be a bit more creative. Focusing on financial stability means focusing on RAR, on large disasters. And if you take the analogy of climate change,
54:08
you need to be capable of modeling and understanding with data these more rare events, which means that you need to start looking at these rare events with a different lens. And the
54:26
data that you need for that is perhaps different. Just to make myself clear, think of one event, which is the development of big tech firms in payments, in system payments, that are very
54:42
reliant now in them. Take, for example, Alipay in China, Tesent in China, WePay in China. These guys have half a billion customers. They are basically assuring the transactions, the monetary transactions, the payment transactions of very, very large segments
55:02
of the consumer market in Asia. And if by any chance they have a reputational failure, remember the Facebook episode where there was this distrust because of the leakage of private data. Suppose that you have one of these big guys that also get one of these episodes.
55:22
What would be the consequences for the stability of the payments system in China? Of course, if it is in China, it is in the whole world. So this is the angle, the novel angle, which I think we should get into if we want to have data that provokes us the tilt for an early
55:43
one. And finally, of course, we need data in financial stability to quantify policy trade-offs. I think you need, of course, as I mentioned at the beginning, to have a sound theoretical framework for that. So you need data to feed this theoretical framework. For me,
56:02
to explore policy trade-offs, you need to be working in a general equilibrium framework. But you need, above all, to understand how the real economy interacts with the financial sector. So whether it is formally modeled as a financial sector within a general equilibrium framework or whether it is just a financial friction, you need to have data for that.
56:24
You need to have an explicit way in which you understand this interaction so that you can test policy trade-offs. And of course, while you do all this, you need to work on the resilience of the system. And for that, you need, of course, data to understand if the system,
56:45
as it is now with your capital requirements, with your reserve requirements, with your deposit insurance schemes, is resilient enough. Fortunately, because of the crisis, we've been doing just that in the G20, in the FSB, at the BIS, and other forums, so that we can now have a core
57:04
of the financial system that is more resilient than before the crisis. So thank you. I stop here. Thank you, Luis. And I think I'd like to point about the lens and the theory becoming more and
57:24
more important as all this stuff becomes much more complex. This point about knowing where you're going before you start, and I think it linked to Richard's best practice principles, were along the same lines. So Hans-Helmut, why don't you bring all this together in 10 minutes. Thank you.
57:43
Thank you very much, Otto. It's a great honor and pleasure to be on this panel, and it was really inspiring because the privilege in discussing these arguments was having access to papers beforehand. So since this, by the way, is also a little bit about celebrating August,
58:05
who gave me, by the way, the assignment to be provocative. So if I'm not polite, it's his fault. My idea was basically to summarize very briefly what we heard. There's lots of overlap
58:20
between this panel and, by the way, the panel before. And then Bach upped the same tree as basically all of us did. It's ultimately theory which we have to care about because, to quote a famous economist, Marshall, facts don't tell their own story. So in order to make
58:45
data meaningful, as you said, you have to think about what are we trying to think about here. So what I'd like to do is, so I'll be impolite in summarizing the three presentations in one page. Frugal, austere, impolite. And I summarize it the way I received the presentations.
59:08
So Dick Berner was starting with what we should ultimately care about if you're thinking about financial stability issues, namely, vulnerabilities. And they might come up in
59:22
interesting places which we wouldn't have thought of. And I'll try to highlight or emphasize this point in a few slides later on. Then he carefully insists upon improving data quality, which is scope as well as excess,
59:40
and also new means to read and interpret data. One can get easily very excited about these new ways of interpreting data. But I do think it's important to start from the idea which is
01:00:00
it's an old one, technologies change. The fundamental laws of economics often don't, rarely do. So this is often deeply rooted in what we knew or learned before. Philip Lane stressed the cross border,
01:00:20
the multidimensional, the financial stability dimension which arises out of the repercussions and interactions. That's a very important part in particular within the Euro area where you still have lots of national idiosyncrasies and as a result of that repercussion. Internationally it means supporting all those efforts
01:00:43
which have been launched in the wake of the crisis in 2009 in the environment of the FSB. The FSB has been, by the way, not starting from data, but starting from a list of seven issues which should be addressed that was in September 2009.
01:01:01
So the core of policies at that time conceived was about addressing underlying real economic problems as in order to do that, think about what data you would need to do that. Lewis highlighted that the issue of,
01:01:25
so it's much easier to work in the monetary committee than to work in the financial stability committee because you do not have a well-defined objective. It's fuzzy, it's complex. As a result of that, you don't really know
01:01:42
what type of data you need. Ultimately you might not even know which type of data you need exposed. There's still discussion what was at the source of the Great Depression in the early 30s. So you can have contentious debate about things long back in history.
01:02:05
So I would like to start with an incident which was important in terms of redesigning the institutions we now have, which is basically what happened in 2007, 2008.
01:02:24
So the main questions I do think we have to care about are do we have the right data? Which data should we look for? What do we do with this data? And how should we derive policies from there?
01:02:44
So what I'd like to start with is summer 2007. These are data, it's a graph. I don't know if I can, so here you see spreads of secured over unsecured interbank money.
01:03:04
For the long stretch of history, barely five to seven basis points, and suddenly it showed up dramatically. So what do we do with this? How do we read this? And we were sitting at the time in the monetary committee and thinking about
01:03:20
what's behind that. So what is unusual? So we have data, but how do we interpret them? So one argument behind that has been this is the implosion of interbank money markets. And it was driven by the unraveling of subprime insurance products.
01:03:46
And a major issue there is what I would like to call foresight knowledge. We always read these graphs from right to left. So we are sitting here and have to think about
01:04:02
what's gonna happen here. That's why definitely an anchoring in theory is needed. This is important because policy or theory should ultimately do to policy advice. In 2007, there were two views of what was going on.
01:04:24
One was, this is information asymmetry. Market will find in equilibrium, let's stay on the sideline. The other was, it's a run. It's a run of wholesale banks on each other.
01:04:42
The conclusion which the ECB at the time drew was to inject liquidity. That's here. That's August 8, 9, 2007. At the time, it was by the way criticized as hyperactive and panicky. I don't know how you call this.
01:05:03
So theory should inform ultimately a policy. So in the meantime, we've achieved many many institutional innovations. We've been filling data gaps.
01:05:22
I just clicked through them. In particular, in terms of the European system of central banks, integrated reporting frameworks, an integrated vocabulary, so common view. So now, let me dig a little bit deeper into theory.
01:05:46
Financial crisis are, and he has been quoted by you, Luis, according to Kindleberger, Hardy Perennial. Usually it has been three points. Over leverage, mismatches, underpricing of risk,
01:06:05
and interconnectedness, which has become over the last decades very much international. Philip has highlighted that. So that's where we should look for in terms of which data we would like to see. And it is about how markets can become dysfunctional,
01:06:24
how intermediaries can become dysfunctional. There's this credit gap. It could also be mispricing of term premier and risk premier, and the interaction between funding and market liquidity.
01:06:43
So those were the places where I do think we need theory as well as data to understand what ultimately is important for us, namely what's gonna happen in the real economy. So what does this lead me to suggest?
01:07:05
We have to develop and critically assess analytical tools, and there is not one right model. This also, of course, holds true for the new devices,
01:07:21
and it holds true for thinking about how we derive arguments from looking at individual cases. No one right model. You might recall the Jackson Hole Conference in 2005 when Raghu Rajan suggested that as a result
01:07:41
of using micro insurance instrument, macro trouble could easily arise. He was heavily criticized. So one suggestion would be beware of groupthink.
01:08:06
Try to integrate critical views. So there's also, and Luis pointed to that, there's not only a need for inter, but I would say also for interdisciplinary debates.
01:08:26
Finally, I think it goes much beyond cognition. Early on in the crisis, there was talk about and a window of opportunity which was about to be lost.
01:08:42
And now we have talked about rollback. So I don't completely agree with Dick in terms of this complementarity between industry and policymakers. I do think policymakers are providing a public good, financial stability, which does not always align
01:09:03
with what private sector entities would like to see. For example, capital requirements have been criticized as much too high. That's why you have this rollback argument. So central banks are somehow in a role of benevolent dictators.
01:09:21
And that means they need discretion, they need judgment, and they have to communicate with the general public. So maybe this one, the policy dimension is the most difficult one. So that's what I want to conclude with. Data never tell their own story.
01:09:41
Thank you very much. Thank you, Hans-Helmut. Do any of the, would any of the speakers like to react, Philip or Dick or Louise? Okay, then we'll open up. One of the recurrent issues in this panel
01:10:04
so far has been this issue about the interaction between data and analytics. Now, I think that is true. So it does mean within our organizations, it's very important that the statisticians have a vibrant dialogue,
01:10:22
both with the macro-monetary economists and with the supervisors. So actually, so in terms of having a single data set to collect and how to interpret and what questions to ask, I think that's very important. But I think it's also probably important to emphasize it's very much a dynamic relationship
01:10:42
because theory is not static. I mean, there's no other joke about economics, which is the empirical guy saying to the modeler, tell me your model and I will find a way to look for it in the data. And the modelers are going, tell me the data
01:11:00
and I'll trim up the model to match the data. And so there's always gonna be that dynamic, which is the models written now are heavily, often the first page of a theory paper is gonna be, well, based on the data we see, I'm now going to write a model to replicate some features of the data. So it's a two-way street there. And I think that's quite important.
01:11:22
And then the other thing that Hans-Helmut raised, which Peter Pratt this morning also signaled, which is what is the role of a market's data? So one level, maybe it's a substitute for collecting institutional data. If the market is telling you what's going on,
01:11:42
we do think it's one of the pillars of financial stability is to market discipline and to see what we can infer from market's data. But it does run into the limit because the markets also learn from us because the public good of statistics is not something that any individual market trader or institution can replicate on their own.
01:12:05
I mean, this goes back to what some of you were saying, which is essentially it's in the interest of industry that there are good data sets that they can also use for their own analysis. So my, again, come back to what I said earlier on, I do think we're in this kind of intermediate phase
01:12:23
that a lot has been done, but the interaction between the markets, the institutions, academics, it's a very fluid interaction and we're far from having a true view of the world that we can guide that. We have to recognize the dynamic element of that.
01:12:43
Okay, just as much as I think everybody here, and I did emphasize the need for a theory, a framework to collect the right set of data, to look at the right set of data, there are also a very important role the statisticians and data brings to policy makers
01:13:02
in a more common sense thing. I remember when I was sitting in the Financial Stability Committee of the Central Bank of Brazil and people were bringing us data about loans to used cars with LTVs of 150%
01:13:21
and maturities of seven years. Doesn't take a rocket scientist to understand that this is absolutely wrong and that you need to act on that. So you don't need a big theory to use the appropriate set of data that people are bringing you to take immediate actions for a localized financial stability, which was the car market in Brazil.
01:13:44
So I'd make two comments, one in response to what Hans said. I wanna emphasize the fact that when I talk about a partnership or alignment of interest between industry and regulators, there's an important asymmetry there obviously because system-wide financial stability policies
01:14:02
are needed because neither risk management at the micro level nor micro-prudential policies are really sufficient to deal with the system-wide externalities and market failures that really can arise from asymmetric information and mispriced guarantees, mispriced credit, totally acknowledge that.
01:14:20
But there does need to be a dialogue, I think, between them so that they can better understand what their respective goals are. Second, none of us have really mentioned a key use of financial stability data, namely stress testing, a workhorse tool for implementing macroprudential policies and filling in the gaps
01:14:41
where we don't have counterfactuals to observe and to understand what the impact of our tools can be. So we need to work on the framework for stress testing. I think that's particularly true with respect to operational risk, which I mentioned earlier. It's particularly true with respect to CCPs
01:15:02
and how to do stress testing for CCPs. Does it make sense to test them one by one? I think not. I think we have to do stress testing for CCPs in the context of their relationship with their clearing members and other counterparties and with the system as a whole
01:15:21
because they're so highly interconnected. Those are really important issues. And we ought to think about the granularity that we really need in stress testing, which is intense for CCPs, but it may not be so intense for less complex, smaller entities, such as some of the smaller community banks that we have in the United States.
01:15:43
We ought to differentiate the way that we use these tools by where we think the risk is. Thank you very much. So we have about 15 minutes officially. I don't think we want to eat too much into the lunch break. So we'll collect maybe three questions at a time
01:16:01
and then address them to specific people on the panel.