My first two posts in 2018 have included numbered lists, but this one – ironically about data – is intentionally number free. This is because – instead of predictions for 2018, around regulation generally and Brexit specifically - I want to write about what the attached FCA Insight article tells us about regulators’ data struggles.

The first thing to say is that it’s well worth a read, and tells us a good deal about what’s happening with consumer credit (and what isn’t). But it also reveals how hard it still is for regulators to conduct this sort of analysis and, due to the remaining gaps in the analysis (which the authors freely acknowledge), begs almost as many questions as it answers.

The article exposes the some of the limitations of the regular data returns regulators receives from firms. This is the inevitable result of seeking to use this data for a range of purposes it wasn’t designed to meet, some of which weren’t even envisaged at the time. So it's not that the existing data isn’t valuable, just that it doesn’t particularly help answer these questions. Instead, the regulators have got hold of a different set of data from CRAs (credit rating agencies) that enables them to, for example, look at individual borrowings across different products etc.

Analysis of this data has enabled the authors to answer some important questions – Is consumer credit growth driven by subprime borrowers? (No); Is it linked to mortgage borrowing? (No); Do consumers stay in debt longer than product-level data implies? (Yes)

The first two of these answers are broadly positive news for us all - consumers, firms and regulators - while the third is evidently more worrying. 

s far as I can see, however, the new set of data doesn’t yet really tell us about the vulnerability of those who are borrowing. And it could be, for example, that growth in consumer credit is being driven by younger borrowers - without enough credit history to register as subprime and not (yet) able to get a mortgage. Some media commentary seems to support this but what I’ve so far seen is largely anecdotal. The main point, however, is how hard this type of analysis still is – which partly explains, for instance, the various bespoke data requests the FCA makes, that often create significant frustration and cost for firms.

Looking ahead, the recent FCA/Bank of England TechSprint on reg reporting offers a glimmer of hope. It was only a proof of concept, but offered the future prospect of regulatory reporting that, because it would be model driven and machine executable, would enable both regulators and firms to ask and answer these sorts of questions quickly and cheaply. No more data warehouses, no more +3 year lead times to change the content of reporting. 

In the meantime, we should value properly the hard yards this sort of analysis reflects…