We first shared this on LinkedIn a few days ago, but it’s worth repeating here as this data is, we believe, fundamental to the success of CMA9 Open Banking API initiatives and the banks, ASPs and TPPs who are running them. Next we’re going to be diving into the core data and what it’s telling us.
Improvement, but still a way to go…
Over the course of this year, we’ve found a consistent number of what we deem to be underperforming CMA9 Open APIs from UK banks. That number has been set at around 30%. But in a good development in July, it’s dropped dramatically – to 20%. And it’s mostly due to improvements at just one of those underperforming banks.
The graph shows the CASC (Cloud API Service Consistency) scores the main bank brands we track as part of the CMA9 OBIE regulated open APIs.
Open Banking API, CMA9 & The CASC Score
As a reminder, we use our CASC system for our UK CMA9 Open Banking API comparisons. This allows us to do some interesting things:
- Firstly, the sample is homogeneous. We have Open Banking APIs from similar types of institutions, all doing the same things.
- Secondly, the sample is bound by geography. We can create a sample based on calls made from the same clouds and locations.
- Thirdly, the APIs are open and public. We don’t have any security effects.
We assumed this would lead to similar scores between essentially identical APIs, maybe with some minor variations. For example, some banks are larger than others, so they would return significantly more data.
Except both of those assumptions are wrong, which tells us a LOT about how different organizations approach this type of problem.
With CASC, we take into account not just overall speed and pass rates, but also general consistency of performance using machine learning algorithms to identify outliers and performance trends. We then normalize those results into a score out of 999. Scores over 800 are generally good with few problems; scores between 600-800 are concerning; and scores below 600 are completely unacceptable.
We feel fairly confident that in the case of the CMA9 Open APIs, all of the providers should be able to maintain a score over 800.
From the outset, one bank has been clearly head and shoulders above the rest: Danske. They have been the best performing bank for the last 3 months, and before that in the top performers. They have a smaller footprint when looking at branch or ATM locations, but they deliver a consistent service with few issues.
HSBC and Royal Bank of Scotland have also maintained a fairly consistent but weaker set of scores. As you can see from the graph, HSBC is consistent – but at the very low end of good performance, with some of the individual APIs scoring in the low 600s.
On the Up
Barclays Bank have shown the most significant improvement during July. They made significant improvements to the networking side of the equation over the month of June. This has removed some ongoing DNS lookup issues and given all their APIs a boost.
Nationwide are this month’s loser, with a dramatic drop in their overall score. We can trace this to a significant API failure during the end of July, which went unnoticed for several days with the API returning hard server errors (HTTP 500).
One area we also looked at this month was the specific result breakdowns – i.e., how much time during each total call was spent doing what. Our system analyzes DNS lookup, connection time, handshake, upload, processing time and download time.
These are all simple GET requests, so the upload time is effectively zero, making the interesting data to look at the Time to First Byte received and the download time. Added together, these will effectively tell you the work the server is doing.
The final metric we thought would be interesting is the returned Content Size. Again, we assumed there would be a correlation between speed of the API and the amount of data it returns. While there is a loose relationship, it’s actually quite complicated, with some banks returning significantly larger payloads much faster than others.
We also noted that there was a fairly significant difference between the different banks in how they applied the specifications, which we’ll cover in a separate post. It raises the question – how optional is optional?
Feel free to reach out, or check out http://apimetrics.io if you’d like more details on what we do and how we do it.