in News, Technology, Work

Thinking about proxies

Recent conversations at work have got me thinking about the proxy metrics that we use, and how much nuance and detail they hide.


Last week, we had a look at a tool that presented a ‘cybersecurity dashboard’ for our organisation. It is a powerful tool, with lots of capabilities for investigating and remediating security issues across our IT infrastructure estate. But what struck me was a big number presented front-and-centre on the first page. It looked a bit like this1:

It was simply a percentage. I’ve been pondering it since, wondering if it us useful or not.

80.4%. Is this good? If that’s my organisation’s score, can I sleep well at night? When I was at university, an average score of 70% in your exams and coursework meant that you were awarded with a first-class degree. So that number has always stayed with me and has felt intrinsically ‘good’. 80.4% is substantially higher than this. But what about that other 19.6%? Can we relax, or do we need to keep pushing to 100%? Can you ever truly be 100% secure if you’re running any kind of IT system?

Perhaps it is meant as a useful jumping off point for investigation. Or it is meant to be used over time, i.e. last week we were 78.9% and now we’re 80.4%, so things are going in the right direction. Maybe both. I’m not sure.

It’s a common idea that executives don’t want the detail. They simply want to see a big green light that says that things are ok. If there’s no green, they want to know that things are being dealt with in order to bring the amber or red thing back to green again. In the example above, although the ‘speed gauge’ is blue, it is still an attempt to aggregate all of the cybersecurity data across an organisation into a simple number. To me, it feels dangerous to boil it down to a single proxy metric.

I likened the single score to a song being reduced to its average frequency. Music can make us laugh, sing or cry. It can make our pulses race and our hearts throb. But the beauty and nuance is completely lost if you take the average of all of the sound and boil it down to one long continuous tone. (Someone has actually done this so you can hear examples for yourself.2)


Food writer, journalist and activist Jack Monroe wrote an incredibly insightful thread on the latest inflation figures. The news headlines were screaming that the inflation number is 5.4% — a 30-year high. However, this hides the nuance of what exactly has been increasing in price and what has remained static. As usual, the poorest in society bear a disproportionate brunt of the increase. For people that depend on the cheapest goods, inflation is much higher, as the cost of those goods have been increasing at a much higher rate. Her original thread is well worth a read:

It was wonderful to see this thread get so much attention. Today Monroe announced that the Office of National Statistics will be making changes:

Financial data

I’ve been working in Financial Services for over 20 years. During the financial crisis of 2007–2008 I was employed by one of the banks that suffered terrible losses. In the lengthy report that was published to shareholders, it was notable that there was a dependency on a number of metrics such as Value at Risk which were in effect ’green’ even when the global financial system started to unravel. The actual problem was the sheer amount of toxic financial products that were on the balance sheet; as soon as the assumption of how much they were worth was revised, it triggered eye-watering losses.

From the report:

UBS’s Market Risk framework relies upon VaR and Stress Loss to set and monitor market risks at a portfolio level. [p19]

In the context of the CDO structuring business and Negative Basis and AMPS trades, IB MRC [Market Risk Control] relied primarily upon VaR and Stress limits and monitoring to provide risk control for the CDO desk. As noted above, there were no Operational limits on the CDO Warehouse and throughout 2006 and 2007, there were no notional limits on the retention of unhedged Super Senior positions and AMPS Super Senior positions, or the CDO Warehouse… [p20]

In other words, the amount of ‘good quality’ collateralised debt obligations (CDOs) that could be held on the balance sheet wasn’t subject to a cap. These were the instruments that were later found to be ‘toxic’.

MRC VaR methodologies relied on the AAA rating of the Super Senior positions. The AAA rating determined the relevant product-type time series to be used in calculating VaR. In turn, the product-type time series determined the volatility sensitivities to be applied to Super Senior positions. Until Q3 2007, the 5-year time series had demonstrated very low levels of volatility sensitivities. As a consequence, even unhedged Super Senior positions contributed little to VaR utilisation. [p20]

This means that the model, which produced a ‘green’ status for Value at Risk, was based on the historical data which said that ‘everything is fine’. No consideration seemed to have been taken on the sheer amount of CDOs that were being held. As the financial crisis unfolded and it became clear that the assets were no longer worth 100%, the revaluations resulted in nearly USD 50bn in losses.

Proxies should be a jumping off point

Proxies are attractive as they often boil down complex things into simple metrics that we think we can interpret and understand. Wherever I see or use them I need to think about what assumptions they are built on and to check that they are not being substituted for the important details.

  1. Taken from the Microsoft Windows Defender ATP Fall Creators Update 
  2. Lots of people on forums seem baffled as to why anyone would want to do that. I love that someone has done it. 

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.