Is a core part of your business secretly highly dependent on Excel? If so, then you’re not alone – Excel is still the most common tool for data analysis across a range of industries, and Insurance is no exception. Many businesses still rely on Excel for critical business functions. But that doesn’t make it OK, because Excel wasn’t designed for that purpose.
Imagine a scenario where all your Excel spreadsheets suddenly disappear overnight. How bad would that be? If you’re thinking “we’ve just lost some critical data” then that’s a real concern. If you’re thinking “we’ve just lost a key capability that runs part of our business” - you have a serious issue.
Why Excel probably isn’t the answer.
Excel is awesome for many things, or many reasons:
It’s pretty much ubiquitously available for everyone to use
It’s highly flexible
It’s relatively easy to get started with
It offers front end, database and calculation in one application.
But this accessibility is part of the problem. Spreadsheets that perform key business functions - or that contain critical data - introduce a range of business risks, from GDPR failures to error-prone decision-making. Perhaps, months or years ago, someone built a spreadsheet to help calculate loss ratios, or to do pricing of a niche asset class, or to create management information reports. Fast forward to today, and that brilliant spreadsheet has become a black hole of data and logic, likely because it has iterated many times and the provenance and parameters of the data and algorithms it uses is unclear.
Furthermore, data in Excel files is rarely available centrally or in a standard format to be used later for more advanced data analytics, keeping potential value locked away in folder structures. The distribution of these files also carries risk from a governance perspective and make consolidated reporting on real-time KPIs extremely difficult and fragile. However this isn’t Excel’s fault - it was never designed to be a core system in terms of:
Scalability: how big can it safely grow? (Remember that Covid incident with Excel?)
Traceability and auditability: how easy it is to check the decision-making process?
Availability: how ready is it for 24x7x365 usage at scale?
Why is this a particular challenge in Insurance?
Let’s be clear – this challenge exists across every industry. You won’t believe the number of data analytic projects that have started with “so, we have this spreadsheet…” before a more industrial solution is designed. But insurance tends to be rife with these things because of a few factors:
Insurance is data rich. There are so many sources of data available to support decision-making. Excel is often used to bring these sources together, or to do quick analyses to support thinking.
Insurance businesses need to get to answers quickly. Success in Insurance is fundamentally aligned to the success of decision making. Because of the drive to get to an answer, particularly in more niche asset classes (or in reinsurance), we often turn to Excel to help us make a decision. And because we need to make the decision quickly, we might do ‘one off’ analyses as opposed to build a robust tool.
Actuaries are skilled in working with data, not building products. Historically, education for actuaries has focused almost exclusively on the building of domain knowledge, supporting by the fostering of skills in data analysis. However, formal programming skills have never been part of the equation, and there is still nothing on the syllabus today to help actuaries build stable and well-designed data products. This means that Excel often becomes the go-to solution in Insurance for anything related to analysis.
Much of this relates to what Excel was designed for – quick, one-off analyses. But it’s when these spreadsheets become reusable assets, or the foundations for repeatable analysis, that we run into problems.
So, what should we do next?
Let’s meet the challenges of today in a way that positions us for the future. Here are some positive steps that could, and should, be taken if any of the above rings true:
Step 1: Tactical risk mitigation.
Understand the role that Excel currently plays and make tactical changes.
Audit Excel to clarify which decisions are made where
Audit your data to identify if there is valuable (or sensitive) data that needs to be moved quickly to a more appropriate data management platform
Review instances where Excel provides a critical business capability to understand what you are trying to achieve
Prioritise activities to reduce initial risks.
Step 2: Lay the foundation for success.
Migrate to the right tools and start using Excel for the things it was designed for.
Implement design standards to ensure the right tools are used in the right way, for the right thing
Implement bias and ethics standards to ensure your data is representative and used ethically
Review and revamp your security around Excel to minimise impact of cyber-attacks.
Step 3: Prove and release value quickly.
Enable your organisation to release value from data.
Upskill your people to be confident using your new data tools and processes
Remove Excel from critical business functions in a prioritised order
Invest in a strategic data platform and migrate there in a pragmatic and phased way
Pilot meaningful and achievable solutions that show that data can be used correctly to release value quickly.
With some strategic thought and a thorough review of existing processes and behaviours, you can move Excel back into the space where it’s great (for short, sharp bursts of creativity to help you understand something quickly) - but it’s time to start using proven 21st century data and digital best practices that will set your organisation up for ongoing success. And if you need any support, don’t hesitate to get in touch.