How can we answer ‘what if’ questions and, at the same time, embrace the automation that opens up huge data sources with analytics or machine learning? When we see things more clearly, we can help shape the future. This now requires the skills of both forecaster and data scientist.
Predictive analysis is like forecasting, but not just for planners. Firstly, we now require more depth of analysis, clear to see if we look back over past years. To sufficiently understand variances in this world of rapid change, and model the impacts of these, we also want to tap into vastly more data, of different kinds. There are complex operating models and data relationships to map. Planning models are no longer simple smoothing forecasts.
Secondly, we are losing our dependence on complex spreadsheets that can only be changed by their creators. Now, models are more like computer programmes, built with rule-based languages, with structured, automated data feeds and the governance that comes with this approach. At RSA, for instance, the models in Anaplan are programmed by planning analysts using iterative development and agile sprints. Scrum teams involve subject experts from across the business. Once set up, multiple scenarios can be managed with astonishing speed. Data is trusted, coming from a shared source with Finance, HR & others. Budget planning cycles are often an organisation’s only truly cross-functional alignment.
Thirdly, we want to tackle problems before they happen, by creating forward-thinking performance indicators and dashboards. Forecasts are in-built, the result of joining up your drivers. Think: if X occurs, then this is the impact on Y or Z. We want to predict revenue, costs, customer and employee satisfaction. It’s all about risk and downstream consequences. In theory, with the right data and resources, we can produce predictions for any of our metrics. Crucially, this work is likely to be replicated across the operation, so pool resources and don’t work in siloes. You need analytical, programming & planning skill, with business understanding, but these can come from collaboration across any mix of teams.
Predicting weather impacts
Weather is a good example, as many operations face impacts from extreme weather events. Moreover, a wealth of publicly available source data can be used to build models that predict the impacts in our specific operational scenario. At LV=, for instance, the new, connected planning system for Home Claims is a sector first. Developed with Guy Carpenter and their re-assurer, this is transforming surge response to major storms. Projections from forecasting agency EuroTempest are overlaid on a map of UK home insurance customers and a model predicts surge impacts, to evaluate resourcing contingencies. Data drives the decisions. “You see what’s waiting to happen, before it happens, and we can quickly model multiple scenarios”. At Anglian Water the value of their weather impact analysis from their data science team, during the Beast from the East in 2018, raised their profile massively in the business. In planning, the team are using AI & machine learning forecasts, developed with their analytics partner CACI.
Data: the key to analytics
If you have joined up data, analytics or machine learning are powerful tools. They are like any predictive analysis, but able to process large volumes of data, automatically and at speed, trying many models to find the best fit. In customer operations, you want data for the whole end-to-end journey, with no gaps, and data tags that link it up. Even better, use metadata that can link to wider public data sources: age, gender, postcode etc. Statistical methods like regression are common, testing a hypothesis that independent variables are correlated. When you find the right combination, you can build a predictive model. Also, you may think of data science and speech or text analytics as tools for root cause analysis, just working on a large scale in an automated way. Analytics is not limited to strategic planning. At Jet2 real time analytics, from QStory, analyses all key performance drivers, to explain the underlying reasons if an SLA is missed. Remember, any statistical analysis is based on historical correlation, yet the relationship between variables can change. So, it’s important to fully understand the assumptions built into your model (see box). Risks need to be understood and governed, as models can soon be outdated if assumptions start to change.
What is machine learning?
AI, Machine Learning & Analytics all use complex algorithms to programme a model(s). You may ask: does my problem require machine learning? If so, take time to study some of the distinctions and features; there’s been an explosion of innovation in recent years. For instance, not all AI is machine learning and analytics is powerful if you know how to set up the model. With machine learning, computers re-programme themselves, learning from external inputs, so they get better and better at key tasks over time. At Openreach (pg 40) Network Analytics uses decision trees, a type of machine learning often seen in medical imaging. Local engineers had known of, and been frustrated by, high volumes of unnecessary work, so the team worked with them to find cases where this could be removed. Automated algorithms were created to identify these in the data and change the work allocation rules accordingly. More broadly, supervised learning can be via classification (labelling data) or via regression (as above). Neural networks are another common method, with ‘deep learning’ a layering of these.
This article was first published in the 2020 Best Practice Guide - 2020 Vision: Crystallising your knowledge
To download a full digital copy of the Best Practice Guide, click here