Before the pandemic, companies had several tools to predict future outcomes. They could plan from several weeks to several years in advance and shape their machine learning models to deliver insights on what the future may hold -- based on data, those models learned from the past. Long-term forecasting hasn’t always been completely accurate, but now, those models and plans are producing scenarios that aren’t realistic, making them unreliable to companies in need of direction amid the current economic disruption.
It’s time to throw out the three-year plans. Today’s world calls for a sharp short-term shift toward predictive models that focus on recent, real-time data. Companies need to be nimble and act at the fastest pace based on reliable, accurate predictions to shape their decision-making -- especially as the importance of safety is at an all-time high, and the margin for error within budgets continues to thin.
As businesses look to recapture accurate data and analytics on their customers, they should embrace a new approach on their planning process: real-time forecasting.
Past results can no longer provide accurate future guidance
Instead of taking the traditional approach of relying on historical data, this new approach leverages specific data that is only relevant based on the pandemic’s new conditions. Model your most recent data with current variables only. Historical data from prior to the pandemic won’t generate the accurate, high-value insights that your business needs. It’s also crucial that these models react quickly to keep pace with the rapid consumer behavior changes taking place.
Retraining models is an investment that will outlast the pandemic
Adapting models for the current situation is essential to short-term accuracy but making the move now will pay dividends as conditions continue to evolve long term. By retraining machine learning models and continuously testing their effectiveness, organizations will build up resiliency against the next crisis. Companies running outdated models are seeing significant drift on accuracy, highlighting the need for updated models that deliver more effective predictions. Data science teams must keep their models current to truly create a “real-time” forecast that is showcasing the scenarios most likely to unfold.
Significant change has already started, but it’s not too late to catch up
Across industries, the pandemic is producing challenges that companies have never faced previously. How can models relying on past data predict an accurate future, when the present is an untraveled trail? Capgemini research shows how drastically buying behaviors have jolted within transportation and retail, with an overwhelming focus on safety, hygiene, and ways to avoid contact. Producing a touchless customer experience with the latest automation capabilities is paramount to earning loyal consumers who appreciate efficiency, rather than added shopping stress during this difficult time. Companies must adapt quickly to these emerging trends and get a finger on the pulse of the industries they operate within. Real-time forecasting could go a long way toward delivering those insights.
In the past six months, countless companies have seen budgets shrink, sales stall, supply chains in disarray and operations in overdrive. Organizations must get accurate visibility into what could be coming next. Customer patterns aren’t done changing and many may not settle into new, consistent routines for years to come. IT and data science must work in unison to produce relevant models that identify the most likely outcomes and get businesses into positions where they can make informed decisions that put them on the path toward economic recovery.
No one truly knows what’s coming next, but there’s plenty of technology at our disposal to come close with our predictions. Real-time forecasting is a way to effectively prepare for what’s to come. With so little data and even less history on this pandemic-influenced world, there aren’t many alternatives.
Dan Simion leads the AI & Analytics practice for Capgemini North America. He has more than 25 years of experience in data science, advanced analytics, and technology-enabled applications and solutions. Dan’s focus areas are artificial intelligence and machine learning, and his publications include "Marketing Analytics Capabilities," "Harnessing the Power of Private Label," and "Systems and Tools to Track Marketing Effectiveness.”
The InformationWeek community brings together IT practitioners and industry experts with IT advice, education, and opinions. We strive to highlight technology executives and subject matter experts and use their knowledge and experiences to help our audience of IT ... View Full Bio