Deep beneath the surface of the cold, stormy North Sea, countless pumps churn away in the dark, feeding the nearly 200 oil rigs scattered atop the dozens of major oil and natural gas fields discovered in the region since the 1960s. It’s the most active offshore drilling region in the world, located in some of the most inhospitable waters for drill rigs. Storms sweep through regularly in the winter and active shipping and fishing lanes bring plenty of traffic through the area as well.
Three hundred feet down, the pressure is ten times what it is on the surface. But the Electrical Submersible Pumps (ESPs) sitting atop the wellheads have to contend with even greater pressures being exerted on them from the upwelling oil and gas trapped below. The varying viscosity of the oil, the amount of gas mixed into it, and the particulate count all work to degrade the lifespan of the pumps.
Sixty percent of the world’s oil production runs through ESPs and replacing a broken one in North Sea conditions is no picnic, so major oil producers like the Apache Corporation would love to know when and how to maintain them to avoid unexpected production shutdowns. A one percent improvement in operational efficiency could result in an additional $53 million per day in revenue. But despite the profit motive, since each well is unique, no easy formula could make that determination.
Enter prescriptive analytics… the ability to combine large, disparate, near-real-time datasets to not only make predictions about pump performance and wear, but also to provide recommendations on servicing them. Apache plans to combine vast datasets on operating parameters, installation and configuration options, and oil and gas extraction data into a prescriptive analysis to help it eke out that additional efficiency.
Prescriptive analytics isn’t just coming to the oil and gas industry, although it has a lot more uses there than just helping reduce unplanned pump failures. It’s a method that can be viewed as the natural evolution of all types of analytics, and a natural area of deep focus in any data science graduate program.
The Natural Evolution of Analytics Leads to Prescriptive Analytics
Although prescriptive analytics exists as a distinct conceptual approach to data analytics, the term “prescriptive analytics” has been trademarked by Ayata, a Texas-based firm that focuses on analytics solutions for the oil and gas industry. Because of the trademark, it can be difficult to find other organizations or data scientists who are using that approach—even the Wikipedia article on prescriptive analytics is largely focused on Ayata’s practice area.
But it’s an idea whose time has come even if you aren’t allowed to call it that.
Prescriptive analytics represents the final logical stage of data-based analysis in business analytics.
- Stage One – Descriptive Analytics: Collected data is analyzed to describe a certain outcome or to determine what the current state of affairs may be; for instance, tallying sales data to arrive at total numbers for revenue and profit.
- Stage Two – Predictive Analytics: Making use of data to forecast trends and outcomes, such as taking historic sales data and using it to predict the likely numbers for the next quarter.
- Stage Three – Prescriptive Analytics: Using data analysis techniques to simultaneously forecast and influence outcomes in a self-optimizing loop that looks at incoming information and uses instant analysis to improve the performance of the system.
In this way, prescriptive analytics incorporates a feedback loop in which descriptive and predictive models are combined to influence one another and direct the trends instead of simply detecting them.
Prescriptive analytics uses the power of data science to not only anticipate outcomes, but to understand why and how they come to pass, and how to influence them to take the most desirable path. Optimization is the goal, and prescriptive analysis is particularly useful in stochastic optimization—using the trends available in large, frequently unstructured, datasets to account for random variables that can effect outcomes.
This process can be applied not just to ESP maintenance, but to broader areas such as:
- Fracking
- Self-driving cars
- Medical diagnosis and treatment
- Supply chain and delivery optimization
- Pricing of commodity goods
Prescriptive Analytics Closes a Key Gap in Modern Data Science
In one sense, almost all analytics are used prescriptively. The entire point in producing detailed historical reports is for their use in influencing future courses of action by decision-makers. But prescriptive analytics differs from these traditional uses in closing the feedback loop to make it almost or entirely automated and collapsing the decision matrix back into the analytical engine that is generating the reports.
This technology closes a key loophole that has become a dirty but open secret in data science circles: Vast amounts of data are collected and never constructively analyzed. And much of the careful analysis and reporting that data scientists do generate is ignored or is produced after the fact, too late to influence decisions that could have benefited from it.
Prescriptive analytics takes much of the delay out of the analysis and incorporates concepts of uncertainty into decision making to optimize outcomes based on the numbers. In fact, the ideal prescriptive analytics solution is one that essentially codifies the observation/analysis/action loop algorithmically… exactly the way that Google is programming its self-driving cars to operate, a classic example of prescriptive analysis.
If prescriptive analytics were easy, though, everyone would already be using the technique. But doing it properly requires a lot of advanced data science techniques. Data scientists who want to work in prescriptive analytics have to become experts at:
- Evaluating and assimilating large amounts of unstructured data
- Creating algorithms for processing and decision support
- Using artificial intelligence and machine-learning techniques for data processing
All of this requires an advanced education where you refine your programming prowess and advanced mathematic and statistical skills. A master’s degree in data science is one of the only types of programs to combine all of those skills in a single package. And it’s a package that is definitely going to be in greater demand as more and more industries advance to the prescriptive stage in analytics.