We’re a social bunch

Let's talk Business

Solving Upstream's Data Trust Gap

Posted 16.03.2018
Back to news

by Giles Edward, CEO, M-Flow first published in E&P on 1st February 2018

Accurate measurement has perennially been one of upstream’s biggest challenges, and many in upstream believe that data and digitalisation could help solve their measurement dilemmas. But this will only become reality if the data is reliable, and reproducible.

Oil company senior managers are often involved in disputes over what and how much has been measured, and it’s very difficult to prove to a legal standard. Even during relatively straightforward operations, like sending crude from storage to measuring stations there can be disagreements about what volumes had been sent and received. And as soon as it gets to court, it usually costs millions to resolve, with decisions essentially being based upon unreliable data.

 

Measurement ambitions

In particular, when a combination of fluids such as oil, water, and gas flow from a well it has traditionally been difficult to correctly measure individual phases without separating them. Multiphase metering started around 30 years ago with the aim of addressing inaccuracy and reliability, providing valuable data, and delivering cost reductions. Those newly realised savings would also facilitate single well tiebacks, shared use of existing pad and pipeline facilities, and continuous online monitoring for economically marginal wells – or so the theory went.

Upstream is an industry that takes great pride in its use of technology, science, and risk assessment. However, at the moment, there’s a data trust gap between what’s being promised and what’s being delivered. That’s because over the years the multiphase metering market has not fulfilled its potential.

At present, it’s rare to find a business claim to get better than 10% +/- accuracy from their multiphase meters. But more astonishingly, it’s very common to hear people in the field say that they’re lucky to get 20-30% +/-.

Such a level of uncertainty matters, as I recently heard from an operator out in the field. He told me that as soon as he goes above 40% water cut he got emulsion formation. As soon as this happens his storage tanks all filled up, and he had to shut the field in and empty everything. This is just a one-example case study, but it’s a ubiquitous issue. Even on a 1,000 barrel a day field an emulsion incident can really hurt the bottom line.

In large part this inaccuracy is because traditional meters haven’t been optimised for reliability, and instead manufacturers have prioritised expensive technology that embeds uncertainty in flow rate measurement over accuracy and repeatability in parameters that can be directly measured. This inherently leads to complexity, human intervention, and validation-hungry systems.

To address these weaknesses, cumbersome and expensive test separators remain in operation; but they provide only piecemeal or fragmented information that rarely delivers more than limited value. So while oil and gas companies seek the benefits of access to data, they’ve been consistently unable to access lower cost reliable and reproducible information sets.

 

Quantifying change

There is no doubt that as an industry we need to spend less per barrel. It’s a theme that’s coming straight from the top: How can we reduce our biggest costs? Can technology stop our salary graph going up, and our production graph going down?

In the onshore industry where the costs are often well driven, when operators want to reduce expenditure they’ve generally got to cut people or workovers. That’s where the biggest costs are. But if they don’t do workovers, they’ll get trouble later on. Workovers and other well interventions need to be done at some point; and once they have been done them, knowing how well they worked determines future strategy. Without measurement it’s very difficult to get a clear picture of what has been effective. The biggest improvement on this front will come from continuous well-by-well data.

We all agree that meters at every well would be best practice if it was cost-effective, and today that’s the reality with M-Flow. Operators can manage an oilfield without having to amend the operational pattern every time something changes. But as with other process industries, this will involve smart monitoring systems throughout the upstream production chain.

One of the biggest factors that will govern the future price of oil is the extent to which the standardised, manufacturing-like processes that characterise tight oil production are implemented across the industry. In the shale plays in the USA’s ‘lower 48’ states, best practice has transposed swiftly between operators and operations. Pad drilling, high-volume completions, tighter well spacing. All have made statistically visible differences to costs and how quickly and successfully projects are brought to commercialisation.

At the heart of this process has been the requirement for reliable data to improve performance through the reproducibility and tight process control that delivers the marginal gains that incrementally compound into significant improvements.

 

Removing statistical doubt

By taking a step back, and re-thinking the challenges that have inhibited the growth of multiphase data for the production optimisation market, we have developed a technology solution that provides confidence at the wellhead. Today M-Flow is delivering reliability, repeatability, and accuracy, while simultaneously slashing capital and operational expenditure.

We have focused on understanding well performance through phase fraction measurement, because this solution delivers through direct measurement the key parameters that quantify and signal production change. Moreover, it can be combined with other measurement systems and data points to provide more complex understanding.

M-Flow’s unique carbon fibre construction creates a transparent window on the pipe flow, and makes it possible to deploy sensor systems fully protected from aggressive oil well fluids. This means that our meters experience none of the harsh fluids’ induced degradation or calibration change that are the main drivers for multiphase flow meter intervention.

In contrast to traditional meters, our new carbon fibre multiphase meters require minimal manpower, lower capital expenditure, and almost zero operational expenditure; with five-year lifecycle meter costs that are on average 20% of traditional multiphase flowmeter costs. All the while delivering directly measured, constant data on water cut and gas fraction with confidence in its accuracy and repeatability.

The proven reliability of our technology facilitates a shift in the traditional engagement. By providing this information in a packaged, discreet, and highly valuable data set, we have enabled a shift in focus to where it needs to be: moving dialogue and engagement within the multiphase market away from the meter, and onto the impact of accurate and reliable data to redefine upstream operations.

The foundation of the modern, industrialised oilfield is reliable data. But unless the information that's derived at the wellhead is consistently reliable and replicable, we won't fix today’s problems – let alone solve tomorrow’s.

There is broad recognition that digging into a well managed dataset reveals insights, trends, and patterns that will help increase return on investment, decrease HSSE incidents, and create the foundation for future achievement. In this environment data trust is a competitive advantage. Success is built upon actionable insights. That starts with credible data.

More news

AI in PR: It can’t do what you want us to do
Posted 13.02.2024
BLUE wins new maritime PR and communications client UK P&I Club
Posted 26.03.2024
MEPC 81: The moment things changed?
Posted 28.03.2024

Let’s talk

Get in touch

Cookie consent

Please choose which cookies you want to consent to.