Follow Datanami:
April 4, 2016

How Big and Fast Data Can Transform the Oil and Gas Industry

Theo Kambouris

(Stuart Miles/Shutterstock)

The oil industry has generally lagged behind other industries when it comes to new technology. For example, most other blue chip industries have gone paperless for most things. The obvious benefit of going paperless is a faster and more efficient movement of information across the organization. With as many processes in place internally, not to mention the vast amount of data being exchanged from one oil operator and supplier’s firewall to another, they seem like perfect candidates for going paperless, right? However, the oil industry has struggled to catch up to these now legacy technologies. The industry is still inundated with mounds of spreadsheets and paper purchase orders and invoices. Why? Clearly, it’s not a cost factor, because the funds are there and we’ve seen companies like SAP do very well. The answer is simple—it’s risky and for very good reason.

Oil & Gas expert Mark LaCour, who co-hosts a great podcast out here in Houston, summed it up pretty well during one of his shows. He explained that this industry is very unique because the processes and procedures in place are generally tied to a much higher consequence—even life or death for folks in the field. So, if something is working and everyone is safe, there is an incredibly high resistance to change. Considering the engineers in this industry make others look like they’re playing with children’s toys, it’s pretty hard to believe they’re not sharp enough to pick up on new concepts. With that, I think I have to agree with Mark. This is not an industry of laggards; this is an industry of very careful people, doing very difficult work, who are not eager to take on any jeopardizing risk.

Despite all of this, we might be in for a big change in big oil. While many drivers may consider filling their gas tanks for $20 a dream come true, the recent plunge in oil prices is a nightmare for the upstream side of the oil industry. Over the last decade, we have seen the price of crude oil fall by enormous amounts, from a high of $140 in 2008, to just $30 or less per barrel. Faced with this dramatic price drop, the oil industry is looking for ways to cut cost and become more operationally efficient. Furthermore, analysts have been hammering the industry for years for the same reason. When the barrel price recovers, you have to believe the market will be watching closely to see which companies are equipped to succeed in the information age, and which ones are dinosaurs.

While most companies understand they need to make changes and take advantage of their data, the big question is where to focus. More specifically—what’s going to move the needle? The big data use cases most E&P companies (and some service providers) are focused on right now are directly related to reducing lost revenue and op-ex due to non-productive time out in the field. Focusing on predictive maintenance has to be your number one priority; if you haven’t started tackling this issue, you’re already behind in the game.

In addition to volatile market forces, a major factor of loss for oil production companies is Non-Productive Time (NPT), or time wasted due to technical or physical difficulties in extraction. Production delays can cost several million dollars per day, depending on impact. In order to survive and thrive, energy companies must protect themselves from NPT and can do so by utilizing data to improve efficiency and productivity. In our increasingly data-dominated environment, monitoring data output is a practice the oil industry, as well as many other heavy industrial industries, should have adopted much earlier. The goal now is to catch up and start seeing value very quickly.

Mike Jensen, who is president of 4Atmos, explained that in the upstream Oil & Gas industry today, every uptime minute counts. The cost of Rig Down is a lot more than just the day rate lost.  It may mean the difference in keeping a contract or maintaining ever-decreasing profit margins. The potential for savings is literally in the billions of dollars each year. We have to use big data to alert the rig of potential hidden signatures that will lead to a catastrophic failure as quickly as possible. Their singular goal is to hit Total Depth on each well on time, on budget, and safely. There’s no room for error. Big data is no longer a “nice to have” technology; it’s a must-have in today’s economy.

A digitized oilfield can produce over 500,000 data points per second, and many companies have hundreds of wells to monitor and analyze. Big data, particularly fast data, is the key to countering NPT, improving efficiency, and increasing production revenue. Prior to the ubiquitous adoption of big data monitoring, NPT was a reality for oil companies, which was still mostly dealt with manually, and the financial loss has been considered an inevitable business expense.

Fast data is data that is immediately generated, synchronized and acted upon in real time. Companies using fast data ingest millions of data sets per second, transmitted directly from sensors embedded in drilling operations. Once generated, big data platforms are able to compare this new data to historical data sets to catch anomalies as they occur. By helping to identify a defect at its root, companies can automate an appropriate response, whether this involves dispatching an engineer to the field, or adjusting a drill’s trajectory. Not only does this lead to better safety, faster logistics, and more efficient and targeted operations—it can also drastically reduce the high impact that NPT has on these companies.

Through utilizing fast data, oil companies can realistically save on average between $500,000 and $1 million per day by cutting down on high-impact NPT, which adds up to significant long-term savings for companies in the face of falling market value. Energy companies implementing fast data architecture will prevent the frequent, minor, losses in day-to-day production, effectively clearing the path for the oil industry to operate in and benefit from the big data-dominated environment.

The good news is a project like this should never ask you to “rip and replace” any of your crucial processes. The right data platform should allow you to easily connect to all of your legacy applications and databases, including historical data, and never disrupt your current processes. Once you’ve created a landing place for all vital data, including your non-structured and semi-structured data, you can begin to tackle the problem with the help of data science and the right partners. Keep in mind, you will need to work with a very highly available and performant platform that provides you with streaming capability, or you may not succeed any time soon. The game changer here is, as an example, your ability to identify issues in your blow out preventer stack before something goes wrong and not be alerted when or after it does.

The fickle market has undoubtedly damaged the behemoth oil industry as of late, the move to data-driven automation is expanding to a large scale, and the opportunities to make back lost profit by cutting down on the day-to-day hindrances are infinite. Other blue chip verticals, from manufacturing to railroads, have an amazing opportunity to follow in the oil industry’s footsteps and invest in their core equipment with fast data. By adapting to our increasingly data-dominated environment, even the most longstanding industries can benefit from having their fingers on the pulse of their data output, and the vast advantages that comes with this knowledge.

 

About the author: Theo Kambouris is an Industry Director and Energy Theo KSpecialist at MapR Technologies, where he is responsible for managing the oil and gas team in Houston, Texas. Theo has an in-depth knowledge of the oil and gas industry, with particular expertise in use cases related to oil and gas exploration and production, completion engineering, sensors, and supply chains. Prior to joining MapR, Theo was an enterprise sales director for Actian Corporation. Earlier in his career, Theo was a software sales executive specializing in the oil and gas sector for Pervasive Software. In his free time, Theo volunteers for PIDX (Petroleum Industry Data Exchange) as the Western Hemisphere Events Coordinator & New Technologies Advisor. Theo holds a Bachelor of Science degree in business and organizational management from the Vanguard University of Southern California.

Related Items:

Making Big Data Work for a Major Oil & Gas Equipment Manufacturer

How Data Analytics Can Help Frackers Find Oil

Datanami