Author Archives: Certis Inc.

About Certis Inc.

Digital solutions architects. Servicing the energy industry since 2001. This blog contains the views and opinions of the founder and CEO of Certis IS Inc.

Jobs Are Changing In Oil and Gas. Here Is How and Why…

If we fast forward 10 years, what type of jobs will be still be relevant in upstream oil and gas ? Would your current job description read the same for a new applicant? Will it still exist?

There is no denying that technological changes are upon us.  When it comes to digital data technology, there are two waves of change in plain sight, one behind the other.  The first wave is in a bundle of technology enhancing and enabling one another. This bundle includes faster connectivity, IOT/ IIOT, Cloud, AI (Artificial Intelligence) and Robots.  This change is already strongly underway.  The second wave is following closely behind the first and will probably have the same impact if not larger. This is in the distributed ledger and crypto technology, specifically Blockchain Technology.

The Why

The industry went through/is undergoing a crew change, with baby boomers retiring and younger employees stepping in. This crew change is coinciding with lower commodity prices, opening the door and arms to any technology could impact the bottom line.  These new technologies are proving themselves plenty, resulting in ever greater shifts in daily tasks, if not entire job descriptions.

Major technological change, like the above, always calls for adaptation or job abolition.  New skills will arise, some will be obsolete, yet others will morph. Reskilling or upskilling is no longer optional.

Back to the question, will your current job description read the same in 10 years? It’s a question worth pondering and answering.

The What and How

From our recent experience (and research), here are some of the many jobs that may morph into something different in the same way the map drafters (draftsman) morphed to Geotechs in the previous digital revolution from paper to digital:

  • Shale Reservoir Engineering: This job has increasingly focused on reserves estimation and economics. Granular estimates require customized production analyses. Today this task is time consuming and takes the bulk of Reservoir Engineer’s time. As capabilities in AI modeling advance, I believe “Supervised” or “Reinforcement” Machine Learning (ML) could automate this work, altering the job description to focus on detailed management and recovery efficiency of the reservoir.
  • Surveillance and Maintenance Engineering: today, for many digitally maturing organizations, automated surveillance and managing by exception has become the norm. However, predictive and prescriptive maintenance enabled by ML and Augmented Reality (AR) will become the new norm.  Furthermore, with blockchain or otherwise, vendors can be integrated in the ecosystem to ascertain equipment optimal performance for their customers (as additional services).  A combination of IIoT, ML, AR and expanded ecosystem alters inhouse surveillance and maintenance jobs, yet opens up new opportunities for service companies.
  • Lease Brokers and Land Administrators: with increased connectivity to source systems and with verifiable trusted data on immutable blockchain ledger, much of the due-diligence and verification work could be reduced if not fully automated in the future. These jobs could morph to something new and exciting.

Summary: Stay Relevent

IoT, IIOT, Cloud, AR, VR, Bots, ML-AI, and Blockchain are all here to stay. But as powerful as the digital world and machines are , they are not without flaws or challenges. The human mind will always be more flexible and can deal with more and forseeable scenarios. Nonetheless, some jobs will become obsolete, others will morph, and new ones will arise.          

Knowing what could change, how and when, will help you to prepare a strategic road map to stay ahead and relevant.

What jobs do you think will be different in 10 years? What new and exciting jobs will become available? Alternatively, let me know your job and I will share my thoughts on how it would morph (info@certisinc.com).

Here are some curated references:

  • SPE-194746-MS: The End of Petroleum Engineering as We Know It. From the SPE library
  • What to Do When Industry Disruption Threatens Your Career. MIT Sloan Management Review, Spring 2019

Harnessing The Newly Found Trust

In our previous blog, we presented blockchain as a welcome addition to the E&P landscape and added our voice to those extolling the value in baking data integrity from the ground up when re-engineering processes with it. In this article, we look at how a business can be reshaped, simply by having all business partners trusting the same data.

Trusted data is key to streamlining wasteful processes and reshaping how we do business together. The Economist in 2015 correctly identified blockchain as ‘the trust machine’. As the technology underpinning Bitcoin, it allowed organizations or persons with no particular trust in one another to collaborate without having to go through a neutral central authority. This capacity allows complex transactions to be completed more quickly, and at the same time, for companies to align on common interests and share risks and returns on new opportunities.

One company that is harnessing the shift in trust paradigm and advancing the ball on performance-based contracts is Data Gumbo (DG). This team’s solution elegantly aligns operators, rig owners and service providers on Key Performance Indicators (KPI).

Last week I attended the Oil and Gas Smart Contract conference. A demo from DG showed the use of their Smart Contract feature using drilling KPIs. The contract is prepared, negotiated and signed, all on their platform. Once signed, the contract is locked in an immutable distributed ledger (blockchain). This means all parties possess the same exact copy on the same network with no permissions to update it unless all agree.  This alone is an improvement from the abilities of most current systems.

But the blockchain takes us further, it allows us to codify contracts and automatically trigger workflows, in what they call “Smart Contracts”. Once codified (and the data connections are put in place with the rig and ERPs), data from the rig can feed the codified contract directly, which in turn could trigger the release of payments. Notice, within this example, no invoices needed, no data reconciliation is needed, and there is no chasing after Accounts Payable or Accounts Receivable. That is needle-moving change for organizations that handle hundreds of thousands of payments per year.

With blockchain, this could be taken further, to “crypto assets”, but we are not sure if the industry is ready yet. A discussion for another blog, meanwhile check out Ziyen Energy, they are staying ahead of the curve. (https://www.ziyen.com/ziyencoin_future_currency_of_oil_industry/)

What is your interest in blockchain? Interested to know more? Contact us for a free consultation.  info@certisinc.com

Considering Blockchain for E&P? Don’t Forget to Bake-in Data Integrity

A Welcomed Addition to the E&P Digital Landscape, Blockchain

Over the decades, energy companies have adopted many technologies and tools to enhance and advance their competitive advantage and business margins. Information Technology (IT) has played a significant role in those enhancements. Most recent examples in the news coming from AI, IoT and Cloud computing. Nonetheless, oil and gas is still struggling with few persistent data issues:

  1. security and data leakage
  2. siloed systems and/or a spaghetti of connections from ETLs, and
  3. reliability of data quality.

As a consequence, workflows and processes are still filled with examples of inefficiency and rework. But an emerging technology, Blockchain[1] may have the answer we’ve been waiting for.

Blockchain is a distributed ledger and crypto technology which brings with it the promise of connectivity, security and data reliability that we have not been able to achieve with past technology, and which was desperately needed in this very specialized industry.

Data Integrity Baked In Business Processes

Because it is possible to enter wrong data in blockchain applications, thinking about how to capture and disseminate good data quality using blockchain applications is important to ensure quality analysis and sound, informed decision making.

Blockchains can live alongside the current data architecture as a data governance layer and passively monitor quality. However, baking data integrity into business applications from the ground up is the best way forward.

Along with distributed ledger and crypto signatures, blockchain also offers a smart contract feature which allows us to re-engineer and re-imagine processes and how we all work together. Thinking through the last-mile problem to build data integrity into the application from the ground up will be one of the key success factors that will make or break your ultimate solution. New jobs will arise to ensure that correct information is being captured, as specialists that can verify information validity will be a must. These specialists will be similar to insurance adjustors that testify a claim is valid prior to insurance payment.

The further good news is blockchain data will only need to be verified once. No more having to verify the same data over and over again!!! Who has time for that?!

Recommended Resources

[1] A plethora of material describing blockchain are everywhere. Make sure you are reading  curated articles.  One good reference is from IBM (not endorsing their technology, Certis is vendor agnostic): Blockchain for dummies


For additional information, architecting, implementing and training on successful Blockchain projects please contact us at info@certisinc.com

Why it’s time for Petroleum Geologists and Engineers to move away from generic data scientists

The industry has finally warmed up to Artificial Intelligence (AI) technologies. This is good news. AI technologies are a great opportunity for operators to increase their efficiency and boost their competitive advantage and safe operations. And, in fact, many have already done just that.

But not every company will deploy AI at the same pace or with the same capabilities. Skills to build meaningful and impactful AI models are still scarce. Also, the people who understand the earth’s and hydrocarbon’s physics are not always the same ones who know how to use “Data Science (DS) and AI” tools. For that reason, some Exploration & Production (E&P)  companies opt for a centralized DS team that has skills in statistics and building AI models. In this centralized model, engineers’ “exploratory” and “what-if” analyses go to a central team to build a predictive model for their hypothesis.

This process is not ideal. It takes longer to reach to an acceptable and final model even when a request is prioritized (long cycle time).  How can you speed it up? While a central DS team may seem to be the only option at the moment, I would argue that eventually those who understand the physics should be the ones building and confirming their own AI and Machine Learning (ML) models. I argue, some of brightest engineering and geoscience ideas are yet to come out to industry, once they learn the tools.

Why Change?

For some petro-proefessionals, learning a new coding language (python for example) is not fun. After all if they wanted to be coders they would be in a different place. We have been there before. The situation is analogous with the old DOS commands days, which were the domain of specialists and enthusiasts. But as soon as software (like Microsoft Office) came out with Graphical User Interface (GUI) – where we clicked and typed a natural language – then the engineers, geologists and the whole world came onboard. Well, AI and ML tools are making that turn and do not require any coding. 

It was no longer a stretch to get petroleum engineers to use Excel spreadsheets, then and it is an expectation now. We should have the same expection for them using advanced analytics, AI and ML tools. 

 

Some suggestions

I leave you with some tips to reach that goal (if you agree with the above premises);

  • Look for advanced analytics tools with graphical user interfaces (as opposed to those that require you learn a new language like python). I like this software’s interface Data Robot, www.datarobot.com
  • Make sure your software can easily connect to data sources – it’s not enough to have a software with good GUI.
  • Automate the flow of information and facilitate access to information – this is key to success, especially for data warehouses/ lakes or hubs.
  • Make sure your users are trained in advanced analytics principles and on the software.  At the very least, engineers should be able to build some basic predictive models.
  • Consider cloud data warehouses and tools to get the power to run large datasets such as logs and seismic. We are following Snowflake https://www.snowflake.com  and MEME SQL https://www.memsql.com/

Certis consults and delivers data services. We are systems agnostic. We focus on helping oil and gas clients set up their backbone data and processes. Find out more about our services

The Way To Maximize Value from M&A Assets

In North America, the only constant when it comes to Oil and Gas companies is change.  With mergers and acquisitions (M&A), hydrocarbon assets constantly change hands. The value of acquired assets will then either be maintained, increased. decreased or maximized depending on how it is managed under the new owners. It is generally agreed the value can only be maximized when the asset’s geological models, reservoirs’ dynamics, and wells’ behavior are fully understood to their minute details. The new owner takes over the asset but is not guaranteed the people with the knowledge.

Building a clear understanding of the new asset becomes an urgent matter for the new owner.  This understanding is typically hidden under the mountain of data and files that change hands together with the asset. How and when the data is migrated to the new organization, therefore, can build up or bring down the value.

Typically, when an asset changes hands, the field staff remains, but the geologists and geoscientists that strategized the assets’ management may not follow the asset. This can mean that a great deal of knowledge is potentially lost in transition. This makes the data and documents that are delivered, after the transaction is complete, that much more important to understanding the details of the acquisition. Obtaining as much of this data as possible is crucial.  As a geologist who has been through multiple mergers put it:

“Knowledge like drilling through a fault is only known to the asset team operating the asset. This information is not publicly available. During the transition, getting any maps or reports from the geologists will help the acquiring company develop the right models and strategies to increase value. We want all the data we can get our hands on.”

Another key consideration is software licenses and versions, which may or may not transfer.  We find that the risk of losing the information permanently due to software incompatibility, licensing, or structure issues is very real. Migrating the technical data during the transitioning period will help protect the new owner from data loss.

Per Harvey Orth, a geophysicist and former CIO who has been through three mergers and acquisitions:

In many cases, companies made an agreement with the software vendor to maintain a read-only copy of all the data; just in case they needed to extract some data they had not loaded into their production systems (for the new owner) or need the data for legal or tax reasons later (for the seller). In fact, keeping a read-only copy can be easily negotiated within a purchase agreement if you are divesting an asset. When acquiring, then everything and anything you can get your hands on can be essential to getting the most value from the field and should be migrated.

Tips to Protect the Value of New Assets 

Experts like us can help ensure that data is migrated quickly and efficiently and that the right data is obtained from the acquisition target. However, if inclined to manage the data transfer yourself, we share the following tips:

Make it Manageable, Prioritize it Right:

While all of the data and information is important, time is of the essence. Most companies will prioritize migrating “accounting” data, and rightly so, but to maximize value, technical data must also be at the top of the priority list. The following should top your priority list: production volumes and pressure data, land and lease data, well construction & intervention data (drilling, completions, and intervention history), Reservoir characterization (logs, paraphysics, core …etc.)

Do Due Diligence with a Master List

Getting your hands on all the data starts with a master list of all the assets,  including such things as active wells and their statuses. This list is the first-stop shop for every department that needs to build its knowledge and processes to manage the new assets. It is also the checklist against which to assess received information. If you have invested in a MDM (Master Data Management) system, then adding the new assets to the database should be one of your first steps.

Know What is Good Quality and What Is Not.

One of the biggest obstacles that companies face is the realization that their own data is not standardized and clean.  So now they are faced with the prospect of adding bad to bad.

Much can be said about investing in data quality standards and governance practice. It makes folding in any new assets easier, faster and cost effective. If you don’t have strong data standards yet, see if you can inherit them from the selling company,  or alternatively get help from IM experts to create these standards and merge legacy data with the new acquisitions.

Make it Findable: Tag Your Electronic files

Documents like geological maps, logs, lab reports, management presentations, and other files contain a wealth of information. Finding the right file can take considerable time, especially if the organizer was not you. Take advantage of Artificial Intelligence and “tag” the files based on their content. This will create a layer of metadata and make finding the right file based on “petroleum natural language” easier.

For additional information or a free consultation on migrating M&A data please contact us at info@certisinc.com

How To Turbocharge Oil & Gas Analyses With Machine Learning and The Right EIM Foundation

It is generally accepted that good analysis of oil and gas data results in actionable insights, which in turn leads to better profits and growth. With today’s advancements in technology and processing power, more data and better analysis are easily achievable but will require the right EIM (Enterprise Information Management)  foundation to make “all” data available and “analyses-ready”.

The evidence of those analytics are clear and ubiquitous. In an article in JPT (Journal of Petroleum Technology) by Stephen Rassenfoss, “Four Answers To the Question: What Can I Learn From Analytics?”, Devon Energy concludes it is possible to increase production by 25% by drilling the lateral toe-up in Cana-Woodford Shale. Range Resources, responding to a different question and with Machine Learning (ML) analysis, concluded more production in the Marcellus is associated with wells fracked with as much sand volume as the reservoir can handle.

All Data All The Time = More Studies More Return

Looking closer at the article, both studies were based on a relatively small data set; Devon Energy and Range Resources only used 300 and 156 wells respectively.  Both companies stated that a larger data set would help their respective studies. So, why some studies rely on a small population of wells when there are thousands more that could have been included to reach a deeper understanding.

While the answer depends on the study itself, we find two key data”preparation” problems that may contribute to the answer a) data findability/ availability b) data readiness for analyses. In some E&P companies, data preparation can consume over 50% of total study’s time. This is where I believe EIM can make a difference by taking a proactive role.

 Three Strategic EIM Initiatives to Turbocharge Your Organization’s Analytics

Information preparation for exploratory analytics like the above, require Oil and Gas companies to embrace a new paradigm in EIM. The traditional “data management” has its applications but can be rigid and limiting because it requires predefined schemas.

We share our favorite three EIM strategic initiatives to deliver  more, trustworthy and analyses-ready information:

  • Strategic and Selective Information Governance Program – A strong data governance model ensures data can be trusted, correlated and integrated, this is a foundational step and will take standardizing, and mastering key entities and attributes.   Tip: key enabling technology is Master Data Management (MDM)
  •  Multi-Stream Data Correlation – Together with the MDM, “Big Data” technology and processes enable the inclusion and further correlation of data from a variety of streams, without the prejudice of predefined data schema.
  • Collaborative Process and Partnership – From years of lessons learned, we’ve noticed that none of the above will move the needle much at all if implemented in isolation. A collaborative process with the sole purpose of fostering a close partnership between IM engineers/ architects, data scientists, and the business, is what differentiates success from failure. As the organization finds new “nuggets of insights,” the EIM team’s role is to put the necessary structure in place to capture the required data systematically and then infiltrate it into the organization’s DNA.

New analytics are positively changing how we produce and manage oil and gas fields. Companies that invest in getting their EIM foundation right will lead the race among its competition.

Disclosure:

For help on defining and implementing EIM strategy please contact us.
With Petroleum Engineers, Geoscientist, Data Scientists and Enterprise Information Architects on the Certis team, we help companies design and implement EIM solutions that support their business goals. for more information on our services please email us at info@certisinc.com.

Why Connecting Silos With Better IM Architecture Is Important

If you work in an oil and gas company, then you are familiar with the functional divides. We are all familiar with the jokes about geologists vs. engineers. We laugh and even create our own. But jokes aside, oil and gas companies operate in silos and with reason.

But while organizational silos may be necessary to excel and maintain standards of excellence, collaboration and connection across the silos are crucial for survival.

For an energy company to produce hydrocarbons from an asset, it needs all the departments to work together (geoscience, engineering, finance, land, supply chain …etc.). This requires sharing of detailed information and collaborating beyond meeting rooms and email attachments. But the reality in many oil and gas companies today is different, functional silos extend to information silos.

Connected Silos Are Good. Isolated Silos Are Bad

In an attempt to connect silos, “Asset Teams” or “Matrix” organizations are formed and incentive plans are carefully crafted to share goals between functions. These are great strides, but no matter the organizational structure, or the incentive provided, miscommunications, delays, and poor information hand-over are still common place. Until we solve the problem of seamless information sharing, the gap between functional departments will persist; because we are human and we rationalize our decisions differently.  This is where technology and automation (if architected correctly) can play a role in closing the gap between the silos.

Asset team members and supporting business staff have an obligation to share information not only through meetings and email attachments but through organizing and indexing asset files throughout the life of the asset. Fit-for-Purpose IM architecture has a stratigic role to play in closing the gap between the functional silos.  

Connecting Functional Silos With IM Takes Vision & Organizational Commitment 

Advancements in IM (Information Management) and BPMS (Business Process Management Systems) can easily close a big part of the remaining gap. But many companies have not been successful in doing so, despite significant investments in data and process projects. There can be many reasons for this, I share with you two of the most common pitfalls I come across:

  • Silo IM projects or systems –  Architecting and working on IM projects within one function without regard to impact on other departments. I have seen millions of dollars spent to solve isolated geoscience data needs, without accounting for impact on engineering and land departments. Or spent on Exploration IM projects without regard to Appraisal and Development phases of the asset. Quite often, organizations do not take the time to look at the end-to-end processes and its impact on company’s goals. As a result, millions of dollars are spent on IM projects without bringing the silos any closer.  Connecting silos through an IM architecture requires a global vision.
  • Lack of commitment to enterprise standards – If each department defines and collects information according to their own needs without regard of the company’s needs, it is up to other departments to translate and reformat. This often means rework and repetitive verification whenever information reaches a new departmental ‘checkpoint’.

The above pitfalls can be mitigated by recognizing the information dependencies and commonalities between departments then architecting global solutions based on accepted standards and strong technology. It takes a solid vision and commitment.

For a free consultation on how to connect silos effectively, please schedule your appointment with a Certis consultant. Email us at info@certisinc.com or call us on 281-377-5523.