Category Archives: Knowledge managmeent

The Way To Maximize Value from M&A Assets

In North America, the only constant when it comes to Oil and Gas companies is change.  With mergers and acquisitions (M&A), hydrocarbon assets constantly change hands. The value of acquired assets will then either be maintained, increased. decreased or maximized depending on how it is managed under the new owners. It is generally agreed the value can only be maximized when the asset’s geological models, reservoirs’ dynamics, and wells’ behavior are fully understood to their minute details. The new owner takes over the asset but is not guaranteed the people with the knowledge.

Building a clear understanding of the new asset becomes an urgent matter for the new owner.  This understanding is typically hidden under the mountain of data and files that change hands together with the asset. How and when the data is migrated to the new organization, therefore, can build up or bring down the value.

Typically, when an asset changes hands, the field staff remains, but the geologists and geoscientists that strategized the assets’ management may not follow the asset. This can mean that a great deal of knowledge is potentially lost in transition. This makes the data and documents that are delivered, after the transaction is complete, that much more important to understanding the details of the acquisition. Obtaining as much of this data as possible is crucial.  As a geologist who has been through multiple mergers put it:

“Knowledge like drilling through a fault is only known to the asset team operating the asset. This information is not publicly available. During the transition, getting any maps or reports from the geologists will help the acquiring company develop the right models and strategies to increase value. We want all the data we can get our hands on.”

Another key consideration is software licenses and versions, which may or may not transfer.  We find that the risk of losing the information permanently due to software incompatibility, licensing, or structure issues is very real. Migrating the technical data during the transitioning period will help protect the new owner from data loss.

Per Harvey Orth, a geophysicist and former CIO who has been through three mergers and acquisitions:

In many cases, companies made an agreement with the software vendor to maintain a read-only copy of all the data; just in case they needed to extract some data they had not loaded into their production systems (for the new owner) or need the data for legal or tax reasons later (for the seller). In fact, keeping a read-only copy can be easily negotiated within a purchase agreement if you are divesting an asset. When acquiring, then everything and anything you can get your hands on can be essential to getting the most value from the field and should be migrated.

Tips to Protect the Value of New Assets 

Experts like us can help ensure that data is migrated quickly and efficiently and that the right data is obtained from the acquisition target. However, if inclined to manage the data transfer yourself, we share the following tips:

Make it Manageable, Prioritize it Right:

While all of the data and information is important, time is of the essence. Most companies will prioritize migrating “accounting” data, and rightly so, but to maximize value, technical data must also be at the top of the priority list. The following should top your priority list: production volumes and pressure data, land and lease data, well construction & intervention data (drilling, completions, and intervention history), Reservoir characterization (logs, paraphysics, core …etc.)

Do Due Diligence with a Master List

Getting your hands on all the data starts with a master list of all the assets,  including such things as active wells and their statuses. This list is the first-stop shop for every department that needs to build its knowledge and processes to manage the new assets. It is also the checklist against which to assess received information. If you have invested in a MDM (Master Data Management) system, then adding the new assets to the database should be one of your first steps.

Know What is Good Quality and What Is Not.

One of the biggest obstacles that companies face is the realization that their own data is not standardized and clean.  So now they are faced with the prospect of adding bad to bad.

Much can be said about investing in data quality standards and governance practice. It makes folding in any new assets easier, faster and cost effective. If you don’t have strong data standards yet, see if you can inherit them from the selling company,  or alternatively get help from IM experts to create these standards and merge legacy data with the new acquisitions.

Make it Findable: Tag Your Electronic files

Documents like geological maps, logs, lab reports, management presentations, and other files contain a wealth of information. Finding the right file can take considerable time, especially if the organizer was not you. Take advantage of Artificial Intelligence and “tag” the files based on their content. This will create a layer of metadata and make finding the right file based on “petroleum natural language” easier.

For additional information or a free consultation on migrating M&A data please contact us at info@certisinc.com

To Build Fit Enterprise Solutions, Be Physical …

The British and the Americans speak the same language. But, say “I have a flat” to a British, and it means something completely different than said to an American. The former would congratulate you, and the latter would feel sorry for you. Flat in the UK means an apartment. Flat in Houston means a flat tire. The same 4 words, arranged in the exact same way, in what is ostensibly the same language, and yet either speaker would confuse their audience, if the audiences were transposed.

It is the same thing in business – if you cross different corporate cultures or even inter-organizational boundaries, industry terminology might sound the same but mean very different things. Sometimes we think we are communicating, but we are not.

Why is this a problem? Because it is not possible to build an enterprise data management solution to serve all departments without addressing variations in expectations for the same word. Especially if the term in question is one that defines your organization’s values and activities.

“Sometimes we think we are communicating, but we are not”

In the corporate world of Energy E&P, the word “completion” means different things to the different departments. If you mention a “Completion” to a Landman, he will assume you are referring to the subsurface horizon for his leases (it is more complex than this, but for the sake of this argument we need not dive into details). If a “Completion” is referenced to a Production Engineer, she immediately thinks of the intersection of a wellbore and a reservoir horizon. To a Completion Engineer, the same term means the process of completing a well after the well has reached final depth.

As organizations’ data management practice become more matured, they start to make their way towards the right of the EIM MM (Enterprise Information Management Maturity Model). Centralized solutions such as Master Data Management (MDM) are important and are designed to serve ALL departments to break as many silos as possible.

Naturally, to create a centralized solution that addresses needs across the enterprise, you must first reach consensus on how to build that solution. The solution must ensure that the data is NOT LOST, NOT OVERWRITTEN and is FULLY CAPTURED and useful to EVERYONE. What is the best way to reach consensus without the risk of losing data?

Get Physical

To answer the above question, many agree that information systems need to be built based on the physical reality to gather granular data …

By basing your data on the physical world and capture granular data as practically possible, you not only make it possible to capture all related information but also possible to report it in any combination of grouping and queries. See the example in figure 1.

Focus on Enterprise Needs and Departmental needs will follow…

I have seen systems that ignore wellbore data yet store only completions per well. At other clients, I have seen systems that take short cuts by storing wells, wellbore and wellbore completion data in one line (this necessitates overwriting old completion data with new everytime there is a change), these are “fit-for-purpose” systems.  These are not enterprise level solutions, but rather serve departmental needs.

Too often systems are designed for the need of one group/department/purpose rather than for the need of the company as a whole. However, if the needs of the whole are defined and understood, both company and groups will have what they need and then some.

Let’s look at an example to clarify this position:

Figure 1 Multi lateral well

Figure 1 Multi lateral well

In Figure 1 above, how would you store the data for the well in your organization or your department? Would you define the data captured as one well, three bores, and three completions? Or maybe two completions? One?
Depending on your department or organizational definitions, any of the above definitions could be fit-for-purpose correct. Accounting systems might keep track of ONLY one completion if it made Payroll and Tax sense. While Land may only keep track of 2 completions if the bores are in two zones. An engineer would track three completions and will be specific to one completion per wellbore. The regulatory department may want you to report something entirely different.
How do we decide the number of completions so that the information is captured accurately, yet remains useful to a Landman, Accountant, Engineer, and Geoscientist? Build based on the physical reality and stay granular.
In Figure 1, physically speaking, we see one well with three paths (3 wellbores). Each bore has its own configuration that open to the reservoir (completions). In total, this well has three different ‘Completions’,  one ‘Completion’ for each of the horizontal bores.
Accounting can query how many different cost centers the well has, and depending on the production (and other complex rules) the answer could be three but it could be 1.  Depending on the lease agreement, Landman could get a result of one or 3 completions. An engineer can also easily query and graph this data to find the three pathways, and determine each completion job per wellbore.
While it could be argued that data needs to be presented differently to each department, the underlying source data must reflect the physical truth. After all, we cannot control what people call things and certainly cannot change the lingo.

Data and Processes are your two friends in fat or skinny margin times – Some tools and ideas to weather low oil-prices

well;  2014 is ending with oil prices down and an upward trend on M&A activities. For those that are nearing retirement age, this is not all bad news. For those of us that are still building our careers and companies, well, we have uncertain times ahead of us. This got me asking: is it a double whammy to have most knowledgeable staff retiring when oil prices are low? I think it is.

At the very least, companies will no longer have the “fat margins” to forgive errors or to sweep costly mistakes under the rug! While costs must be watched closely, with the right experience some costs can be avoided all together. This experience is about to retire.

For those E&P companies that have already invested (or are investing) in putting in place, the right data and processes that captured knowledge into their analysis and opportunity prioritization, will be better equipped to weather low prices.  On the other hand, companies that have been making money “despite themselves” will be living on their savings hoping to weather the storm. If the storm stays too long or is too strong they will not survive.

Controlling cost the right way

Blanket cost cutting, across all projects is not good business. For example some wells do not withstand shutting down or differing repairs, you would risk losing the wells altogether. Selectively prioritizing capital and operational costs with higher margins and controllable risks, however, is good business. To support this good business practice is a robust foundation of systems, processes and data practices that empower a company to watch important matrices and act fast!

We also argue that without relevant experience some opportunities may not be recognized or fully realized.

Here are some good tools to weather these low prices:

Note that this is a quick list of things that you can do “NOW” for just few tens or few hundred thousand dollars (rather than the million dollar projects that may not be agile at these times)

  •  If you do not have this already, consider implementing a system that will give you a 360 degree view of your operations and capital projects. Systems like these need to have the capability to bring data from various data systems, including spreadsheets. We love the OVS solutions (http://ovsgroup.com/ ). It is lean, comes with good processes right out of the box and can be implanted to get you up and running within 90 days.
  • When integrating systems you may need some data cleaning. Don’t let that deter you; in less than few weeks you can get some data cleaned. Companies like us, Certisinc.com, will take thousands of records validate, de-duplicate, correct errors, complete what is missing and give you a pristine set. So consider outsourcing data cleaning efforts. By outsourcing you can have 20 maybe 40 data admins to go through thousands of records in a matter of a few days.
  • Weave the about-to-retire knowledge into your processes before it is too late. Basically understand their workflow and decision making process, take what is good, and implement it into systems, processes and automated workflows. It takes a bit of time to discover them and put them in place. But now is the time to do it. Examples are: ESP surveillance, Well failure diagnosis, identifying sweet frac’ing spots…etc. There are thousands upon thousands of workflows that can be implemented to forge almost error proof procedures for  “new-on-the job” staff
  • Many of your resources are retiring, consider hiring retirees, but if they would rather be on the beach than sitting around the office after 35+ years of work; then leverage systems like OGmentorsTM (http://youtu.be/9nlI6tU9asc ).

In short, the importance of timely and efficient access to right & complete data, and the right knowledge weaved into systems and processes are just as important, if not more important, during skinny margin times.

Good luck. I wish you all Happy Holidays and a Happy 2015.

A master’s degree in Petroleum Data Management?

I had dinner with one of the VPs of a major E&P company last week. One of the hot topics on the table was about universities agreeing to offer MSc in Petroleum Data Management. Great idea!  I thought. But it brought so many questions to my mind.

 

Considering where our industry is with information management (way behind many other industries), I asked who will define the syllabus for this PDM MSc? The majors? The service companies? Small independents? Boutique specialized service providers? IMS professors? All of the above?

 

Should we allow ourselves to be swayed by the majors and giant service companies? With their funding they certainly have the capability to influence the direction, but is this the correct (or only) direction? I can think of few areas where majors implementation of DM would be an overkill for small independents, they would get bogged down with processes that make it difficult to be agile! The reason that made the independents successful with the unconventional.

 

What should the prerequisite be? A science degree? Any science degree? Is a degree required at all? I know at least couple exceptional managers, managing data management projects and setting up DM from scratch for oil and gas companies, they manage billions of dollars’ worth of data. They do not have a degree, what happens to them?

 

It takes technology to manage the data.  MSc in Petroleum Data Management is no different. But unlike petroleum engineering and geoscience technologies, technology to manage data progresses fast, what is valid today may not still be valid next year! Are we going to teach technology or are we teaching about Oil and Gas data? This is an easy one, at least in my mind it is, we need both. But more about the data itself and how it is used to help operators and their partners be safer, find more and grow. We should encourage innovation to support what companies need.

 

PPDM – http://www.ppdm.org/ is still trying to define some standards, POSC (Petrochemical Open Standards Consortium (I think that is what it stands for, but not sure) came and went, Energistics – http://www.energistics.org/ is here and is making a dent, Openspirit – http://www.tibco.com/industries/oil-gas/openspirit made a dent but is no longer non-profit. Will there be standards that are endorsed by the universities?

The variations from company to company in how data management is implemented today is large. Studying and comparing the different variations will make a good thesis I think…

I am quite excited about the potential of this program and will be watching developments with interest.

 

Capture The Retiring Knowledge

The massive knowledge that is retiring and about to retire in the next five years will bring some companies to a new low in productivity. The U.S. Bureau of Labor Statistics reported that 60% of job openings from 2010 to 2020 across all industries will result from retirees leaving the workforce, and it’s estimated that up to half of the current oil & gas workforce could retire in the next five to ten years.

For companies that do not have their processes defined and weaved into their everyday culture and systems — relying on their engineers and geoscientists knowledge instead — retirement of these professionals will cause a ‘brain drain,’ potentially costing these companies real down time and real money.

One way to minimize the impact of “Brain Drain” is by documenting a company’s unique technical processes and weaving them into training programs and, where possible, into automating technology. Why are process flows important to document? Process flow maps and documents are the geographical maps that give new employees the direction and the transparency they need, not only to ramp up a learning curve faster, but also to repeat the success that experienced resources deliver with their eyes closed.

For example, if a reservoir engineer decides to commission a transient test, equipment must be transported to location, the well is shut down and penetrated, pressure buildup is measured, data is interpreted, and BHP is extrapolated and Kh is calculated.
The above transient test process, if well mapped, would consist of: 1) Decisions 2) Tasks/ Activities 3) A Sequence Flow 4) Responsible and Accountable Parties 5) Clear Input and Output 6) and Possible Reference Materials and Safety Rules. These process components, when well documented and defined, allow a relatively new engineer to easily run the operation from start to end without downtime.

When documenting this knowledge, some of the rules will make its way in contracts and sometimes in technology enablers, such as software and workflow applications. The retiring knowledge can easily be weaved into the rules, reference materials, the sequence flow, and in information systems.

Documenting technical processes is one of the tools to minimize the impact of a retiring workforce. Another equally important way to capture and preserve knowledge is to ensure that data in personal network drives is accumulated, merged with mainstream information, and put in context early enough for the retiring workforce to verify its accuracy before they leave.

Processes and data  for a company make the foundation of a competitive edge, cuts back on rework and errors, and helps for quickly identifying new opportunities.

To learn more about our services on Processes or Data contact us at info@certisinc.com