Category Archives: Document Management

Data and Processes are your two friends in fat or skinny margin times – Some tools and ideas to weather low oil-prices

well;  2014 is ending with oil prices down and an upward trend on M&A activities. For those that are nearing retirement age, this is not all bad news. For those of us that are still building our careers and companies, well, we have uncertain times ahead of us. This got me asking: is it a double whammy to have most knowledgeable staff retiring when oil prices are low? I think it is.

At the very least, companies will no longer have the “fat margins” to forgive errors or to sweep costly mistakes under the rug! While costs must be watched closely, with the right experience some costs can be avoided all together. This experience is about to retire.

For those E&P companies that have already invested (or are investing) in putting in place, the right data and processes that captured knowledge into their analysis and opportunity prioritization, will be better equipped to weather low prices.  On the other hand, companies that have been making money “despite themselves” will be living on their savings hoping to weather the storm. If the storm stays too long or is too strong they will not survive.

Controlling cost the right way

Blanket cost cutting, across all projects is not good business. For example some wells do not withstand shutting down or differing repairs, you would risk losing the wells altogether. Selectively prioritizing capital and operational costs with higher margins and controllable risks, however, is good business. To support this good business practice is a robust foundation of systems, processes and data practices that empower a company to watch important matrices and act fast!

We also argue that without relevant experience some opportunities may not be recognized or fully realized.

Here are some good tools to weather these low prices:

Note that this is a quick list of things that you can do “NOW” for just few tens or few hundred thousand dollars (rather than the million dollar projects that may not be agile at these times)

  •  If you do not have this already, consider implementing a system that will give you a 360 degree view of your operations and capital projects. Systems like these need to have the capability to bring data from various data systems, including spreadsheets. We love the OVS solutions (http://ovsgroup.com/ ). It is lean, comes with good processes right out of the box and can be implanted to get you up and running within 90 days.
  • When integrating systems you may need some data cleaning. Don’t let that deter you; in less than few weeks you can get some data cleaned. Companies like us, Certisinc.com, will take thousands of records validate, de-duplicate, correct errors, complete what is missing and give you a pristine set. So consider outsourcing data cleaning efforts. By outsourcing you can have 20 maybe 40 data admins to go through thousands of records in a matter of a few days.
  • Weave the about-to-retire knowledge into your processes before it is too late. Basically understand their workflow and decision making process, take what is good, and implement it into systems, processes and automated workflows. It takes a bit of time to discover them and put them in place. But now is the time to do it. Examples are: ESP surveillance, Well failure diagnosis, identifying sweet frac’ing spots…etc. There are thousands upon thousands of workflows that can be implemented to forge almost error proof procedures for  “new-on-the job” staff
  • Many of your resources are retiring, consider hiring retirees, but if they would rather be on the beach than sitting around the office after 35+ years of work; then leverage systems like OGmentorsTM (http://youtu.be/9nlI6tU9asc ).

In short, the importance of timely and efficient access to right & complete data, and the right knowledge weaved into systems and processes are just as important, if not more important, during skinny margin times.

Good luck. I wish you all Happy Holidays and a Happy 2015.

A master’s degree in Petroleum Data Management?

I had dinner with one of the VPs of a major E&P company last week. One of the hot topics on the table was about universities agreeing to offer MSc in Petroleum Data Management. Great idea!  I thought. But it brought so many questions to my mind.

 

Considering where our industry is with information management (way behind many other industries), I asked who will define the syllabus for this PDM MSc? The majors? The service companies? Small independents? Boutique specialized service providers? IMS professors? All of the above?

 

Should we allow ourselves to be swayed by the majors and giant service companies? With their funding they certainly have the capability to influence the direction, but is this the correct (or only) direction? I can think of few areas where majors implementation of DM would be an overkill for small independents, they would get bogged down with processes that make it difficult to be agile! The reason that made the independents successful with the unconventional.

 

What should the prerequisite be? A science degree? Any science degree? Is a degree required at all? I know at least couple exceptional managers, managing data management projects and setting up DM from scratch for oil and gas companies, they manage billions of dollars’ worth of data. They do not have a degree, what happens to them?

 

It takes technology to manage the data.  MSc in Petroleum Data Management is no different. But unlike petroleum engineering and geoscience technologies, technology to manage data progresses fast, what is valid today may not still be valid next year! Are we going to teach technology or are we teaching about Oil and Gas data? This is an easy one, at least in my mind it is, we need both. But more about the data itself and how it is used to help operators and their partners be safer, find more and grow. We should encourage innovation to support what companies need.

 

PPDM – http://www.ppdm.org/ is still trying to define some standards, POSC (Petrochemical Open Standards Consortium (I think that is what it stands for, but not sure) came and went, Energistics – http://www.energistics.org/ is here and is making a dent, Openspirit – http://www.tibco.com/industries/oil-gas/openspirit made a dent but is no longer non-profit. Will there be standards that are endorsed by the universities?

The variations from company to company in how data management is implemented today is large. Studying and comparing the different variations will make a good thesis I think…

I am quite excited about the potential of this program and will be watching developments with interest.

 

Change Coming Our Way, Prepare Data Systems to Store Lateral’s Details.

Effectively, during the past decade, oil and gas companies have aimed their spotlight on efficiency. But should this efficiency be at the expense of data collection? Many companies are now realizing that it shouldn’t.

Consider the increasingly important re-fracturing effort.  It turns out, in at least one area, that only 45% of re-fracs were considered successful if the candidates were selected using production data alone.  However, if additional information (such as detailed completion, production, well integrity and reservoir characterization data) were also used a success rate of 80% was observed. See the snip below from the Society of Petroleum Engineer’s paper “SPE 134330” by M.C Vincent 2010).

Capture

Prepare data systems to store details, otherwise left in files.

Measurements while drilling (MWD), mud log – cuttings analysis and granular frac data are some of the data that can be collected without changing drilling or completion operations workflow and the achieved efficiency.  This information when acquired at the field will make its way to petrophysicists and engineers. Most likely it ends up in reports, folders and project databases.  Many companies do not think of this data storage beyond that.

We argue, however, to take advantage of this opportunity archival databases should also be expanded to store this information in a structured manner. This information should also funnel its way to various analytical tools. This practice will allow technical experts to dive straight into analyzing the wells  data instead of diverting a large portion of their time in looking for and piecing data together. Selecting the best re-frac candidates in a field will require the above well data and then some. Many companies are starting to study those opportunities.

Good data practices to consider

To maximize economic success from re-stimulation (or from first stimulation for that matter) consider these steps that are often overlooked:

  1. Prepare archival databases to specifically capture and retain data from lateral portions of wells. This data may include cuttings analysis, Mud log analysis, rock mechanics analysis, rock properties, granular frac data, and well integrity data.
  2. Don’t stop at archiving the data, but expose it to engineers and readily accessible to statistical and Artificial Intelligence tools. One of those tools is Tibco Spotfire.
  3. Integrate, integrate, integrate. Engineers depend on ALL data sources; internal, partners, third party, latest researches and media, to find new correlations and possibilities. Analytic platforms that can bring together a variety of data sources and types should be made available. Consider Big Data Platforms.
  4. Clean, complete and accurate data will integrate well. If you are not there yet, get a company that will clean data for you.

Quality and granular well data is the cornerstone to increasing re-frac success in horizontal wells and in other processes as well.  Collecting data and managing it well, even if you do not need it immediately, is an exercise of discipline but it is also a strategic decision that must be made and committed to from top down. Whether you are drilling to “flip” or you are developing for a long term. Data is your asset.

 

Capture The Retiring Knowledge

The massive knowledge that is retiring and about to retire in the next five years will bring some companies to a new low in productivity. The U.S. Bureau of Labor Statistics reported that 60% of job openings from 2010 to 2020 across all industries will result from retirees leaving the workforce, and it’s estimated that up to half of the current oil & gas workforce could retire in the next five to ten years.

For companies that do not have their processes defined and weaved into their everyday culture and systems — relying on their engineers and geoscientists knowledge instead — retirement of these professionals will cause a ‘brain drain,’ potentially costing these companies real down time and real money.

One way to minimize the impact of “Brain Drain” is by documenting a company’s unique technical processes and weaving them into training programs and, where possible, into automating technology. Why are process flows important to document? Process flow maps and documents are the geographical maps that give new employees the direction and the transparency they need, not only to ramp up a learning curve faster, but also to repeat the success that experienced resources deliver with their eyes closed.

For example, if a reservoir engineer decides to commission a transient test, equipment must be transported to location, the well is shut down and penetrated, pressure buildup is measured, data is interpreted, and BHP is extrapolated and Kh is calculated.
The above transient test process, if well mapped, would consist of: 1) Decisions 2) Tasks/ Activities 3) A Sequence Flow 4) Responsible and Accountable Parties 5) Clear Input and Output 6) and Possible Reference Materials and Safety Rules. These process components, when well documented and defined, allow a relatively new engineer to easily run the operation from start to end without downtime.

When documenting this knowledge, some of the rules will make its way in contracts and sometimes in technology enablers, such as software and workflow applications. The retiring knowledge can easily be weaved into the rules, reference materials, the sequence flow, and in information systems.

Documenting technical processes is one of the tools to minimize the impact of a retiring workforce. Another equally important way to capture and preserve knowledge is to ensure that data in personal network drives is accumulated, merged with mainstream information, and put in context early enough for the retiring workforce to verify its accuracy before they leave.

Processes and data  for a company make the foundation of a competitive edge, cuts back on rework and errors, and helps for quickly identifying new opportunities.

To learn more about our services on Processes or Data contact us at info@certisinc.com

More Shale Data Should Equal More Production, Unless Your Data is an Unusable Mess

As the U.S. overtakes Russia in Oil & Gas production because of its unconventional fields, new operators flood the industry. Inevitably, competition increases. The need for predictable and optimized well performance is greater than ever. The fastest route to optimization is quality data that can be used to understand unconventional fields better and drive the flow of operations efficiently.

However, as more data pours in, the cracks in many E&P companies’ data management systems are exposed. Geoscientists and engineers are left to make their own integrations and correlations between disperse systems and left digging through folders trying to find documents for knowledge.

Some of the trouble lies in the new methods of analyzing vast array of data that were not considered as prominent in conventional fields. For example, geoscientists break shale down by geology, geochemistry, and geomechanics, and engineers now look into fracs using microseimic lenses. While this data was used in conversional fields before, the stress on and ways of analyzing them is different now; new parameters have emerged as key measures such as TOC and brittleness. When it comes to shale fields, the industry is still learning from acquired data.

Well organized (and quality) information that is easily found and efficiently flows through internal departments and supplying vendors, not only will allow for faster reaction to operation’s needs & opportunities, it will turn into better strategy to increase EUR per well through better understanding of the reservoirs.

How you take care of your data directly impacts your engineers and geoscientists efficiency and the speed they can find good production opportunities. Fast and efficient is the name of the game when it comes to unconventional and competitive world.

It is not enough to provide a place to store new unconventional information and flow it to analytical systems, while those are the first steps they must fit into a holistic approach that takes unconventional integrated operational processes to the next level of efficiency.

Cut Search Time for Critical Documents from Days to Seconds. It is Time to Stop Digging in Folder Structures

It wasn’t long ago when geoscientists and petroleum engineers at one renowned oil company might spend days searching for documents.  “Searching” meant digging through folders (as many as 1500 of them!!), and discerning whether a “found” file was an official report or only an earlier draft.  To give you an idea, some critical HSE documents were buried as deeply as within the 13th   sub-folder (and then the correct version had to be selected!!)

Obviously in this situation emergency and critical decision cycle times were lengthened by the difficulty of finding the “buried” technical documents. The average time to locate and validate the accuracy of a document was calculated at 3 days.

When Certis arrived, the company’s folder system looked like an episode of “Hoarders”. The hoarder believes there is an organized system to his “madness”, but nobody else in the home can quite figure it out. Over the years, over 2,000,000 documents had been amassed at this location, and that total was growing fast. As engineers and geoscientists floated in and out, the system fell victim to hundreds of interpretations. Unlike the hoarder’s goods, these documents contained vital information that accumulated years of studies and billions of dollars of data acquisitions. Years of knowledge, buried, literally.

In today’s competitive and fast pace operations in our Oil and Gas industry, data is accumulating faster than ever and decisions must be made faster than ever by petro-professionals that are already overextended.  Compounded with the fact that a large portion of the knowledge is within a workforce that may soon retire means that Oil and Gas companies that want to stay exceptional and competitive cannot afford to waste petro-professionals time hunting for critical records.

So, how do you get to a point where your organization can locate the right document instantly?  We believe it is all about Processes, Technology and People put in place (a cliché but so true)

When Certis completed this project, the technical community could locate their documents within few seconds using “google-like” search. More importantly they were (and are now) able to locate the “latest” version and trust it. The solution had to address 3 elements, people, processes and technology.

The final solution meant collapsing folders from 2000 down to 150, using a DRM system without burdening the technical community and implementing complete processes with a service element that ensured sustainability.

Centralized, standardized and institutionalized systems and processes were configured to take full advantage of the taxonomy and DRM systems. Once the ease of use and the value were demonstrated to the people, buy-in was easy to get.

Technology advances faster than our ability to keep up. This is especially true when working with professionals whose focus is (and should be!) on their projects, not on data management. We had to break the fear of change by proving there is a better way to work that increases efficiency and makes employee’s lives easier.

Legacy Documents, what do you do with them?

Because solving operational issues at the field requires access to complete historical information, exhuming technical legacy documents, physical or electronic, from their buried locations was the next task.

On this project the work involved prioritizing, locating, removing duplicates, clustering, and tagging files with standard meta-data. With a huge number of files accumulated in network drives and library rooms, a company must keep an eye on “cost/ benefit” ratio. How to prioritize and how to tag technical files become two key success factors to designing a cost-effective migration project.

This topic can go on and on since there were so many details that made this project successful. But that may be for another post.

Read more about Certis and about our oil and gas DRM services http://ow.ly/oRQ5f