Category Archives: Digital Field

Considering Blockchain for E&P? Don’t Forget to Bake-in Data Integrity

A Welcomed Addition to the E&P Digital Landscape, Blockchain

Over the decades, energy companies have adopted many technologies and tools to enhance and advance their competitive advantage and business margins. Information Technology (IT) has played a significant role in those enhancements. Most recent examples in the news coming from AI, IoT and Cloud computing. Nonetheless, oil and gas is still struggling with few persistent data issues:

  1. security and data leakage
  2. siloed systems and/or a spaghetti of connections from ETLs, and
  3. reliability of data quality.

As a consequence, workflows and processes are still filled with examples of inefficiency and rework. But an emerging technology, Blockchain[1] may have the answer we’ve been waiting for.

Blockchain is a distributed ledger and crypto technology which brings with it the promise of connectivity, security and data reliability that we have not been able to achieve with past technology, and which was desperately needed in this very specialized industry.

Data Integrity Baked In Business Processes

Because it is possible to enter wrong data in blockchain applications, thinking about how to capture and disseminate good data quality using blockchain applications is important to ensure quality analysis and sound, informed decision making.

Blockchains can live alongside the current data architecture as a data governance layer and passively monitor quality. However, baking data integrity into business applications from the ground up is the best way forward.

Along with distributed ledger and crypto signatures, blockchain also offers a smart contract feature which allows us to re-engineer and re-imagine processes and how we all work together. Thinking through the last-mile problem to build data integrity into the application from the ground up will be one of the key success factors that will make or break your ultimate solution. New jobs will arise to ensure that correct information is being captured, as specialists that can verify information validity will be a must. These specialists will be similar to insurance adjustors that testify a claim is valid prior to insurance payment.

The further good news is blockchain data will only need to be verified once. No more having to verify the same data over and over again!!! Who has time for that?!

Recommended Resources

[1] A plethora of material describing blockchain are everywhere. Make sure you are reading  curated articles.  One good reference is from IBM (not endorsing their technology, Certis is vendor agnostic): Blockchain for dummies


For additional information, architecting, implementing and training on successful Blockchain projects please contact us at info@certisinc.com

Why it’s time for Petroleum Geologists and Engineers to move away from generic data scientists

The industry has finally warmed up to Artificial Intelligence (AI) technologies. This is good news. AI technologies are a great opportunity for operators to increase their efficiency and boost their competitive advantage and safe operations. And, in fact, many have already done just that.

But not every company will deploy AI at the same pace or with the same capabilities. Skills to build meaningful and impactful AI models are still scarce. Also, the people who understand the earth’s and hydrocarbon’s physics are not always the same ones who know how to use “Data Science (DS) and AI” tools. For that reason, some Exploration & Production (E&P)  companies opt for a centralized DS team that has skills in statistics and building AI models. In this centralized model, engineers’ “exploratory” and “what-if” analyses go to a central team to build a predictive model for their hypothesis.

This process is not ideal. It takes longer to reach to an acceptable and final model even when a request is prioritized (long cycle time).  How can you speed it up? While a central DS team may seem to be the only option at the moment, I would argue that eventually those who understand the physics should be the ones building and confirming their own AI and Machine Learning (ML) models. I argue, some of brightest engineering and geoscience ideas are yet to come out to industry, once they learn the tools.

Why Change?

For some petro-proefessionals, learning a new coding language (python for example) is not fun. After all if they wanted to be coders they would be in a different place. We have been there before. The situation is analogous with the old DOS commands days, which were the domain of specialists and enthusiasts. But as soon as software (like Microsoft Office) came out with Graphical User Interface (GUI) – where we clicked and typed a natural language – then the engineers, geologists and the whole world came onboard. Well, AI and ML tools are making that turn and do not require any coding. 

It was no longer a stretch to get petroleum engineers to use Excel spreadsheets, then and it is an expectation now. We should have the same expection for them using advanced analytics, AI and ML tools. 

 

Some suggestions

I leave you with some tips to reach that goal (if you agree with the above premises);

  • Look for advanced analytics tools with graphical user interfaces (as opposed to those that require you learn a new language like python). I like this software’s interface Data Robot, www.datarobot.com
  • Make sure your software can easily connect to data sources – it’s not enough to have a software with good GUI.
  • Automate the flow of information and facilitate access to information – this is key to success, especially for data warehouses/ lakes or hubs.
  • Make sure your users are trained in advanced analytics principles and on the software.  At the very least, engineers should be able to build some basic predictive models.
  • Consider cloud data warehouses and tools to get the power to run large datasets such as logs and seismic. We are following Snowflake https://www.snowflake.com  and MEME SQL https://www.memsql.com/

Certis consults and delivers data services. We are systems agnostic. We focus on helping oil and gas clients set up their backbone data and processes. Find out more about our services

The Way To Maximize Value from M&A Assets

In North America, the only constant when it comes to Oil and Gas companies is change.  With mergers and acquisitions (M&A), hydrocarbon assets constantly change hands. The value of acquired assets will then either be maintained, increased. decreased or maximized depending on how it is managed under the new owners. It is generally agreed the value can only be maximized when the asset’s geological models, reservoirs’ dynamics, and wells’ behavior are fully understood to their minute details. The new owner takes over the asset but is not guaranteed the people with the knowledge.

Building a clear understanding of the new asset becomes an urgent matter for the new owner.  This understanding is typically hidden under the mountain of data and files that change hands together with the asset. How and when the data is migrated to the new organization, therefore, can build up or bring down the value.

Typically, when an asset changes hands, the field staff remains, but the geologists and geoscientists that strategized the assets’ management may not follow the asset. This can mean that a great deal of knowledge is potentially lost in transition. This makes the data and documents that are delivered, after the transaction is complete, that much more important to understanding the details of the acquisition. Obtaining as much of this data as possible is crucial.  As a geologist who has been through multiple mergers put it:

“Knowledge like drilling through a fault is only known to the asset team operating the asset. This information is not publicly available. During the transition, getting any maps or reports from the geologists will help the acquiring company develop the right models and strategies to increase value. We want all the data we can get our hands on.”

Another key consideration is software licenses and versions, which may or may not transfer.  We find that the risk of losing the information permanently due to software incompatibility, licensing, or structure issues is very real. Migrating the technical data during the transitioning period will help protect the new owner from data loss.

Per Harvey Orth, a geophysicist and former CIO who has been through three mergers and acquisitions:

In many cases, companies made an agreement with the software vendor to maintain a read-only copy of all the data; just in case they needed to extract some data they had not loaded into their production systems (for the new owner) or need the data for legal or tax reasons later (for the seller). In fact, keeping a read-only copy can be easily negotiated within a purchase agreement if you are divesting an asset. When acquiring, then everything and anything you can get your hands on can be essential to getting the most value from the field and should be migrated.

Tips to Protect the Value of New Assets 

Experts like us can help ensure that data is migrated quickly and efficiently and that the right data is obtained from the acquisition target. However, if inclined to manage the data transfer yourself, we share the following tips:

Make it Manageable, Prioritize it Right:

While all of the data and information is important, time is of the essence. Most companies will prioritize migrating “accounting” data, and rightly so, but to maximize value, technical data must also be at the top of the priority list. The following should top your priority list: production volumes and pressure data, land and lease data, well construction & intervention data (drilling, completions, and intervention history), Reservoir characterization (logs, paraphysics, core …etc.)

Do Due Diligence with a Master List

Getting your hands on all the data starts with a master list of all the assets,  including such things as active wells and their statuses. This list is the first-stop shop for every department that needs to build its knowledge and processes to manage the new assets. It is also the checklist against which to assess received information. If you have invested in a MDM (Master Data Management) system, then adding the new assets to the database should be one of your first steps.

Know What is Good Quality and What Is Not.

One of the biggest obstacles that companies face is the realization that their own data is not standardized and clean.  So now they are faced with the prospect of adding bad to bad.

Much can be said about investing in data quality standards and governance practice. It makes folding in any new assets easier, faster and cost effective. If you don’t have strong data standards yet, see if you can inherit them from the selling company,  or alternatively get help from IM experts to create these standards and merge legacy data with the new acquisitions.

Make it Findable: Tag Your Electronic files

Documents like geological maps, logs, lab reports, management presentations, and other files contain a wealth of information. Finding the right file can take considerable time, especially if the organizer was not you. Take advantage of Artificial Intelligence and “tag” the files based on their content. This will create a layer of metadata and make finding the right file based on “petroleum natural language” easier.

For additional information or a free consultation on migrating M&A data please contact us at info@certisinc.com

Coming Current with E&P Data Management Efforts

During the PNEC 2015 conference last week, we managed to entice some of the attendees, passing by our booth, to take part in a short survey. As an incentive we offered a chance to win a prize and made the survey brief, we could’t make it too long and risk getting little or no intelligence.

I’m not sure if any of you will find the results to be a revelation or offer anything new that you already did not know anecdotally. But if nothing else, they may substantiate “feelings” with some numbers.

You will be pleased to know that more than 60% of the replies are from operators or NOC.  In this week’s blog I share the results  and offer my thoughts on the first survey question.

graph

In the above question and graph “Which data projects are of high priority in your mind?”, it appears the industry continues to pursue data integration projects and the majority of the participants (73%) consider them to be the highest priority. Followed closely on the priority list were “data quality” projects (data governance and legacy data cleaning), 65% consider these a priority.

Thoughts…

Integration will always be at the top of the priority list in the E&P world until we truly connect the surface measurements with the subsurface data in real time. Also, given that data integration cannot be achieved without pristine data, it is no surprise that data quality follows integration as a close second.

Because many “data cleaning” projects are driven by the need to integrate, data quality efforts are still focused on incoming data and mostly on “identification” data, such is the case in MDM projects.

Nonetheless, how a well was configured 20 years earlier and what failures (or not) were encountered during those 20 years are telling facts to engineers. Therefore, the quality of “legacy” technical data is just as important as of new incoming data.

Reaching deeper than identification and header data to ensure technical information is complete and accurate is not only important for decision making, but as my friend at a major company would say: it is important firstly for safety reasons, then for removing waste (lean principle) and then for decisions.  Of course chipping away slowly at the large mountain of data is a grueling task and can be demotivating if there are only limited results.

To get them done right with impactful E&P business results, these projects should be tackled with a clear vision and a holistic approach. As an industry we need to think about  legacy data preparation strategically, do them once and be done with it.

Legacy data cleanup projects are temporary (with a start and an end date), experience tells me they are best accomplished by outsourcing them to professional data cleaning firms that fully understand E&P data.

This blog is getting too long, I’d better cover the results of the rest of the survey in the next one.

Please share your thoughts and correct me where you feel I got it wrong….

 

Part 3 of 3: Are we progressing? Oil & Gas Data Management Journey the 2000s

The 1990’s shopping spree for applications produced a spaghetti of links between databases and applications while also chipping away the petro professional’s effective time with manual data entry. Then, a wave of mega M&As hit the industry in late 90s early part of the 2000s.

Mega M&As (mergers and acquisitions) continued into the first part of the 2000s, bringing with them—at least for those on the acquiring side – a new level of data management complexity.

With mega M&As, the acquiring companies inherit more databases and systems, and many physical boxes upon boxes of data. This influx of information proved to be too much at the outset and companies struggled – and continue to struggle – to check the quality of the technical data they’d inherited. Unknown at the time, the data quality issues present at the outset of these M&As would have lasting effects on current and future data management efforts. In some cases it gave rise to law suites that were settled in millions of dollars. 

Early 2000s

Companies started to experiment with the Internet.  At that time, that meant experimenting with simple reporting and limited intelligence on the intranet.  Reports were still mostly distributed via email attachments and/or posted in a centralized network folder.

I am convinced that it was the Internet that  necessitated cleaning technical data and key header information for two reasons: 1) Web reports forced the integration between systems as business users wanted data from multiple silo databases on one page. Often times than not, real-time integration could not be realized without cleaning the data first 2)  Reports on the web linked directly to databases exposed more “holes” and multiple “versions for the same data”;  it revealed how necessary it was to have only ONE VERSION of information, and that  had better be the truth.

The majors were further ahead but for many other E&P companies, Engineers were still integrating technical information manually, taking a day or more to get a complete view and understanding of their wells, excel was the tool moslty. Theoretically, with these new technologies, it should be possible to automate and instantaneously give a 360 degree-view of a well, field, basin and what have you. However, in practice it was a different story because of poor data quality.  Many companies started data cleaning projects, some efforts were massive, in tens of millions of dollars, and involved merging systems from many past acquisitions.

In the USA, in addition to the internet, the collapse of Enron in October 2001 and the Sarbanes–Oxley Act enacted in July 30, 2002, forced publicly traded oil and gas companies to document and get better transparency into operations and finances. Data management professionals were busy implementing their understanding of SOX in the USA. This required tightener definitions and processes around data.

Mid 2000s

By mid-2000s, many companies started looking into data governance. Sustaining data quality was now in the forefront.  The need for both sustainable quality data and data integration gave rise to Well Master Data Management initiatives. Projects on well hierarchy, data definitions, data standards, data processes and more were all evolving around reporting and data cleaning projects. Each company working on its own standards, sharing success stories from time to time.  Energetics, PPDM and DAMA organizations came in handy but not fully relied on.

Late 2000s

When working on sustaining data quality, one runs into the much-debated subject of who owns the data?  While for years, the IT department tried to lead the “data management” efforts, they were not fit to clean technical oil and gas data alone; they needed heavy support from the business. However the engineers and geoscientists did not feel it was their priority to clean “company-wide” data.

CIOs and CEOs started realizing that separating data from systems is a better proposition for E&P.  Data lives forever while systems come and go. We started seeing a movement towards a data management department, separate and independent from IT, but working close together. Few majors made this move in mid 2000s with good success stories others are started in late 2000s. First by having a Data Management Manager reporting to the CIO (and maybe dotted line to report to a business VP) then reporting directly to the business.

Who would staff a separate data management department?  You guessed it; resources came from both the business and IT.  In the past each department or asset had its own team of technical assistants “Techs” who would support their data needs (purchase, clean, load, massage…etc.) Now many companies are seeing a consolidation of “Techs” in one data management department supporting many departments.

Depending on how the DM department is run, this can be a powerful model if it is truly run as a service organization with the matching sense of urgency that E&P operations see. In my opinion, this could result in cheaper, faster and better data services for the company, and a more rewarding career path for those who are passionate about data.

Late 2008 and throughout 2009 the gas prices started to fall, more so in the USA than in other parts of the world. Shale Natural Gas has caught up with the demand and was exceeding it.  In April 2010, we woke up to witness one of the largest offshore oil spill disasters in history. A BP well, Macondo, exploded and was gushing oil.

For companies that put all their bets on gas fields or offshore fields, they did not have appetite for data management projects. For those well diversified or more focused on onshore liquids, data management projects were either full speed or business as usual.

 2010 to 2015 ….

Companies that had enjoyed the high oil prices since the 2007 started investing heavily in “digital” oilfields.  More than 20 years had passed since the majors started this initiative (I was on this type of project with Schlumberger for one of the majors back in 1998). But now it is more justifiable than ever. Technology prices have come down, systems capacities are up, network reliability is strong, wireless-connections are reasonably steady and more. All have come together like a prefect storm to resurrect the “smart” field initiatives like-never before. Even the small independents were now investing in this initiative. High oil prices were justifying the price tag (multiple millions of dollars) on these projects. A good part of these projects is in managing and integrating real time data steams and intelligent calculations.

Two more trends appeared in the first half of the 2010s:

  • Professionalizing the petroleum data management. Seemed like a natural progression now data management departments are in every company. The PPDM organization has a competency model that is worth looking into. Some of the majors have their own models that are tied to their HR structure. The goal is to reward a DM professional’s contribution to business’ assets. (Also please see my blog on MSc in Petroleum DM)
  • Larger companies are starting to experiment and harness the power of Big Data, and the integration of structured with unstructured data. Meta data and managing unstructured has become more important than ever.

Both trends have tremendous contributions that are yet to be fully harnessed.  The Big Data trend in particular is nudging data managers to start thinking of more sophisticated “analysis” than they did before .  Albeit one could argue that Technical Assistants that helped engineers with some analysis, were also nudging towards data analytics initiatives.

In December 2015, the oil price collapses more than 60% from its peak

But to my friends’ disappointment, standards are still being defined. Well hierarchy, while is seems simple to the business folks, getting it all automated and running smoothly across all types and locations of assets  will require the intervention of the UN.  With the data quality commotion some data management departments are a bit detached from the operations reality and take too long to deliver.

This concludes my series on the history of Petroleum Data Management. Please add your thoughts would love to hear your views.

For Data Nerds

  1. Data ownership has now come full circle, from the business to IT and back to business.
  2. The rise of Shale and Coal-bed Methane properties, fast evolution of field technologies are introducing new data needs. Data management systems and services need to stay nimble and agile. The old ways of taking years to come up with a usable system is too slow.
  3. Data cleaning projects are costly, especially when cleaning legacy data, so prioritizing and having a complete strategy that aligns with the business’ goals are key to success. Starting with well-header data is a very good start, aligning with what operations really need will require paying attention to many other data types, including mealtime measurements.
  4. When instituting governance programs, having a sustainable, agile and robust quality program is more important than temporarily patching problems based on a specific system.
  5. Tying data rules to business processes while starting from the wellspring of the data is prudent to sustainable solutions.
  6. Consider outsourcing all your legacy data cleanups if it takes resources away from supporting day to day business needs. Legacy data cleaning outsources to specialized companies will always be faster, cheaper and more accurate.
  7. Consider leveraging standardized data rules from organizations like PPDM instead of building them from scratch. Consider adding to the PPDM rules database as you define new ones. When rules are standardized data, sharing exchanging data becomes easier and cost effective.  

Data and Processes are your two friends in fat or skinny margin times – Some tools and ideas to weather low oil-prices

well;  2014 is ending with oil prices down and an upward trend on M&A activities. For those that are nearing retirement age, this is not all bad news. For those of us that are still building our careers and companies, well, we have uncertain times ahead of us. This got me asking: is it a double whammy to have most knowledgeable staff retiring when oil prices are low? I think it is.

At the very least, companies will no longer have the “fat margins” to forgive errors or to sweep costly mistakes under the rug! While costs must be watched closely, with the right experience some costs can be avoided all together. This experience is about to retire.

For those E&P companies that have already invested (or are investing) in putting in place, the right data and processes that captured knowledge into their analysis and opportunity prioritization, will be better equipped to weather low prices.  On the other hand, companies that have been making money “despite themselves” will be living on their savings hoping to weather the storm. If the storm stays too long or is too strong they will not survive.

Controlling cost the right way

Blanket cost cutting, across all projects is not good business. For example some wells do not withstand shutting down or differing repairs, you would risk losing the wells altogether. Selectively prioritizing capital and operational costs with higher margins and controllable risks, however, is good business. To support this good business practice is a robust foundation of systems, processes and data practices that empower a company to watch important matrices and act fast!

We also argue that without relevant experience some opportunities may not be recognized or fully realized.

Here are some good tools to weather these low prices:

Note that this is a quick list of things that you can do “NOW” for just few tens or few hundred thousand dollars (rather than the million dollar projects that may not be agile at these times)

  •  If you do not have this already, consider implementing a system that will give you a 360 degree view of your operations and capital projects. Systems like these need to have the capability to bring data from various data systems, including spreadsheets. We love the OVS solutions (http://ovsgroup.com/ ). It is lean, comes with good processes right out of the box and can be implanted to get you up and running within 90 days.
  • When integrating systems you may need some data cleaning. Don’t let that deter you; in less than few weeks you can get some data cleaned. Companies like us, Certisinc.com, will take thousands of records validate, de-duplicate, correct errors, complete what is missing and give you a pristine set. So consider outsourcing data cleaning efforts. By outsourcing you can have 20 maybe 40 data admins to go through thousands of records in a matter of a few days.
  • Weave the about-to-retire knowledge into your processes before it is too late. Basically understand their workflow and decision making process, take what is good, and implement it into systems, processes and automated workflows. It takes a bit of time to discover them and put them in place. But now is the time to do it. Examples are: ESP surveillance, Well failure diagnosis, identifying sweet frac’ing spots…etc. There are thousands upon thousands of workflows that can be implemented to forge almost error proof procedures for  “new-on-the job” staff
  • Many of your resources are retiring, consider hiring retirees, but if they would rather be on the beach than sitting around the office after 35+ years of work; then leverage systems like OGmentorsTM (http://youtu.be/9nlI6tU9asc ).

In short, the importance of timely and efficient access to right & complete data, and the right knowledge weaved into systems and processes are just as important, if not more important, during skinny margin times.

Good luck. I wish you all Happy Holidays and a Happy 2015.

A master’s degree in Petroleum Data Management?

I had dinner with one of the VPs of a major E&P company last week. One of the hot topics on the table was about universities agreeing to offer MSc in Petroleum Data Management. Great idea!  I thought. But it brought so many questions to my mind.

 

Considering where our industry is with information management (way behind many other industries), I asked who will define the syllabus for this PDM MSc? The majors? The service companies? Small independents? Boutique specialized service providers? IMS professors? All of the above?

 

Should we allow ourselves to be swayed by the majors and giant service companies? With their funding they certainly have the capability to influence the direction, but is this the correct (or only) direction? I can think of few areas where majors implementation of DM would be an overkill for small independents, they would get bogged down with processes that make it difficult to be agile! The reason that made the independents successful with the unconventional.

 

What should the prerequisite be? A science degree? Any science degree? Is a degree required at all? I know at least couple exceptional managers, managing data management projects and setting up DM from scratch for oil and gas companies, they manage billions of dollars’ worth of data. They do not have a degree, what happens to them?

 

It takes technology to manage the data.  MSc in Petroleum Data Management is no different. But unlike petroleum engineering and geoscience technologies, technology to manage data progresses fast, what is valid today may not still be valid next year! Are we going to teach technology or are we teaching about Oil and Gas data? This is an easy one, at least in my mind it is, we need both. But more about the data itself and how it is used to help operators and their partners be safer, find more and grow. We should encourage innovation to support what companies need.

 

PPDM – http://www.ppdm.org/ is still trying to define some standards, POSC (Petrochemical Open Standards Consortium (I think that is what it stands for, but not sure) came and went, Energistics – http://www.energistics.org/ is here and is making a dent, Openspirit – http://www.tibco.com/industries/oil-gas/openspirit made a dent but is no longer non-profit. Will there be standards that are endorsed by the universities?

The variations from company to company in how data management is implemented today is large. Studying and comparing the different variations will make a good thesis I think…

I am quite excited about the potential of this program and will be watching developments with interest.

 

Change Coming Our Way, Prepare Data Systems to Store Lateral’s Details.

Effectively, during the past decade, oil and gas companies have aimed their spotlight on efficiency. But should this efficiency be at the expense of data collection? Many companies are now realizing that it shouldn’t.

Consider the increasingly important re-fracturing effort.  It turns out, in at least one area, that only 45% of re-fracs were considered successful if the candidates were selected using production data alone.  However, if additional information (such as detailed completion, production, well integrity and reservoir characterization data) were also used a success rate of 80% was observed. See the snip below from the Society of Petroleum Engineer’s paper “SPE 134330” by M.C Vincent 2010).

Capture

Prepare data systems to store details, otherwise left in files.

Measurements while drilling (MWD), mud log – cuttings analysis and granular frac data are some of the data that can be collected without changing drilling or completion operations workflow and the achieved efficiency.  This information when acquired at the field will make its way to petrophysicists and engineers. Most likely it ends up in reports, folders and project databases.  Many companies do not think of this data storage beyond that.

We argue, however, to take advantage of this opportunity archival databases should also be expanded to store this information in a structured manner. This information should also funnel its way to various analytical tools. This practice will allow technical experts to dive straight into analyzing the wells  data instead of diverting a large portion of their time in looking for and piecing data together. Selecting the best re-frac candidates in a field will require the above well data and then some. Many companies are starting to study those opportunities.

Good data practices to consider

To maximize economic success from re-stimulation (or from first stimulation for that matter) consider these steps that are often overlooked:

  1. Prepare archival databases to specifically capture and retain data from lateral portions of wells. This data may include cuttings analysis, Mud log analysis, rock mechanics analysis, rock properties, granular frac data, and well integrity data.
  2. Don’t stop at archiving the data, but expose it to engineers and readily accessible to statistical and Artificial Intelligence tools. One of those tools is Tibco Spotfire.
  3. Integrate, integrate, integrate. Engineers depend on ALL data sources; internal, partners, third party, latest researches and media, to find new correlations and possibilities. Analytic platforms that can bring together a variety of data sources and types should be made available. Consider Big Data Platforms.
  4. Clean, complete and accurate data will integrate well. If you are not there yet, get a company that will clean data for you.

Quality and granular well data is the cornerstone to increasing re-frac success in horizontal wells and in other processes as well.  Collecting data and managing it well, even if you do not need it immediately, is an exercise of discipline but it is also a strategic decision that must be made and committed to from top down. Whether you are drilling to “flip” or you are developing for a long term. Data is your asset.

 

Capture The Retiring Knowledge

The massive knowledge that is retiring and about to retire in the next five years will bring some companies to a new low in productivity. The U.S. Bureau of Labor Statistics reported that 60% of job openings from 2010 to 2020 across all industries will result from retirees leaving the workforce, and it’s estimated that up to half of the current oil & gas workforce could retire in the next five to ten years.

For companies that do not have their processes defined and weaved into their everyday culture and systems — relying on their engineers and geoscientists knowledge instead — retirement of these professionals will cause a ‘brain drain,’ potentially costing these companies real down time and real money.

One way to minimize the impact of “Brain Drain” is by documenting a company’s unique technical processes and weaving them into training programs and, where possible, into automating technology. Why are process flows important to document? Process flow maps and documents are the geographical maps that give new employees the direction and the transparency they need, not only to ramp up a learning curve faster, but also to repeat the success that experienced resources deliver with their eyes closed.

For example, if a reservoir engineer decides to commission a transient test, equipment must be transported to location, the well is shut down and penetrated, pressure buildup is measured, data is interpreted, and BHP is extrapolated and Kh is calculated.
The above transient test process, if well mapped, would consist of: 1) Decisions 2) Tasks/ Activities 3) A Sequence Flow 4) Responsible and Accountable Parties 5) Clear Input and Output 6) and Possible Reference Materials and Safety Rules. These process components, when well documented and defined, allow a relatively new engineer to easily run the operation from start to end without downtime.

When documenting this knowledge, some of the rules will make its way in contracts and sometimes in technology enablers, such as software and workflow applications. The retiring knowledge can easily be weaved into the rules, reference materials, the sequence flow, and in information systems.

Documenting technical processes is one of the tools to minimize the impact of a retiring workforce. Another equally important way to capture and preserve knowledge is to ensure that data in personal network drives is accumulated, merged with mainstream information, and put in context early enough for the retiring workforce to verify its accuracy before they leave.

Processes and data  for a company make the foundation of a competitive edge, cuts back on rework and errors, and helps for quickly identifying new opportunities.

To learn more about our services on Processes or Data contact us at info@certisinc.com

Bring It On Sooner & Keep It Lifting Longer. Solutions To Consider For ESPs (Or Any Field Equipment)

Settled on average 6,000 feet below the surface, electrical submersible pumps (a.k.a ESPs) provide artificial lift for liquid hydrocarbons for more than 130,000 wells worldwide.
Installing the correct ESP system for the well, installing it precisely, and careful monitoring of the system is paramount to reducing the risk of a premature end to an ESP life cycle. But the increasingly long laterals of horizontal wells, along with rapid drilling in remote areas, is creating challenges for efficient operations and the ESP’s life span. Implementing the correct processes and data strategies will, undoubtedly, be the cheapest and fastest way to overcome some of the challenges.

1- Implement A Process Flow That Works, Break The Barriers

When a decision is made to install an ESP in a well, a series of actions are triggered: preparing specifications, arranging for power, ordering equipment, scheduling operations, testing, and finally installing it in a well, to state a few. These actions and decisions involve individuals from multiple departments within an organization as well as external vendors and contractors. These series of actions form a process flow that is sometimes inefficient and is drawn out, causing delays in producing revenue. In addition, sometimes processes fall short causing premature pump failures that interrupt production and raise operational costs.
Research of many industry processes shows communication challenges are one of the root causes for delays, according to LMA Consulting Group Inc. Furthermore, communication challenges increase exponentially when actions change hands and departments. A good workflow will cut across departmental barriers to focus on the ultimate goal of making sure Engineering, Procurement, Logistics, accounting, vendors, contractors and field operations all are on the same page and have a simple and direct means to communicate effectively. But more importantly, the workflow will allow for the team to share the same level of urgency and keep stakeholders well informed with the correct information about their projects. If you are still relying on phones, papers and emails to communicate, look for workflow technology that will bring all parties on one page.

A well-thought through workflow coupled with fit-for-purpose technology and data is critical, not only to ensure consistent successful results each time but also to minimize delays in revenue.

2- ESP Rented Or Purchased, It Does Not Matter… QA/QC Should Be Part Of Your Process

Although ESPs are rented and the vendor will switch out non-performing ones, ensuring that the right ESP is being installed for a well should be an important step of the operator’s process and procedures. Skipping this step means operators will incur the cost of shut downs and tempering of reservoir conditions that may otherwise be stabilized – not to mention exposure to risks each time a well is penetrated.
More importantly a thoughtful workflow ensures a safe and optimal life span for ESPs regardless of the engineers or vendors involved, especially in this age of a mass retiring of knowledge.

At today’s oil prices, interrupted production for a well of 1,000 barrels per day will cost an operator at least $250,000 of delayed revenue for a 5 day operation. Predictive and prescriptive analytics in addition to efficient processes can keep the interruption to the minimum if not delay it altogether.

3- Know Why And How It Failed Then Improve Your Processes – You Need The Data And The Knowledge

One last point in this blog: Because ESPs consist of several components, a motor, a pump, a cable, elastomer, etc… ESP failure can, therefore, be electrical, mechanical, thermal or fluid/gas composition. Capturing and understanding the reasons for a failure in a system to allow for effective data analysis provides insight that can be carried forward to future wells and to monitoring systems. Integrating this knowledge into systems such as predictive analysis or even prescriptive analytic to guide new engineers will have an effect on operator’s bottom-line. A few vendors in the market offer these kind of technology, weaving the right technology, data and processes to work in synergy is where the future is.

On how to implement these solutions please contact our team at info@certisinc.com.

Related articles