Category Archives: Process Improvement

Why Connecting Silos With Better IM Architecture Is Important

If you work in an oil and gas company, then you are familiar with the functional divides. We are all familiar with the jokes about geologists vs. engineers. We laugh and even create our own. But jokes aside, oil and gas companies operate in silos and with reason.

But while organizational silos may be necessary to excel and maintain standards of excellence, collaboration and connection across the silos are crucial for survival.

For an energy company to produce hydrocarbons from an asset, it needs all the departments to work together (geoscience, engineering, finance, land, supply chain …etc.). This requires sharing of detailed information and collaborating beyond meeting rooms and email attachments. But the reality in many oil and gas companies today is different, functional silos extend to information silos.

Connected Silos Are Good. Isolated Silos Are Bad

In an attempt to connect silos, “Asset Teams” or “Matrix” organizations are formed and incentive plans are carefully crafted to share goals between functions. These are great strides, but no matter the organizational structure, or the incentive provided, miscommunications, delays, and poor information hand-over are still common place. Until we solve the problem of seamless information sharing, the gap between functional departments will persist; because we are human and we rationalize our decisions differently.  This is where technology and automation (if architected correctly) can play a role in closing the gap between the silos.

Asset team members and supporting business staff have an obligation to share information not only through meetings and email attachments but through organizing and indexing asset files throughout the life of the asset. Fit-for-Purpose IM architecture has a stratigic role to play in closing the gap between the functional silos.  

Connecting Functional Silos With IM Takes Vision & Organizational Commitment 

Advancements in IM (Information Management) and BPMS (Business Process Management Systems) can easily close a big part of the remaining gap. But many companies have not been successful in doing so, despite significant investments in data and process projects. There can be many reasons for this, I share with you two of the most common pitfalls I come across:

  • Silo IM projects or systems –  Architecting and working on IM projects within one function without regard to impact on other departments. I have seen millions of dollars spent to solve isolated geoscience data needs, without accounting for impact on engineering and land departments. Or spent on Exploration IM projects without regard to Appraisal and Development phases of the asset. Quite often, organizations do not take the time to look at the end-to-end processes and its impact on company’s goals. As a result, millions of dollars are spent on IM projects without bringing the silos any closer.  Connecting silos through an IM architecture requires a global vision.
  • Lack of commitment to enterprise standards – If each department defines and collects information according to their own needs without regard of the company’s needs, it is up to other departments to translate and reformat. This often means rework and repetitive verification whenever information reaches a new departmental ‘checkpoint’.

The above pitfalls can be mitigated by recognizing the information dependencies and commonalities between departments then architecting global solutions based on accepted standards and strong technology. It takes a solid vision and commitment.

For a free consultation on how to connect silos effectively, please schedule your appointment with a Certis consultant. Email us at info@certisinc.com or call us on 281-377-5523.

Juicy Data Aligned

juice

Around the corner from my house is a local shop selling an excellent assortment of fresh vegetable and fruit juices. Having tried their product, I was hooked, and thought it would be a good addition to my diet on a daily basis. But I knew with my schedule that unless I made a financial commitment, and paid ahead of time, I would simply forget to return on a regular basis.  For this reason, I broached the subject of a subscription with the vendor. If the juice was already paid for, and all I had to do was drop in and pick it up, I’d save time, and have incentive to stop by (or waste money).

However, the owner of the shop did not have a subscription model, and had no set process for handling one. But as any great business person does when dealing with a potential long term loyal customer, the owner accommodated my proposition, and simply wrote the subscription terms on a piece of paper (my name, total number of juices owed and date of first purchase), and communicated the arrangement with her staff. This piece of paper, was tacked to the wall behind the counter. I could now walk in at any time, and ask for my juice. Yess!

Of course, this wasn’t a perfect system, but it aligned with business needs (more repeat business), and worked without fail, until, of course, it eventually failed. On my second to last visit, the clerk behind the counter could not find the paper. Whether or not I got the juice owed to me that day is irrelevant to the topic at hand…the business response, however, is not.

When I went in today, they had a bigger piece of paper, with a fluorescent tag on it and large fonts. More importantly, they had also added another data point, labeled ‘REMAINING DRINKS’. This simple addition to their data and slight change to the process made it easier and faster for the business to serve a client. Previously, the salesperson would have to count the number of drinks I had had to date, add the current order, then deduct from the total subscription. But now, at a glance a salesperson can tell if I have remaining drinks or not, and as you can imagine deducting the 2 juices I picked up today from the twelve remaining is far simpler. Not to mention the data and process adjustment, helped them avoid liability, and improved their margins (more time to serve other customers). To me, this is a perfect example of aligning data solutions to business needs.

There are several parallels in the above analogy to our business, the oil and gas industry, albeit with a great deal more complexity. The data needs of our petro professionals, land, geoscience and engineering have been proven to translate directly into financial gains, but are we doing enough listening to what the real needs of the business are? Reference our blog on Better Capital Allocation With A Rear-View Mirror – Look Back for an example on what it takes to align data to corporate needs.

There is real value to harvest inside an individual organization when data strategies are elevated to higher standards. Like the juice shop, oil and gas can reap benefits from improved data handling in terms of response time, reduction in overhead, and client (stakeholder) satisfaction, but on a far larger scale.  If the juice shop had not adapted their methodology in response to their failure of process (even if it wasn’t hugely likely to reoccur) the customer perception might be that they didn’t care to provide better service. Instead, they might just get unofficial advertising from readers asking where I get my juice. I’d suggest that the oil and gas industry could benefit from similar data-handling improvements. Most companies today align their data management strategies to departmental and functional needs.  Unless the data is also aligned to the corporate goals many companies will continue to leave money on the table.

We have been handicapped by high margins, will this happen again or will we learn?

About 15 to 20 years ago, we started to discuss and plan the implementation of databases in Oil and Gas, in hopes of  reaping the benefits of all its promises. And we did plan and deploy those databases.  It is now no longer conceivable to draw geological maps by hand or to store production volumes in books. Also, in the last ten years, we have moved beyond simple storage of digital content and have started looking into managing data quality more aggressively. Here too, we have made inroads. But have we done enough?

Have you ever wondered why companies are still cleaning their data over and over again? Or why we are still putting up building blocks such as standards for master well lists and hierarchies? It seems to me that the industry as a whole is not able to break through the foundational stages of enterprise information management.  Because they can’t break through, they are unable to achieve a sustainable, robust foundation that allows their systems to  keep pace with business growth or business assets diversification.

Perversely, I believe this is because the oil and gas industry has been handicapped by high margins. When a company is making money despite itself, throwing additional bodies and resources to solve a pressing issue seems like the fastest and most effective solution in that moment. Because the industry is structured in such a way that opportunities have to be seized in the moment, there is often little time to wait for the right solution to be implemented.

Throwing money at a problem is not always the wrong thing to do. However, if it becomes your go-to solution, you are asking for trouble.

I would argue that highly leveraged companies have put themselves at high risk of bankruptcy because they do not invest sufficiently in efficiency and agility through optimized processes and quality information flow. For example, coming up with the most effective completion for your reservoir requires access to quality and granular technical data. This data does not just happen, it takes a great deal of wiring and plumbing work to obtain your organization’s data and processes, luckily if done right, it is a one-time investment with minimal operational upkeep.

According to Bloomberg, CNN and Oil & Gas 360 reports, during this ongoing downturn, at least 60 companies have entered chapter 11 in the USA alone. Ultra, Swift, Sabine, Quicksilver, American Energy are just a few of these highly leveraged but otherwise technically excellent companies.

Without the required behind the scenes investment, engineers and geoscientist will  find a way to get the data they need to make decisions. They will, and often do, work hard to bring data from many siloed systems. For each engineer to still have to massage data is throwing money at the problem. If the correct platform is implemented in your company, this information would flow like clockwork to everyone that needs it with little to no manual work.

WHAT COULD HAVE BEEN DONE?

We all know it is never the wrong time to make a profit. Consequently, it is never the wrong time to invest in the right foundation. During a downturn, lower demand creates an abundance of the only resource unavailable during an upturn – time. This time, spent wisely, could bring huge dividends during the next upswing in prices. Conversely, during a period of high prices, it is the other resources we cannot afford to waste. During a boom, we cannot ignore building sustainable longterm data and process solutions the RIGHT way.

It is never the wrong time to make a profit. Consequently, it is never the wrong time to invest in the right foundation.

Of course, there is no single “right way” that will work for everyone. The right way for your organization is entirely subjective, the only rule being that it must align with your company’s operations models and goals. By contrast, the only truly wrong way is to do nothing, or invest nothing at all.

If your organization has survived more than ten years, then it has seen more than one downturn, along with prosperous times. If you’ve been bitten before, it’s time to be twice shy. Don’t let the false security of high margins handicap you from attaining sustainable and long-term information management solutions.

Here are some key pointers that you probably already know:

      Track and automate repeatable tasks – many of your organization’s manual and repeatable tasks have become easier to track and automate with the help of BPMS solutions. Gain transparency into your processes, automate them, and make them leaner whenever possible.  

   Avoid Duplication of Effort – Siloed systems and departmental communication issues result in significant duplicated efforts or reworks of the same data.  Implementing strong data QA process upstream can resolve this. The farther upstream, the better. For example, geoscientists are forced to rework their maps when they discover inaccuracy in the elevation or directional survey data. These are simple low hanging fruits that should be easy to remove by implementing controls at the source, and at each stop along the way.

  Take an Enterprise View –  Most E&P companies fall under the enterprise category. Even if they are a smaller player, they often employ more people than the average small to medium business  (especially during a boom) and deal with a large number of vendors, suppliers, and clients. Your organization should deploy enterprise solutions that match your company’s enterprise operations model. Most E&P companies fall in the lower right quadrant in the below MIT matrix.

mitopmodel

Data and Processes are your two friends in fat or skinny margin times – Some tools and ideas to weather low oil-prices

well;  2014 is ending with oil prices down and an upward trend on M&A activities. For those that are nearing retirement age, this is not all bad news. For those of us that are still building our careers and companies, well, we have uncertain times ahead of us. This got me asking: is it a double whammy to have most knowledgeable staff retiring when oil prices are low? I think it is.

At the very least, companies will no longer have the “fat margins” to forgive errors or to sweep costly mistakes under the rug! While costs must be watched closely, with the right experience some costs can be avoided all together. This experience is about to retire.

For those E&P companies that have already invested (or are investing) in putting in place, the right data and processes that captured knowledge into their analysis and opportunity prioritization, will be better equipped to weather low prices.  On the other hand, companies that have been making money “despite themselves” will be living on their savings hoping to weather the storm. If the storm stays too long or is too strong they will not survive.

Controlling cost the right way

Blanket cost cutting, across all projects is not good business. For example some wells do not withstand shutting down or differing repairs, you would risk losing the wells altogether. Selectively prioritizing capital and operational costs with higher margins and controllable risks, however, is good business. To support this good business practice is a robust foundation of systems, processes and data practices that empower a company to watch important matrices and act fast!

We also argue that without relevant experience some opportunities may not be recognized or fully realized.

Here are some good tools to weather these low prices:

Note that this is a quick list of things that you can do “NOW” for just few tens or few hundred thousand dollars (rather than the million dollar projects that may not be agile at these times)

  •  If you do not have this already, consider implementing a system that will give you a 360 degree view of your operations and capital projects. Systems like these need to have the capability to bring data from various data systems, including spreadsheets. We love the OVS solutions (http://ovsgroup.com/ ). It is lean, comes with good processes right out of the box and can be implanted to get you up and running within 90 days.
  • When integrating systems you may need some data cleaning. Don’t let that deter you; in less than few weeks you can get some data cleaned. Companies like us, Certisinc.com, will take thousands of records validate, de-duplicate, correct errors, complete what is missing and give you a pristine set. So consider outsourcing data cleaning efforts. By outsourcing you can have 20 maybe 40 data admins to go through thousands of records in a matter of a few days.
  • Weave the about-to-retire knowledge into your processes before it is too late. Basically understand their workflow and decision making process, take what is good, and implement it into systems, processes and automated workflows. It takes a bit of time to discover them and put them in place. But now is the time to do it. Examples are: ESP surveillance, Well failure diagnosis, identifying sweet frac’ing spots…etc. There are thousands upon thousands of workflows that can be implemented to forge almost error proof procedures for  “new-on-the job” staff
  • Many of your resources are retiring, consider hiring retirees, but if they would rather be on the beach than sitting around the office after 35+ years of work; then leverage systems like OGmentorsTM (http://youtu.be/9nlI6tU9asc ).

In short, the importance of timely and efficient access to right & complete data, and the right knowledge weaved into systems and processes are just as important, if not more important, during skinny margin times.

Good luck. I wish you all Happy Holidays and a Happy 2015.

A master’s degree in Petroleum Data Management?

I had dinner with one of the VPs of a major E&P company last week. One of the hot topics on the table was about universities agreeing to offer MSc in Petroleum Data Management. Great idea!  I thought. But it brought so many questions to my mind.

 

Considering where our industry is with information management (way behind many other industries), I asked who will define the syllabus for this PDM MSc? The majors? The service companies? Small independents? Boutique specialized service providers? IMS professors? All of the above?

 

Should we allow ourselves to be swayed by the majors and giant service companies? With their funding they certainly have the capability to influence the direction, but is this the correct (or only) direction? I can think of few areas where majors implementation of DM would be an overkill for small independents, they would get bogged down with processes that make it difficult to be agile! The reason that made the independents successful with the unconventional.

 

What should the prerequisite be? A science degree? Any science degree? Is a degree required at all? I know at least couple exceptional managers, managing data management projects and setting up DM from scratch for oil and gas companies, they manage billions of dollars’ worth of data. They do not have a degree, what happens to them?

 

It takes technology to manage the data.  MSc in Petroleum Data Management is no different. But unlike petroleum engineering and geoscience technologies, technology to manage data progresses fast, what is valid today may not still be valid next year! Are we going to teach technology or are we teaching about Oil and Gas data? This is an easy one, at least in my mind it is, we need both. But more about the data itself and how it is used to help operators and their partners be safer, find more and grow. We should encourage innovation to support what companies need.

 

PPDM – http://www.ppdm.org/ is still trying to define some standards, POSC (Petrochemical Open Standards Consortium (I think that is what it stands for, but not sure) came and went, Energistics – http://www.energistics.org/ is here and is making a dent, Openspirit – http://www.tibco.com/industries/oil-gas/openspirit made a dent but is no longer non-profit. Will there be standards that are endorsed by the universities?

The variations from company to company in how data management is implemented today is large. Studying and comparing the different variations will make a good thesis I think…

I am quite excited about the potential of this program and will be watching developments with interest.

 

E&P Companies Looking to New Ways to Deliver Data Management Services to Improve efficiency and transparency

Effective data management, specifically in the exploration and production (E&P) business, has a significant positive impact on operational efficiency and profitability of oil and gas companies. Upstream companies are now realizing that a separate “Data Management” or “Data Services” department is needed in addition to the conventional IT department. Those Departments’ key responsibilities are to “professionally” and “effectively” manage E&P technical data assets worth millions, in some cases, billions of dollars.

Traditional Data Management Processes Cannot Keep up with Today’s Industry Information Flow 

Currently, day-to-day “data management” tasks in the oil and gas industry are directed and partially tracked using Excel spreadsheets, emails and phone calls. One of the companies I visited last month, was using excel to validate receipt of seismic data against contracts and PO; e.g. all surveys and all their associated data. Another one used excel to maintain a list of all Wire-line Log data ordered by petrophysicists in a month to compare to end-of-the-month invoices.

Excel might be adequate if an E&P company is small and has little ambition to grow. However, the larger a company’s capital (and ambitions) the more information and processes are involved to manage data and documents’ life cycle. Consider more than 20,000 drilling permits issued a year in Texas alone. Trying to manage this much information with a spreadsheet, some tasks are bound to fall through the cracks.

Senior managers are more interested than ever in the source , integrity and accuracy of information that affect and influence their HSE and financial statements.
Providing senior managers with such information requires transparent data management processes that are clearly defined, repeatable, verifiable and that  allow managers to identify, evaluate and address or alleviate any obvious or potential risks…. before they become a risk. Preferably all delivered efficiently and cost-effectively.

Choosing the Right E&P Data Management Workflow Tools

It’s tempting to stay with the old way of doing things – the “way we have always done it” – because you have already established a certain (and personal), rhythm of working, inputting and managing data. Even the promise of improved profitability and efficiency is often not enough to convince people to try something new. But the advantages of new workflow tools and programs cannot and should not be underestimated.

For example, a workflow tool can help automate the creation of data management tasks, log and document technical meta data,  track data related correspondence, alert you for brewing issues. When all set and done, data management department would be set for growth and for handling more load of work with out skipping a beat.  Growing by adding more people is not sustainable.

So, where to start?

There are multiple data management workflow tools available from a variety of different vendors, so how do you know which one will work best for your company? You will need to ensure that your workflow tool or software is able to do the following:

  • Keep detailed technical documentation of incoming data from vendors, thereby minimizing duplication of work associated with “cataloging” or “tagging”;
  • Integrate with other systems in your organizations such as Seismic, Contracts, Accounting…etc., including proprietary software programs;
  • Allow sharing of tasks between data managers;
  • Enable collaboration and discussion to minimize scattered email correspondence; and,
  • Automatically alert others of issues such as requests that still need to be addressed

 

 

Change Coming Our Way, Prepare Data Systems to Store Lateral’s Details.

Effectively, during the past decade, oil and gas companies have aimed their spotlight on efficiency. But should this efficiency be at the expense of data collection? Many companies are now realizing that it shouldn’t.

Consider the increasingly important re-fracturing effort.  It turns out, in at least one area, that only 45% of re-fracs were considered successful if the candidates were selected using production data alone.  However, if additional information (such as detailed completion, production, well integrity and reservoir characterization data) were also used a success rate of 80% was observed. See the snip below from the Society of Petroleum Engineer’s paper “SPE 134330” by M.C Vincent 2010).

Capture

Prepare data systems to store details, otherwise left in files.

Measurements while drilling (MWD), mud log – cuttings analysis and granular frac data are some of the data that can be collected without changing drilling or completion operations workflow and the achieved efficiency.  This information when acquired at the field will make its way to petrophysicists and engineers. Most likely it ends up in reports, folders and project databases.  Many companies do not think of this data storage beyond that.

We argue, however, to take advantage of this opportunity archival databases should also be expanded to store this information in a structured manner. This information should also funnel its way to various analytical tools. This practice will allow technical experts to dive straight into analyzing the wells  data instead of diverting a large portion of their time in looking for and piecing data together. Selecting the best re-frac candidates in a field will require the above well data and then some. Many companies are starting to study those opportunities.

Good data practices to consider

To maximize economic success from re-stimulation (or from first stimulation for that matter) consider these steps that are often overlooked:

  1. Prepare archival databases to specifically capture and retain data from lateral portions of wells. This data may include cuttings analysis, Mud log analysis, rock mechanics analysis, rock properties, granular frac data, and well integrity data.
  2. Don’t stop at archiving the data, but expose it to engineers and readily accessible to statistical and Artificial Intelligence tools. One of those tools is Tibco Spotfire.
  3. Integrate, integrate, integrate. Engineers depend on ALL data sources; internal, partners, third party, latest researches and media, to find new correlations and possibilities. Analytic platforms that can bring together a variety of data sources and types should be made available. Consider Big Data Platforms.
  4. Clean, complete and accurate data will integrate well. If you are not there yet, get a company that will clean data for you.

Quality and granular well data is the cornerstone to increasing re-frac success in horizontal wells and in other processes as well.  Collecting data and managing it well, even if you do not need it immediately, is an exercise of discipline but it is also a strategic decision that must be made and committed to from top down. Whether you are drilling to “flip” or you are developing for a long term. Data is your asset.