Tag Archives: oil and gas

More Shale Data Should Equal More Production, Unless Your Data is an Unusable Mess

As the U.S. overtakes Russia in Oil & Gas production because of its unconventional fields, new operators flood the industry. Inevitably, competition increases. The need for predictable and optimized well performance is greater than ever. The fastest route to optimization is quality data that can be used to understand unconventional fields better and drive the flow of operations efficiently.

However, as more data pours in, the cracks in many E&P companies’ data management systems are exposed. Geoscientists and engineers are left to make their own integrations and correlations between disperse systems and left digging through folders trying to find documents for knowledge.

Some of the trouble lies in the new methods of analyzing vast array of data that were not considered as prominent in conventional fields. For example, geoscientists break shale down by geology, geochemistry, and geomechanics, and engineers now look into fracs using microseimic lenses. While this data was used in conversional fields before, the stress on and ways of analyzing them is different now; new parameters have emerged as key measures such as TOC and brittleness. When it comes to shale fields, the industry is still learning from acquired data.

Well organized (and quality) information that is easily found and efficiently flows through internal departments and supplying vendors, not only will allow for faster reaction to operation’s needs & opportunities, it will turn into better strategy to increase EUR per well through better understanding of the reservoirs.

How you take care of your data directly impacts your engineers and geoscientists efficiency and the speed they can find good production opportunities. Fast and efficient is the name of the game when it comes to unconventional and competitive world.

It is not enough to provide a place to store new unconventional information and flow it to analytical systems, while those are the first steps they must fit into a holistic approach that takes unconventional integrated operational processes to the next level of efficiency.

Cut Search Time for Critical Documents from Days to Seconds. It is Time to Stop Digging in Folder Structures

It wasn’t long ago when geoscientists and petroleum engineers at one renowned oil company might spend days searching for documents.  “Searching” meant digging through folders (as many as 1500 of them!!), and discerning whether a “found” file was an official report or only an earlier draft.  To give you an idea, some critical HSE documents were buried as deeply as within the 13th   sub-folder (and then the correct version had to be selected!!)

Obviously in this situation emergency and critical decision cycle times were lengthened by the difficulty of finding the “buried” technical documents. The average time to locate and validate the accuracy of a document was calculated at 3 days.

When Certis arrived, the company’s folder system looked like an episode of “Hoarders”. The hoarder believes there is an organized system to his “madness”, but nobody else in the home can quite figure it out. Over the years, over 2,000,000 documents had been amassed at this location, and that total was growing fast. As engineers and geoscientists floated in and out, the system fell victim to hundreds of interpretations. Unlike the hoarder’s goods, these documents contained vital information that accumulated years of studies and billions of dollars of data acquisitions. Years of knowledge, buried, literally.

In today’s competitive and fast pace operations in our Oil and Gas industry, data is accumulating faster than ever and decisions must be made faster than ever by petro-professionals that are already overextended.  Compounded with the fact that a large portion of the knowledge is within a workforce that may soon retire means that Oil and Gas companies that want to stay exceptional and competitive cannot afford to waste petro-professionals time hunting for critical records.

So, how do you get to a point where your organization can locate the right document instantly?  We believe it is all about Processes, Technology and People put in place (a cliché but so true)

When Certis completed this project, the technical community could locate their documents within few seconds using “google-like” search. More importantly they were (and are now) able to locate the “latest” version and trust it. The solution had to address 3 elements, people, processes and technology.

The final solution meant collapsing folders from 2000 down to 150, using a DRM system without burdening the technical community and implementing complete processes with a service element that ensured sustainability.

Centralized, standardized and institutionalized systems and processes were configured to take full advantage of the taxonomy and DRM systems. Once the ease of use and the value were demonstrated to the people, buy-in was easy to get.

Technology advances faster than our ability to keep up. This is especially true when working with professionals whose focus is (and should be!) on their projects, not on data management. We had to break the fear of change by proving there is a better way to work that increases efficiency and makes employee’s lives easier.

Legacy Documents, what do you do with them?

Because solving operational issues at the field requires access to complete historical information, exhuming technical legacy documents, physical or electronic, from their buried locations was the next task.

On this project the work involved prioritizing, locating, removing duplicates, clustering, and tagging files with standard meta-data. With a huge number of files accumulated in network drives and library rooms, a company must keep an eye on “cost/ benefit” ratio. How to prioritize and how to tag technical files become two key success factors to designing a cost-effective migration project.

This topic can go on and on since there were so many details that made this project successful. But that may be for another post.

Read more about Certis and about our oil and gas DRM services http://ow.ly/oRQ5f

How an E & P Subsidiary took its Information Communications from Risky to Efficient

It starts with chatter around the workplace. A company is growing. Procedures that were once “nice to have” are now serious money bleeds. That is exactly what Certis found when they revamped a major E&P subsidiary’s communication procedures.

When an oil and gas company plants itself in any nation to explore for business opportunities, its communications with the nation’s government and with its JV partners can be, understandably, informal for the early stages of the project. As the company moves from Exploration and Appraisal phases towards a full fledge Development and Operation, what once worked with lax communications becomes a risky endeavor.

While these risks can be underplayed next to health and safety hazards, we discovered they warranted immediate action if the company is to survive long term. Consider these two real situations, to name a few:

1)      Sensitive information leaks, for example, at early stages of exploration efforts, any discovery would have a large impact on a company’s stock price (if public) and serious implications on their competitor’s behavior.

2)      Growing companies’ watch millions of dollars become billions of dollars almost overnight. Those large dollar amounts require complete technical data and timely communications to appease the government and the JV partner. The flow of information becomes crucial.

Knowing something is broken isn’t the same as understanding how it is broken and how to fix it.

Most employees can feel the weak spots in their company. When you start to sense problems, the cost of fixing them seems outlandish. But overtime the scales tip. Often, when the scales tip, the problem has grown to overwhelming proportions for employees to handle alone.

The scale had long ago tipped for this client.  Our team’s role was to quickly identify causes of communication problems, and orchestrate a long-term plan and processes to mitigate risks.

Over a period of few weeks, we surveyed the office, field, and rigs in two different continents. We went through a full cycle of process improvement. At the end we were able to divide their information communications needs into four process categories: 1) Documents and Data Management 2) Decisions Documentation 3) Security and Access Management 4) Request Management.

Our plan started with ‘Quick Wins’ that changed the way the subsidiary did business in the first month. Imagine being able to institute relevant changes in your company in one month. Yes, it was that easy to solve. The rest of the implementation plan spanned over 4 months. Communication policies, standards and procedures were to be defined and complied to across the organization.

We all know that the cost of fixing is cheap compared to the cost of cleaning up a huge mess later.

The costs of missed opportunities, reduced stock prices, or the cost of million-dollar lawsuits make this kind of projects important, combine that with the relevant low fixing cost, makes this project a “high” priority.

I believe a company needs to do more than simply comply with government or JV partner contracts. To build strong relationships, you must be able to readily prove your compliance. That’s just good business.

Our client’s new transparent business practices allow the government to view them as a serious and trusted part of the country’s future. It is impossible to put a price on a valued relationship. But successful business people know that gaining trust means big business over time.

What about your company? Is it starting to feel the risks of outdated communication systems?

$250 Million Oil Take-Over Deal Implodes Due To Disastrous Data Management

As professionals in the oil and gas sector we all know that when it comes to a merger and acquisition (M&A) that having access to quality data is essential. In its absence deals don’t get made, investors lose $000,000s and livelihoods are put at risk.

So we were pretty taken aback recently to hear of one deal – of a public company – which fell through because the organization couldn’t even list their complete assets with confidence – such was the mess of their data.

We were talking with a CEO recently who “vented” about a recently failed acquisition.  He is a major player who has worked in the sector since the mid-1970s, he told us here why the $150 Million to $250 million investment his company was prepared to make didn’t just fall flat, but imploded:  “Despite asking this company repeatedly to give us access to their “complete” data sets they failed to deliver time and again. We became increasingly frustrated and discouraged to the extent we wouldn’t even make a proposal in the region of $80 million for the company.  What was so galling to us was that it was obvious this company badly needed an investor and had approached us to bid”

We all know what data is needed for M&A investments to happen, some of which we can get from public records and from commercial organizations such as I.H.S and Drilling Info (in the USA). But those sources alone are not nearly sufficient. So what were they thinking? Did they think data would take care of itself? Or was someone just not doing his/her job well?

The CEO continues “…. in the past when companies were under pressure, typically a lot of data got swept under the rug as it were. Today though, investors demand tighter regulation of data and I suspect that, because of this, in ten years’ time some companies just aren’t going to make it. If our company had been allowed to invest and take over we could have solved many of the organization’s problems, saved some jobs and even added value. Sadly, in this event, due to poor management of critical data that scenario was never allowed to take place. The deal never even got past the first hurdle. No-one is going to invest $millions when they don’t have a clue of (or confidence in the data of) what they’re buying.”

Considering this was a company which had a responsibility for public money the management team should never have been allowed free rein without critical data management regulations or at the very least “guidelines”.

What is your opinion?