Author Archives: estahe1304

Crossing The Border From a Mere Change to Cultural Expectation for QUALITY DATA

Culture sets certain expectations of behavior, and once accepted, there is no deviation. Even if you are removed from the cultural origin, these behaviors are ingrained and follow long after. 
I recently experienced this first hand, when a dear friend of mine was diagnosed with cancer. Of course, I was very distraught. When, a few weeks later, he was admitted to hospital for surgery, I visited him and his wonderful wife. This was natural to me, visiting a sick or injured friend at home or in the hospital, is not only a kind gesture, but is an expected social obligation ingrained in me since childhood.  What seemed to my friends as a thoughtful gesture was something I could not imagine not doing, or imagine friends of my culture not doing for me.
 
It made me wonder what makes a behavior “culturally” accepted and ingrained? When did visiting a sick friend become more than a thoughtful gesture, and cross the barrier into social obligation? How did this transition occur?  
 
These musings extended to Oil and Gas corporate culture. What behaviors were so ingrained at work that they had become second nature? Did they serve a purpose, such as to improve data quality? If not, what would it take to weave in these behaviors, and make them the expected social norm, and a clear moral obligation or expected practice within an organization? In an ideal world, these cultural obligations would lead to employees and employers alike feeling that it is “on them” to report and correct data quality issues, no matter at what point in the process they were discovered.  
 
 I thought it might be a good idea to ask my readers these questions. Are such behaviors ingrained in your workplace deep enough to be considered cultural? How would you weave them in, if not? If they are a part of your corporate culture, can you point to any policies and practices that may have led to this?  

Reminded Again, Narrow Focus Leads To Failure Every Time. Why do Some Data Projects Never Make It?

toronto
In 1993, an incident occurred in the Toronto Dominion Bank Tower that caught national attention, enough so that it made the infamous “Darwin Awards”. A lawyer, in an attempt to demonstrate the safety and strength of the building’s windows to visiting law students, crashed through a pane of glass with his shoulder and fell 24 floors to his death. Maybe the Glass did not break but it pulled off the wall. 
 
The lawyer made a classic mistake, he had focused on one specific thing to the exclusion of the big picture. If he had taken a look at his hypothesis from a wider angle, he might have considered the numerous other factors  that may have contributed to his doomed demonstration – the bond between the glass and the frame,  the yielding effect of material after repeated tests, or simply the view of the courtyard below (the high risk should it fail) might have been enough to make him reconsider his “leap of logic”. He focused on a specific item and ignored the other factors. 
 
Such a narrowed focus is equally risky to an information management project, or any project really. Although we are getting better we often focus on one thing: technology implementation and ignore other aspects.
From my experience, many factors contribute to the success or failure of information management in Oil & Gas projects. People, technology, processes, legacy data, Integration, a company’s culture, operational model, infrastructure, time constraints, or external influences such as vendors and partners, just to name a few. Each has a degree of influence on the project, but rarely will they cause the demise of the project – unless they are ignored! The key to success in any project is the consideration of all aspects, and an assessment of the risks they impose, prior to spending millions.
As an example, let’s look at survey data. How would you manage that data?
Often, companies focus on two elements:
  • Finding the technology to host the data
  • Migration of the data to the new solution
Success is declared at the end of these two steps, but two years down the road, the business has not embraced the solution, or worse yet, they continue to see incomplete surveys, a problem the new technology was supposed to solve. Failure, in this case, is less abrupt than an appointment with the Toronto Dominion Courtyard, but it is failure nonetheless.
 
More often than not, projects like the one above fail to take into consideration the other aspects that will keep data quality intact.
Even more often, these projects fail to consider external factors such as data acquisition vendors. These external vendors have their own processes and formats. If your project ignores our increasingly integrated world, and cannot cooperate with the processes, technology, and data formats of key external vendors and business partners, your project will yield very limited results and will not be sustainable. 

To achieve sustainable success in data management projects or any projects for that matter, it is necessary to consider the context surrounding the project, not just the specifics. Without this context, like the unfortunate lawyer, your project too can look forward to a rather significant fall.

A master’s degree in Petroleum Data Management?

I had dinner with one of the VPs of a major E&P company last week. One of the hot topics on the table was about universities agreeing to offer MSc in Petroleum Data Management. Great idea!  I thought. But it brought so many questions to my mind.

 

Considering where our industry is with information management (way behind many other industries), I asked who will define the syllabus for this PDM MSc? The majors? The service companies? Small independents? Boutique specialized service providers? IMS professors? All of the above?

 

Should we allow ourselves to be swayed by the majors and giant service companies? With their funding they certainly have the capability to influence the direction, but is this the correct (or only) direction? I can think of few areas where majors implementation of DM would be an overkill for small independents, they would get bogged down with processes that make it difficult to be agile! The reason that made the independents successful with the unconventional.

 

What should the prerequisite be? A science degree? Any science degree? Is a degree required at all? I know at least couple exceptional managers, managing data management projects and setting up DM from scratch for oil and gas companies, they manage billions of dollars’ worth of data. They do not have a degree, what happens to them?

 

It takes technology to manage the data.  MSc in Petroleum Data Management is no different. But unlike petroleum engineering and geoscience technologies, technology to manage data progresses fast, what is valid today may not still be valid next year! Are we going to teach technology or are we teaching about Oil and Gas data? This is an easy one, at least in my mind it is, we need both. But more about the data itself and how it is used to help operators and their partners be safer, find more and grow. We should encourage innovation to support what companies need.

 

PPDM – http://www.ppdm.org/ is still trying to define some standards, POSC (Petrochemical Open Standards Consortium (I think that is what it stands for, but not sure) came and went, Energistics – http://www.energistics.org/ is here and is making a dent, Openspirit – http://www.tibco.com/industries/oil-gas/openspirit made a dent but is no longer non-profit. Will there be standards that are endorsed by the universities?

The variations from company to company in how data management is implemented today is large. Studying and comparing the different variations will make a good thesis I think…

I am quite excited about the potential of this program and will be watching developments with interest.

 

E&P Companies Looking to New Ways to Deliver Data Management Services to Improve efficiency and transparency

Effective data management, specifically in the exploration and production (E&P) business, has a significant positive impact on operational efficiency and profitability of oil and gas companies. Upstream companies are now realizing that a separate “Data Management” or “Data Services” department is needed in addition to the conventional IT department. Those Departments’ key responsibilities are to “professionally” and “effectively” manage E&P technical data assets worth millions, in some cases, billions of dollars.

Traditional Data Management Processes Cannot Keep up with Today’s Industry Information Flow 

Currently, day-to-day “data management” tasks in the oil and gas industry are directed and partially tracked using Excel spreadsheets, emails and phone calls. One of the companies I visited last month, was using excel to validate receipt of seismic data against contracts and PO; e.g. all surveys and all their associated data. Another one used excel to maintain a list of all Wire-line Log data ordered by petrophysicists in a month to compare to end-of-the-month invoices.

Excel might be adequate if an E&P company is small and has little ambition to grow. However, the larger a company’s capital (and ambitions) the more information and processes are involved to manage data and documents’ life cycle. Consider more than 20,000 drilling permits issued a year in Texas alone. Trying to manage this much information with a spreadsheet, some tasks are bound to fall through the cracks.

Senior managers are more interested than ever in the source , integrity and accuracy of information that affect and influence their HSE and financial statements.
Providing senior managers with such information requires transparent data management processes that are clearly defined, repeatable, verifiable and that  allow managers to identify, evaluate and address or alleviate any obvious or potential risks…. before they become a risk. Preferably all delivered efficiently and cost-effectively.

Choosing the Right E&P Data Management Workflow Tools

It’s tempting to stay with the old way of doing things – the “way we have always done it” – because you have already established a certain (and personal), rhythm of working, inputting and managing data. Even the promise of improved profitability and efficiency is often not enough to convince people to try something new. But the advantages of new workflow tools and programs cannot and should not be underestimated.

For example, a workflow tool can help automate the creation of data management tasks, log and document technical meta data,  track data related correspondence, alert you for brewing issues. When all set and done, data management department would be set for growth and for handling more load of work with out skipping a beat.  Growing by adding more people is not sustainable.

So, where to start?

There are multiple data management workflow tools available from a variety of different vendors, so how do you know which one will work best for your company? You will need to ensure that your workflow tool or software is able to do the following:

  • Keep detailed technical documentation of incoming data from vendors, thereby minimizing duplication of work associated with “cataloging” or “tagging”;
  • Integrate with other systems in your organizations such as Seismic, Contracts, Accounting…etc., including proprietary software programs;
  • Allow sharing of tasks between data managers;
  • Enable collaboration and discussion to minimize scattered email correspondence; and,
  • Automatically alert others of issues such as requests that still need to be addressed

 

 

What Impact Does Big Data Technology Bring To Oil and Gas?

Dealing with the massive influx of information gathered from exploration projects or real time gauges at the established fields is pushing the traditional data-management architecture to its limits in the oil and gas industry. More sensors, from 4-D seismic or from fiber optics in wells, crack the gap wider between data capture advancements and the traditional ways of managing and analyzing data. It is the challenge of managing the sheer volume of collected data and the need to sift through it in a timely fashion that Big Data technologies can promise to help us solve.  This was just one of the suggestions on the table at the recent Data Management Workshop I attended in Turkey earlier this month.

For me, one of the main issue with the whole Big Data concept within the oil and gas industry is that, while it sounds promising, it has yet to deliver tangible return that companies need to see in order to prove its worth.  To overcome this dilemma, Big Data vendors such as TeraData, Oracle, and IBM should consider demonstrating concrete new examples of real-life oil & gas wins. By new I mean challenges that are not possible to solve with traditional data architecture and tools. Vendors should also be able to offer Big Data technology at a price that makes it viable for the companies to “try” it and experiment.

The Oil and Gas industry is notoriously slow to adopt new software technology, particularly when it comes to anything that tries to take the place of traditional methods that have proven to work already, unless its value is apparent.  To quote my good friend ” we operate with fat margins we don’t feel the urgency”.  However, E&P companies should put their creative hats on to work alongside Big Data technology vendors. Big Data may just be the breakthrough that we need to make a tangible step-change in how we consume and analyse subsurface and surface data with agility.

If either side, vendors and E&P companies, fail to deliver, Big Data becomes a commercial white elephant and is doomed to very slow adoption.

At the workshop we had Oracle, Teradata, and IBM all showing interesting tools. However they showed examples from other industries and occasionally referred to examples that are possible to solve with the conventional data technology. They left the audience still wondering!

One Big Data example that is relevant and hits home was presented by CGG. CGG used pattern recognition (on Teradata technology) to find all logs that exhibit a specific pattern a petrophysicist may be interested in. This type of analysis require scanning through millions of log curves, not just meta-data which is what we had been bound to in traditional architecture. This opens up new horizons to serendipity and who knows maybe to new discoveries.