Finding the technology to host the data
Migration of the data to the new solution
To achieve sustainable success in data management projects or any projects for that matter, it is necessary to consider the context surrounding the project, not just the specifics. Without this context, like the unfortunate lawyer, your project too can look forward to a rather significant fall.
I had dinner with one of the VPs of a major E&P company last week. One of the hot topics on the table was about universities agreeing to offer MSc in Petroleum Data Management. Great idea! I thought. But it brought so many questions to my mind.
Considering where our industry is with information management (way behind many other industries), I asked who will define the syllabus for this PDM MSc? The majors? The service companies? Small independents? Boutique specialized service providers? IMS professors? All of the above?
Should we allow ourselves to be swayed by the majors and giant service companies? With their funding they certainly have the capability to influence the direction, but is this the correct (or only) direction? I can think of few areas where majors implementation of DM would be an overkill for small independents, they would get bogged down with processes that make it difficult to be agile! The reason that made the independents successful with the unconventional.
What should the prerequisite be? A science degree? Any science degree? Is a degree required at all? I know at least couple exceptional managers, managing data management projects and setting up DM from scratch for oil and gas companies, they manage billions of dollars’ worth of data. They do not have a degree, what happens to them?
It takes technology to manage the data. MSc in Petroleum Data Management is no different. But unlike petroleum engineering and geoscience technologies, technology to manage data progresses fast, what is valid today may not still be valid next year! Are we going to teach technology or are we teaching about Oil and Gas data? This is an easy one, at least in my mind it is, we need both. But more about the data itself and how it is used to help operators and their partners be safer, find more and grow. We should encourage innovation to support what companies need.
PPDM – http://www.ppdm.org/ is still trying to define some standards, POSC (Petrochemical Open Standards Consortium (I think that is what it stands for, but not sure) came and went, Energistics – http://www.energistics.org/ is here and is making a dent, Openspirit – http://www.tibco.com/industries/oil-gas/openspirit made a dent but is no longer non-profit. Will there be standards that are endorsed by the universities?
The variations from company to company in how data management is implemented today is large. Studying and comparing the different variations will make a good thesis I think…
I am quite excited about the potential of this program and will be watching developments with interest.
Effective data management, specifically in the exploration and production (E&P) business, has a significant positive impact on operational efficiency and profitability of oil and gas companies. Upstream companies are now realizing that a separate “Data Management” or “Data Services” department is needed in addition to the conventional IT department. Those Departments’ key responsibilities are to “professionally” and “effectively” manage E&P technical data assets worth millions, in some cases, billions of dollars.
Traditional Data Management Processes Cannot Keep up with Today’s Industry Information Flow
Currently, day-to-day “data management” tasks in the oil and gas industry are directed and partially tracked using Excel spreadsheets, emails and phone calls. One of the companies I visited last month, was using excel to validate receipt of seismic data against contracts and PO; e.g. all surveys and all their associated data. Another one used excel to maintain a list of all Wire-line Log data ordered by petrophysicists in a month to compare to end-of-the-month invoices.
Excel might be adequate if an E&P company is small and has little ambition to grow. However, the larger a company’s capital (and ambitions) the more information and processes are involved to manage data and documents’ life cycle. Consider more than 20,000 drilling permits issued a year in Texas alone. Trying to manage this much information with a spreadsheet, some tasks are bound to fall through the cracks.
Senior managers are more interested than ever in the source , integrity and accuracy of information that affect and influence their HSE and financial statements.
Providing senior managers with such information requires transparent data management processes that are clearly defined, repeatable, verifiable and that allow managers to identify, evaluate and address or alleviate any obvious or potential risks…. before they become a risk. Preferably all delivered efficiently and cost-effectively.
Choosing the Right E&P Data Management Workflow Tools
It’s tempting to stay with the old way of doing things – the “way we have always done it” – because you have already established a certain (and personal), rhythm of working, inputting and managing data. Even the promise of improved profitability and efficiency is often not enough to convince people to try something new. But the advantages of new workflow tools and programs cannot and should not be underestimated.
For example, a workflow tool can help automate the creation of data management tasks, log and document technical meta data, track data related correspondence, alert you for brewing issues. When all set and done, data management department would be set for growth and for handling more load of work with out skipping a beat. Growing by adding more people is not sustainable.
So, where to start?
There are multiple data management workflow tools available from a variety of different vendors, so how do you know which one will work best for your company? You will need to ensure that your workflow tool or software is able to do the following:
- Keep detailed technical documentation of incoming data from vendors, thereby minimizing duplication of work associated with “cataloging” or “tagging”;
- Integrate with other systems in your organizations such as Seismic, Contracts, Accounting…etc., including proprietary software programs;
- Allow sharing of tasks between data managers;
- Enable collaboration and discussion to minimize scattered email correspondence; and,
- Automatically alert others of issues such as requests that still need to be addressed
Dealing with the massive influx of information gathered from exploration projects or real time gauges at the established fields is pushing the traditional data-management architecture to its limits in the oil and gas industry. More sensors, from 4-D seismic or from fiber optics in wells, crack the gap wider between data capture advancements and the traditional ways of managing and analyzing data. It is the challenge of managing the sheer volume of collected data and the need to sift through it in a timely fashion that Big Data technologies can promise to help us solve. This was just one of the suggestions on the table at the recent Data Management Workshop I attended in Turkey earlier this month.
For me, one of the main issue with the whole Big Data concept within the oil and gas industry is that, while it sounds promising, it has yet to deliver tangible return that companies need to see in order to prove its worth. To overcome this dilemma, Big Data vendors such as TeraData, Oracle, and IBM should consider demonstrating concrete new examples of real-life oil & gas wins. By new I mean challenges that are not possible to solve with traditional data architecture and tools. Vendors should also be able to offer Big Data technology at a price that makes it viable for the companies to “try” it and experiment.
The Oil and Gas industry is notoriously slow to adopt new software technology, particularly when it comes to anything that tries to take the place of traditional methods that have proven to work already, unless its value is apparent. To quote my good friend ” we operate with fat margins we don’t feel the urgency”. However, E&P companies should put their creative hats on to work alongside Big Data technology vendors. Big Data may just be the breakthrough that we need to make a tangible step-change in how we consume and analyse subsurface and surface data with agility.
If either side, vendors and E&P companies, fail to deliver, Big Data becomes a commercial white elephant and is doomed to very slow adoption.
At the workshop we had Oracle, Teradata, and IBM all showing interesting tools. However they showed examples from other industries and occasionally referred to examples that are possible to solve with the conventional data technology. They left the audience still wondering!
One Big Data example that is relevant and hits home was presented by CGG. CGG used pattern recognition (on Teradata technology) to find all logs that exhibit a specific pattern a petrophysicist may be interested in. This type of analysis require scanning through millions of log curves, not just meta-data which is what we had been bound to in traditional architecture. This opens up new horizons to serendipity and who knows maybe to new discoveries.