Tag Archives: data architecture

Why Connecting Silos With Better IM Architecture Is Important

If you work in an oil and gas company, then you are familiar with the functional divides. We are all familiar with the jokes about geologists vs. engineers. We laugh and even create our own. But jokes aside, oil and gas companies operate in silos and with reason.

But while organizational silos may be necessary to excel and maintain standards of excellence, collaboration and connection across the silos are crucial for survival.

For an energy company to produce hydrocarbons from an asset, it needs all the departments to work together (geoscience, engineering, finance, land, supply chain …etc.). This requires sharing of detailed information and collaborating beyond meeting rooms and email attachments. But the reality in many oil and gas companies today is different, functional silos extend to information silos.

Connected Silos Are Good. Isolated Silos Are Bad

In an attempt to connect silos, “Asset Teams” or “Matrix” organizations are formed and incentive plans are carefully crafted to share goals between functions. These are great strides, but no matter the organizational structure, or the incentive provided, miscommunications, delays, and poor information hand-over are still common place. Until we solve the problem of seamless information sharing, the gap between functional departments will persist; because we are human and we rationalize our decisions differently.  This is where technology and automation (if architected correctly) can play a role in closing the gap between the silos.

Asset team members and supporting business staff have an obligation to share information not only through meetings and email attachments but through organizing and indexing asset files throughout the life of the asset. Fit-for-Purpose IM architecture has a stratigic role to play in closing the gap between the functional silos.  

Connecting Functional Silos With IM Takes Vision & Organizational Commitment 

Advancements in IM (Information Management) and BPMS (Business Process Management Systems) can easily close a big part of the remaining gap. But many companies have not been successful in doing so, despite significant investments in data and process projects. There can be many reasons for this, I share with you two of the most common pitfalls I come across:

  • Silo IM projects or systems –  Architecting and working on IM projects within one function without regard to impact on other departments. I have seen millions of dollars spent to solve isolated geoscience data needs, without accounting for impact on engineering and land departments. Or spent on Exploration IM projects without regard to Appraisal and Development phases of the asset. Quite often, organizations do not take the time to look at the end-to-end processes and its impact on company’s goals. As a result, millions of dollars are spent on IM projects without bringing the silos any closer.  Connecting silos through an IM architecture requires a global vision.
  • Lack of commitment to enterprise standards – If each department defines and collects information according to their own needs without regard of the company’s needs, it is up to other departments to translate and reformat. This often means rework and repetitive verification whenever information reaches a new departmental ‘checkpoint’.

The above pitfalls can be mitigated by recognizing the information dependencies and commonalities between departments then architecting global solutions based on accepted standards and strong technology. It takes a solid vision and commitment.

For a free consultation on how to connect silos effectively, please schedule your appointment with a Certis consultant. Email us at info@certisinc.com or call us on 281-377-5523.

What Impact Does Big Data Technology Bring To Oil and Gas?

Dealing with the massive influx of information gathered from exploration projects or real time gauges at the established fields is pushing the traditional data-management architecture to its limits in the oil and gas industry. More sensors, from 4-D seismic or from fiber optics in wells, crack the gap wider between data capture advancements and the traditional ways of managing and analyzing data. It is the challenge of managing the sheer volume of collected data and the need to sift through it in a timely fashion that Big Data technologies can promise to help us solve.  This was just one of the suggestions on the table at the recent Data Management Workshop I attended in Turkey earlier this month.

For me, one of the main issue with the whole Big Data concept within the oil and gas industry is that, while it sounds promising, it has yet to deliver tangible return that companies need to see in order to prove its worth.  To overcome this dilemma, Big Data vendors such as TeraData, Oracle, and IBM should consider demonstrating concrete new examples of real-life oil & gas wins. By new I mean challenges that are not possible to solve with traditional data architecture and tools. Vendors should also be able to offer Big Data technology at a price that makes it viable for the companies to “try” it and experiment.

The Oil and Gas industry is notoriously slow to adopt new software technology, particularly when it comes to anything that tries to take the place of traditional methods that have proven to work already, unless its value is apparent.  To quote my good friend ” we operate with fat margins we don’t feel the urgency”.  However, E&P companies should put their creative hats on to work alongside Big Data technology vendors. Big Data may just be the breakthrough that we need to make a tangible step-change in how we consume and analyse subsurface and surface data with agility.

If either side, vendors and E&P companies, fail to deliver, Big Data becomes a commercial white elephant and is doomed to very slow adoption.

At the workshop we had Oracle, Teradata, and IBM all showing interesting tools. However they showed examples from other industries and occasionally referred to examples that are possible to solve with the conventional data technology. They left the audience still wondering!

One Big Data example that is relevant and hits home was presented by CGG. CGG used pattern recognition (on Teradata technology) to find all logs that exhibit a specific pattern a petrophysicist may be interested in. This type of analysis require scanning through millions of log curves, not just meta-data which is what we had been bound to in traditional architecture. This opens up new horizons to serendipity and who knows maybe to new discoveries.