There is increased demand from our Oil & Gas clients for regular consolidation of subsurface data from multiple sources. The data then needs to be distributed across internal and external networks and databases. So with a complex range of data sources, how do we effectively extract the exact data streams required? How can we aggregate, replicate and transform the data appropriately into the required target data store or application?

In the complicated Exploration & Production (E&P) environment, our experience has shown that rules and standards are a crucial factor. In a future blog post, we will look at methods to minimise duplication, reduce implementation effort and maximise the integrity and maintainability of the rules.  Along with rules, we also need to make sure that we’re sharing the exact same data across the enterprise.  And then we need to share the data with joint venture partners, service organisations or with the regulatory bodies. Organisations and businesses within the Oil & Gas industry require data to be quality checked, cleaned, and possibly aggregated and/or transformed to meet the needs of an application or for compliance reporting.

Our solution is a managed service that has been employed in other industries and is now available to the E&P domain. DataHub is an extensible and fully scalable service to manage data in a range of formats, such as files, CSV, Excel or XML. It can manage databases in common data models, such as PPDM, custom data models, and industry standards, including WITSML and PRODML. Using an application that acts as a hub, our consultants extract, manipulate, verify, aggregate and then migrate data to the selected application or third party.

The DataHub software implemented by our consultants consists of a core that forms the central part of the hub. The hub, which can share data bi-directionally with any number of databases, has points that are data sources. Data is securely extracted from these multiple points via the internet or an intranet and consolidated at our data centre or a location of the client’s choice. The data is cleansed using data quality rules and shared with the client or third party. Once established, the hub operates automatically through a central configuration management facility. The facility manages any changes in functionality or software and automatically updates each location. This reduces the need for maintenance and leaves a full audit trail to provide a record of historic data.

Using DataHub gives E&P organisations the confidence and assurance that their data is consistent and quality checked across their enterprise. Read more about DataHub.

Further reading

    Did you like this article? Get our new articles in your inbox with our occasional email newsletter.

    We will never share your details with anyone else, except that your data will be processed securely by EmailOctopus (https://emailoctopus.com/) in a third country, and you can unsubscribe with one click at any time. Our privacy policy: https://www.etlsolutions.com/privacy-policy/.