Server maintenance is scheduled for Saturday, December 21st between 6am-10am CST.

During that time, parts of our website will be affected until maintenance is completed. Thank you for your patience.

Search
Filters
Close

Aggregating And Standardizing Disjointed Integrity Management Data

Regulation and business needs of an organization are primary drivers for the decisions made in a pipeline integrity management data plan. How data is acquired and stored can vary greatly as determined by those factors. Two production databases and multiple remote sensor vendor databases are referenced in this text, each with a business case to operate independently.

Product Number: 51322-17646-SG
Author: Robert Floria, Alfonso Garcia
Publication Date: 2022
$0.00
$20.00
$20.00

Data management is a critical component of an integrity management plan. Current products and services in the integrity management sector can generate an enormous amount of uncontrolled and disjointed data, housed on multiple platforms. This text examines methods of mapping and linking multiple data sources to achieve optimal data usefulness, while reducing redundant data, through use of spatial and relational techniques. By defining relationships between fixed points, linear values can be generated from calibrated routes. Developing methods to introduce new data, standardized from spatial data, serves to maintain data quality. Recurring data transfer logistics, using relational keys in conjunction with ETL procedures, serve to link databases. Value is achieved on a large-scale using girth welds to automate the process of generating mile post values for point features. Data generated at remote sensors are aggregated from multiple vendors and populated using an exchange governed by universally unique identifiers (UUID).

Data management is a critical component of an integrity management plan. Current products and services in the integrity management sector can generate an enormous amount of uncontrolled and disjointed data, housed on multiple platforms. This text examines methods of mapping and linking multiple data sources to achieve optimal data usefulness, while reducing redundant data, through use of spatial and relational techniques. By defining relationships between fixed points, linear values can be generated from calibrated routes. Developing methods to introduce new data, standardized from spatial data, serves to maintain data quality. Recurring data transfer logistics, using relational keys in conjunction with ETL procedures, serve to link databases. Value is achieved on a large-scale using girth welds to automate the process of generating mile post values for point features. Data generated at remote sensors are aggregated from multiple vendors and populated using an exchange governed by universally unique identifiers (UUID).

Product tags
Also Purchased
Picture for New Frontiers for the Pipeline Integrity Management
Available for download

New Frontiers for the Pipeline Integrity Management

Product Number: MPWT19-15016
Author: Matteo Mattioli, Paolo Cherubini, Andrea Baldoni
Publication Date: 2019
$0.00