After a series of high-profile accidents involving gas transmission pipelines, in 2014 the US National Transportation Safety Board (NTSB) commissioned a study to see what could be done to lower the incidence rate. The report, “Integrity Management of Gas Transmission Pipelines in High Consequence Areas” included an analysis of how pipeline quality data was gathered, used, and shared. A close look at the report offers some interesting insight into engineering data management issues.
The NTSB report on Integrity Management (IM) published 33 findings; seven of the specifically mention data management issues. Following the findings, the report listed 22 recommendations to the Pipeline and Hazardous Materials Safety Administration, seven of which specifically mention data handling.
NTSB Report Findings
Reading the document closely reveals strong interest by the NTSB about geospatial data. There are recommendations to various government agencies about improving the national repository of geospatial data. There were discoveries about data integration, and how lax standards or low compliance to existing guidelines were a problem. The strength of an effective IM system depends on “its ability to merge and utilize multiple data elements obtained from several sources to provide an improved confidence” about possible safety threats.
Later in the report, the authors note the pipeline industry uses three standards for creating and storing geospatial data, two of which are open and one is proprietary. One common theme the report team found was a desire to use GIS to provide operators with “a single source of authoritative data accessible throughout all parts of a gas transmission company.” One company they interviewed seemed to be the exception. The unnamed company “showed that their GIS capabilities allow the IM division to maintain version control of the company’s authoritative pipeline data and to share this information easily… from the chief executive officer to the local engineer at the pipeline facilities.”
There were also three recommendations suggesting changes to existing report forms the pipeline companies are required to use. The report shows what the unnamed company did was to combine the engineering side (geospatial data) with the operational side (forms and reports). This can’t be done with just the engineering software; it takes purpose-built engineering data management software. But such software specifics were not included in the report.
How Sunoco Logistics Meets Regulatory Compliance
Sunoco is an energy company that knows first-hand how to marry the engineering data with the operations side. The Logistics division found itself with multiple demands for greater regulatory compliance, higher security and safety standards, and more detailed reporting, but its older document management system was not up to the complexity.
Sunoco Logistics replaced the outdated system with Synergis Adept. It was initially used in Engineering to manage and store AutoCAD files, but quickly grew into being the central repository organizing and controlling access to virtually all file types, from pictures to scanned engineering drawings and maps. Today Sunoco Logistics uses Adept to store and manage CAD, TIFF, PDF, JPG, CALS, old raster files, AutoCAD, AutoCAD Raster Design, MicroStation, ERSI Shape files, Excel, and Word documents. It now also uses Adept to manage logistical data for its role as a common carrier of oil, and it takes advantage of Adept’s security capabilities to gain compliance with all relevant DOT security standards.
If you want more information, Synergis has prepared a case study on Sunoco Logistics’ engineering data management upgrade.
Randall S. Newton is the principal analyst and managing director at Consilia Vektor, a consulting firm serving the engineering software industry. He has been directly involved in engineering software in a number of roles since 1985.