What is an "Industrial Plant Data Set"
The "Industrial Plant Dataset" is a complex amalgam of synchronous and asynchronous data types and data sources which must be collected, checked, structured and organized to service the business and operational scenarios of users, applications and business process.
- Operating Conditions: Flow, pressure, temperatures, levels which represent the state of materials and equipment at any given time or for a given time interval
- Product Quality Data: lab results typically coming from LIMS, out of order relative to the operating conditions often crossing across reconciliation periods. It includes also s analyzer results and any property sensor (e.g. viscosity, density, etc)
- Inventory Data: volume to mass conversation and compensation, strapping tables, etc.
- Material Transfers and movements: list of all material movements (receipts, shipments, internal transfers, feed-ins, rundowns) with their respective start time, end time, origin and destinations and material transferred.
- Utilities, Chemicals, Catalysts and Energy Consumptions: all ancillary measurements made on systems supporting mainstream operations. Similar in nature to all of the above but originating from different physical systems or networks.
- Planning & Scheduling Data: data about the expectations which can related to any of the above categories and can help assess the actual vs. planned results.
- Design data of equipment and systems: to understand the capability of the plant, create a reference for checks and balances and understand performance.
Assembling these disparate data types into a usable dataset requires skills, knowledge, and experience. This is the job description of Sigmafine and core compentency of Pimsoft.
A well conditionned Industrial Plant Dataset is the only way to enable users, applications and business processes to perform according to expectations and beyond.
"Only trustable data, available at the right time to the right people which can take decisions are really useful to increate the company's income"
Walter Mantelli, Technical Director, IPLOM S.p.A, Sigmafine Users Meeting - 2015
Defining Data Quality in the Process Industries
Data Quality is an abstract concept until we are confronted with bad data or information which is not usable, not credible, not presented correctly, not accurate, etc.. Then Data Quality becomes a very concrete experience. While, there are many definitions for Data Quality, all of them converge to the same focal point. “fitness for use” by people, applications and business processes.
Tolerance for bad data
Whether it is called “Industry 4.0”, “Edge Computing”, “Big Data” or “IOT”, the tolerance of People, Applications and Business Processes to poor data quality in industrial plants is diminishing rapidly. The Modern process industries thrive on readily usable information and credible data to deliver sustainable tangible business results.