Content deleted Content added
No edit summary Tags: Mobile edit Mobile web edit |
Rescuing 1 sources and tagging 0 as dead.) #IABot (v2.0.9.5 |
||
(21 intermediate revisions by 17 users not shown) | |||
Line 1:
{{Short description|A technology to correct measurements in industrial processes}}
'''Industrial process data validation and reconciliation''', or more briefly, '''process data reconciliation (PDR)''', is a technology that uses process information and mathematical methods in order to automatically ensure [[data validation]] and reconciliation by correcting measurements in industrial processes. The use of PDR allows for extracting accurate and reliable information about the state of industry processes from raw measurement [[data]] and produces a single consistent set of data representing the most likely process operation.
==Models, data and measurement errors==
Line 12 ⟶ 13:
File:Normal_with_bias.jpg|Normally distributed measurements with bias.
</gallery>
Data originates typically from [[measurements]] taken at different places throughout the industrial site, for example temperature, pressure, volumetric flow rate measurements etc. To understand the basic principles of
# [[random error]]s due to intrinsic [[sensor]] [[accuracy]] and
# [[systematic errors]] (or gross errors) due to sensor [[calibration]] or faulty data transmission.
Line 31 ⟶ 32:
==History==
,<ref name="Stanley-Mah-1977">G.M. Stanley and R.S.H. Mah, [http://gregstanleyandassociates.com/AIChEJ-1977-EstimationInProcessNetworks.pdf ''Estimation of Flows and Temperatures in Process Networks'', AIChE Journal 23: 642–650, 1977.]</ref><ref>P. Joris, B. Kalitventzeff, ''Process measurements analysis and validation'', Proc. CEF’87: Use Comput. Chem. Eng., Italy, 41–46, 1987.</ref> Quasi steady state dynamics for filtering and simultaneous parameter estimation over time were introduced in 1977 by Stanley and Mah.<ref name="Stanley-Mah-1977"/> Dynamic
▲<ref>P. Joris, B. Kalitventzeff, ''Process measurements analysis and validation'', Proc. CEF’87: Use Comput. Chem. Eng., Italy, 41–46, 1987.</ref> Quasi steady state dynamics for filtering and simultaneous parameter estimation over time were introduced in 1977 by Stanley and Mah.<ref name="Stanley-Mah-1977"/> Dynamic DVR was formulated as a nonlinear optimization problem by Liebman et al. in 1992.<ref>M.J. Liebman, T.F. Edgar, L.S. Lasdon, ''Efficient Data Reconciliation and Estimation for Dynamic Processes Using Nonlinear Programming Techniques'', Computers Chem. Eng. 16: 963–986, 1992.</ref>
==Data reconciliation==
Line 55:
The term <math>\left(\frac{y_i^*-y_i}{\sigma_i}\right)^2\,\!</math> is called the ''penalty'' of measurement ''i''. The objective function is the sum of the penalties, which will be denoted in the following by <math>f(y^*)=\sum_{i=1}^n\left(\frac{y_i^*-y_i}{\sigma_i}\right)^2</math>.
In other words, one wants to minimize the overall correction (measured in the least squares term) that is needed in order to satisfy the [[constraint (mathematics)|system constraints]]. Additionally, each least squares term is weighted by the [[standard deviation]] of the corresponding measurement. The standard deviation is related to the accuracy of the measurement. For example, at a 95% confidence level, the standard deviation is about half the accuracy.
===Redundancy===
Line 66:
Redundancy can be due to [[redundancy (engineering)|sensor redundancy]], where sensors are duplicated in order to have more than one measurement of the same quantity. Redundancy also arises when a single variable can be estimated in several independent ways from separate sets of measurements at a given time or time averaging period, using the algebraic constraints.
Redundancy is linked to
[
Topological redundancy is intimately linked with the [[degrees of freedom (physics and chemistry)|degrees of freedom]] (<math>dof\,\!</math>) of a mathematical system,<ref name="vdi">VDI-Gesellschaft Energie und Umwelt, "Guidelines - VDI 2048 Blatt 1 -
When speaking about topological redundancy we have to distinguish between measured and unmeasured variables. In the following let us denote by <math>x\,\!</math> the unmeasured variables and <math>y\,\!</math> the measured variables. Then the system of the process constraints becomes <math>F(x,y)=0\,\!</math>, which is a nonlinear system in <math>y\,\!</math> and <math>x\,\!</math>.
Line 90:
We incorporate only flow conservation constraints and obtain <math>a+b=c\,\!</math> and <math>c=d\,\!</math>. It is possible that the system <math>F(x,y)=0\,\!</math> is not calculable, even though <math>p-m\ge 0\,\!</math>.
If we have measurements for <math>c\,\!</math> and <math>d\,\!</math>, but not for <math>a\,\!</math> and <math>b\,\!</math>, then the system cannot be calculated (knowing <math>c\,\!</math> does not give information about <math>a\,\!</math> and <math>b\,\!</math>). On the other hand, if <math>a\,\!</math> and <math>
In 1981, observability and redundancy criteria were proven for these sorts of flow networks involving only mass and energy balance constraints.<ref name="Stanley-Mah-1981b">[
===Benefits===
Line 115:
The individual test compares each penalty term in the objective function with the critical values of the normal distribution. If the <math>i</math>-th penalty term is outside the 95% confidence interval of the normal distribution, then there is reason to believe that this measurement has a gross error.
==Advanced process data
Advanced process data
* complex models incorporating besides mass balances also thermodynamics, momentum balances, equilibria constraints, hydrodynamics etc.
* gross error remediation techniques to ensure meaningfulness of the reconciled values,
Line 126:
===Gross error remediation===
[[image:scheme reconciliation.jpg|thumb|350px|The workflow of an advanced data validation and reconciliation process.]]
Gross errors are measurement systematic errors that may [[bias]] the reconciliation results. Therefore, it is important to identify and eliminate these gross errors from the reconciliation process. After the reconciliation [[statistical tests]] can be applied that indicate whether or not a gross error does exist somewhere in the set of measurements. These techniques of gross error remediation are based on two concepts:
* gross error elimination
* gross error relaxation.
Line 133:
Gross error relaxation targets at relaxing the estimate for the uncertainty of suspicious measurements so that the reconciled value is in the 95% confidence interval. Relaxation typically finds application when it is not possible to determine which measurement around one unit is responsible for the gross error (equivalence of gross errors). Then measurement uncertainties of the measurements involved are increased.
It is important to note that the remediation of gross errors reduces the quality of the reconciliation, either the redundancy decreases (elimination) or the uncertainty of the measured data increases (relaxation). Therefore, it can only be applied when the initial level of redundancy is high enough to ensure that the data reconciliation can still be done (see Section 2,<ref name="vdi" />).
===Workflow===
Advanced
# data acquisition from data historian, data base or manual inputs
# data validation and filtering of raw measurements
Line 144:
#* gross error remediation (and go back to step 3)
# result storage (raw measurements together with reconciled values)
The result of an advanced
==Applications==
As
==See also==
Line 161:
{{Reflist}}
* Alexander, Dave, Tannar, Dave & Wasik, Larry "Mill Information System uses Dynamic Data Reconciliation for Accurate Energy Accounting" TAPPI Fall Conference 2007.[http://www.tappi.org/Downloads/Conference-Papers/2007/07EPE/07epe87.aspx]{{Dead link|date=July 2019 |bot=InternetArchiveBot |fix-attempted=yes }}
* Rankin, J. & Wasik, L. "Dynamic Data Reconciliation of Batch Pulping Processes (for On-Line Prediction)" PAPTAC Spring Conference 2009.
* S. Narasimhan, C. Jordache, ''Data reconciliation and gross error detection: an intelligent use of process data'', Golf Publishing Company, Houston, 2000.
* V. Veverka, F. Madron, ''Material and Energy Balancing in the Process Industries'', Elsevier Science BV, Amsterdam, 1997.
* J. Romagnoli, M.C. Sanchez, ''Data processing and reconciliation for chemical process operations'', Academic Press, 2000.
[[Category:Data management]]
|