Content deleted Content added
mNo edit summary Tag: Reverted |
m Duplicate word reworded |
||
(37 intermediate revisions by 27 users not shown) | |||
Line 1:
{{short description|Branch of engineering and mathematics}}
{{About|control theory in engineering|control theory in linguistics|control (linguistics)|control theory in psychology and sociology|control theory (sociology)|and|Perceptual control theory}}
{{Use mdy dates|date=July 2016}}
'''Control theory''' is a field of [[control engineering]] and [[applied mathematics]] that deals with the [[control system|control]] of [[dynamical system]]s
To do this, a '''controller''' with the requisite corrective behavior is required. This controller monitors the controlled [[process variable]] (PV), and compares it with the reference or [[Setpoint (control system)|set point]] (SP). The difference between actual and desired value of the process variable, called the ''error'' signal, or SP-PV error, is applied as feedback to generate a control action to bring the controlled process variable to the same value as the set point. Other aspects which are also studied are [[controllability]] and [[observability]]. Control theory is used in [[control system engineering]] to design automation that have revolutionized manufacturing, aircraft, communications and other industries, and created new fields such as [[robotics]].
Line 11 ⟶ 10:
Control theory dates from the 19th century, when the theoretical basis for the operation of governors was first described by [[James Clerk Maxwell]].<ref>{{cite journal |first=J. C. |last=Maxwell |author-link=James Clerk Maxwell |title=On Governors |date=1868 |journal=Proceedings of the Royal Society |volume=100 |url=https://upload.wikimedia.org/wikipedia/commons/b/b1/On_Governors.pdf |archive-url=https://web.archive.org/web/20081219051207/http://upload.wikimedia.org/wikipedia/commons/b/b1/On_Governors.pdf |archive-date=2008-12-19 |url-status=live}}</ref> Control theory was further advanced by [[Edward Routh]] in 1874, [[Jacques Charles François Sturm|Charles Sturm]] and in 1895, [[Adolf Hurwitz]], who all contributed to the establishment of control stability criteria; and from 1922 onwards, the development of [[PID control]] theory by [[Nicolas Minorsky]].<ref>{{cite journal |last=Minorsky |first=Nicolas |author-link=Nicolas Minorsky |title=Directional stability of automatically steered bodies |journal=Journal of the American Society of Naval Engineers |year=1922 |volume=34 |pages=280–309 |issue=2 |doi=10.1111/j.1559-3584.1922.tb04958.x}}</ref>
Although
==History==
{{see also|Control engineering#History}}
[[File:Boulton and Watt centrifugal governor-MJ.jpg|thumb|right|[[Centrifugal governor]] in a [[Boulton & Watt engine]] of 1788]]
Although control systems of various types date back to antiquity, a more formal analysis of the field began with a dynamics analysis of the [[centrifugal governor]], conducted by the physicist [[James Clerk Maxwell]] in 1868, entitled ''On Governors''.<ref name="Maxwell1867">{{cite journal|author=Maxwell, J.C.|year=1868|title=On Governors|journal=Proceedings of the Royal Society of London|volume=16|pages=270–283|doi=10.1098/rspl.1867.0055|jstor=112510|doi-access=
| author = Routh, E.J.
|author2=Fuller, A.T.
Line 54 ⟶ 55:
* ''[[Nonlinear control theory]]'' – This covers a wider class of systems that do not obey the superposition principle, and applies to more real-world systems because all real control systems are nonlinear. These systems are often governed by [[nonlinear differential equation]]s. The few mathematical techniques which have been developed to handle them are more difficult and much less general, often applying only to narrow categories of systems. These include [[limit cycle]] theory, [[Poincaré map]]s, [[Lyapunov function|Lyapunov stability theorem]], and [[describing function]]s. Nonlinear systems are often analyzed using [[numerical method]]s on computers, for example by [[simulation|simulating]] their operation using a [[simulation language]]. If only solutions near a stable point are of interest, nonlinear systems can often be [[linearization|linearized]] by approximating them by a linear system using [[perturbation theory]], and linear techniques can be used.<ref>{{cite web| url = http://www.mathworks.com/help/toolbox/simulink/slref/trim.html| title = trim point}}</ref>
==Analysis techniques
Mathematical techniques for analyzing and designing control systems fall into two different categories:
* ''[[Frequency ___domain]]'' – In this type the values of the [[state variable]]s, the mathematical [[variable (mathematics)|variables]] representing the system's input, output and feedback are represented as functions of [[frequency]]. The input signal and the system's [[transfer function]] are converted from time functions to functions of frequency by a [[transform (mathematics)|transform]] such as the [[Fourier transform]], [[Laplace transform]], or [[Z transform]]. The advantage of this technique is that it results in a simplification of the mathematics; the ''[[differential equation]]s'' that represent the system are replaced by ''[[algebraic equation]]s'' in the frequency ___domain which is much simpler to solve. However, frequency ___domain techniques can only be used with linear systems, as mentioned above.
* ''[[Time-___domain state space representation]]'' – In this type the values of the [[state variable]]s are represented as functions of time. With this model, the system being analyzed is represented by one or more [[differential equation]]s. Since frequency ___domain techniques are limited to [[linear function|linear]] systems, time ___domain is widely used to analyze real-world nonlinear systems. Although these are more difficult to solve, modern computer simulation techniques such as [[simulation language]]s have made their analysis routine.
In contrast to the frequency
==System interfacing
Control systems can be divided into different categories depending on the number of inputs and outputs.
* [[Single-input single-output system|Single-input single-output]] (SISO) – This is the simplest and most common type, in which one output is controlled by one control signal. Examples are the cruise control example above, or an [[audio system]], in which the control input is the input audio signal and the output is the sound waves from the speaker.
* [[Multiple-input multiple-output system|Multiple-input multiple-output]] (MIMO) – These are found in more complicated systems. For example, modern large [[telescope]]s such as the [[Keck telescopes|Keck]] and [[MMT Observatory|MMT]] have mirrors composed of many separate segments each controlled by an [[actuator]]. The shape of the entire mirror is constantly adjusted by a MIMO [[active optics]] control system using input from multiple sensors at the focal plane, to compensate for changes in the mirror shape due to thermal expansion, contraction, stresses as it is rotated and distortion of the [[wavefront]] due to turbulence in the atmosphere. Complicated systems such as [[nuclear reactor]]s and human [[cell (biology)|cells]] are simulated by a computer as large MIMO control systems.
===Classical SISO system design ===
The scope of classical control theory is limited to single-input and single-output (SISO) system design, except when analyzing for disturbance rejection using a second input. The system analysis is carried out in the time ___domain using [[differential equations]], in the complex-s ___domain with the [[Laplace transform]], or in the frequency ___domain by transforming from the complex-s ___domain. Many systems may be assumed to have a second order and single variable system response in the time ___domain. A controller designed using classical theory often requires on-site tuning due to incorrect design approximations. Yet, due to the easier physical implementation of classical controller designs as compared to systems designed using modern control theory, these controllers are preferred in most industrial applications. The most common controllers designed using classical control theory are [[PID controller]]s. A less common implementation may include either or both a Lead or Lag filter. The ultimate end goal is to meet requirements typically provided in the time-___domain called the step response, or at times in the frequency ___domain called the open-loop response. The step response characteristics applied in a specification are typically percent overshoot, settling time, etc. The open-loop response characteristics applied in a specification are typically Gain and Phase margin and bandwidth. These characteristics may be evaluated through simulation including a dynamic model of the system under control coupled with the compensation model.
===Modern MIMO system design===
Modern control theory is carried out in the [[State space (controls)|state space]], and can deal with multiple-input and multiple-output (MIMO) systems. This overcomes the limitations of classical control theory in more sophisticated design problems, such as fighter aircraft control, with the limitation that no frequency ___domain analysis is possible. In modern design, a system is represented to the greatest advantage as a set of decoupled first order [[differential equation]]s defined using [[state variables]]. [[Nonlinear control|Nonlinear]], [[multivariable control|multivariable]], [[adaptive control|adaptive]] and [[robust control]] theories come under this division. Being fairly new, modern control theory has many areas yet to be explored. Scholars like [[Rudolf E. Kálmán]] and [[Aleksandr Lyapunov]] are well known among the people who have shaped modern control theory.
==Topics in control theory==
Line 80 ⟶ 87:
The difference between the two cases is simply due to the traditional method of plotting continuous time versus discrete time transfer functions. The continuous Laplace transform is in [[Cartesian coordinates]] where the <math>x</math> axis is the real axis and the discrete Z-transform is in [[circular coordinates]] where the <math>\rho</math> axis is the real axis.
When the appropriate conditions above are satisfied a system is said to be [[asymptotic stability|asymptotically stable]]; the variables of an asymptotically stable control system always decrease from their initial value and do not show permanent oscillations. Permanent oscillations occur when a pole has a real part exactly equal to zero (in the continuous time case) or a [[Absolute value#Complex_numbers|modulus]] equal to one (in the discrete time case). If a simply stable system response neither decays nor grows over time, and has no oscillations, it is [[marginal stability|marginally stable]]; in this case the system transfer function has non-repeated poles at the complex plane origin (i.e. their real and complex component is zero in the continuous time case). Oscillations are present when poles with real part equal to zero have an imaginary part not equal to zero.
If a system in question has an [[impulse response]] of
Line 127 ⟶ 134:
===Model identification and robustness===
A control system must always have some robustness property. A [[robust control]]ler is such that its properties do not change much if applied to a system slightly different from the mathematical one used for its synthesis. This requirement is important, as no real physical system truly behaves like the series of differential equations used to represent it mathematically. Typically a simpler mathematical model is chosen in order to simplify calculations, otherwise, the true [[system dynamics]] can be so complicated that a complete model is impossible.
;System identification
Line 165 ⟶ 172:
;List of the main control techniques
*[[Optimal control]] is a particular control technique in which the control signal optimizes a certain "cost index": for example, in the case of a satellite, the jet thrusts needed to bring it to desired trajectory that consume the least amount of fuel. Two optimal control design methods have been widely used in industrial applications, as it has been shown they can guarantee closed-loop stability. These are [[Model Predictive Control]] (MPC) and [[linear-quadratic-Gaussian control]] (LQG). The first can more explicitly take into account constraints on the signals in the system, which is an important feature in many industrial processes. However, the "optimal control" structure in MPC is only a means to achieve such a result, as it does not optimize a true performance index of the closed-loop control system. Together with PID controllers, MPC systems are the most widely used control technique in [[process control]].▼
*[[Robust control]] deals explicitly with uncertainty in its approach to controller design. Controllers designed using ''robust control'' methods tend to be able to cope with small differences between the true system and the nominal model used for design.<ref>{{cite journal|last1=Melby|first1=Paul|last2=et.|first2=al.|title=Robustness of Adaptation in Controlled Self-Adjusting Chaotic Systems |journal=Fluctuation and Noise Letters |volume=02|issue=4|pages=L285–L292|date=2002|doi=10.1142/S0219477502000919}}</ref> The early methods of [[Hendrik Wade Bode|Bode]] and others were fairly robust; the state-space methods invented in the 1960s and 1970s were sometimes found to lack robustness. Examples of modern robust control techniques include [[H-infinity loop-shaping]] developed by Duncan McFarlane and [[Keith Glover]], [[Sliding mode control]] (SMC) developed by [[Vadim Utkin]], and safe protocols designed for control of large heterogeneous populations of electric loads in Smart Power Grid applications.<ref name='TCL1'>{{cite journal|title=Safe Protocols for Generating Power Pulses with Heterogeneous Populations of Thermostatically Controlled Loads |author=N. A. Sinitsyn. S. Kundu, S. Backhaus |journal=[[Energy Conversion and Management]]|volume=67|year=2013|pages=297–308|arxiv=1211.0248|doi=10.1016/j.enconman.2012.11.021|bibcode=2013ECM....67..297S |s2cid=32067734 }}</ref> Robust methods aim to achieve robust performance and/or [[Stability theory|stability]] in the presence of small modeling errors.▼
*[[Stochastic control]] deals with control design with uncertainty in the model. In typical stochastic control problems, it is assumed that there exist random noise and disturbances in the model and the controller, and the control design must take into account these random deviations.▼
*[[Adaptive control]] uses on-line identification of the process parameters, or modification of controller gains, thereby obtaining strong robustness properties. Adaptive controls were applied for the first time in the [[aerospace industry]] in the 1950s, and have found particular success in that field.
*A [[hierarchical control system]] is a type of [[control system]] in which a set of devices and governing software is arranged in a [[hierarchical]] [[tree (data structure)|tree]]. When the links in the tree are implemented by a [[computer network]], then that hierarchical control system is also a form of [[networked control system]].
*[[Intelligent control]] uses various AI computing approaches like [[artificial neural networks]], [[Bayesian probability]], [[fuzzy logic]],<ref>{{cite journal | title=A novel fuzzy framework for nonlinear system control| journal=Fuzzy Sets and Systems | year=2010 | last1=Liu |first1=Jie |author2=Wilson Wang |author3=Farid Golnaraghi |author4=Eric Kubica | volume=161 | issue=21 | pages=2746–2759 | doi=10.1016/j.fss.2010.04.009}}</ref> [[machine learning]], [[evolutionary computation]] and [[genetic algorithms]] or a combination of these methods, such as [[neuro-fuzzy]] algorithms, to control a [[dynamic system]].
▲*[[Optimal control]] is a particular control technique in which the control signal optimizes a certain "cost index": for example, in the case of a satellite, the jet thrusts needed to bring it to desired trajectory that consume the least amount of fuel. Two optimal control design methods have been widely used in industrial applications, as it has been shown they can guarantee closed-loop stability. These are [[Model Predictive Control]] (MPC) and [[linear-quadratic-Gaussian control]] (LQG). The first can more explicitly take into account constraints on the signals in the system, which is an important feature in many industrial processes. However, the "optimal control" structure in MPC is only a means to achieve such a result, as it does not optimize a true performance index of the closed-loop control system. Together with PID controllers, MPC systems are the most widely used control technique in [[process control]].
▲*[[Robust control]] deals explicitly with uncertainty in its approach to controller design. Controllers designed using ''robust control'' methods tend to be able to cope with small differences between the true system and the nominal model used for design.<ref>{{cite journal|last1=Melby|first1=Paul|last2=et.|first2=al.|title=Robustness of Adaptation in Controlled Self-Adjusting Chaotic Systems |journal=Fluctuation and Noise Letters |volume=02|issue=4|pages=L285–L292|date=2002|doi=10.1142/S0219477502000919}}</ref> The early methods of [[Hendrik Wade Bode|Bode]] and others were fairly robust; the state-space methods invented in the 1960s and 1970s were sometimes found to lack robustness. Examples of modern robust control techniques include [[H-infinity loop-shaping]] developed by Duncan McFarlane and [[Keith Glover]], [[Sliding mode control]] (SMC) developed by [[Vadim Utkin]], and safe protocols designed for control of large heterogeneous populations of electric loads in Smart Power Grid applications.<ref name='TCL1'>{{cite journal|title=Safe Protocols for Generating Power Pulses with Heterogeneous Populations of Thermostatically Controlled Loads |author=N. A. Sinitsyn. S. Kundu, S. Backhaus |journal=[[Energy Conversion and Management]]|volume=67|year=2013|pages=297–308|arxiv=1211.0248|doi=10.1016/j.enconman.2012.11.021|s2cid=32067734 }}</ref> Robust methods aim to achieve robust performance and/or [[Stability theory|stability]] in the presence of small modeling errors.
▲*[[Stochastic control]] deals with control design with uncertainty in the model. In typical stochastic control problems, it is assumed that there exist random noise and disturbances in the model and the controller, and the control design must take into account these random deviations.
*[[Self-organized criticality control]] may be defined as attempts to interfere in the processes by which the [[self-organized]] system dissipates energy.
Line 183 ⟶ 190:
* [[Richard Bellman]] developed [[dynamic programming]] in the 1940s.<ref>{{cite magazine |author=Richard Bellman |date=1964 |title=Control Theory |doi=10.1038/scientificamerican0964-186 |magazine=[[Scientific American]] |volume=211 |issue=3 |pages=186–200|author-link=Richard Bellman }}</ref>
* [[Warren E. Dixon]], control theorist and a professor
* [[Kyriakos G. Vamvoudakis]], developed synchronous reinforcement learning algorithms to solve optimal control and game theoretic problems
* [[Andrey Kolmogorov]] co-developed the [[Wiener filter|Wiener–Kolmogorov filter]] in 1941.
* [[Norbert Wiener]] co-developed the Wiener–Kolmogorov filter and coined the term [[cybernetics]] in the 1940s.
Line 196 ⟶ 204:
;Examples of control systems
{{colbegin}}
*
*
*
*
*
*
*
*
*
*
*
*
*
{{colend}}
;Topics in control theory
{{colbegin}}
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
{{colend}}
;Other related topics
{{colbegin}}
*
*
*
*
*
*
*
*
*
*
* {{annotated link|Outline of management}}
*
*
*
{{colend}}
Line 270 ⟶ 278:
*{{cite book | author= Robert F. Stengel | title= Optimal Control and Estimation | publisher= Dover Publications | year= 1994 | isbn=978-0-486-68200-6 }}
* {{cite book |last=Franklin |title=Feedback Control of Dynamic Systems |edition=4 |year=2002 |publisher=Prentice Hall |___location=New Jersey |isbn=978-0-13-032393-4 |display-authors=etal }}
* {{cite book |author1=Joseph L. Hellerstein |author2=Dawn M. Tilbury|author2-link= Dawn Tilbury |author3=Sujay Parekh | title= Feedback Control of Computing Systems | publisher= John Wiley and Sons | year= 2004 | isbn=978-0-471-26637-2}}
*{{cite book | author= [[Diederich Hinrichsen]] and Anthony J. Pritchard | title= Mathematical Systems Theory I – Modelling, State Space Analysis, Stability and Robustness | publisher= Springer | year= 2005 | isbn=978-3-540-44125-0 }}
*{{cite book | last = Sontag | first = Eduardo | author-link = Eduardo D. Sontag | year = 1998 | title = Mathematical Control Theory: Deterministic Finite Dimensional Systems. Second Edition | publisher = Springer | url = http://www.sontaglab.org/FTPDIR/sontag_mathematical_control_theory_springer98.pdf | isbn = 978-0-387-98489-6 }}
* {{cite book | last = Goodwin | first = Graham | year = 2001 | title = Control System Design | publisher = Prentice Hall | isbn = 978-0-13-958653-8 }}
* {{cite book | author= Christophe Basso | year = 2012 | title = Designing Control Loops for Linear and Switching Power Supplies: A Tutorial Guide.| publisher = Artech House | url = http://cbasso.pagesperso-orange.fr/Spice.htm | isbn = 978-1608075577 }}
Line 299 ⟶ 304:
[[Category:Control engineering]]
[[Category:Computer engineering]]
|