Control (optimal control theory)

In optimal control theory, a control is a variable chosen by the controller or agent to manipulate state variables, similar to an actual control valve. Unlike the state variable, it does not have a predetermined equation of motion.[1] The goal of optimal control theory is to choose controls as functions of time to achieve an optimal path for the state variables (with respect to a loss function).

See also

References

  1. Ferguson, Brian S.; Lim, G. C. (1998). Introduction to Dynamic Economic Problems. Manchester: Manchester University Press. p. 162. ISBN 0-7190-4996-2.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.