|
Uncertainty and risk are integral to engineering because real systems have inherent ambiguities that arise naturally or due to our inability to model complex physics. In this book, the authors discuss probability theory, stochastic processes, estimation, and stochastic control strategies and show how probability can be used to model uncertainty in control and estimation problems. The material is practical and rich in research opportunities.
The book provides a comprehensive treatment of stochastic systems from the foundations of probability to stochastic optimal control. It covers discrete- and continuous-time stochastic dynamic systems leading to the derivation of the Kalman filter, its properties, and its relation to the frequency domain Wiener filter as well as the dynamic programming derivation of the linear quadratic Gaussian (LQG) and the linear exponential Gaussian (LEG) controllers and their relation to H2 and H controllers and system robustness.
This book is divided into three related sections. First, the concepts of probability theory, random variables, and stochastic processes, which lead to the topics of expectation, conditional expectation, and discrete-time estimation and the Kalman filter, are dealt. After establishing this foundation, stochastic calculus and continuous-time estimation are introduced. Finally, dynamic programming for both discrete-time and continuous-time systems leads to the solution of optimal stochastic control problems, resulting in controllers with significant practical application.
The book is suitable for postgraduate students in electrical, mechanical, chemical, and aerospace engineering specializing in systems and control. Students in computer science, economics, and business management will also find it useful.ISBN - 9788120346826
|
|
Pages : 400
|