Mathematical Theory of Control Systems Design by V.N. AfanasievMathematical Theory of Control Systems Design by V.N. Afanasiev

Mathematical Theory of Control Systems Design

byV.N. Afanasiev, V. Kolmanovskii, V.R. Nosov

Paperback | December 6, 2010

Pricing and Purchase Info

$220.58 online 
$245.95 list price save 10%
Earn 1103 plum® points

In stock online

Ships free on orders over $25

Not available in stores

about

The many interesting topics covered in Mathematical Theory of Control Systems Design are spread over an Introduction and four parts. Each chapter concludes with a brief review of the main results and formulae, and each part ends with an exercise section. Part One treats the fundamentals of modern stability theory. Part Two is devoted to the optimal control of deterministic systems. Part Three is concerned with problems of the control of systems under random disturbances of their parameters, and Part Four provides an outline of modern numerical methods of control theory. The many examples included illustrate the main assertions, teaching the reader the skills needed to construct models of relevant phenomena, to design nonlinear control systems, to explain the qualitative differences between various classes of control systems, and to apply what they have learned to the investigation of particular systems. Audience: This book will be valuable to both graduate and postgraduate students in such disciplines as applied mathematics, mechanics, engineering, automation and cybernetics.
Title:Mathematical Theory of Control Systems DesignFormat:PaperbackDimensions:694 pagesPublished:December 6, 2010Publisher:Springer NetherlandsLanguage:English

The following ISBNs are associated with this title:

ISBN - 10:9048146151

ISBN - 13:9789048146154

Look for similar items by category:

Customer Reviews of Mathematical Theory of Control Systems Design

Reviews

Table of Contents

Preface. Introduction. Part One: Stability of control systems. I. Continuous and discrete deterministic systems. II. Stability of stochastic systems. Part Two: Control of deterministic systems. III. Description of control problems. IV. The classical calculus of variations and optimal control. V. The maximum principle. VI. Linear control systems. VII. Dynamic programming approach. Sufficient conditions for optimal control. VIII. Some additional topics of optimal control theory. Part Three: Optimal control of dynamical systems under random disturbances. IX. Control of stochastic systems. Problem statements and investigation techniques. X. Optimal control on a time level of random duration. XI. Optimal estimation of the state of the system. XII. Optimal control of the observation process. Part Four: Numerical methods in control systems. XIII. Linear time-invariant control systems. XIV. Numerical methods for the investigation of nonlinear control systems. XV. Numerical design of optimal control systems. General references. Subject index.