Optimal Control and the Calculus of Variations by Enid R. PinchOptimal Control and the Calculus of Variations by Enid R. Pinch

Optimal Control and the Calculus of Variations

byEnid R. Pinch

Paperback | March 1, 1993

not yet rated|write a review

Pricing and Purchase Info

$63.85 online 
$107.50 list price save 40%
Earn 319 plum® points

Ships within 1-3 weeks

Ships free on orders over $25

Not available in stores

about

A paperback edition of this successful textbook for final year undergraduate mathematicians and control engineering students, this book contains exercises and many worked examples, with complete solutions and hints making it ideal not only as a class textbook but also for individual study. Theintorduction to optimal control begins by considering the problem of minimizing a function of many variables, before moving on to the main subject: the optimal control of systems governed by ordinary differential equations.

About The Author

Enid R. Pinch is at University of Manchester.

Details & Specs

Title:Optimal Control and the Calculus of VariationsFormat:PaperbackDimensions:242 pages, 9.21 × 6.14 × 0.59 inPublished:March 1, 1993Publisher:Oxford University Press

The following ISBNs are associated with this title:

ISBN - 10:0198514891

ISBN - 13:9780198514893

Look for similar items by category:

Customer Reviews of Optimal Control and the Calculus of Variations

Reviews

Extra Content

Table of Contents

1. Introduction1.1. The maxima and minima of functions1.2. The calculus of variations1.3. Optimal controlPart 2: Optimization in2.1. Functions of one variable2.2. Critical points, end-points, and points of discontinuity2.3. Functions of several variables2.4. Minimization with constraints2.5. A geometrical interpretation2.6. Distinguishing maxima from minimaPart 3: The calculus of variations3.1. Problems in which the end-points are not fixed3.2. Finding minimizing curves3.3. Isoperimetric problems3.4. Sufficiency conditions3.5. Fields of extremals3.6. Hilbert's invariant integral3.7. Semi-fields and the Jacobi conditionPart 4: Optimal Control I: Theory4.1. Introduction4.2. Control of a simple first-order system4.3. Systems governed by ordinary differential equations4.4. The optimal control problem4.5. The Pontryagin maximum principle4.6. Optimal control to target curvesPart 5: Optimal Control II: Applications5.1. Time-optimal control of linear systems5.2. Optimal control to target curves5.3. Singular controls5.4. Fuel-optimal controls5.5. Problems where the cost depends on X (t l)5.6. Linear systems with quadratic cost5.7. The steady-state Riccai equation5.8. The calculus of variations revisitedPart 6: Proof of the Maximum Principle of Pontryagin6.1. Convex sets in6.2. The linearized state equations6.3. Behaviour of H on an optimal path6.4. Sufficiency conditions for optimal controlAppendix: Answers and hints for the exercisesBibliographyIndex

From Our Editors

This book provides the core material for undergraduate courses on optimal control, the modern development that has grown out of the calculus of variations and classical optimization theory.

Editorial Reviews

"The author has achieved his aim. Anyone who is curious to know what optimal control theory is all about, or who wishes to begin specializing in this field, would benefit by having this book close at hand. Technical libraries should acquire it, too. . . . highly recommended." --Applied Mechanics
Review