Theory of Global Random Search by Anatoly A. ZhigljavskyTheory of Global Random Search by Anatoly A. Zhigljavsky

Theory of Global Random Search

byAnatoly A. ZhigljavskyEditorJános D. Pint

Paperback | September 27, 2012

Pricing and Purchase Info

$170.59 online 
$178.50 list price
Earn 853 plum® points

Prices and offers may vary in store


In stock online

Ships free on orders over $25

Not available in stores


One service mathematics has rendered the 'Et moi, ...• si j'avait su comment en revenir. je n'y serais point aIle.' human mee. It has put common sense back Jules Verne where it belongs, on the topmost shelf next to the dusty canister labelled 'discarded non­ The series is divergent; therefore we may be sense'. Eric T. Bell able to do something with it. O. Heaviside Mathematics is a tool for thought. A highly necessary tool in a world where both feedback and non­ linearities abound. Similarly, all kinds of parts of mathematics serve as tools for other parts and for other sciences. Applying a simple rewriting rule to the quote on the right above one finds such statements as: 'One service topology has rendered mathematical physics .. .'; 'One service logic has rendered com­ puter science .. .'; 'One service category theory has rendered mathematics .. .'. All arguably true. And all statements obtainable this way form part of the raison d'etre of this series.
Title:Theory of Global Random SearchFormat:PaperbackDimensions:341 pagesPublished:September 27, 2012Publisher:Springer-Verlag/Sci-Tech/TradeLanguage:English

The following ISBNs are associated with this title:

ISBN - 10:940105519X

ISBN - 13:9789401055192


Table of Contents

1 Global Optimization: An Overview.- 1. Global Optimization Theory: General Concepts.- 1.1. Statements of the global optimization problem.- 1.2. Types of prior information about the objective function and a classification of methods.- 1.2.1. Types of prior information.- 1.2.2. Classification of principal approaches and methods of the global optimization.- 1.2.3. General properties of multiextremal functions.- 1.3. Comparison and practical use of global optimization algorithms.- 1.3.1. Numerical comparison.- 1.3.2. Theoretical comparison criteria.- 1.3.3. Practical optimization problems.- 2. Global Optimization Methods.- 2.1. Global optimization algorithms based on the use of local search techniques.- 2.1.1. Local optimization algorithms.- 2.1.2. Use of local algorithms in constructing global optimization strategies.- 2.1.3. Multistart.- 2.1.4. Tunneling algorithms.- 2.1.5. Methods of transition from one local minimizer into another.- 2.1.6. Algorithms based on smoothing the objective function.- 2.2. Set covering methods.- 2.2.1. Grid algorithms (Passive coverings).- 2.2.2. Sequential covering methods.- 2.2.3. Optimality of global minimization algorithms.- 2.3. One-dimensional optimization, reduction and partition techniques.- 2.3.1. One-dimensional global optimization.- 2.3.2. Dimension reduction in multiextremal problems.- 2.3.3. Reducing global optimization to other problems in computational mathematics.- 2.3.4. Branch and bound methods.- 2.4. An approach based on stochastic and axiomatic models of the objective function.- 2.4.1. Stochastic models.- 2.4.2. Global optimization methods based on stochastic models.- 2.4.3. The Wiener process case.- 2.4.4. Axiomatic approach.- 2.4.5. Information-statistical approach.- 2. Global Random Search.- 3. Main Concepts and Approaches of Global Random Search.- 3.1. Construction of global random search algorithms: Basic approaches.- 3.1.1. Uniform random sampling.- 3.1.2. General (nonuniform) random sampling.- 3.1.3. Ways of improving the efficiency of random sampling algorithms.- 3.1.4. Random coverings.- 3.1.5. Formal scheme of global random search.- 3.1.6. Local behaviour of global random search algorithm.- 3.2. General results on the convergence of global random search algorithms.- 3.3. Markovian algorithms.- 3.3.1. General scheme of Markovian algorithms.- 3.3.2. Simulated annealing.- 3.3.3. Methods based on solving stochastic differential equations.- 3.3.4. Global stochastic approximation: Zielinski's method.- 3.3.5. Convergence rate of Baba's algorithm.- 3.3.6. The case of high dimension.- 4. Statistical Inference in Global Random Search.- 4.1. Some ways of applying statistical procedures to construct global random search algorithms.- 4.1.1. Regression analysis and design.- 4.1.2. Cluster analysis and pattern recognition.- 4.1.3. Estimation of the cumulative distribution function, its density, mode and level surfaces.- 4.1.4. Statistical modelling (Monte Carlo method).- 4.1.5. Design of experiments.- 4.2. Statistical inference concerning the maximum of a function.- 4.2.1. Statement of the problem and a survey of the approaches for its solution.- 4.2.2. Statistical inference construction for estimating M.- 4.2.3. Statistical inference for M, when the value of the tail index ? is known.- 4.2.4. Statistical inference, when the value of the tail index ? is unknown.- 4.2.5. Estimation of F(t).- 4.2.6. Prior determination of the value of the tail index ?.- 4.2.7. Exponential complexity of the uniform random sampling algorithm.- 4.3. Branch and probability bound methods.- 4.3.1. Prospectiveness criteria.- 4.3.2. The essence of branch and bound procedures.- 4.3.3. Principal construction of branch and probability bound methods.- 4.3.4. Typical variants of the branch and probability bound method.- 4.4. Stratified sampling.- 4.4.1. Organization of stratified sampling.- 4.4.2. Statistical inference for the maximum of a function based on its values at the points of stratified sample.- 4.4.3. Dominance of stratified over independent sampling.- 4.5. Statistical inference in random multistart.- 4.5.1. Problem statement.- 4.5.2. Bounded number of local maximizers.- 4.5.3. Bayesian approach.- 4.6. An approach based on distributions neutral to the right.- 4.6.1. Random distributions neutral to the right and their properties ....- 4.6.2. Bayesian testing about quantiles of random distributions.- 4.6.3. Application of distributions neutral to the right to construct global random search algorithms.- 5. Methods of Generations.- 5.1. Description of algorithms and formulation of the basic probabilistic model.- 5.1.1. Algorithms.- 5.1.2. The basic probabilistic model.- 5.2. Convergence of probability measure sequences generated by the basic model.- 5.2.1. Assumptions.- 5.2.2. Auxiliary statements.- 5.2.3. Convergence of the sequences (5.2.7) and (5.2.8) to?*(dx).- 5.3. Methods of generations for eigen-measure functional estimation of linear integral operators.- 5.3.1. Eigen-measures of linear integral operators.- 5.3.2. Closeness of eigen-measures to ?* (dx).- 5.3.3. Description of the generation methods.- 5.3.4. Convergence and rate of convergence of the generation methods.- 5.4. Sequential analogues of the methods of generations.- 5.4.1. Functionals of eigen-measures.- 5.4.2. Sequential maximization algorithms.- 5.4.3. Narrowing the search area.- 6. Random Search Algorithms for Solving Specific Problems.- 6.1. Distribution sampling in random search algorithms for solving constrained optimization problems.- 6.1.1. Basic concepts.- 6.1.2. Properties of D(x).- 6.1.3. General remarks on sampling.- 6.1.4. Manifold defined by linear constraints.- 6.1.5. Uniform distribution on an ellipsoid.- 6.1.6. Sampling on a hyperboloid.- 6.1.7. Sampling on a paraboloid.- 6.1.8. Sampling on a cone.- 6.2. Random search algorithm construction for optimization in functional spaces, in discrete and in multicriterial problems.- 6.2.1. Optimization in functional spaces.- 6.2.2. Random search in multicriterial optimization problems.- 6.2.3. Discrete optimization.- 6.2.4. Relative efficiency of discrete random search.- 3. Auxiliary Results.- 7. Statistical Inference for the Bounds of Random Variables.- 7.1. Statistical inference when the tail index of the extreme value distribution is known.- 7.1.1. Motivation and problem statement.- 7.1.2. Auxiliary statements.- 7.1.3. Estimation of M.- 7.1.4. Confidence intervals for M.- 7.1.5. Testing statistical hypotheses about M.- 7.2. Statistical inference when the tail index is unknown.- 7.2.1. Statistical inference for M.- 7.2.2. Estimation of ?.- 7.2.3. Construction of confidence intervals and statistical hypothesis test for ?.- 7.3. Asymptotic properties of optimal linear estimates.- 7.3.1. Results and consequences.- 7.3.2. Auxiliary statements and proofs of Theorem 7.3.2 and Proposition 7.1.3.- 7.3.3. Proof of Theorem 7.3.1.- 8. Several Problems Connected with Global Random Search.- 8.1. Optimal design in extremal experiments.- 8.1.1. Extremal experiment design.- 8.1.2. Optimal selection of the search direction.- 8.1.3. Experimental design applying the search direction (8.1.15).- 8.2. Optimal simultaneous estimation of several integrals by the Monte Carlo method.- 8.2.1. Problem statement.- 8.2.2. Assumptions.- 8.2.3. Existence and uniqueness of optimal densities.- 8.2.4. Necessary and sufficient optimality conditions.- 8.2.5. Construction and structure of optimal densities.- 8.2.6. Structure of optimal densities for nondifferentiable criteria.- 8.2.7. Connection with the regression design theory.- 8.3. Projection estimation of multivariate regression.- 8.3.1. Problem statement.- 8.3.2. Bias and random inaccuracies of nonparametric estimates.- 8.3.3. Examples of asymptotically optimal projection procedures with deterministic designs.- 8.3.4. Projection estimation via evaluations at random points.- References.