国外数学名著系列6(影印版):数值最优化 [Numerical Optimization]

图书介绍

出版社: 科学出版社

ISBN:9787030166753

版次:1

商品编码:11928475

包装:精装

丛书名: 国外数学名著系列(影印版)

外文名称:Numerical Optimization

开本:16开

出版时间:2006-01-01

用纸:胶版纸

页数:636

字数:779000

正文语种:中文

图书描述

编辑推荐

适读人群 :数学专业高年级本科生,运筹学、应用数学等相关专业研究生

本书运筹学、计算数学高年级本科生或研究生必读书目,是数之**化一部经典之作。

内容简介

本书作者现任美国西北大学教授,多种国际杂志的主编、副主编。作者根据在教学、研究和咨询中的经验,写了这本适合学生和实际工作者的书。本书提供连续优化中大多数有效方法的全面的新的论述。每一章从基本概念开始,逐步阐述当前可用的技术。

本书强调实用方法,包含大量图例和练习,适合广大读者阅读,可作为工程、运筹学、数学、计算机科学以及商务方面的研究生教材,《哑舍(典藏版零)(精) 玄色 人民文学出版社 9787020124848》,也可作为该领域的科研人员和实际工作人员的手册。

总之,作者力求本书阅读性强,内容丰富,挚野2:完结篇丁墨,论述严谨,能揭示数值实用价值。

作者简介

作者现任美国西北大学教授,多种国际**杂志的主编、副主编。作者根据在教学、研究和咨询中的经验,写了这本适合学生和实际工作者的书。

目录

Preface

1 Introduction

Mathematical Formulation

Example: A Transportation Problem

Continuous versus Discrete Optimization

Constrained and Unconstrained Optimization

Global and Local Optimization

Stochastic and Deterministic Optimization

Optimization Algorithms

Convexity

Notes and References

2 Fundamentals of Unconstrained Optimization

2.1 What Is a Solution?

Recognizing a Local Minimum

Nonsmooth Problems

2.2 Overview of Algorithms

Two Strategies: Line Search and Trust Region

Search Directions for Line Search Methods

Models for Trust—Region Methods

Scaling

Rates of Convergence

R—Rates of Convergence

Notes and References

Exercises

3 Line Search Methods

3.1 Step Length

The Wolfe Conditions

The Goldstein Conditions

Sufficient Decrease and Backtracking

3.2 Convergence of Line Search Methods

3.3 Rate of Convergence

Convergence Rate of Steepest Descent

Quasi—Newton Methods

Newton‘s Method

Coordinate Descent Methods

3.4 Step—Length Selection Algorithms

Interpolation

The Initial Step Length

A Line Search Algorithm for the Wolfe Conditions

Notes and References

Exerases

4 Trust—Region Methods

Outline of the Algorithm

4.1 The Cauchy Point and Related Algorithms

The Cauchy Point

Improving on the Cauchy Point

The DoglegMethod

Two—Dimensional Subspace Minimization

Steihaug‘s Approach

4.2 Using Nearly Exact Solutions to the Subproblem

Characterizing Exact Solutions

Calculating Nearly Exact Solutions

The Hard Case

Proof of Theorem 4.3

4.3 Global Convergence

Reduction Obtained by the Cauchy Point

Convergence to Stationary Points

Convergence of Algorithms Based on Nearly Exact Solutions

4.4 Other Enhancements

Scaling

Non—Euclidean Trust Regions

Notes and References

Exercises

5 Conjugate Gradient Methods

5.1 The Linear Conjugate Gradient Method

Conjugate Direction Methods

Basic Properties of the Conjugate Gradient Method

A Practical Form of the Conjugate Gradient Method

Rate of Convergence

Preconditioning

Practical Preconditioners

5.2 Nonlinear Conjugate Gradient Methods

The Fletcher—Reeves Method

The Polak—Ribiere Method

Quadratic Termination and Restarts

Numerical Performance

Behavior of the Fletcher—Reeves Method

Global Convergence

Notes and References

Exerases

6 Practical Newton Methods

6.1 Inexact Newton Steps

6.2 Line Search Newton Methods

Line Search Newton—CG Method

Modified Newton‘s Method

6.3 Hessian Modifications

Eigenvalue Modification

Adding a Multiple of the Identity

Modified Cholesky Factorization

Gershgorin Modification

Modified Symmetric Indefinite Factorization

6.4 Trust—Region Newton Methods

Newton—Dogleg and Subspace—Minimization Methods

Accurate Solution of the Trust—Region Problem

Trust—Region Newton—CG Method

Preconditioning the Newton—CG Method

Local Convergence of Trust—Region Newton Methods

Notes and References

Exerases

7 Calculating Derivatives

7.1 Finite—Difference Derivative Approximations

Approximating the Gradient

Approximating a Sparse Jacobian

Approximatingthe Hessian

Approximating a Sparse Hessian

7.2 Automatic Differentiation

An Example

The Forward Mode

The Reverse Mode

Vector Functions and Partial Separability

Calculating Jacobians of Vector Functions

Calculating Hessians: Forward Mode

Calculating Hessians: Reverse Mode

Current Limitations

Notes and References

Exercises

8 Quasi—Newton Methods

8.1 The BFGS Method

Properties ofthe BFGS Method

Implementation

8.2 The SR1 Method

Properties of SRl Updating

8.3 The Broyden Class

Properties ofthe Broyden Class

8.4 Convergence Analysis

Global Convergence ofthe BFGS Method

Superlinear Convergence of BFGS

Convergence Analysis of the SR1 Method

Notes and References

Exercises

9 Large—Scale Quasi—Newton and Partially Separable Optimization

9.1 Limited—Memory BFGS

Relationship with Conjugate Gradient Methods

9,2 General Limited—Memory Updating

Compact Representation of BFGS Updating

SR1 Matrices

Unrolling the Update

9.3 Sparse Quasi—Newton Updates

9.4 Partially Separable Functions

A Simple Example

Internal Variables

9.5 Invariant Subspaces and Partial Separability

Sparsity vs.Partial Separability

Group Partial Separability

9.6 Algorithms for Partially Separable Functions

Exploiting Partial Separabilityin Newton‘s Method

Quasi—Newton Methods for Partially Separable Functions

Notes and References

Exercises

……

10 Nonlinear Least—Squares Problems

11 Nonlinear Equations

12 Theory of Constrained Optimization

13 Linear Programming: The Simplex Method

14 Linear Programming:Interior—Point Methods

15 Fundamentals of Algorithms for Nonlinear Constrained Optimization

16 Quadratic Programnung

17 Penalty, Barrier, and Augmented Lagrangian Methods

18 Sequential Quadratic Programming

A Background Material

References

Index

前言/序言

要使我国的数学事业更好地发展起来,需要数学家淡泊名利并付出更艰苦地努力。另一方面,我们也要从客观上为数学家创造更有利的发展数学事业的外部环境,这主要是加强对数学事业的支持与投资力度,使数学家有较好的工作与生活条件,其中也包括改善与加强数学的出版工作。

科学出版社影印一批他们出版的好的新书,使我国广大数学家能以较低的价格购买,特别是在边远地区工作的数学家能普遍见到这些书,无疑是对推动我国数学的科研与教学十分有益的事。

这次科学出版社购买了版权,一次影印了23本施普林格出版社出版的数学书,就是一件好事,也是值得继续做下去的事情。大体上分一下,这23本书中,包括基础数学书5本,应用数学书6本与计算数学书12本,其中有些书也具有交叉性质。这些书都是很新的,2000年以后出版的占绝大部分,共计16本,其余的也是1990年以后出版的。这些书可以使读者较快地了解数学某方面的前沿,例如基础数学中的数论、代数与拓扑三本,都是由该领域大数学家编著的“数学百科全书”的分册。对从事这方面研究的数学家了解该领域的前沿与全貌很有帮助。按照学科的特点,基础数学类的书以“经典”为主,应用和计算数学类的书以“前沿”为主。这些书的作者多数是国际知名的大数学家,例如《拓扑学》一书的作者诺维科夫是俄罗斯科学院的院士,曾获“菲尔兹奖”和“沃尔夫数学奖”。这些大数学家的著作无疑将会对我国的科研人员起到非常好的指导作用。