An introduction to optimization

Praise from the Second Edition ""...an excellent introduction to optimization theory..."" (Journal of Mathematical Psychology, 2002) ""A textbook for a one-semester course on optimization theory and methods at the senior undergraduate or beginning graduate level."&...

Descripción completa

Detalles Bibliográficos
Autor principal: Chong, Edwin Kah Pin (-)
Otros Autores: Zak, Stanislaw H.
Formato: Libro electrónico
Idioma:Inglés
Publicado: Hoboken, N.J. : Wiley-Interscience c2008.
Edición:3rd ed
Colección:Wiley-Interscience series in discrete mathematics and optimization.
Materias:
Ver en Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009665120506719
Tabla de Contenidos:
  • An Introduction to Optimization; CONTENTS; PART I MATHEMATICAL REVIEW; 1 Methods of Proof and Some Notation; 1.1 Methods of Proof; 1.2 Notation; Exercises; 2 Vector Spaces and Matrices; 2.1 Vector and Matrix; 2.2 Rank of a Matrix; 2.3 Linear Equations; 2.4 Inner Products and Norms; Exercises; 3 Transformations; 3.1 Linear Transformations; 3.2 Eigenvalues and Eigenvectors; 3.3 Orthogonal Projections; 3.4 Quadratic Forms; 3.5 Matrix Norms; Exercises; 4 Concepts from Geometry; 4.1 Line Segments; 4.2 Hyperplanes and Linear Varieties; 4.3 Convex Sets; 4.4 Neighborhoods; 4.5 Polytopes and Polyhedra
  • Exercises5 Elements of Calculus; 5.1 Sequences and Limits; 5.2 Differentiability; 5.3 The Derivative Matrix; 5.4 Differentiation Rules; 5.5 Level Sets and Gradients; 5.6 Taylor Series; Exercises; PART II UNCONSTRAINED OPTIMIZATION; 6 Basics of Set-Constrained and Unconstrained Optimization; 6.1 Introduction; 6.2 Conditions for Local Minimizers; Exercises; 7 One-Dimensional Search Methods; 7.1 Golden Section Search; 7.2 Fibonacci Search; 7.3 Newton's Method; 7.4 Secant Method; 7.5 Remarks on Line Search Methods; Exercises; 8 Gradient Methods; 8.1 Introduction
  • 8.2 The Method of Steepest Descent8.3 Analysis of Gradient Methods; Exercises; 9 Newton's Method; 9.1 Introduction; 9.2 Analysis of Newton's Method; 9.3 Levenberg-Marquardt Modification; 9.4 Newton's Method for Nonlinear Least Squares; Exercises; 10 Conjugate Direction Methods; 10.1 Introduction; 10.2 The Conjugate Direction Algorithm; 10.3 The Conjugate Gradient Algorithm; 10.4 The Conjugate Gradient Algorithm for Nonquadratic Problems; Exercises; 11 Quasi-Newton Methods; 11.1 Introduction; 11.2 Approximating the Inverse Hessian; 11.3 The Rank One Correction Formula; 11.4 The DFP Algorithm
  • 11.5 The BFGS AlgorithmExercises; 12 Solving Linear Equations; 12.1 Least-Squares Analysis; 12.2 The Recursive Least-Squares Algorithm; 12.3 Solution to a Linear Equation with Minimum Norm; 12.4 Kaczmarz's Algorithm; 12.5 Solving Linear Equations in General; Exercises; 13 Unconstrained Optimization and Neural Networks; 13.1 Introduction; 13.2 Single-Neuron Training; 13.3 The Backpropagation Algorithm; Exercises; 14 Global Search Algorithms; 14.1 Introduction; 14.2 The Nelder-Mead Simplex Algorithm; 14.3 Simulated Annealing; 14.4 Particle Swarm Optimization; 14.5 Genetic Algorithms; Exercises
  • PART III LINEAR PROGRAMMING15 Introduction to Linear Programming; 15.1 Brief History of Linear Programming; 15.2 Simple Examples of Linear Programs; 15.3 Two-Dimensional Linear Programs; 15.4 Convex Polyhedra and Linear Programming; 15.5 Standard Form Linear Programs; 15.6 Basic Solutions; 15.7 Properties of Basic Solutions; 15.8 Geometric View of Linear Programs; Exercises; 16 Simplex Method; 16.1 Solving Linear Equations Using Row Operations; 16.2 The Canonical Augmented Matrix; 16.3 Updating the Augmented Matrix; 16.4 The Simplex Algorithm; 16.5 Matrix Form of the Simplex Method
  • 16.6 Two-Phase Simplex Method