توضیحاتی در مورد کتاب Nonlinear Optimization: Models and Applications
نام کتاب : Nonlinear Optimization: Models and Applications
عنوان ترجمه شده به فارسی : بهینه سازی غیرخطی: مدل ها و برنامه ها
سری : Textbooks in Mathematics
نویسندگان : William P. Fox
ناشر : CRC Press
سال نشر : 2020
تعداد صفحات : 417
ISBN (شابک) : 9780367444150 , 9781003009573
زبان کتاب : English
فرمت کتاب : pdf
حجم کتاب : 14 مگابایت
بعد از تکمیل فرایند پرداخت لینک دانلود کتاب ارائه خواهد شد. درصورت ثبت نام و ورود به حساب کاربری خود قادر خواهید بود لیست کتاب های خریداری شده را مشاهده فرمایید.
فهرست مطالب :
Cover
Half Title
Series Page
Title Page
Copyright Page
Dedication
Table of Contents
Preface: Nonlinear Optimization—Models and Applications
Acknowledgments
Author
1 Introduction to Optimization Models
1.1 Introduction
1.1.1 History
1.1.2 Applications of Optimization
1.1.3 Modelling
1.2 Classifying Optimization Problems
1.3 Review of Mathematical Programming with Excel Technology
1.3.1 Excel Using the Solver
1.3.2 Examples for Integer, Mixed-Integer, and Nonlinear Optimization
1.4 Exercises
1.5 Review of the Simplex Method in Excel Using Revised Simplex
1.5.1 Steps of the Simplex Method
References and Suggested Further Reading
2 Review of Differential Calculus
2.1 Limits
2.2 Continuity
2.3 Differentiation
2.3.1 Increasing and Decreasing Functions
2.3.2 Higher Derivatives
2.4 Convex and Concave Functions
Exercises
References and Suggested Reading
3 Single-Variable Unconstrained Optimization
3.1 Introduction
3.2 Single-Variable Optimization and Basic Theory
3.3 Basic Applications of Max-Min Theory
Exercises
3.4 Applied Single-Variable Optimization Models
Exercises
Projects
References and Suggested Reading
4 Numerical Search Techniques in Single-Variable Optimization
4.1 Single-Variable Techniques
4.1.1 Unrestricted Search
4.1.2 Exhaustive Search
4.1.3 Dichotomous Search
4.1.4 Golden Section Search
4.1.5 Finding the Maximum of a Function Over an Interval with Golden Section
4.1.6 Golden Section Search with Technology
4.1.6.1 Excel Golden Search
4.1.6.2 Maple Golden Search
4.1.6.3 MATLAB Golden Search
4.1.7 Illustrious Examples with Technology
4.1.8 Fibonacci’s Search
4.1.8.1 Finding the Maximum of a Function Over an Interval with the Fibonacci Method
4.2 Interpolation with Derivatives: Newton’s Method
4.2.1 Finding the Critical Points (Roots) of a Function
4.2.2 The Basic Application
4.2.3 Newton’s Method to Find Critical Points with Technology
4.2.4 Excel: Newton’s Method
4.2.5 Maple: Newton’s Method
4.2.6 Newton’s Method for Critical Points with MATLAB
4.2.7 The Bisection Method with Derivatives
Exercises
Projects
References and Suggested Further Readings
5 Review of Multivariable Differential Calculus
5.1 Introduction: Basic Theory and Partial Differentiation
5.2 Directional Derivatives and the Gradient
Exercises
References and Suggested Reading
6 Models Using Unconstrained Optimization: Maximization and Minimization with Several Variables
6.1 Introduction
6.2 The Hessian Matrix
6.3 Unconstrained Optimization
Exercises
6.4 Eigenvalues
Exercises
Reference and Further Suggested Reading
7 Multivariate Optimization Search Techniques
7.1 Introduction
7.2 Gradient Search Methods
7.3 Examples of Gradient Search
7.4 Modified Newton’s Method
7.4.1 Modified Newton with Technology
Exercises
7.5 Comparisons of Methods
7.5.1 Maple Code for Steepest Ascent Method (See Fox and Richardson)
7.5.2 Newton’s Method for Optimization in Maple
Exercises
Projects Chapter 7
References and Suggested Reading
8 Optimization with Equality Constraints
8.1 Introduction
8.2 Equality Constraints Method of Lagrange Multipliers
8.3 Introduction and Basic Theory
8.4 Graphical Interpretation of Lagrange Multipliers
8.5 Computational Method of Lagrange Multipliers
Lagrange Method with Technology
8.6 Applications with Lagrange Multipliers
Exercises
Projects
References and Suggested Reading
9 Inequality Constraints: Necessary/Sufficient Kuhn–Tucker Conditions (KTC)
9.1 Introduction to KTC
9.2 Basic Theory of Constrained Optimization
9.2.1 Necessary and Sufficient Conditions
9.3 Geometric Interpretation of KTC
9.3.1 Spanning Cones (Optional)
9.4 Computational KTC with Maple
9.5 Modelling and Application with KTC
Exercises
Project
Manufacturing
References and Suggested Reading
10 Specialized Nonlinear Optimization Methods
10.1 Introduction
10.1.1 Numerical and Heuristic Methods
10.1.2 Technology
10.2 Method of Feasible Directions
Exercises
10.3 Quadratic Programming
Exercises
10.4 Separable Programming
10.4.1 Adjacency Assumptions
10.4.2 Linearization Property
Exercises
References and Suggested Reading
11 Dynamic Programming
11.1 Introduction: Basic Concepts and Theory
11.1.1 Characteristics of Dynamic Programming
11.1.2 Working Backwards
11.2 Continuous DP
11.3 Modelling and Applications of Continuous DP
Exercises
11.4 Models of Discrete Dynamic Programming
11.5 Modelling and Applications of Discrete DP
Exercises
References and Suggested Readings
12 Data Analysis with Regression Models, Advanced Regression Models, and Machine Learning Through Optimization
12.1 Introduction and Machine Learning
12.1.1 Machine Learning
12.1.1.1 Data Cleaning and Breakdown
12.1.1.2 Engineering
12.1.1.3 Model Fitting
12.2 The Different Curve Fitting Criterion
12.2.1 Fitting Criterion 1: Least Squares
12.2.2 Fitting Criterion 2: Minimize the Sum of the Absolute Deviations
12.2.3 Fitting Criterion 3: Chebyshev’s Criterion or Minimize the Largest Error
Exercises
12.3 Introduction to Simple Linear and Polynomial Regression
12.3.1 Excel
12.3.2 Regression in Maple
12.3.3 MATLAB
Exercises
12.4 Diagnostics in Regression
12.4.1 Example for the Common-Sense Test
12.4.1.1 Exponential Decay Example
12.4.2 Multiple Linear Regression
Exercises
12.5 Nonlinear Regression Through Optimization
12.5.1 Exponential Regression
12.5.1.1 Newton–Raphson Algorithm
12.5.2 Sine Regression Using Optimization
12.5.3 Illustrative Examples
12.5.3.1 Nonlinear Regression (Exponential Decay)
Exercises
12.6 One-Predictor Logistic and One-Predictor Poisson Regression Models
12.6.1 Logistic Regression and Poisson Regression with Technology
12.6.1.1 Logistic Regression with Technology
12.6.1.2 Simple Poisson Regression with Technology
12.6.2 Logistic Regression Illustrious Examples
12.6.3 Poisson Regression Discussion and Examples
12.6.3.1 Normality Assumption Lost
12.6.3.2 Estimates of Regression Coefficients
12.6.4 Illustrative Poisson Regression Examples
12.6.4.1 Maple
Exercises
Projects
12.7 Conclusions and Summary
References and Suggested Reading
Answers to Selected Problems
Index