Optimization and Linear Algebra/Math from the Internet
First order taylor approximation formula?
https://www.thestudentroom.co.uk/showthread.php?t=1247928
Hessian Matrix
https://en.wikipedia.org/wiki/Hessian_matrix
"In mathematics, the Hessian matrix or Hessian is a square matrix of second-order partial derivatives of a scalar-valued function, or scalar field. It describes the local curvature of a function of many variables."
Use in optimization
"Hessian matrices are used in large-scale optimization problems within Newton-type methods because they are the coefficient of the quadratic term of a local Taylor expansion of a function. That is,
"
"Newton's method in optimization"
In calculus, Newton's method is an iterative method for finding the roots of a differentiable function F, which are solutions to the equation F (x) = 0. In optimization, Newton's method is applied to the derivative f ′ of a twice-differentiable function f to find the roots of the derivative (solutions to f ′(x) = 0), also known as the stationary points of f. These solutions may be minima, maxima, or saddle points.[1]
https://en.wikipedia.org/wiki/Newton%27s_method_in_optimization
SOLVING LINEAR DIFFERENTIAL EQUATIONS WITH THE LAPLACE TRANSFORM
https://onlinelibrary.wiley.com/doi/pdf/10.1002/9781118733639.app6
Pointwise supremum of a convex function collection
is it "I think it is either assumed that the 𝑓𝑖 are defined on the same domain 𝐷, or that (following a common convention) we set 𝑓𝑖(𝑥)=+∞ if 𝑥∉Dom(𝑓𝑖). You can easily check that under this convention, the extended 𝑓𝑖 still remain convex and the claim is true."
"The supremum of a set is its least upper bound and the infimum is its greatest
upper bound."
https://www.math.ucdavis.edu/~hunter/m125b/ch2.pdf
Sine and Cosine Values
Barrier Function
"
Barrier function. In constrained optimization, a field of mathematics, a barrier function is a continuous function whose value on a point increases to infinity as the point approaches the boundary of the feasible region of an optimization problem.
"
https://en.wikipedia.org/wiki/Barrier_function
Trace: Marix
https://en.wikipedia.org/wiki/Trace_(linear_algebra)
Determinant
"The determinant of a matrix A is denoted det(A), det A, or |A|. Geometrically, it can be viewed as the volume scaling factor of the linear transformation described by the matrix. This is also the signed volume of the n-dimensional parallelepiped spanned by the column or row vectors of the matrix. The determinant is positive or negative according to whether the linear mapping preserves or reverses the orientation of n-space."
Ref: https://en.wikipedia.org/wiki/Determinant
Note: Older short-notes from this site are posted on Medium: https://medium.com/@SayedAhmedCanada
*** . *** *** . *** . *** . ***
Sayed Ahmed
BSc. Eng. in Comp. Sc. & Eng. (BUET)
MSc. in Comp. Sc. (U of Manitoba, Canada)
MSc. in Data Science and Analytics (Ryerson University, Canada)
Linkedin: https://ca.linkedin.com/in/sayedjustetc
Blog: http://Bangla.SaLearningSchool.com, http://SitesTree.com
Online and Offline Training: http://Training.SitesTree.com (Also, can be free and low cost sometimes)
Facebook Group/Form to discuss (Q & A): https://www.facebook.com/banglasalearningschool
Our free or paid training events: https://www.facebook.com/justetcsocial
Get access to courses on Big Data, Data Science, AI, Cloud, Linux, System Admin, Web Development and Misc. related. Also, create your own course to sell to others. http://sitestree.com/training/
If you want to contribute to occasional free and/or low cost online/offline training or charitable/non-profit work in the education/health/social service sector, you can financially contribute to: safoundation at salearningschool.com using Paypal or Credit Card (on http://sitestree.com/training/enrol/index.php?id=114 ).