Misc Math, Data Science, Machine Learning, PCA, FA

"In mathematics, a set B of elements (vectors) in a vector space V is called a basis, if every element of V may be written in a unique way as a (finite) linear combination of elements of B. The coefficients of this linear combination are referred to as components or coordinates on B of the vector. The elements of a basis are called basis vectors."

"

Equivalently B is a basis if its elements are linearly independent and every element of V is a linear combination of elements of B.[1] In more general terms, a basis is a linearly independent spanning set.

A vector space can have several bases; however all the bases have the same number of elements, called the dimension of the vector space.

"

https://en.wikipedia.org/wiki/Basis_(linear_algebra)

Positive Semidefinite Matrix
"A positive semidefinite matrix is a Hermitian matrix all of whose eigenvalues are nonnegative. SEE ALSO: Negative Definite Matrix, Negative Semidefinite Matrix, Positive Definite Matrix, Positive Eigenvalued Matrix, Positive Matrix."

http://mathworld.wolfram.com/PositiveSemidefiniteMatrix.html

"

Hermitian Matrix

A square matrix is called Hermitian if it is self-adjoint. Therefore, a Hermitian matrix A=(a_(ij)) is defined as one for which

A=A^(H), (1)

where A^(H) denotes the conjugate transpose. This is equivalent to the condition

a_(ij)=a^__(ji),

"

http://mathworld.wolfram.com/HermitianMatrix.html

Definiteness of a matrix

"In linear algebra, a symmetric n\times n real matrix M is said to be positive definite if the scalar {\displaystyle z^{\textsf {T}}Mz} is strictly positive for every non-zero column vector z of n real numbers. Here {\displaystyle z^{\textsf {T}}} denotes the transpose of z.[1] When interpreting {\displaystyle Mz} as the output of an operator, M, that is acting on an input, z, the property of positive definiteness implies that the output always has a positive inner product with the input, as often observed in physical processes."
https://en.wikipedia.org/wiki/Definiteness_of_a_matrix

"

Singular value decomposition

From Wikipedia, the free encyclopedia

Jump to navigationJump to search
Illustration of the singular value decomposition UΣV* of a real 2×2 matrix M.

  • Top: The action of M, indicated by its effect on the unit disc D and the two canonical unit vectors e1 and e2.
  • Left: The action of V*, a rotation, on D, e1, and e2.
  • Bottom: The action of Σ, a scaling by the singular values σ1 horizontally and σ2 vertically.
  • Right: The action of U, another rotation.

In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix that generalizes the eigendecomposition of a square normal matrix to any {\displaystyle m\times n}m\times n matrix via an extension of the polar decomposition.

Specifically, the singular value decomposition of an m\times n real or complex matrix \mathbf {M} is a factorization of the form {\displaystyle \mathbf {U\Sigma V^{*}} }, where \mathbf {U} is an m\times m real or complex unitary matrix, \mathbf{\Sigma} is an m\times n rectangular diagonal matrix with non-negative real numbers on the diagonal, and \mathbf {V} is an n\times n real or complex unitary matrix. If \mathbf {M} is real, \mathbf {U} and {\displaystyle \mathbf {V} =\mathbf {V^{*}} } are real orthonormal matrices."

https://en.wikipedia.org/wiki/Singular_value_decomposition

PCA using Python (scikit-learn)

https://towardsdatascience.com/pca-using-python-scikit-learn-e653f8989e60

Random R code in relation to PCA

#calculate covariance matrix
cov_mat = cov(normalized_mat)

#Calculation of eigen values using built in eigen function
#no need here to do our own eigen
eig <- eigen(cov_mat)

#verify with prcomp from R (principal components function)
prcomp(pca_data)

eig$vectors

t(eig$vectors)

Some more information on PCA and FA (Factor Analysis)
https://www.cs.rutgers.edu/~elgammal/classes/cs536/lectures/i2ml-chap6.pdf

*** *** ***
Note: Older short-notes from this site are posted on Medium: https://medium.com/@SayedAhmedCanada

*** . *** *** . *** . *** . ***

Sayed Ahmed

BSc. Eng. in Comp. Sc. & Eng. (BUET)
MSc. in Comp. Sc. (U of Manitoba, Canada)
MSc. in Data Science and Analytics (Ryerson University, Canada)
Linkedin: https://ca.linkedin.com/in/sayedjustetc

Blog: http://Bangla.SaLearningSchool.com, http://SitesTree.com
Online and Offline Training: http://Training.SitesTree.com (Also, can be free and low cost sometimes)

Facebook Group/Form to discuss (Q & A): https://www.facebook.com/banglasalearningschool

Our free or paid training events: https://www.facebook.com/justetcsocial

Get access to courses on Big Data, Data Science, AI, Cloud, Linux, System Admin, Web Development and Misc. related. Also, create your own course to sell to others. http://sitestree.com/training/