# Topic 17 – Linear Algebra

Why do I need to learn about linear algebra?

Linear algebra is a fundamental tool for understanding many modern theories and techniques such as artificial intelligence, machine learning, deep learning, data mining, security, digital imagine processing, and natural language processing.

What can I do after finishing learning about linear algebra?

You will be prepared to learn modern theories and techniques to create modern security, machine learning, data mining, image processing or natural language processing software.

That sounds useful! What should I do now?

Please read this David C. Lay et al. (2022). Linear Algebra and Its Applications. Pearson Education book.

Alternatively, please watch this MIT 18.06 Linear Algebra, Spring 2005 course. While watching this course please do read Lecture Notes, and this Gilbert Strang (2016). Introduction to Linear Algebra. Wellesley-Cambridge Press book for better understanding some complex topics.

Terminology Review:

• Triangular matrix is a square matrix where all the values above or below the diagonal are zero.
• Diagonal matrix is a matrix in which the entries outside the main diagonal are all zero.
• Column space, C(A) consists of all combinations of the columns of A and is a vector space in ℝᵐ.
• Nullspace, N(A) consists of all solutions x of the equation Ax = 0 and lies in ℝⁿ.
• Row space, C(Aᵀ) consists of all combinations of the row vectors of A and form a subspace of ℝⁿ. We equate this with C(Aᵀ), the column space of the transpose of A.
• The left nullspace of A, N(Aᵀ) is the nullspace of Aᵀ. This is a subspace of ℝᵐ.
• A basis for a vector space is a sequence of vectors with two properties:
• They are independent.
• They span the vector space.
• Given a space, every basis for that space has the same number of vectors; that number is the dimension of the space.
• Dot product.
• Orthogonal vectors.
• Orthogonal subspaces.
• Row space of A is orthogonal to  nullspace of A.
• Orthogonal complements.
• Projection matrix: P = A(AᵀA)⁻¹Aᵀ. Properties of projection matrix: Pᵀ = P and P² = P. Projection component: Pb = A(AᵀA)⁻¹Aᵀb = (AᵀA)⁻¹(Aᵀb)A.
• Linear regression, least squares, and normal equations: Instead of solving Ax = b we solve Ax̂ = p or AᵀAx̂ = Aᵀb.
• Orthogonal matrix.
• Orthogonal basis.
• Orthonormal vectors.
• Orthonormal basis.
• Gram–Schmidt process.
• Determinant: A number associated with any square matrix letting us know whether the matrix is invertible, the formula for the inverse matrix, the volume of the parallelepiped whose edges are the column vectors of A. The determinant of a triangular matrix is the product of the diagonal entries (pivots).
• The big formula for computing the determinant.
• The cofactor formula rewrites the big formula for the determinant of an n by n matrix in terms of the determinants of smaller matrices.
• Formula for inverse matrix.
• Cramer’s rule.
• Eigenvectors are vectors for which Ax is parallel to x: Ax = λx. λ is an eigenvalue of A, det(A − λI)= 0.
• Diagonalizing a matrix: AS = SΛ 🡲 S⁻¹AS = Λ 🡲 A = SΛS⁻¹. S: matrix of n linearly independent eigenvectors. Λ: matrix of eigenvalues on diagonal.
• Matrix exponential eᴬᵗ.
• Markov matrices: All entries are non-negative and each column adds to 1.
• Symmetric matrices: Aᵀ = A.
• Positive definite matrices: all eigenvalues are positive or all pivots are positive or all determinants are positive.
• Similar matrices: A and B = M⁻¹AM.
• Singular value decomposition (SVD) of a matrix: A = UΣVᵀ, where U is orthogonal, Σ is diagonal, and V is orthogonal.
• Linear Transformations: T(v + w) = T(v)+ T(w) and T(cv)= cT(v) . For any linear transformation T we can find a matrix A so that T(v) = Av.

After finishing the books please click Topic 18 – Probability & Statistics to continue.

# Topic 16 – Calculus

Why do I need to learn about calculus?

Calculus is a fundamental tool for understanding modern theories and techniques to create software such as artificial intelligence, machine learning, deep learning, data mining, security, digital imagine processing and natural language processing.

What can I do after finishing learning about calculus?

You will then be prepared to be able to learn modern theories and techniques to create security, data mining, image processing or natural language processing software.

What should I do now?

Alternatively, please watch
– this MIT 18.01 Single Variable Calculus, Fall 2007 course (Lecture Notes), then watch
– this MIT 18.02 Multivariable Calculus, Fall 2007 course (Lecture Notes). You will need some Linear Algebra knowledge (specifically Inverse Matrix and Determinant) to understand Multivariable Calculus.
When you watch these courses please be sure to refer to
– this George F. Simmons (1996). Calculus With Analytic Geometry. McGraw-Hill book or
– this C. Henry Edwards David E. Penney (2008). Calculus – Early Transcendentals. Pearson book when you have any difficulties with understanding the lectures.

After that please watch this Highlights of Calculus course to review many core concepts of Calculus.

What is the difference between calculus and analysis?

Calculus means a method of calculation. Calculus is about differentiation and integration.

Real analysis includes calculus, and other topics that may not be of interest to engineers but of interest to pure mathematicians such as measure theory, lebesgue integral, topology, functional analysis, complex analysis, PDE, ODE, proofs of theorems.

What does early transcendentals mean?

Transcendentals in this context refers to functions like the exponential, logarithmic, and trigonometric functions.

The early transcendentals approach means that the book introduces polynomial, rational functions, exponential, logarithmic, and trigonometric functions at the beginning, then use them as examples when developing differential calculus. This approach is good for students who do not need to take much rigorous math.

The classical approach is the late transcendentals. It means that the book develops differential calculus using only polynomials and rational functions as examples, then introduces the other functions afterwards. This approach is good for students who need to understand more rigorous definitions of the transcendental functions.

Terminology Review:

• Fundamental Theorem of Calculus.
• L’Hôpital’s Rule.
• Improper Integrals.
• Infinite Series.
• Taylor’s Series.
• Dot Product.
• Cross Product.
• Inverse Matrix.
• Determinant.
• Equations of Planes: ax + by + cz = d
• Parametric Equations = as trajectory of a moving point.
• Velocity Vector
• Acceleration Vector
• Functions of Several Variables
• Partial Derivatives
• Second Derivatives
• Second Derivative Test
• Differentials
• Power Series
• Euler’s Formula

After finishing the books please click Topic 17 – Linear Algebra to continue.