Get Started. It's Free
or sign up with your email address
LA by Mind Map: LA

1. I. Linear systems and matrices

1.1. 1. Systems of linear equations

1.1.1. Introduction

1.1.2. Systems of linear equations

1.1.2.1. System of linear equations, or linear system

1.1.2.2. Solution to a system

1.1.2.3. Lines in R^2

1.1.2.4. Planes in R^3

1.1.3. Solving systems

1.1.3.1. Main questions

1.1.3.2. Consistent system

1.1.3.3. Equivalent systems

1.1.3.4. Coefficient matrix

1.1.3.5. Augmented coefficient matrix

1.1.3.6. Elementary row operations

1.1.3.7. Row equivalent matrices

1.1.3.8. Row echelon form

1.1.3.8.1. Definition REF

1.1.3.8.2. not unique

1.1.3.8.3. Pivot column/variable

1.1.3.8.4. Free variable

1.1.3.9. Reduced row echelon form

1.1.3.9.1. Definition RREF

1.1.3.9.2. unique

1.1.3.10. Gaussian and Gauss–Jordan elimination

1.1.4. Non-singular matrices

1.1.4.1. Definition

1.1.4.2. Theorem

1.1.5. Rank of a matrix

1.1.6. Existence and uniquenness of solutions

1.1.6.1. Homogeneous systems Ax = 0

1.1.6.1.1. one (unique) solution

1.1.6.1.2. infinitely many solutions

1.1.6.2. Non-homogeneous systems Ax = b

1.1.6.2.1. no solution

1.1.6.2.2. consistent

1.2. 2. Matrix algebra

1.2.1. Introduction

1.2.1.1. Definition

1.2.1.2. Basic operations

1.2.1.3. Properties

1.2.2. Inner product of vectors

1.2.3. Matrix-vector multiplication

1.2.3.1. Row from

1.2.3.2. Column form

1.2.4. Linear combination

1.2.5. Linear independence

1.2.6. Linear span

1.2.7. Properties of matrix multiplication

1.2.8. Matrix-matrix multiplication

1.2.9. Vectro-matrix multiplication

1.2.10. Corollaries for linear systems

1.2.10.1. Homogeneous

1.2.10.2. Nonhomogeneous

1.2.11. Invertibility

1.2.11.1. Invertible matrices

1.2.11.2. Right- and left-invertibility

1.2.11.3. Invertibility of a product

1.2.11.4. Elementary row operations

1.2.11.5. Products of lower-triangular matrices

1.2.11.6. LU-factorization

1.2.11.7. Method of finding A^-1

1.2.11.8. Characterization of invertible matrices

1.2.11.9. Corollaries

1.3. 3. Determinants

1.3.1. In dimension 2

1.3.1.1. properties

1.3.1.2. Geometrical meaning

1.3.2. In dimension 3

1.3.2.1. Mnemonic rules for 3 × 3 matrices

1.3.2.2. Geometrical meaning

1.3.3. In any dimension

1.3.3.1. Abstract definition

1.3.3.2. Futher properties

1.3.4. Inverses ect.

1.3.4.1. Minors and cofactors

1.3.4.1.1. Row Cofactor Expansion

1.3.4.1.2. (Column Cofactor Expansion

1.3.4.2. Cross-product of vectors

1.3.4.2.1. Properties

1.3.4.3. Formula for the determinant

1.3.4.4. Characteristic polynomial

1.3.4.5. Inverse matrix via determinants

1.3.4.5.1. Adjugate

1.3.4.6. Cramer’s rule

2. II. Linear spaces and transformations

2.1. 4. Linear vector spaces

2.1.1. Definition

2.1.2. Examples

2.1.3. Theorem 1

2.1.4. Subspaces

2.1.4.1. Definition

2.1.4.2. Remark

2.1.4.3. Examples

2.1.5. Linear combinations and linear spans

2.1.5.1. Linear combination

2.1.5.2. Linear span

2.1.5.2.1. Properties

2.1.5.2.2. Coincidence of linear spans

2.1.5.2.3. Linear independence

2.1.5.2.4. Examples

2.1.5.2.5. Wronskian and independence

2.2. 5. Bases

2.2.1. Bases in vector spaces

2.2.1.1. Definition

2.2.1.2. Example

2.2.1.3. Lemma

2.2.1.4. Coordinates

2.2.2. Dimensions

2.2.2.1. Finite- and infinite-dimentional v.c.

2.2.2.1.1. Teorem

2.2.2.2. Definition

2.2.2.3. Examples

2.2.2.4. Sufficient conditions for a basis

2.2.2.5. Warning on dimensions

2.2.2.5.1. Example

2.2.3. Four subspaces

2.2.3.1. Column space

2.2.3.1.1. dimension

2.2.3.2. Row space

2.2.3.2.1. dimension

2.2.3.3. Left nullspace

2.2.3.3.1. dimension

2.2.3.4. Nullspace

2.2.3.4.1. dimension

2.2.4. Rank

2.2.4.1. Definition

2.2.4.2. Properties of rank

2.2.5. Coordinate maps

2.2.5.1. Definition of coordinate map

2.2.5.2. Linear maps and isomorphism

2.2.5.3. Isomorphic linear vector spaces

2.2.5.4. Isomorphism to R^n

2.2.5.4.1. Properties of Ts

2.2.5.4.2. Corollary1

2.2.5.4.3. Corollary2

2.2.6. Changes of basis

2.2.6.1. Theorem

2.2.6.2. Computing the transition matrices in R^n

2.3. 6. Linear transformations

2.3.1. Coordinates and change of basis

2.3.1.1. Coordinate map

2.3.1.2. Change of basis

2.3.1.2.1. Theorem

2.3.1.2.2. Computing the transition matrices in R^n

2.3.2. Linear transformations between R^n and R^m

2.3.2.1. Linear transformations = matrix multiplications

2.3.2.2. Linear transformations of the plane

2.3.2.3. Further properties

2.3.2.3.1. Composition of linear transformations

2.3.2.3.2. General linear transformation

2.3.2.3.3. Transformation in different bases

3. II. Orthogonality

3.1. 7. Orthogonal vectors and subspaces

3.1.1. Distances and norms

3.1.1.1. Distances in R^n

3.1.1.1.1. Euclidean

3.1.1.1.2. Block/Manhattan

3.1.1.1.3. Maximum-coordinate

3.1.1.1.4. Properties

3.1.1.2. Norm/length

3.1.1.2.1. Definition

3.1.1.2.2. Example

3.1.2. Inner product

3.1.2.1. Deffnition

3.1.2.1.1. Remark

3.1.2.2. Properties

3.1.2.2.1. Transposition

3.1.3. Cosine theorem

3.1.3.1. n = 2

3.1.3.2. General case

3.1.3.3. Geometric interpretation

3.1.3.3.1. Hyperplane in R^n

3.1.4. Orthogonal vectors and subspaces

3.1.4.1. Basic inequalities and theorems

3.1.4.1.1. Cauchy–Bunyakovsky–Schwarz inequality

3.1.4.1.2. Triangle inequality

3.1.4.1.3. Pythagorean theorem

3.1.4.1.4. Parallelogram identity

3.1.4.2. Orthogonal vectors/subspaces

3.1.4.2.1. Definitions

3.1.4.2.2. Properties

3.1.4.2.3. Orthogonal sum

3.1.4.3. Four orthogonal subspaces

3.1.4.3.1. Second fundamental theorem of LA

3.1.4.4. Pythagorean theorem

3.1.5. Shortest distance

3.1.5.1. to a line

3.1.5.2. to a plane/subspace

3.2. 8. Orthogonal projections

3.2.1. Distances to a subspace

3.2.1.1. Pythagorean theorem and shortest distance

3.2.1.1.1. Parallelogram rule

3.2.1.1.2. Pythagorean theorem

3.2.1.2. Shortest distance to a line

3.2.1.3. Shortest distance to a plane

3.2.2. Projections

3.2.2.1. Orthogonal decomposition

3.2.2.2. Projectors

3.2.2.2.1. Definition

3.2.2.2.2. Properties

3.2.2.2.3. Extremal properties of projections

3.2.2.3. Standard error, covariance, and correlation via LA

3.2.2.4. Projection onto a subspace

3.2.3. Least squares

3.2.3.1. Approximate solution to a linear system Ax = b

3.2.3.2. Least squares solution to Ax = b

3.2.3.3. Linear regression

3.3. 9.Orthogonalization and QR

3.3.1. Orthonormal bases and orthogonal matrices

3.3.1.1. Orthogonal sets

3.3.1.1.1. Definition

3.3.1.1.2. Remark

3.3.1.1.3. Properties

3.3.1.2. Orthonormal basis

3.3.1.2.1. Orthogonal basis

3.3.1.3. Othogonal columns

3.3.1.3.1. Orthonormal columns

3.3.1.3.2. Least squares solution

3.3.1.3.3. Projection

3.3.1.4. Orthogonal projectors

3.3.1.5. Orthogonal matrices

3.3.1.5.1. Criterion for orthogonality

3.3.1.5.2. Properties

3.3.2. Gram–Schmidt orthogonalization and QR

3.3.2.1. QR factorization

3.3.2.1.1. Properties

3.3.2.1.2. Full QR factorization

3.3.2.1.3. Application of QR to least squares

3.3.2.2. Householder’s reflections

3.3.2.3. Givens rotations

3.3.3. Fourier transform

3.3.3.1. Hilbert space

3.3.3.1.1. Examples

3.3.3.2. Fourier series

3.3.3.3. Fourier transform

4. IV. Eigenvalues and eigenvectors

4.1. 10. Eigenvalues and eigenvectors

4.1.1. Definition and properties

4.1.2. EV’s of upper- and lower-triangular matrices

4.1.2.1. Theorem

4.1.3. How to compute An

4.1.4. Diagonalization

4.1.4.1. Theorem

4.1.4.2. Solving the k-dimensional systems

4.1.4.2.1. Theorem

4.1.4.3. The powers of a matrix

4.1.4.3.1. Theorem

4.1.4.4. Markov processes

4.1.5. Not diagonal

4.1.5.1. Diagonalize if

4.1.5.2. Definition

4.1.5.3. Jordan blocks

4.1.5.3.1. Definition

4.1.5.4. Jordan normal form

4.1.5.5. Linear differential equations and stability

4.1.5.6. Invariants of similar matrices