Get Started. It's Free
or sign up with your email address
LW Review by Mind Map: LW Review

1. Outline

1.1. Basics

1.1.1. Python

1.1.1.1. basics

1.1.1.1.1. control structures

1.1.1.1.2. data types

1.1.1.1.3. classes

1.1.1.1.4. interactive usage

1.1.1.1.5. exceptions

1.1.1.1.6. iterators

1.1.1.2. libraries

1.1.1.2.1. io

1.1.1.2.2. string processing

1.1.1.2.3. regular expressions

1.1.1.2.4. numpy

1.1.1.2.5. pylab

1.1.1.2.6. scipy... (selected submodules)

1.1.1.2.7. scikit... (selected submodules)

1.1.2. array processing

1.1.2.1. indexing and slicing

1.1.2.2. shaping and reshaping

1.1.2.3. copying and sharing

1.1.2.4. recycling

1.1.2.5. element-wise operations

1.1.2.6. transpose

1.1.2.7. dot and outer

1.1.2.8. reductions and scans

1.1.2.9. rolling and shifting

1.1.2.10. transforming loops into arrays

1.1.2.11. array concatenation

1.1.2.12. meshgrid, mgrid, r_, c_

1.1.2.13. put, take, subscripting by arrays

1.1.2.14. data parallelism

1.2. Image Processing

1.2.1. Images, Pixels, Colors

1.2.1.1. image types

1.2.1.1.1. element types

1.2.1.1.2. binary, gray, color

1.2.1.2. image shape

1.2.1.3. channels

1.2.1.4. image I/O

1.2.1.4.1. imread, imsave

1.2.1.4.2. image formats and their properties: PNG, JPEG

1.2.1.5. gamma correction

1.2.1.6. RGB and HSV color spaces

1.2.1.7. image blending with alpha channels

1.2.2. Smoothing Filters

1.2.2.1. box filters, gaussian filters

1.2.2.2. boundary conditions

1.2.2.3. temporal smoothing for noise reduction

1.2.2.4. spatial smoothing for noise reduction

1.2.2.5. statistical justification for smoothing

1.2.2.6. definition of linear filters

1.2.2.7. impulse response

1.2.2.8. relationship between linearity and impulse response and convolution

1.2.2.9. algebraic properties of linear filters

1.2.2.10. composition of filters

1.2.2.11. separability

1.2.2.12. Gaussian filters, definition of Gaussian kernel

1.2.2.13. expressing convolution with data-parallel operations

1.2.2.14. identify / predict the effect of these filters on images

1.2.3. Edge Detection

1.2.3.1. Prewitt, Sobel, Gaussian derivative filters

1.2.3.2. model-based derivation from a step edge

1.2.3.3. gradients in 2D, intensity surfaces

1.2.3.4. Laplace filters

1.2.3.5. identify / predict the effect of these filters on images

1.2.4. Images in the Frequency Domain

1.2.4.1. Fourier transform

1.2.4.2. coefficient ordering for 2D Fourier transform (i.e., where are the low/high frequencies in a 2D FFT)

1.2.4.3. basis functions for the 2D Fourier transform, impulse response of the 2D Fourier transform

1.2.4.4. frequency domain filtering

1.2.4.4.1. lowpass, highpass, bandpass

1.2.4.4.2. relationship to smoothing, edge detection

1.2.4.5. computing fast image convolutions with 2D FFTs

1.2.4.6. Gaussian filters vs lowpass filters; ringing

1.2.4.7. identify / predict the effect of these filters on images

1.2.4.8. see 1D FFT later in the course

1.2.5. Convolution, Template Matching

1.2.5.1. template matching by sliding windows and Euclidean distance of window contents to template

1.2.5.2. see nearest neighbor classification later in the course

1.2.5.3. relationship between convolution, correlation, and template matching

1.2.5.4. normalized cross correlation

1.2.5.5. peak finding

1.2.5.5.1. via local pixel comparisons

1.2.5.5.2. (note: also possible via comparison with local maximum filters; see morphology)

1.2.5.6. controlling the number and spacing of peaks via Gaussian filtering ("scale space")

1.2.6. Median and Rank Filters

1.2.6.1. definition of media, maximum, minimum, and rank filters

1.2.6.2. separability of these filters

1.2.6.3. applications and properties of these filters

1.2.6.4. proof of non-linearity, impulse response of nonlinear filters

1.2.6.5. identify / predict the effect of these filters on images

1.2.7. image components

1.2.7.1. connected component labeling

1.2.7.1.1. define and describe (we didn't cover the algorithm)

1.2.7.1.2. measurements.label

1.2.7.2. operations over connected components

1.2.7.2.1. measurements.sum

1.2.7.2.2. measurements.find_objects

1.2.8. Morphological Image Processing

1.2.8.1. binary morphological operations

1.2.8.2. algebraic properties of binary morphology

1.2.8.3. hit or miss transform

1.2.8.4. grayscale morphology, definition

1.2.8.5. tophat filters

1.2.8.6. identify / predict the effect of these filters on images

1.2.8.7. distance transform via grayscale morphology

1.2.8.8. distance transform via brushfire algorithm (cf dynamic programming later in the class)

1.3. Pattern Recognition

1.3.1. Introduction to Classification

1.3.1.1. object-oriented view of classification

1.3.1.2. nature vs program

1.3.1.3. noisy samples

1.3.1.4. training set, test set

1.3.1.5. intrinsic error rates

1.3.1.6. MNIST data

1.3.2. Nearest Neighbor Methods

1.3.2.1. nearest neighbor classification

1.3.2.2. asymptotic error rate about nearest neighbor classification

1.3.3. feature extraction

1.3.3.1. preprocessing, filtering

1.3.3.2. normalization and metrics

1.3.3.3. deskewing and canonicalization

1.3.4. Linear Classifiers

1.3.4.1. definition of linear classifiers

1.3.4.2. perceptron learning algorithm

1.3.4.3. homogeneous coordinates for linear classifiers

1.3.4.4. two-class to one-class trick

1.3.4.5. the perceptron criterion function

1.3.4.6. derivation of the perceptron learning algorithm by gradient descent

1.3.4.7. logistic regression via gradient descent

1.3.4.8. sigmoid functions

1.3.5. Nonlinear Classifiers, Kernel Methods

1.3.5.1. define linear and non-linear classification problems

1.3.5.2. direct learning of non-linear classifiers

1.3.5.3. non-linear classifiers by linear classification on non-linear transformations of the input data

1.3.5.4. the kernel trick

1.3.5.5. the perceptron learning algorithm with kernels

1.3.6. Model-Based Classifiers

1.3.6.1. decision regions for the normal density

1.3.6.1.1. what regions are possible?

1.3.6.1.2. what are the conditions under which the different kinds of decision regions occur?

1.3.6.1.3. derive the form of the decision regions from the normal density

1.3.6.2. linear discriminant analysis, quadratic discriminant analysis

1.3.6.3. differences between LDA/QDA and linear or quadratic perceptrons

1.3.6.4. formulas for estimation of mean and covariance matrices of Gaussians

1.3.7. Unsupervised Learning

1.3.7.1. k-Means Clustering

1.3.7.1.1. describe Lloyd's algorithm

1.3.7.1.2. describe how k-means clustering can be used for classification

1.3.7.2. PCA

1.3.7.2.1. describe properties of PCA

1.3.7.2.2. describe computation of PCA using eigenvectors

1.3.7.2.3. projection and reconstruction with PCA

1.3.7.2.4. using PCA as preprocessing for classifiers

1.4. Audio and Speech

1.4.1. speech and audio signals

1.4.1.1. what do they represent?

1.4.1.2. why are they commonly considered as being composed of sine waves?

1.4.1.3. frequency and phase

1.4.2. Fourier Transformation

1.4.2.1. sines and cosines

1.4.2.2. sines and cosines on discrete arrays

1.4.2.3. quadrature, how many linearly independent sine/cosine functions are there?

1.4.2.4. orthogonality of sine and cosine vectors

1.4.2.5. Fourier transform as projection onto sine/cosine basis

1.4.2.6. complex numbers and complex arithmetic

1.4.2.7. combining the sine/cosine basis functions into a complex Fourier transform

1.4.2.8. defining equation for the Fourier transform

1.4.2.9. inverse Fourier transforms

1.4.3. FFT Algorithm

1.4.3.1. algorithmic structure of the FFT algorithm

1.4.3.2. derivation of the Cooley-Tukey lemma

1.4.4. DTW

1.4.4.1. dynamic programming

1.4.4.2. dynamic time warping

1.4.4.3. edit distance algorithm

1.4.5. HMMs

1.4.5.1. Markov chains: definitions and properties

1.4.5.2. Hidden Markov Models: definitions and properties

1.4.5.3. forward algorithm

1.4.5.4. forward-backward algorithm

1.4.5.5. definition, explanation EM algorithms