I’d guess that my math background is my shakiest foundation for grasping ML. I have a formal education in creative writing, and more professional experience as an English teacher than anything else. The course lists multi variable calculus and linear algebra as prereqs. I get algebra. I’ve never taken a calculus class. Let’s see how it goes.

Shape of data

  • Scalar: single value, 0 dimensions
  • Vectors: lists of value, rows and columns
    • row v col is conceptual
    • Single dimensional, the length
  • Matrix: 2 dimensional grid of values
    • Rows x Cols
  • Tensors
    • Any n dimensional collection of values
    • 4+ dimenions is hard to visualize, but math still works
    • All matrices are tensors, not all tensors are matrices
  • Indices
    • Values in matrices (And tensors?) are identified via their indices, starting with 1 (not 0)
    • So x22 would refer to the second number in the second column of matrix x

Matrix Operations

  • Apply a scalar operator to entire matrix
    • For example, convert a matrix of RGB values by dividing it by 255
  • Matrix on matrix operations:
    • Values at same indices operate on each other and store their value in results matrix at same indicies
    • Matrices must be the same shape to operate on each other
  • Using np instead of python to perform these operations is quicker
  • ‘reset’ an array, or convert all its values to 0, by multiplying by 0: m *= 0

Dot product

  • Multiply corresponding elements of a vector, then add products

Matrix Multiplication

  • Usually, matrix multiplication refers to ‘matrix product’
  1. Find dot product of rows in left matrix and columns in second matrix
  2. Create results matrix where the dot product from previous step is stored the index of (row, column)
  • To multiply matrices, # of columns in left matrix must equal number of rows in right matrix
  • To check multiplicability, look at shapes of matricies, side by side
  • For example, matrix A is a 2 by 3 matrix, matrix B is 3 by 2, so side by side they look like:

4 X 3 * 3 X 7

  • If inner numbers match, they’re multiplicable
  • Dimensions of result matrix can be predicted by outter numbers
    • Results matrix has same number of rows of left, and same number of rows as right matrix
    • So result matrix of above example would be a 4 x 7 matrix
  • Changing order of operatees changes outcome
    • Matrix multiplication is not commutative.

Matrix Transpose

  • If matrix A’s columns match the rows of column B, matrix A can be called a transpose of matrix B.
    • Matching rows and columns means row 1 of matrix B is column 1 of matrix A.
  • If matrix A is conceptually organized as rows, then its transpose is organized as columns
  • Transposes can be useful for reshaping a matrix to make it multiplicable
  • Not always a helpful solution. Determine case-by-case
  • A transpose will only be useful if the two related matrices are initially organized by rows

Math with NumPy

  • A C library for performant math
  • Conventionally referred to as np
  • np gives access to ndarray, an array data structure supporting any number of dimensions
  • Expands primitive number types
  • Fully interoperable with python primitives
  • Create multi-dimension arrays as you might expect:

    np.array([[1,2,3], [4,5,6], [7,8,9]])
  • Possible to ‘reshape’ tensors w/ np.reshape method
  • np.matmul is the matrix multiplication function
  • np.dot is another matrix multiplication method that produces the same results for 2-d arrays only
  • A matrix’s transpose is exposed in a T attribute.
    • References same matrix in memory, so mutating one affects the other.

Yeeee I know all the math!