Mathematics for Data Science Please use ONLY my *Inria* email address (marcella.bonazzoli@inria.fr) if you need to contact me (do NOT use my @universite-paris-saclay.fr address!) 12/09/24 Class 1 topics: vector space, example of R^n, geometrical interpretation ; linear combination, span, linearly independent vectors, spanning list, basis, dimension, canonical basis of R^n ; linear transformations, rank, image, kernel, rank-nullity theorem ; subspace (examples and proposition about the dimensions) ; surjective, injective, bijective functions, case of linear transformations. 19/09/24 Class 2 topics: matrices, diagonal matrices, identity matrices, symmetric matrices, transpose of a matrix ; addition of matrices, multiplication by a scalar, matrix-vector multiplication, matrix multiplication ; matrices and linear transformations ; rank, range, kernel of a matrix, theorem of equivalent statements about kernel and range of square matrix ; inverse of a matrix ; linear systems and matrices, first example of Gaussian elimination. 26/09/24 Class 3 topics: Gaussian elimination ; norms, examples of vector norms, scalar product ; orthogonal vectors, orthogonal basis, orthonormal vectors, orthonormal basis, orthogonal matrices (definition). 03/10/24 Class 4 topics: orthogonal matrices ; eigenvalues and eigenvectors, diagonalizable matrices, spectral theorem ; positive definite matrices, positive semi-definite matrices, Gram matrix ; polar decomposition ; singular value decomposition (for a square matrix), singular values, geometrical interpretation of SVD, SVD of rectangular matrices ; trace of a matrix. 10/10/24 Class 5 topics: matrix norms, sup, formulas for 2, 1 and infinity matrix norms, Frobenius norm, properties of matrix norms ; condition number, example of cond_2 of orthogonal matrices ; determinant, several properties, Laplace expansion, link with singular values and eigenvalues ; link of SVD with 2-norm and cond_2. 17/10/24 Class 6 topics: expression of SVD as a sum of column-row products, link with the rank, low-rank approximation ; tensors (quick notions) ; convergence of a sequence of vectors ; continuous function, little o notation, derivative, differentiable function, first order Taylor formula ; partial derivatives, gradient, direction of maximum increase, level sets, Hessian matrix ; Jacobian matrix, differentiable function (vector valued and of several variables), differential. Some references: To complement the lectures, you can find plenty of good material on the web about Linear Algebra and Multivariable Calculus, such as: - Lectures "Introduction to Probability, Statistics, and Machine Learning" by Samuel S. Watson from Brown University (lectures 2, 3, 4), https://data1010.github.io/class/ - Lectures "Mathematics for Machine Learning" by Garrett Thomas from University of California, Berkeley, https://gwthomas.github.io/docs/math4ml.pdf - Book "Mathematics for Machine Learning" by M. P. Deisenroth, A. A. Faisal, C. S. Ong, published by Cambridge University Press, https://mml-book.github.io/book/mml-book.pdf - Notes from Zico Kolter (updated by Chuong Do), Stanford university, "Linear Algebra Review and Reference", https://cs229.stanford.edu/section/cs229-linalg.pdf - Videos of Gilbert Strang lectures on Linear Algebra, MIT. https://ocw.mit.edu/courses/18-06-linear-algebra-spring-2010/video_galleries/video-lectures/