Home
學生控制台
註冊會員/登入
研究知情同意書
UeduGPTs
Aida 優學伴
Uedu Open
支援與訊息

UeduGPTs

--

Jupyters

3

AI 回覆桌面通知

AI 助教回覆完成時顯示桌面通知

聊天訊息通知

同學在討論區發送訊息時通知

聲音通知

每當有新通知時播放提示音

Uedu Open / Matrix Methods in Data Analysis, Signal Processing, and Machine Learning
18.065

Matrix Methods in Data Analysis, Signal Processing, and Machine Learning

Prof. Gilbert Strang | Spring 2018
Science & Math Mathematics Engineering Electrical Engineering Applied Mathematics Computation Linear Algebra Signal Processing
前往原始課程
CC BY-NC-SA 4.0
課程簡介
Linear algebra concepts are key for understanding and creating machine learning algorithms, especially as applied to deep learning and neural networks. This course reviews linear algebra with applications to probability and statistics and optimization–and above all a full explanation of deep learning.
課程資訊
來源MIT 開放式課程
科系Mathematics
語言English
影片數36
課程影片 (36)
1
Course Introduction of 18.065 by Professor Strang
Course Introduction of 18.065 by Professor Strang
2
An Interview with Gilbert Strang on Teaching Matrix Methods in Data Analysis, Signal Processing,...
An Interview with Gilbert Strang on Teaching Matrix Methods in Data Analysis, Signal Processing,...
3
Lecture 1: The Column Space of A Contains All Vectors Ax
Lecture 1: The Column Space of A Contains All Vectors Ax
4
Lecture 2: Multiplying and Factoring Matrices
Lecture 2: Multiplying and Factoring Matrices
5
3. Orthonormal Columns in Q Give Q'Q = I
3. Orthonormal Columns in Q Give Q'Q = I
6
4. Eigenvalues and Eigenvectors
4. Eigenvalues and Eigenvectors
7
5. Positive Definite and Semidefinite Matrices
5. Positive Definite and Semidefinite Matrices
8
6. Singular Value Decomposition (SVD)
6. Singular Value Decomposition (SVD)
9
7. Eckart-Young: The Closest Rank k Matrix to A
7. Eckart-Young: The Closest Rank k Matrix to A
10
Lecture 8: Norms of Vectors and Matrices
Lecture 8: Norms of Vectors and Matrices
11
9. Four Ways to Solve Least Squares Problems
9. Four Ways to Solve Least Squares Problems
12
Lecture 10: Survey of Difficulties with Ax = b
Lecture 10: Survey of Difficulties with Ax = b
13
Lecture 11: Minimizing ‖x‖ Subject to Ax = b
Lecture 11: Minimizing ‖x‖ Subject to Ax = b
14
12. Computing Eigenvalues and Singular Values
12. Computing Eigenvalues and Singular Values
15
Lecture 13: Randomized Matrix Multiplication
Lecture 13: Randomized Matrix Multiplication
16
14. Low Rank Changes in A and Its Inverse
14. Low Rank Changes in A and Its Inverse
17
15. Matrices A(t) Depending on t, Derivative = dA/dt
15. Matrices A(t) Depending on t, Derivative = dA/dt
18
16. Derivatives of Inverse and Singular Values
16. Derivatives of Inverse and Singular Values
19
Lecture 17: Rapidly Decreasing Singular Values
Lecture 17: Rapidly Decreasing Singular Values
20
Lecture 18: Counting Parameters in SVD, LU, QR, Saddle Points
Lecture 18: Counting Parameters in SVD, LU, QR, Saddle Points
21
19. Saddle Points Continued, Maxmin Principle
19. Saddle Points Continued, Maxmin Principle
22
20. Definitions and Inequalities
20. Definitions and Inequalities
23
Lecture 21: Minimizing a Function Step by Step
Lecture 21: Minimizing a Function Step by Step
24
22. Gradient Descent: Downhill to a Minimum
22. Gradient Descent: Downhill to a Minimum
25
23. Accelerating Gradient Descent (Use Momentum)
23. Accelerating Gradient Descent (Use Momentum)
26
24. Linear Programming and Two-Person Games
24. Linear Programming and Two-Person Games
27
25. Stochastic Gradient Descent
25. Stochastic Gradient Descent
28
26. Structure of Neural Nets for Deep Learning
26. Structure of Neural Nets for Deep Learning
29
27. Backpropagation: Find Partial Derivatives
27. Backpropagation: Find Partial Derivatives
30
Lecture 30: Completing a Rank-One Matrix, Circulants!
Lecture 30: Completing a Rank-One Matrix, Circulants!
31
31. Eigenvectors of Circulant Matrices: Fourier Matrix
31. Eigenvectors of Circulant Matrices: Fourier Matrix
32
Lecture 32: ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule
Lecture 32: ImageNet is a Convolutional Neural Network (CNN), The Convolution Rule
33
33. Neural Nets and the Learning Function
33. Neural Nets and the Learning Function
34
34. Distance Matrices, Procrustes Problem
34. Distance Matrices, Procrustes Problem
35
35. Finding Clusters in Graphs
35. Finding Clusters in Graphs
36
Lecture 36: Alan Edelman and Julia Language
Lecture 36: Alan Edelman and Julia Language