[ 100k Special ] Transformers: Zero to Hero
Вставка
- Опубліковано 15 тра 2024
- Let's talk about transformers from scratch.
CODE: github.com/ajhalthor/Transfor...
0:00 Thank you for 100K!
0:29 Transformer Overview
12:27 Self Attention
26:40 Multihead Attention
39:31 Position Encoding
48:51 Layer Normalization
1:00:50 Architecture Deep Dive
1:27:56 Encoder Code
2:16:10 Decoder Code
2:54:11 Sentence Tokenization
3:12:26Training and Inference
MATH COURSES (7 day free trial)
📕 Mathematics for Machine Learning: imp.i384100.net/MathML
📕 Calculus: imp.i384100.net/Calculus
📕 Statistics for Data Science: imp.i384100.net/AdvancedStati...
📕 Bayesian Statistics: imp.i384100.net/BayesianStati...
📕 Linear Algebra: imp.i384100.net/LinearAlgebra
📕 Probability: imp.i384100.net/Probability
OTHER RELATED COURSES (7 day free trial)
📕 ⭐ Deep Learning Specialization: imp.i384100.net/Deep-Learning
📕 Python for Everybody: imp.i384100.net/python
📕 MLOps Course: imp.i384100.net/MLOps
📕 Natural Language Processing (NLP): imp.i384100.net/NLP
📕 Machine Learning in Production: imp.i384100.net/MLProduction
📕 Data Science Specialization: imp.i384100.net/DataScience
📕 Tensorflow: imp.i384100.net/Tensorflow
#chatgpt #deeplearning #machinelearning #bert #gpt