Mathematics for Machine Learning Tutorial (3 Complete Courses in 1 video)

Поділитися
Вставка
  • Опубліковано 11 січ 2025

КОМЕНТАРІ • 146

  • @MyLesson007
    @MyLesson007  3 роки тому +245

    ------------ TIME STAMP -------------
    In the first course on Linear Algebra we look at what linear algebra is and how it relates to data.
    Then we look through what vectors and matrices are and how to work with them.
    COURSE 1
    MATHEMATICS FOR MACHINE LEARNING:LINEAR ALGEBRA
    INTRODUCTION TO LINEAR ALGEBRA AND TO MATHEMATICS FOR MACHINE LEARNING
    0:00:00 Introduction Solving data science challenges with mathemaatics
    0:02:27 Motivations for linear algebra
    0:05:57 Getting a handle on vectors
    0:15:03 Operations with vectors
    0:26:32 Summary
    VECTORS ARE OBJECTS THAT MOVE AROUND SPACE
    0:27:37 Introduction to module 2 - Vectors
    0:28:27 Modulus & inner product
    0:38:28 Cosine & dot product
    0:44:21 Project
    0:51:09 changing basis
    1:02:34 Basis, Vector space, and linear independence
    1:06:47 Application of changing basis
    1:10:16 Summary
    MATRICES IN LINEAR ALGEBRA:OBJECTS THAT OPERATE ON VECTORS
    1:11:36 Matrices, Vectors, and solving simultaneous equation problems
    1:17:08 How matrices transform space
    1:22:49 Types of matrix transformation
    1:31:28 Composition or combination of matrix transformations
    1:40:28 Solving the apples and bananas problem Gaussian elimination
    1:48:29 Going from Gaussian elimination to finding the inverse matrix
    1:57:07 Determinants and inverses
    2:07:44 Summary
    MATRICES MAKE LINEAR MAPPINGS
    2:08:43 Introduction Eintein summation convention and the symmetry of the dot product
    2:18:37 Matrices changing basis
    2:29:52 Doing a transformation in a changed basis
    2:34:30 Orthogonal matrices
    2:41:10 The Gram-Schmidt process
    2:47:18 Example Reflecting in a plane
    EIGENVALUES AND EIGENVECTORS:APPLICATION TO DATA PROBLEMS
    3:01:28 Welcome to Module 5
    3:02:20 What are eigenvalues and eigenvectors
    3:06:45 Special eigen-cases
    3:10:17 Calculating eigenvectors
    3:20:25 Changing to the eigenbasis
    3:26:17 Eigenbasis example
    3:33:43 Introduction to PageRank
    3:42:27 Summary
    3:43:42 Wrap up of this linear algebra course
    ----------------------------------------------
    The second course, Multivariate Calculus, builds on this to look at how to optimize fitting functions to get good fits to data.
    It starts from introductory calculus and then uses the matrices and vectors from the first course to look at data fitting.
    Course 2
    MULTIVARIATE CALCULUS
    WHAT IS CALCULUS?
    3:45:39 Welcome to Multivariate Calculus
    3:47:29 Welcome to Module 1
    3:48:33 Functions
    3:52:51 Rise Over Run
    3:57:48 Definition of a derivative
    4:08:30 Differentiation example & special cases
    4:16:19 Product rule
    4:20:27 Chain rule
    4:25:50 Taming a beast
    4:31:29 See you next module!
    MULTIVARIATE CALCULUS
    4:32:09 Welcome to Module2!
    4:33:13 Variables, constants & context
    4:41:09 Differentiate with respect to anything
    4:45:53 The Jacobian
    4:51:42 Jacobian applied
    4:58:05 The Sandpit
    5:02:48 The Hessian
    5:08:27 Reality in hard
    5:13:04 See you next module!
    MULTIVARIATE CHAIN RULE AND ITS APPLICATIONS
    5:13:28 Welcome to Module 3!
    5:14:04 Multivariate chain rule
    5:16:43 More multivariate chain rule
    5:22:21 Simple neural networks
    5:28:13 More simple neural networks
    5:32:25 See you next module!
    TAYLOR SERIES AND LINEARISATION
    5:32:59 Welcome to Module!
    5:33:35 Building approximate functions
    5:37:03 Power Series
    5:40:41 Power series derivation
    5:49:50 Power series datails
    5:56:04 Examples
    6:01:24 Linearisation
    6:06:41 Multivariate Taylor
    6:13:08 See you next module!
    INTRO TO OPTIMISATION
    6:13:36 Welcome to Module 5!
    6:21:51 Gradient Descent
    6:30:58 constrianed optimisation
    6:39:32 See you next module!
    REGRESSION
    6:41:40 simple linear regression
    6:51:52 General non linear least squares
    6:59:05 Doing least squares regression analysis in practice
    7:05:24 Wrap up of this course
    -------------------------------------------------------
    The third course, Dimensionality Reduction with Principal Component Analysis,
    uses the mathematics from the first two courses to compress high-dimensional data.
    This course is of intermediate difficulty and will require Python and numpy knowledge.
    COURSE 3
    Mathematics for Machine Learning: PCA
    STATISTICS OF DATASETS
    7:06:12 Introduction to the Course
    7:09:59 Welcome to module 1
    7:10:41 Mean of a dataset
    7:14:41 Variance of one-dimensional datasets
    7:19:36 Variance of higher -dimensional datasets
    7:24:52 Effect on the mean
    7:29:38 Effect on the (co)variance
    7:33:08 See you next module!
    INNER PRODUCTS
    7:33:35 Welcome to module 2
    7:35:24 Dot product
    7:40:07 Inner product definition
    7:45:09 Inner product length of vectors
    7:52:17 Inner product distance between vectors
    7:55:59 Inner product angles and orthogonality
    8:01:41 Inner product of functions and random variables (optional)
    8:09:03 Heading for the next module!
    ORTHOGONAL PROJECTIONS
    8:09:38 Welcome to module 3
    8:10:19 Projection onto ID subspaces
    8:18:02 Example projection onto ID Subspaces
    8:12:28 Projections onto higher-dimensional subspaces
    8:30:01 Example projection onto a 2D subspaces
    8:33:53 This was module 3!
    PRINCIPAL COMPONENT ANALYSIS
    8:34:26 Welcome to module 4
    8:35:35 Problem setting and PCA objective
    8:43:20 Finding the coordinates of the projected data
    8:48:49 Reformulation of the objective
    8:59:15 Finding the basis vectors that span the principal subspace
    9:06:55 Steps of PCA
    9:11:02 PCA in high dimensions
    9:16:51 Other interpretations of PCA (optional)
    9:24:33 Summary of this module
    9:25:16 This was the course on PCA

    • @Rixnex
      @Rixnex 2 роки тому +3

      Can you add it to description so it could be accessible

    • @RelaxationHavensys
      @RelaxationHavensys Рік тому +1

      No differential equation? I thought Machine learning requires differential equation

    • @metaverse413
      @metaverse413 Рік тому

      I really wish they add this to the description box

    • @i_chandanpatel
      @i_chandanpatel Рік тому +1

      Atleast pin this comment

    • @a.p.i.bharathbhooshanmenon
      @a.p.i.bharathbhooshanmenon 6 місяців тому +1

      A small correction - 8:21:28 Projections onto higher-dimensional subspaces

  • @LoicMat
    @LoicMat Рік тому +11

    Thank you so much for the video! For anyone wondering, the image is mirrored: check out 1:25:40, his (actual) right hand is the left hand in the video (on the side of the wrist watch)

    • @LoicMat
      @LoicMat Рік тому +3

      Oh and it's also explained explicitly at 2:47:35 :)

  • @marktahu2932
    @marktahu2932 Рік тому +41

    Really appreciate this as a refresher that was needed after 40 odd years. Thanks heaps for the straightforward approach and the clarity.

  • @annorome
    @annorome Рік тому +34

    Thank you! :) I clicked solely because of the very honest title of this video. MATHEMATICS for Machine Learning. That's how you generate the right "expectation" of students for a certain lecture/topic.

    • @joelausten
      @joelausten 7 місяців тому

      yup the keyword machine learning makes me motivated, this video is legend anyways.

    • @GurijalaSudhakarrao
      @GurijalaSudhakarrao 5 місяців тому

      Is this worthfull to you to learn machine learning

  • @MyLesson007
    @MyLesson007  3 роки тому +8

    Deep Learning Specialization
    ua-cam.com/play/PLtS8Ubq2bIlUOQoopGBa_F2mQvdk6QeBw.html

    • @LaughnFurry
      @LaughnFurry Рік тому +1

      Kindly also share machine learning course...

  • @omar-elgammal
    @omar-elgammal Рік тому +23

    This is extremely useful ! thanks a lot it helps me a lot with my Msc.

  • @alishera4673
    @alishera4673 Рік тому +9

    this video is incredible. thank you all very much.

  • @mabd10
    @mabd10 7 місяців тому +4

    first lesson - 1:02:35
    secod - 1:44:50

  • @NoamGonen
    @NoamGonen Рік тому +15

    Awesome lecture, great refresher for long forgotten theory

  • @007myzorro
    @007myzorro 11 місяців тому +4

    AWSOME GENIAL Make the things SO comprehensive and so playfully.

  • @avgspaceloveronasabbatical
    @avgspaceloveronasabbatical Рік тому +10

    I learned a lot from this, thank you so much!

    • @joelausten
      @joelausten 7 місяців тому

      what lesson do you take before this, I find this hard to understand.

  • @arpita1shrivas
    @arpita1shrivas 10 місяців тому +12

    im in 10th grade, thank you so much for making this, i took notes for the entire 9 hour lecture and finished today.

  • @karlodjo6040
    @karlodjo6040 3 роки тому +11

    😭😭😭 Thank a lot guys. GOD BLESS YOU.

  • @masoomladkaaproudvegan2876
    @masoomladkaaproudvegan2876 3 роки тому +13

    awesome professor thanks a lot

  • @rihanbouhaddouch5418
    @rihanbouhaddouch5418 Рік тому +5

    Thank you so much, this is really helpful.

    • @joelausten
      @joelausten 7 місяців тому

      Do you have any suggestion to help me understand this math, i find it hard from the first 30 minutes, confusing.

  • @frederickmloka5454
    @frederickmloka5454 6 місяців тому +1

    Wow i'am impressed by the way you explain things. I have learned a lot on the complex. Please put more videos on AI.

  • @WildOne777
    @WildOne777 3 роки тому +13

    Thank you sir. 😊 (Your videos on UA-cam and classes are great!)

    • @MyLesson007
      @MyLesson007  3 роки тому +5

      Glad you like them!

    • @firstname4337
      @firstname4337 Рік тому +1

      these are NOT his videos -- he stole them from coursera

  • @medrouabhi5699
    @medrouabhi5699 Рік тому +1

    thanks you for this course ,,,i find it very iteressed

  • @mdfarhadhussain
    @mdfarhadhussain Рік тому +3

    This is the best 9 hours you would ever spent on linear algebra

  • @koffiflaimoiye5276
    @koffiflaimoiye5276 Рік тому +2

    Thanks you very much.

  • @moukailasadikou2554
    @moukailasadikou2554 10 місяців тому +1

    Good job. Thank you so much

    • @joelausten
      @joelausten 7 місяців тому

      How'd you understand the lesson? I find these lessons hard to understand, do you any suggestion.

  • @constantin2002
    @constantin2002 6 місяців тому

    Thanks for great explain. Help me alot.

  • @kartiksinghvi9949
    @kartiksinghvi9949 3 роки тому +4

    Thanks for the lecture 😊

  • @MajesticONE-ds4zt
    @MajesticONE-ds4zt 5 місяців тому

    1:30:12 Why some sources said the opposite? Even chatGPT.
    If counter-clockwise its positive not negative. But there, you said the positive one is the clockwise movement.

  • @Code65115
    @Code65115 2 роки тому +3

    This channel is really a savior

  • @komlatselougou8369
    @komlatselougou8369 5 місяців тому

    great lecture. thanks a lot.

  • @vicentegabrielscalisi6182
    @vicentegabrielscalisi6182 Рік тому +1

    Muy bueno. Felicitaciones 🎈

  • @t.gossmann8731
    @t.gossmann8731 11 місяців тому

    @7:24:47
    in the bottom right of the screen the var(D) is calculated. If I hear is correctly, the speaker mentions it as being the covariance matrix. Is this correct? Without a 2nd index it can only be the variance like also written on the screen.

  • @ramkirpallodhi3126
    @ramkirpallodhi3126 11 місяців тому +1

    Great course

  • @0xRonin
    @0xRonin 6 місяців тому

    7:55:25 it should be sqaureroot 4 = 2

  • @lelouchlamperouge9220
    @lelouchlamperouge9220 2 роки тому +179

    Pls add timestamps

  • @SonTran-bh5tt
    @SonTran-bh5tt 4 місяці тому

    little confused at 4:44:39 formula, can you explain again about alternative approach, thanks!

  • @robopsychology
    @robopsychology Рік тому +1

    excellent presentation, except perhaps the last part.

  • @SphereofTime
    @SphereofTime 17 днів тому

    4:45:53 Jacobian

  • @chrissemenov8901
    @chrissemenov8901 Рік тому +2

    02:25:00

  • @kathryncassidy1500
    @kathryncassidy1500 Рік тому +1

    Amazing

  • @WithinEpsilon
    @WithinEpsilon Рік тому +2

    Why is the third course exclusive to this compilation. I don't see it on the Imperial College London channel.

  • @TheOlmesartan
    @TheOlmesartan 4 місяці тому

    Thanks 🙏

  • @007myzorro
    @007myzorro Рік тому

    AWESOME PROFESSOR 🎉🎉🎉

  • @openyard
    @openyard 2 місяці тому

    Thank you.
    This technique of writing backwards from behind a glass wall is distracting, it is so unreal and the handwriting becomes worse and smaller with time as fatigue sets in.
    But many people like it. And also this course is free 🙂so no need to complain.

    • @dzlfiqar
      @dzlfiqar 5 днів тому

      i think they just edit it by inverse mirror the video

  • @BonesFrielinghaus
    @BonesFrielinghaus 8 місяців тому

    Guys.... timestamps essential. Come onnn!

  • @Am_is_here_boi
    @Am_is_here_boi 4 місяці тому

    2:42:28 most funniest part, sir seems scared for a moment 🙂

  • @user_375a82
    @user_375a82 11 місяців тому

    Suddenly changes vector meaning from House [20 (cost), 40 (area), 45(heating)] to spatial r, s etc. Where is i and j for the house example? So the price or heating has two orthogonal components? He skips over that point cleverly and sticks to high school known routes of spatial vectors where i and j are x and y - obviously. Its not obvious for the price, heating. But he avoids that or says "its the same thing".
    Teh later r1s + r2s = r.s - lol, he's mad.

  • @arslanjutt4282
    @arslanjutt4282 Рік тому

    Sir please give one on statistics like this towards data science not as statistician

  • @abrilgonzalez7892
    @abrilgonzalez7892 5 місяців тому

    Is there any site in where we could pay a certificate to garantize the knowdlege obtained in this course?
    Thanks you!
    Awesome course, i love it!

  • @subramanianchenniappan4059
    @subramanianchenniappan4059 Рік тому

    Timestamps relieves headaches😅😅

  • @sarkersaadahmed
    @sarkersaadahmed Рік тому +1

    13:14 could anyone explain what are we trying to understand from here

    • @joelausten
      @joelausten 7 місяців тому

      how im confused till the 20 minutes

  • @glennisholcomb592
    @glennisholcomb592 Рік тому +1

    i love the math,

  • @harshnaik6989
    @harshnaik6989 Рік тому +2

    Thanks a LOTTTTTTTTT🙏🙏🙏🙏🙏🙏

  • @S.Carton_Esq.
    @S.Carton_Esq. Рік тому +1

    A 9+ hour video needs a concept/lesson index in the run-time clock. Put one in, and I might consider watching.

    • @Turtlenigma
      @Turtlenigma Рік тому +4

      he put time stamps in the comments

    • @pectenmaximus231
      @pectenmaximus231 9 місяців тому

      What an honour is your consideration

  • @karthikm627
    @karthikm627 Рік тому

    Please add timelines of topics 😊

  • @darrenlefcoe
    @darrenlefcoe 4 місяці тому

    44:06.... r.s not s.s

  • @Shreya...1
    @Shreya...1 11 місяців тому +1

    So basically 11th and 12th maths is required.

  • @SphereofTime
    @SphereofTime Рік тому

    5:15:37

  • @PhantumKai
    @PhantumKai 4 місяці тому

    Where are we suppose to find examples to work on?

  • @ppanda8427
    @ppanda8427 8 місяців тому

    How did he wrote backwards through whole video!

  • @mahender2517
    @mahender2517 3 роки тому +5

    Course's material please, thank you

  • @alexe3332
    @alexe3332 Рік тому

    Seriously frustrating to see no application

  • @Cat_Sterling
    @Cat_Sterling Рік тому

    Which specialization is being addressed in the video? Is it a specialization on EdX or Coursera?

    • @kevinmcfarlane2752
      @kevinmcfarlane2752 Рік тому

      I don’t know about EdX but I recently completed this specialization (free track) on Coursera. Next up is Machine Learning Specialization. But I’m taking a diversion into some Generative AI courses.

    • @Cat_Sterling
      @Cat_Sterling Рік тому

      @@kevinmcfarlane2752 do you remember the name of this specialization which you completed on Coursera?

    • @svdfxd
      @svdfxd 11 місяців тому

      @@kevinmcfarlane2752 Do you have any suggestions for Generative AI course ?

  • @noah3528
    @noah3528 8 місяців тому

    02:34:33

  • @MetaPhysStore0770
    @MetaPhysStore0770 11 місяців тому

    Frog emerge from Tª-d/d+P⁰L=E

  • @whosestone
    @whosestone 3 місяці тому

    21:44 , that's funny I'm only up 2 Js so far 😂

  • @pesgamer00
    @pesgamer00 9 місяців тому +2

    Is this video enough to learn machine learning?
    Anyone please give a reply 😢

    • @Jishanthegodev
      @Jishanthegodev 7 місяців тому

      Yes

    • @joelausten
      @joelausten 7 місяців тому

      @@Jishanthegodev do you understand the lesson that i taught in this video? I find this hard to understand. Do you have any recommendations.

    • @Jishanthegodev
      @Jishanthegodev 7 місяців тому

      @@joelausten first start by the new video from @freecodecamp "linear algebra for ml, dl, gen ai" and solve easy to medium questions to get a good understanding of topics and build confidence
      Don't try to grasp all these in one go,,, give this linear algebra a good amount of time, divide all the topics in different sections and complete it in 1-2 months
      Then pick the "statistics for ds" from @datatab ( yt - both inferential and descriptive ) and understand the topics
      After that pick calculus , for calculus I didn't find any good yt video, that's why, I will suggest you to learn from books which has good amount of questions

  • @Unknown-eh7rx
    @Unknown-eh7rx 4 місяці тому

    I'm a second yr clg student can i understand it

  • @warisulimam3440
    @warisulimam3440 10 місяців тому

    Where can I find the exercises they mention throughout the lectures?

    • @alokshandilya104
      @alokshandilya104 9 місяців тому

      on coursera

    • @joelausten
      @joelausten 7 місяців тому

      @@alokshandilya104 needs payment/

    • @joelausten
      @joelausten 7 місяців тому

      I find this lesson so confusing, do you have any recommendations on what to learn first.

  • @rohanrana7067
    @rohanrana7067 9 місяців тому

    can anyone tell me what should i learn in maths for coding? Iskill am working on statistics and probability.. do you need to know algebra, trigonometry, etc? As for simple interests and many more, we can just use the formula from chatgpt or Google.. I am new here in coding.. I am in my 2nd year of BA.. can someone tell me if I need to have a Btech or IT background or just skill and certificate matters? i am doing certificate course in web dev and dsa

    • @sugarmy9683
      @sugarmy9683 7 місяців тому

      you dont need any math for web dev just high school level maths is more then enough

    • @rohanrana7067
      @rohanrana7067 7 місяців тому

      @@sugarmy9683 my question was for coding in advance level, like cybersecurity, ml, ai, deep learning, clouding etc

    • @Beyond-Kevin
      @Beyond-Kevin 7 місяців тому

      @@rohanrana7067 for ml,dl or some data analytics means this video much for you

  • @TheGladiator123
    @TheGladiator123 2 роки тому +4

    Does anyone know such a course like that for statistics

    • @matattz
      @matattz Рік тому

      ua-cam.com/video/rm9SYX4MDu0/v-deo.html&ab_channel=CarlosFernandez-Granda
      Carlos teaches statistics for Data Science and for me his way of teaching just works.

    • @kevinmcfarlane2752
      @kevinmcfarlane2752 Рік тому

      I've just done one on Coursera. Probability and Statistics. It’s also part 3 of their Math for Machine Learning Specialization.

    • @culturapoliticaycomputador9999
      @culturapoliticaycomputador9999 3 місяці тому

      ua-cam.com/video/LJa4_yGOmwo/v-deo.htmlsi=TFWaw8gZagTs20OK

  • @brospore7897
    @brospore7897 Рік тому

    Is he left handed and the image is mirrored? Or is he right handed and able to write on glass in reverse?

    • @LoicMat
      @LoicMat Рік тому

      The image is mirrored: check out 1:25:40, he's using his right hand, which is the one with the watch

  • @PANDURANG99
    @PANDURANG99 6 місяців тому

    where is python code?

  • @sida_g567
    @sida_g567 3 роки тому +2

    Cool

  • @mawkuri5496
    @mawkuri5496 6 місяців тому

    dont tell me he was nancy pi when he was younger

  • @shuvraneelroy5
    @shuvraneelroy5 3 місяці тому

    Is this guy left handed or right handed? my take: left

  • @carlosruizmora3111
    @carlosruizmora3111 4 місяці тому

    Thanks Coursera, ehmmm I mean... Thanks My Lesson!

  • @CyberSec-zx6oy
    @CyberSec-zx6oy Рік тому +1

    Already lost on the start

    • @joelausten
      @joelausten 7 місяців тому

      howww i want to understand but it doesnt even from the beginning

  • @amritwt
    @amritwt Рік тому +2

    holy shit what a course

  • @master_braure
    @master_braure 10 місяців тому

    15:27

  • @vickey27able
    @vickey27able Рік тому

    Jo Bharat ko pahle se kaafi jyada economically, defence, international stage pe top jo bhi kar sakta he, chahe BJP or Congress, isme vote mangne ke lie koi sharm ki baat nahi, hum public ko hamara Bharat top 5 me chahie, so asking for vote is not a sin for any great son or daughter of India who is keep working day n night to make India develop 🇮🇳

  • @tarshsingh4198
    @tarshsingh4198 Рік тому +2

    8:09:12

  • @al-muktadir2543
    @al-muktadir2543 3 місяці тому

    one question though, isn't that a panda?

  • @BillyDoris-q8k
    @BillyDoris-q8k 3 місяці тому

    Jones William White Anthony Brown Mark

  • @elemayelemay4229
    @elemayelemay4229 Рік тому

    the car looks like alien's spaceship lmao

  • @LottoSense-h5m
    @LottoSense-h5m 3 місяці тому

    Lmao he was my lecturer during my time at imperial

  • @NicolaHartman-e6p
    @NicolaHartman-e6p 3 місяці тому

    Jones Ruth Martin Timothy Martinez Laura

  • @frafranildo
    @frafranildo 11 місяців тому

    I don't like that you spent so much time selling me why I should want to learn about vectors. Almost like I'm supposed to dislike it.

  • @ИринаКим-ъ5ч
    @ИринаКим-ъ5ч 3 місяці тому

    Jones Larry Lopez Cynthia Anderson Barbara

  • @ChrisSargent-f5j
    @ChrisSargent-f5j 3 місяці тому

    White Kenneth Rodriguez Daniel Johnson Angela

  • @TitusAugust-l6n
    @TitusAugust-l6n 3 місяці тому

    Anderson Timothy Young Timothy Young Kevin

  • @alexzhang9318
    @alexzhang9318 Рік тому +1

    Your 'b' looks like 6

  • @jackymarcel4108
    @jackymarcel4108 3 місяці тому

    White Karen Jackson Laura Garcia Sharon

  • @InfiniteWaveMusic
    @InfiniteWaveMusic Рік тому

    Lol

  • @vedmishravlogging
    @vedmishravlogging Рік тому +51

    ------------ TIME STAMP -------------
    In the first course on Linear Algebra we look at what linear algebra is and how it relates to data.
    Then we look through what vectors and matrices are and how to work with them.
    COURSE 1
    MATHEMATICS FOR MACHINE LEARNING:LINEAR ALGEBRA
    INTRODUCTION TO LINEAR ALGEBRA AND TO MATHEMATICS FOR MACHINE LEARNING
    0:00:00 Introduction Solving data science challenges with mathemaatics
    0:02:27 Motivations for linear algebra
    0:05:57 Getting a handle on vectors
    0:15:03 Operations with vectors
    0:26:32 Summary
    VECTORS ARE OBJECTS THAT MOVE AROUND SPACE
    0:27:37 Introduction to module 2 - Vectors
    0:28:27 Modulus & inner product
    0:38:28 Cosine & dot product
    0:44:21 Project
    0:51:09 changing basis
    1:02:34 Basis, Vector space, and linear independence
    1:06:47 Application of changing basis
    1:10:16 Summary
    MATRICES IN LINEAR ALGEBRA:OBJECTS THAT OPERATE ON VECTORS
    1:11:36 Matrices, Vectors, and solving simultaneous equation problems
    1:17:08 How matrices transform space
    1:22:49 Types of matrix transformation
    1:31:28 Composition or combination of matrix transformations
    1:40:28 Solving the apples and bananas problem Gaussian elimination
    1:48:29 Going from Gaussian elimination to finding the inverse matrix
    1:57:07 Determinants and inverses
    2:07:44 Summary
    MATRICES MAKE LINEAR MAPPINGS
    2:08:43 Introduction Eintein summation convention and the symmetry of the dot product
    2:18:37 Matrices changing basis
    2:29:52 Doing a transformation in a changed basis
    2:34:30 Orthogonal matrices
    2:41:10 The Gram-Schmidt process
    2:47:18 Example Reflecting in a plane
    EIGENVALUES AND EIGENVECTORS:APPLICATION TO DATA PROBLEMS
    3:01:28 Welcome to Module 5
    3:02:20 What are eigenvalues and eigenvectors
    3:06:45 Special eigen-cases
    3:10:17 Calculating eigenvectors
    3:20:25 Changing to the eigenbasis
    3:26:17 Eigenbasis example
    3:33:43 Introduction to PageRank
    3:42:27 Summary
    3:43:42 Wrap up of this linear algebra course
    ----------------------------------------------
    The second course, Multivariate Calculus, builds on this to look at how to optimize fitting functions to get good fits to data.
    It starts from introductory calculus and then uses the matrices and vectors from the first course to look at data fitting.
    Course 2
    MULTIVARIATE CALCULUS
    WHAT IS CALCULUS?
    3:45:39 Welcome to Multivariate Calculus
    3:47:29 Welcome to Module 1
    3:48:33 Functions
    3:52:51 Rise Over Run
    3:57:48 Definition of a derivative
    4:08:30 Differentiation example & special cases
    4:16:19 Product rule
    4:20:27 Chain rule
    4:25:50 Taming a beast
    4:31:29 See you next module!
    MULTIVARIATE CALCULUS
    4:32:09 Welcome to Module2!
    4:33:13 Variables, constants & context
    4:41:09 Differentiate with respect to anything
    4:45:53 The Jacobian
    4:51:42 Jacobian applied
    4:58:05 The Sandpit
    5:02:48 The Hessian
    5:08:27 Reality in hard
    5:13:04 See you next module!
    MULTIVARIATE CHAIN RULE AND ITS APPLICATIONS
    5:13:28 Welcome to Module 3!
    5:14:04 Multivariate chain rule
    5:16:43 More multivariate chain rule
    5:22:21 Simple neural networks
    5:28:13 More simple neural networks
    5:32:25 See you next module!
    TAYLOR SERIES AND LINEARISATION
    5:32:59 Welcome to Module!
    5:33:35 Building approximate functions
    5:37:03 Power Series
    5:40:41 Power series derivation
    5:49:50 Power series datails
    5:56:04 Examples
    6:01:24 Linearisation
    6:06:41 Multivariate Taylor
    6:13:08 See you next module!
    INTRO TO OPTIMISATION
    6:13:36 Welcome to Module 5!
    6:21:51 Gradient Descent
    6:30:58 constrianed optimisation
    6:39:32 See you next module!
    REGRESSION
    6:41:40 simple linear regression
    6:51:52 General non linear least squares
    6:59:05 Doing least squares regression analysis in practice
    7:05:24 Wrap up of this course
    -------------------------------------------------------
    The third course, Dimensionality Reduction with Principal Component Analysis,
    uses the mathematics from the first two courses to compress high-dimensional data.
    This course is of intermediate difficulty and will require Python and numpy knowledge.
    COURSE 3
    Mathematics for Machine Learning: PCA
    STATISTICS OF DATASETS
    7:06:12 Introduction to the Course
    7:09:59 Welcome to module 1
    7:10:41 Mean of a dataset
    7:14:41 Variance of one-dimensional datasets
    7:19:36 Variance of higher -dimensional datasets
    7:24:52 Effect on the mean
    7:29:38 Effect on the (co)variance
    7:33:08 See you next module!
    INNER PRODUCTS
    7:33:35 Welcome to module 2
    7:35:24 Dot product
    7:40:07 Inner product definition
    7:45:09 Inner product length of vectors
    7:52:17 Inner product distance between vectors
    7:55:59 Inner product angles and orthogonality
    8:01:41 Inner product of functions and random variables (optional)
    8:09:03 Heading for the next module!
    ORTHOGONAL PROJECTIONS
    8:09:38 Welcome to module 3
    8:10:19 Projection onto ID subspaces
    8:18:02 Example projection onto ID Subspaces
    8:12:28 Projections onto higher-dimensional subspaces
    8:30:01 Example projection onto a 2D subspaces
    8:33:53 This was module 3!
    PRINCIPAL COMPONENT ANALYSIS
    8:34:26 Welcome to module 4
    8:35:35 Problem setting and PCA objective
    8:43:20 Finding the coordinates of the projected data
    8:48:49 Reformulation of the objective
    8:59:15 Finding the basis vectors that span the principal subspace
    9:06:55 Steps of PCA
    9:11:02 PCA in high dimensions
    9:16:51 Other interpretations of PCA (optional)
    9:24:33 Summary of this module
    9:25:16 This was the course on PCA

    • @egosumcamax
      @egosumcamax Рік тому +2

      Up !

    • @aibui3087
      @aibui3087 9 місяців тому

      you're so cool

    • @curious42367
      @curious42367 5 місяців тому

      man man eased mine problems
      hall of fame worthy............

  • @ravalik2103
    @ravalik2103 9 днів тому

    Thank you

  • @lokeshgubbala622
    @lokeshgubbala622 4 місяці тому

    Thanks🤝