Jake Potter
Jake Potter
  • 18
  • 8 109
Field Axioms
Worst video quality ever. The audio isn't great either.
Fields! They've got everything you would ever want in a mathematical object.
This video is for an introduction to mathematical proof class that I am teaching.
Переглядів: 186

Відео

Finding Polynomials of Best Fit Using Least Squares
Переглядів 5 тис.4 роки тому
This video explains how to find a polynomial of at most degree d which is as close to going through a set of points as possible. This video is part of a series of video lectures for the linear algebra class I am teaching.
Sum and Product of Eigenvalues
Переглядів 4214 роки тому
The trace of a matrix is the sum of the eigenvalues. The determinant is the product of the eigenvalues. I prove both of these facts in a relatively simple way. This video is part of a series of video lectures for the linear algebra class I am teaching.
Polar Decomposition of 2x2 Matrices
Переглядів 6684 роки тому
The how and why of decomposing a 2x2 matrix which has non-real eigenvalues into a rotation matrix, a scalar matrix, and a pair of change of basis matrices. This video is part of a series of video lectures for the linear algebra class I am teaching.
Markov Chains With Diagonalization
Переглядів 3354 роки тому
Analyzing Markov chain's long term behavior using diagonalization of the Markov matrix. This video is part of a series of video lectures for the linear algebra class I am teaching.
Fibonacci Formula Using Diagonalization
Переглядів 2034 роки тому
The process for solving a linear difference equation using diagonalization of a matrix. Also, there is an example where this is not quite possible. This video is part of a series of video lectures for the linear algebra class I am teaching.
Eigenbases
Переглядів 1464 роки тому
Theorems about diagonalization and bases which contain only eigenvalues. This video is part of a series of video lectures for the linear algebra class I am teaching.
Diagonalization
Переглядів 594 роки тому
Examples of how to diagonalize a matrix and an example of when we might want to do so. This video is part of a series of video lectures for the linear algebra class I am teaching.
Eigenvectors and Eigenvalues (part II)
Переглядів 364 роки тому
Theorems related to eigenvectors and eigenvalues. Example of a 2-d eigenspace. This video is part of a series of video lectures for the linear algebra class I am teaching.
Eigenvectors and Eigenvalues (part I)
Переглядів 644 роки тому
Definitions and first examples of eigenvectors and eigenvalues. This video is part of a series of video lectures for the linear algebra class I am teaching.
Solving Nonhomogeneous LDEs
Переглядів 464 роки тому
What to do when solving a linear difference equation which is not homogeneous. This video is part of a series of video lectures for the linear algebra class I am teaching.
Solving Homogeneous LDEs
Переглядів 574 роки тому
Examples of solving homogeneous linear difference equations in the cases where the auxiliary equation has only real roots. This video is part of a series of video lectures for the linear algebra class I am teaching.
Linear Difference Equations
Переглядів 2144 роки тому
A look at some properties of the set of sequences which satisfy this particular type of recurrence relation. This video is part of a series of video lectures for the linear algebra class I am teaching.
Change of Basis
Переглядів 354 роки тому
An example illustrating what a change of basis matrix is.
Null and Column Spaces
Переглядів 504 роки тому
Kernel and range of a linear transformation along with how these relate to null and column spaces of a matrix transformation. Examples of calculating bases of these spaces. This video is part of a series of video lectures for the linear algebra class I am teaching.
Abstract Vector Spaces
Переглядів 624 роки тому
Abstract Vector Spaces
Review of Bases
Переглядів 364 роки тому
Review of Bases
Review of Linear Independence
Переглядів 1234 роки тому
Review of Linear Independence

КОМЕНТАРІ

  • @egonwombat8234
    @egonwombat8234 3 місяці тому

    Amazingly insightful - you finally opened my eyes! Thanks!

  • @naturelover4148
    @naturelover4148 11 місяців тому

    Thanks a lot but could you please explain why you multiplied with transpose matrix on both sides and how it leads to the best fiiting least square polynomial...

  • @omkarparishwad5410
    @omkarparishwad5410 Рік тому

    Loved your session! Could you please explain why do you say in the end that this equation from the used method- has the lowest sum of squares? What if a higher polynomial or some other ML method defines a better curve fit?

    • @drjakepotter
      @drjakepotter Рік тому

      The problem statement itself limits the scope of answers we want to consider. There is a cubic polynomial that exactly fits the 4 points, which would mean that the sum of the squares would be 0. If you allow any polynomial, you can fit any (finite) number of points (as long as the x-values are distinct). The least squares method allows us to do the "best" we can with a limit on the degree of the desired polynomial. In this case, the problem statement limits us to polynomials of degree at most 2, so higher degree polynomials are not considered. The answer we get yields the lowest possible sum of squares because the result of the least-squares process is a projection. We are projecting the original point in \R^4 (thought of as the vector of 4 y-values) onto the 3-dimensional subspace of \R^4 which contains only those vectors which correspond to polynomials (with degree at most 2). Whenever we project a point onto a subspace, the result is the point in the subspace which is closest to the original point. Closeness (distance) here means the square root of the sum of the squares of the z_i, but this is minimized if and only if the sum of the squares (without the square root) is minimized. For why this minimization happens with projections, check out Khan Academy's video here: ua-cam.com/video/b269qpILOpk/v-deo.htmlsi=SnjOnOt7dy_KCJ78.

  • @ironhammersgaming1391
    @ironhammersgaming1391 2 роки тому

    Great explanation!

  • @julianfludwig
    @julianfludwig 2 роки тому

    That's exactly what I was looking for - thank you!

  • @drjakepotter
    @drjakepotter 3 роки тому

    I hope you enjoyed this video! While the process and the formulas are correct, there is a mistake early in the example which throws off the numbers a bit. Correction (starting at @): the angle theta should be arctan(.54/.72) = arctan(3/4). In the video, I accidentally switched a and b. This also means that in the matrix R, each of the .6's should be .8's and each of the .8's should be .6's. Sorry!

  • @jakesmith8778
    @jakesmith8778 3 роки тому

    Correction: At 30:26, the formula for the volume of the new shape should instead be vol(AS) = |det(A)| * vol(S) to account for negative determinants.

  • @drjakepotter
    @drjakepotter 3 роки тому

    Hi! I hope you enjoyed the video. Here is a small correction: The bottom row of the matrix F should be both 1's. This is a typo that remains on the board for the first 12.5 minutes of the video, but does not affect any of the math. Sorry!