Linear Regression 1 [Python]

Поділитися
Вставка
  • Опубліковано 14 жов 2024

КОМЕНТАРІ • 17

  • @CausticTitan
    @CausticTitan 4 роки тому +3

    I'm still trying to wrap my head around the camera/display setup that allows him to write in front of himself but keeps it from looking backwards for us
    It really adds to the lecture!

    • @heinsaar
      @heinsaar 4 роки тому +2

      There is no magical setup. It's done by flipping the original video (where everything _is_ backwards for us) along the vertical center.

    • @navinbondade5365
      @navinbondade5365 4 роки тому

      Leo Heinsaar can you please explain in detail

    • @SkielCast
      @SkielCast 3 роки тому +2

      @Tech Axis and @CausticTitan
      This is done following these steps:
      1. Use a transparent board compatible with markers.
      2. Put the camera behind it.
      3. Use a really dark background and also make the host wear dark clothes.
      4. Since the camera was behind the board, the recording will be inverted, so it should be mirrored (vertically inverted).
      5. Make some color adjustments so that the background and clothes dissapear without corrupting what's written.

  • @vivekkumbhar3809
    @vivekkumbhar3809 4 роки тому

    Thanks for this Video. I am just starting with Machine Learning and this helped me understanding Linear Regression along with a pretty example. Also a very nice display Setup.

  • @PaulNielan
    @PaulNielan 11 місяців тому

    For those experimenting in Mathematica, try these commands---(from PEN)
    x = 3
    delta = .25
    a = Range[-2, 2, delta]
    b = x*a + 1* RandomReal[{-1, 1}, Length[a]]
    p1 = ListPlot[Transpose[{a, b}], PlotStyle -> Red,
    PlotLegends -> {"Raw Data"}]
    p2 = ListLinePlot[{Transpose[{a, x*a}]}, PlotStyle -> Blue,
    PlotLegends -> {"Exact"}]
    Show[p1, p2]
    amat = Transpose[{a}]
    {U, S, V} = SingularValueDecomposition[amat, MatrixRank[amat]]
    xtilde = V . Inverse[S] . Transpose[U] . b // First
    p3 = ListLinePlot[{Transpose[{a, xtilde*a}]}, PlotStyle -> Green,
    PlotLegends -> {"Regression"}]
    Show[p1, p2, p3]

  • @GauravSharma-ui4yd
    @GauravSharma-ui4yd 4 роки тому +2

    Do normalizing the data and then applying SVD really make change? I mean does it helps normalizing data before applying SVD?

    • @alial-musawi9898
      @alial-musawi9898 4 роки тому +2

      No. Ordinary Least Squares is scale-invariant.
      However, for Ridge Regression, you should normalize.

  • @erockromulan9329
    @erockromulan9329 Рік тому

    I understand the reason behind only using some of the data to verify your model, but would it still be 'cheating' to use all of the data to develop a more robust model to predict other homes outside of the data set?

  • @threedworld2319
    @threedworld2319 4 роки тому +2

    where is the jupyter notebook? cannot find it on website databookuw.com

    • @nskmodnar
      @nskmodnar 3 роки тому

      pip install jupyter

  • @PetersonCharlesMONSTAH
    @PetersonCharlesMONSTAH 4 роки тому +1

    Hey bro, can you help me out with from sklearn.linear_model import LinearRegression? I get an error message from using.
    This is the error message I'm getting. from R and python.ModuleNotFoundError Traceback (most recent call last)
    in ()
    ----> 1 from sklearn.Linear_model import LinearRegression
    2 regressor = LinearRegression()
    3 regressor.fit(X_train, y_train)
    ModuleNotFoundError: No module named 'sklearn.Linear_model'
    Thanks bro.

  • @Alexjohnson7929-p2n
    @Alexjohnson7929-p2n Рік тому

    Color, LineWidth, MarkerSize
    these should all be in lower case in the new version of matplotlib

    • @dr.gordontaub1702
      @dr.gordontaub1702 2 місяці тому

      THANK YOU. This is the comment I was looking for to find out why my version of the code wasn't working.

  • @apoorvshrivastava3544
    @apoorvshrivastava3544 4 роки тому

    good morning sir