Carlos Fernandez-Granda
Carlos Fernandez-Granda
  • 174
  • 101 919
Linear Regression: Coefficient Error
In this video we study the coefficients of linear regression models. Using real data, we show that these coefficients recover meaningful linear structure when the number of data is large, but can be very noisy otherwise. We then provide a theoretical analysis and simulated-data experiments (where the response is a noisy linear function of the features) showing that (1) ordinary-least-squares estimation is unbiased, (2) the coefficient error decreases as the number of data increase, and (3) there is large coefficient variance in directions where the features have low variance.
Переглядів: 5

Відео

Linear Regression: Explained Variance
Переглядів 177 годин тому
We derive a decomposition of variance for linear models, which enables us to define the fraction of variance explained by the model. We also show how to compute the explained variance as a function of the mean squared error and explain how to use it to compare different linear models.
LINEAR REGRESSION: ORDINARY LEAST SQUARES
Переглядів 2914 годин тому
We explain how to fit linear models to data, describing the relationship between ordinary least squares and linear minimum mean-squared-estimation. We then apply ordinary least squares to obtain a linear model of premature mortality in the United States, as a function of smoking and income.
Linear Regression: Linear Minimum MSE Estimation
Переглядів 2321 годину тому
We derive the optimal linear minimum estimator of a response given multiple features in terms of mean squared error. We compute the estimator for a simple example, and discuss its properties when the features are uncorrelated, and when the features and the response are jointly Gaussian. Finally, we provide geometric intuition about the estimator from a linear-algebra perspective.
OVERVIEW OF DISCRETE AND CONTINUOUS RANDOM VARIABLES
Переглядів 4821 день тому
In this video we explain how to jointly represent and manipulate discrete and continuous quantities within the same probabilistic model, providing several examples with real data. We also describe two important applications of these ideas: mixture models for classification and regression, and Bayesian parametric modeling.
CLUSTERING VIA GAUSSIAN MIXTURE MODELS
Переглядів 4821 день тому
We explain how to perform clustering using Gaussian mixture models, describing the expectation-maximization algorithm, and showing examples on two simple real-world datasets.
Classification via Gaussian Discriminant Analysis with an Application to Alzheimer's Diagnosis
Переглядів 2628 днів тому
We explain how to use Gaussian mixture models to perform classification, describing both quadratic and linear discriminant analysis. We illustrate these methods by using them to diagnose Alzheimer's disease using brain-region volumes extracted from 3D magnetic-resonance imaging scans.
OVERVIEW OF PRINCIPAL COMPONENT ANALYSIS AND LOW-RANK MODELS
Переглядів 53Місяць тому
The video describes how analyze data with multiple features, using the covariance matrix and principal component analysis (PCA) to identify directions of maximum variance. It also introduces low-rank models, describes their connection to PCA and explains how to use them for collaborative filtering and matrix completion.
MATRIX COMPLETION FOR COLLABORATIVE FILTERING
Переглядів 35Місяць тому
We show that collaborative filtering in recommender systems can be formulated as a matrix completion problem. Then, we explain how to use low-rank models to solve this problem and apply them to estimate movie ratings from simulated and real data.
Low-Rank Models
Переглядів 63Місяць тому
We describe low-rank models, explain their connection to principal component analysis and show how to fit them to data using the singular value decomposition. As an illustration, we apply a low-rank model to a real-world temperature dataset, where it automatically identifies meaningful patterns such as seasonality.
COBRA: Confidence-based Anomaly Detection for Medicine
Переглядів 2046 місяців тому
We present a method to perform automatic assessment of impairment and disease severity using AI models trained only on healthy individuals. The COnfidence-Based chaRacterization of Anomalies (COBRA) score exploits the decrease in confidence of these models when processing data from impaired or diseased patients to quantify their deviation from the healthy population. We show that the COBRA scor...
Dimensionality Reduction via Principal Component Analysis
Переглядів 241Рік тому
Datasets with high dimensionality, where each data point is associated to many features, can be difficult to process and analyze. The goal of dimensionality reduction is to embed such data in a lower-dimensional space, while preserving as much information as possible. In this video we explain how to use principal component analysis (PCA) for this purpose, illustrating the technique on two datas...
The Mathematics Behind Principal Component Analysis
Переглядів 208Рік тому
We provide an intuitive proof of the spectral theorem, which is the mathematical foundation of principal component analysis. This theorem states that the maximum of a quadratic form on the unit sphere lies in the direction of an eigenvector of the corresponding matrix. We show that this is necessarily the case using a simple geometric argument. Photo by Michal Matlon on Unsplash
Principal Component Analysis
Переглядів 153Рік тому
We describe principal component analysis (PCA), a popular technique to process multidimensional data. We explain that PCA identifies the directions of maximum variance of a random vector via the eigendecomposition of its covariance matrix. We then show that applying PCA to the sample covariance matrix of a dataset reveals the components with maximum sample variance. Photo by Steve Smith on Unsp...
The Covariance Matrix
Переглядів 266Рік тому
In this video we define the covariance matrix of a random vector. We show that this matrix encodes the variance of any linear combination of the random vector, and also the variance of the vector in any direction. We then explain how to compute the sample covariance matrix from a dataset, where each data point consists of multiple features. Finally, we show that the sample variance encodes the ...
Overview of Hypothesis Testing
Переглядів 90Рік тому
Overview of Hypothesis Testing
P-Value Abuse! Practical Significance and P Hacking
Переглядів 143Рік тому
P-Value Abuse! Practical Significance and P Hacking
HYPOTHESIS TESTING AND CAUSAL INFERENCE
Переглядів 179Рік тому
HYPOTHESIS TESTING AND CAUSAL INFERENCE
MULTIPLE TESTING: EVALUATING NBA PLAYERS
Переглядів 75Рік тому
MULTIPLE TESTING: EVALUATING NBA PLAYERS
No Model? No Problem! The Permutation Test
Переглядів 151Рік тому
No Model? No Problem! The Permutation Test
THE POWER OF HYPOTHESIS TESTS
Переглядів 89Рік тому
THE POWER OF HYPOTHESIS TESTS
Statistical Significance: Why the P Value Controls False Positives
Переглядів 140Рік тому
Statistical Significance: Why the P Value Controls False Positives
A Two-Sample Test for Antetokounmpo's Free Throws
Переглядів 105Рік тому
A Two-Sample Test for Antetokounmpo's Free Throws
THE NULL HYPOTHESIS AND THE P VALUE
Переглядів 156Рік тому
THE NULL HYPOTHESIS AND THE P VALUE
ESTIMATION OF POPULATION PARAMETERS FROM RANDOM SAMPLES
Переглядів 129Рік тому
ESTIMATION OF POPULATION PARAMETERS FROM RANDOM SAMPLES
Bootstrap Confidence Intervals
Переглядів 155Рік тому
Bootstrap Confidence Intervals
THE BOOTSTRAP
Переглядів 144Рік тому
THE BOOTSTRAP
CONFIDENCE INTERVALS FOR PROBABILITIES AND PROPORTIONS
Переглядів 83Рік тому
CONFIDENCE INTERVALS FOR PROBABILITIES AND PROPORTIONS
CONFIDENCE INTERVALS
Переглядів 116Рік тому
CONFIDENCE INTERVALS
How Not To Estimate Risk
Переглядів 108Рік тому
How Not To Estimate Risk

КОМЕНТАРІ

  • @ikk_ikk
    @ikk_ikk 4 дні тому

    Excellent

  • @MatjazZupančičMuc
    @MatjazZupančičMuc 5 днів тому

    Brilliant! Thanks!

  • @neelotpaldas2710
    @neelotpaldas2710 8 днів тому

    I learnt how you give presentations

  • @MatjazZupančičMuc
    @MatjazZupančičMuc 14 днів тому

    Thanks! There is a typo at 8:40; vector x should be d-dimensional, not n-dimensional.

  • @matjazmuc-7124
    @matjazmuc-7124 18 днів тому

    Thanks! The bootstrap percentile confidence interval part was a bit hard to follow, however I think I got the idea after watching it a second time. I think it might be easier to understand if the derivation was hand-written (and thus presented at a slower pace) :)

  • @matjazmuc-7124
    @matjazmuc-7124 20 днів тому

    Thanks for the great explanation! I think there is a small mistake at 28:24 -1/2 * (a^2 + (s-a)^2) should equal -a^2 + as - s^2/2.

  • @markneumann381
    @markneumann381 Місяць тому

    Enjoy these sessions on probability very much. Thank you for both these videos and your very well written book.

  • @xiamojq621
    @xiamojq621 Місяць тому

    Thank's very much sir it helps clarify alot on what is on your text book

  • @thomasbates9189
    @thomasbates9189 2 місяці тому

    Thank you for this video!

  • @SILOETTE100page
    @SILOETTE100page 2 місяці тому

    Really amazing and intuitive explanations Carlos 🙏 Thank you so much for your helpful videos, they are lifesavers

  • @paulasuero9169
    @paulasuero9169 3 місяці тому

    Eres un crack! Estoy dando metodos no parametricos y tus videos ayudan muchisimo

  • @rohitaitou553
    @rohitaitou553 3 місяці тому

    Are there any excerecises recommended ? Along with the videos ? Or a summary project in R?

  • @thomasbates9189
    @thomasbates9189 5 місяців тому

    Thank you for this video!

  • @thomasbates9189
    @thomasbates9189 5 місяців тому

    Thank you for making these videos available to the public!

  • @psinity
    @psinity 5 місяців тому

    thanks a lot !

  • @ahmadmahagna1255
    @ahmadmahagna1255 5 місяців тому

    somebody give this guy a nobel prize ASAP.

  • @matjazmuc-7124
    @matjazmuc-7124 5 місяців тому

    brilliant!

  • @thuinanutshelll
    @thuinanutshelll 5 місяців тому

    thank you so much for making this video! it’s so clear and easy to understand

  • @stavrostsalapatis5030
    @stavrostsalapatis5030 5 місяців тому

    Great work Mr. Fernandez- Granda!

  • @OuailOucherif
    @OuailOucherif 6 місяців тому

    Thank you very much for sharing such valuable knowledge with us Mister Carlos.

  • @johnsutor203
    @johnsutor203 6 місяців тому

    Nothing gets me more hype than a new CFG video dropping.

  • @loukaskontogiannis7115
    @loukaskontogiannis7115 7 місяців тому

    Carlos, this video is a gem! As you mentioned in the beginning of the video, I came across the mathematical definition of Borel sets in the first few pages of the book "Statistical Inference" by Casella and Berger, and I was stumped. Now, after listening to your explanation I totally get it. I especially liked your early slide on motivation, as it provided the context for appreciating why Borel sets are useful. I can't thank you enough. Brilliant work 👍

  • @muttdev
    @muttdev 9 місяців тому

    Very good explanation!

  • @ConstanceQuinn-rg2we
    @ConstanceQuinn-rg2we 9 місяців тому

    thank you!

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 9 місяців тому

    These are really high quality content.

  • @SaitovicJan
    @SaitovicJan 9 місяців тому

    Awesome, thank you

  • @OtterMorrisDance
    @OtterMorrisDance Рік тому

    All the videos in this series are very good, thanks for creating them!

  • @OtterMorrisDance
    @OtterMorrisDance Рік тому

    Nicely explained!

  • @ytenergy444
    @ytenergy444 Рік тому

    Thank you

  • @jakeaustria5445
    @jakeaustria5445 Рік тому

    I really need this in my research, Thank you.

  • @matattz
    @matattz Рік тому

    I can’t believe I am enjoying watching statistics videos. The way you teach is perfect for me! Let’s see how far I’ll come

  • @matattz
    @matattz Рік тому

    Typo at @16:12 event C should be h-h and t-t, not t-h.

  • @matattz
    @matattz Рік тому

    Amazing!

  • @matattz
    @matattz Рік тому

    If the rest of the playlist is like the first Video i literally found gold! Thank you so much, the way you teach is so chill and easy to follow, yet so informative. Thats the way you should do it!

  • @InoceramusGigas
    @InoceramusGigas Рік тому

    Spectacular video. Thank you for posting this explanation!

  • @kenrutherford1109
    @kenrutherford1109 Рік тому

    Use of all caps makes it look like you're shouting

  • @ericklopez9754
    @ericklopez9754 Рік тому

    Best video on youtube on this subject!!!

  • @chaoyinggu747
    @chaoyinggu747 Рік тому

    very clear explanation and very helpful video! thanks!

  • @nicolasrojas8178
    @nicolasrojas8178 Рік тому

    I had some doubts and this was super useful. Thank you Carlos. Subscribed!

  • @suryatripathi3246
    @suryatripathi3246 Рік тому

    So well explained! Loved it

  • @thefantasticman
    @thefantasticman Рік тому

    Thank You sir I am searching this topic for hours

  • @senzhan221
    @senzhan221 Рік тому

    great content

  • @ankitrana2748
    @ankitrana2748 Рік тому

    While computing the subgradient of |X| at x=0, why did you write the inequality as f(0 +y) >= f(0) + g(y-0)? On the LHS, how is it f(0+y) and not just f(y) ?

  • @ad2409
    @ad2409 Рік тому

    Thank you so much for this video! I've not seen such a comprehensive guide on modelling with earthquakes anywhere else!!!

  • @divyaagarwal6114
    @divyaagarwal6114 Рік тому

    Can you please tell me if this playlist is different from the one you marked as an old playlist? Will we miss some topics if we study one of them?

  • @axelsamuelsantillanmartine5144

    Thank you for your material on UA-cam. I have been using your book on probability and statistics for data science and it helps me quite a bit. Saludos desde México.

  • @tybaltmercutio
    @tybaltmercutio Рік тому

    Very nice explanations and lecture. Thank you! Are the lecture notes/slides available anywhere?

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w Рік тому

    Great topic. Very interested in causal inference

  • @AnthonyChen-gz7bx
    @AnthonyChen-gz7bx Рік тому

    The sun would have engulfed 😂

  • @AryoZare
    @AryoZare Рік тому

    Thanks a lot.