Bill Basener
Bill Basener
  • 53
  • 75 222
M1 Code Walkthrough
Walking though the code for module 1 in Statistical Learning for Remotes Sensing. We take the time to discuss details of many common Python commands, especially for working with image arrays and plots.
Переглядів: 33

Відео

HyperspectralPy - Open image and Create Regions of Interest
Переглядів 445 місяців тому
This is a quick tutorial on how to open a hyperspectral image, create ROIs, and view a spectral library using the open source HyperspectrlPy GUI-based software that can be installed via pip install.
YOLO8 Object Detection with LiDAR - Part 4
Переглядів 3319 місяців тому
This is video #4 in our series on object detection in LiDAR data with YOLO8. I will demonstrate how to train your own YOLO8 model on your labeled LiDAR data.
YOLO8 Object Detection with LiDAR - Part 3
Переглядів 4309 місяців тому
This is video #3 in our series on object detection in LiDAR data with YOLO8. I will demonstrate how to label LiDAR raster files using Roboflow.
YOLO8 Object Detection with LiDAR - Part 2
Переглядів 5359 місяців тому
This is video #2 in our series on object detection in LiDAR data with YOLO8. I will demonstrate how to convert a LiDAR point cloud to a raster file for use in YOLO8.
YOLO8 Object Detection with LiDAR - Part 1
Переглядів 3369 місяців тому
This is video #1 in our series on object detection in LiDAR data with YOLO8. I will demonstrate how to download LiDAR data from the USGS website.
TensorFlow Tutorial Pt.1
Переглядів 215Рік тому
Demo of how to code and optimize neural networks in TensorFlow. In this first part we discuss a classification network. All code is available in my GitHub at: github.com/wbasener/Neural-Netork-From-Scratch-in-Python/blob/main/M2_6_Tutorial_neural_nets_with_keras.ipynb. I assume you know what a neural network is, but little prior coding is required. The goal is to get over the initial challenge ...
TensorFlow Tutorial Pt.2
Переглядів 21Рік тому
Demo of how to code and optimize neural networks in TensorFlow - Part 2 focuses on regression using a California house price dataset. All code is available in my GitHub at: github.com/wbasener/Neural-Ne.... I assume you know what a neural network is, but little prior coding is required. The goal is to get over the initial challenge in coding using TensorFlow that many people encounter. We go ov...
TensorFlow Tutorial Pt.3
Переглядів 21Рік тому
Demo of how to code and optimize neural networks in TensorFlow - Part 3 focuses on optimizing parameters using random grid search and Bayesian optimization. All code is available in my GitHub at: github.com/wbasener/Neural-Ne.... I assume you know what a neural network is, but little prior coding is required. The goal is to get over the initial challenge in coding using TensorFlow that many peo...
LIDAR GRAND CHALLENGE
Переглядів 372 роки тому
temp upload for LiDAR Grand Challenge
Bills D Smother Ravens - were they good or lucky?
Переглядів 723 роки тому
In this video we take a look at the Bills performance in 2020 and so what most predictions of the Bills-Ravens game got wrong. Hint - the Bills have a great defense now, and we can see it is a trend by peaking a little into their season stats. (Also, sorry about the audio. Used a headset instead of my mic to avoid some background noise.)
Josh Allen, by the Numbers
Переглядів 4273 роки тому
Who are the best QBs in the NFL? Is Josh Allen elite? We take a look at the numbers to see who makes the cut after the 13th week of the 2020 season. I am a Prof. of Data Science at the University of Virginia and Emeritus Prof. of Math at the Rochester Institute of Technology.
Machine Learning 10.1 - Exploratory Data Analysis
Переглядів 1113 роки тому
In this video, you will learn tools for exploratory data analysis. These tools allow a person to view data and look for trends and structures. Here, you will explore the terminology and goals for visualizations and unsupervised learning.
Machine Learning 10.2 - PCA Visualizations
Переглядів 2073 роки тому
In this video, we will use PCA (Principal Components Analysis) for dimension reduction and to view high-dimensional data. We used principal components in Module 4 as part of the underlying math for Gaussian regression methods, and we used PCA for regularization in Module 6. Principal components provide a useful mathematical framework for modeling/measuring the shape of data, and, in this module...
Machine Learning 9.4 - R Lab Support Vector Machines
Переглядів 1803 роки тому
Machine Learning 9.4 - R Lab Support Vector Machines
Machine Learning 9.3 - Support Vector Machines
Переглядів 3293 роки тому
Machine Learning 9.3 - Support Vector Machines
Machine Learning 9.1 - Maximum Margin Classifier
Переглядів 6 тис.3 роки тому
Machine Learning 9.1 - Maximum Margin Classifier
Machine Learning 9.2 - Soft Margins and the Support Vector Classifier
Переглядів 4073 роки тому
Machine Learning 9.2 - Soft Margins and the Support Vector Classifier
Machine Learning 8.5 - R Lab, Random Forest and Tree Ensembles
Переглядів 1443 роки тому
Machine Learning 8.5 - R Lab, Random Forest and Tree Ensembles
Machine Learning 8.4 - Boosting Ensambles
Переглядів 1073 роки тому
Machine Learning 8.4 - Boosting Ensambles
Machine Learning 8.2 - Random Forests
Переглядів 1853 роки тому
Machine Learning 8.2 - Random Forests
Machine Learning 8.1 Bagging
Переглядів 1473 роки тому
Machine Learning 8.1 Bagging
Machine Learning 7.4 - R Lab, Decision Trees
Переглядів 2773 роки тому
Machine Learning 7.4 - R Lab, Decision Trees
Machine Learning 7.3 - Advantages and Disadvantages of Trees
Переглядів 1603 роки тому
Machine Learning 7.3 - Advantages and Disadvantages of Trees
Machine Learning 7.2 - Classification Trees
Переглядів 1373 роки тому
Machine Learning 7.2 - Classification Trees
Machine Learning 7.1 - Regression Trees
Переглядів 1863 роки тому
Machine Learning 7.1 - Regression Trees
Machine Learning 6.4 - R Lab, Nonlinear Regression
Переглядів 1153 роки тому
Machine Learning 6.4 - R Lab, Nonlinear Regression
Machine Learning 6.3 - Generalized Additive Models
Переглядів 7403 роки тому
Machine Learning 6.3 - Generalized Additive Models
Machine Learning 6.2 - Regression Splines and Local Regression
Переглядів 8823 роки тому
Machine Learning 6.2 - Regression Splines and Local Regression
Machine Learning 6.1 - Polynomial Regression and Step Functions
Переглядів 7993 роки тому
Machine Learning 6.1 - Polynomial Regression and Step Functions

КОМЕНТАРІ

  • @mikeolinblare913
    @mikeolinblare913 18 годин тому

    948 Benny Glen

  • @CottonChristian-e3r
    @CottonChristian-e3r 21 день тому

    Young Carol Harris Ruth Clark Jessica

  • @Sam1998Here
    @Sam1998Here 2 місяці тому

    Thank you for your explanation. I also think at 8:15 the multivariate normal distribution's probability density function should have $\sqrt{|\Sigma|}$ in the denominator (rather than $|\Sigma|$ as you have currently) and it also may be helpful to viewers to let them know that $p$ represents the dimension of the space we are considering

  • @janice3766
    @janice3766 2 місяці тому

    Thank you so much! 🙏🙏👍👍❤️❤️Are you able to provide slides for your videos, Prof Basener?

  • @user-qp9so1by1j
    @user-qp9so1by1j 3 місяці тому

    Super clear and simple. Thanks!

  • @gingerderidder8665
    @gingerderidder8665 4 місяці тому

    This beats my MIT lecture. WIll be coming back for more!

  • @jaafarelouakhchachi6170
    @jaafarelouakhchachi6170 6 місяців тому

    can you share these slides in the videos with me?

  • @deema_c
    @deema_c 7 місяців тому

    good explanation, funny that whenever he received an email notification I go check my inbox ='')

  • @marciamarquene5753
    @marciamarquene5753 7 місяців тому

    V não sei o meu não está o nome do meu amigo do cartão da minha irmã não tem nada a fazer com o 3f o meu não está funcionando não 3

  • @neftalisalazar2352
    @neftalisalazar2352 7 місяців тому

    I enjoyed watching your video, thank you. I will watch more of your videos on machine learning videos thank you!

  • @billbasener8784
    @billbasener8784 8 місяців тому

    Here is the link to the download site: apps.nationalmap.gov/lidar-explorer/#/

  • @aditihumne5147
    @aditihumne5147 8 місяців тому

    Sir where can i find yololidarTool.py can u provide that file

  • @man9mj
    @man9mj 10 місяців тому

    Thank you for sharing this. RF and SVM are the way to go with point clouds.

  • @ankanpaul2904
    @ankanpaul2904 11 місяців тому

    ❤❤

  • @geo123473
    @geo123473 11 місяців тому

    Very great video! Thank you professor!! :)

  • @praveenm5723
    @praveenm5723 11 місяців тому

    Thank you

  • @saunokchakrabarty8384
    @saunokchakrabarty8384 Рік тому

    How do you get the values of 0.15 and 0.02? I'm getting different values.

    • @rmharp
      @rmharp Рік тому

      Agreed. I got approximately 0.18 and 0.003, respectively.

  • @Spiegeldondi
    @Spiegeldondi Рік тому

    A very good and concise explanation, even starting with the explanation of likelihood. Very well done!

  • @asdfafafdasfasdfs
    @asdfafafdasfasdfs Рік тому

    Why do the stepwise functions have diagonals (slope != 0) joining the parts? shouldn't they all be joined by vertical lines, since they are continuous and yield either a 0 or a constant value?

  • @BluedvdMaster
    @BluedvdMaster Рік тому

    The NFL is changing Bill! Let's up the weight on rushing yards (...I'll admit I'm a Baltimore fan).

  • @AnaCcarita
    @AnaCcarita Рік тому

    Perfect

  • @mustafizurrahman5699
    @mustafizurrahman5699 Рік тому

    Excellent

  • @spencerantoniomarlen-starr3069

    10:48 ohhhhh, I was just going back and forth between the sections on LDA and QDA in three different textbooks (An Introduction to Statistical Learning, Applied Predictive Analytics, and Elements of Statistical Learning) for well over an hour and that multivariate normal pdf was really throwing me off big time. Mostly because of the capital sigma to the negative 1st power term, I didn't realize it was literally a capital sigma, I kept thinking it was a summation of something!

  • @Dhdhhhjjjssuxhe
    @Dhdhhhjjjssuxhe Рік тому

    Good job. It is very easy to follow and understand

  • @ofal4535
    @ofal4535 2 роки тому

    i was trying to read it my self but you made it so much simpler

  • @may-yc6qn
    @may-yc6qn 2 роки тому

    good explanation, i hope there is always example of implementation

  • @clintonlabrador6386
    @clintonlabrador6386 2 роки тому

    yoooo. This really helped me, my guy. Good work.

  • @hassanrevel
    @hassanrevel 2 роки тому

    Thanks professore

  • @黃楷翔-h8j
    @黃楷翔-h8j 2 роки тому

    Very useful information, thanks you professor!

    • @billbasener8784
      @billbasener8784 2 роки тому

      I am glad its helpful! Thanks for the kind words.

  • @Nader95
    @Nader95 2 роки тому

    13:42 correction: higher p-values indicate not very good predictors (insignificant); low predictors with p-values, actually, are good

    • @billbasener8784
      @billbasener8784 2 роки тому

      Thanks for pointing that out. You are exactly right - I said it the opposite of what I should have said!

  • @Nader95
    @Nader95 2 роки тому

    9:15 you say we should expect 51% since up/(up+down) days equals 51%, but we should expect 50% accuracy with randomly guessing (via frequentist inference); 51% does not represent the number of time you correctly call the market up AND the number of time you correctly call the market down, which is what the Confusion Matrix does. So, (up days / (up days + down days)) does not represent accuracy; confusion matrix represents accuracy when up==up and down==down over total number of days. So confusion matrix is not the same as your 51%; cannot compare 52% with 51%.

  • @Nader95
    @Nader95 2 роки тому

    thanks, can you do a video on neural networks from ISLR textbook?

  • @Nader95
    @Nader95 2 роки тому

    So basically, ridge/lasso regression penalize for the *size* of the coefficients while aic/bic subset selection penalizes for the *number* of coefficients

  • @iqm901
    @iqm901 2 роки тому

    This is an excellent series. Thank you so much for taking the time to make these

  • @pol4624
    @pol4624 3 роки тому

    very good video, thank you professor

    • @billbasener8784
      @billbasener8784 3 роки тому

      I am glad it is helpful. Thank you for the kind words!

  • @MrRynRules
    @MrRynRules 3 роки тому

    Thank you sir, well explained.

  • @js913
    @js913 3 роки тому

    Blender!!! Shocked and Surprised !! Awesome 👍👍👍

  • @zhengcao6529
    @zhengcao6529 3 роки тому

    You are so great. Keep up please.

  • @haitaoxu3468
    @haitaoxu3468 3 роки тому

    could you share the slide?

  • @JappieYow
    @JappieYow 3 роки тому

    Interesting and clear explanation! Thank you very much, this will help me in writing my thesis!

  • @mirohorvath
    @mirohorvath 3 роки тому

    Thank you for sharing this, and thumbs up for visualization in Blender :)

    • @ansondiego8875
      @ansondiego8875 3 роки тому

      you probably dont care but if you are stoned like me during the covid times you can watch pretty much all the new movies and series on instaflixxer. Been binge watching with my girlfriend during the lockdown :)

    • @andrewzakai3896
      @andrewzakai3896 3 роки тому

      @Anson Diego definitely, I've been using InstaFlixxer for since december myself :)

    • @noeldakota7395
      @noeldakota7395 3 роки тому

      @Anson Diego Definitely, I have been watching on InstaFlixxer for since november myself =)

  • @kaym2332
    @kaym2332 3 роки тому

    Hi! If the classes are assumed to be normally distributed, does that subsume that the features making up an observations are normally distributed as well?

    • @billbasener8784
      @billbasener8784 3 роки тому

      Yes. If the each class has a multivariate normal distribution than each individual feature variable ihas a single variable normal distribution.

  • @vi5hnupradeep
    @vi5hnupradeep 3 роки тому

    Thankyou so much ! Cleared a lot of my doubts

  • @lizzy1138
    @lizzy1138 3 роки тому

    Thanks for this! I needed to clarify these methods in particular, was reading about them in ISLR

  • @benjamincameron90
    @benjamincameron90 3 роки тому

    THANK YOU SO MUCH!!

  • @Crash-xz6hw
    @Crash-xz6hw 3 роки тому

    Great video. Many thanks

  • @alfibima4247
    @alfibima4247 3 роки тому

    How to classify LiDAR point cloud using machine learning in R.

  • @alfibima4247
    @alfibima4247 3 роки тому

    How to classify LiDAR point cloud using machine learning in R

  • @alfibima4247
    @alfibima4247 3 роки тому

    How to classify LiDAR point cloud using machine learning in R.

  • @cjb3377
    @cjb3377 3 роки тому

    Great video. I love crunching numbers and building reports like this. I'm definitely not anywhere close to being on your level considering you're a professor, but I've always felt like I should be in a similar field just based on my interests and skillset. How cool would it be to be a sports data analyst?

    • @billbasener8784
      @billbasener8784 3 роки тому

      Passion and a knowing your field is note important than education. I had a few students who now work for MLB teams as analysts and part of me is jealous of them. I love that places like PFF put out enough stats that anyone can an analyst now.