Chris Kimmer
Chris Kimmer
  • 25
  • 219 974
i427webcrawling
i427webcrawling
Переглядів: 108

Відео

427 crawler proj
Переглядів 789 років тому
427 crawler proj
VB Direct Manipulation using Timers
Переглядів 329 років тому
VB Direct Manipulation using Timers
Alternative metrics for association rules in Weka
Переглядів 2 тис.10 років тому
Alternative metrics for association rules in Weka
leverage and association rules
Переглядів 2,6 тис.10 років тому
leverage and association rules
covariance and correlation for association rules
Переглядів 46610 років тому
covariance and correlation for association rules
pca weka
Переглядів 47 тис.10 років тому
pca weka
Fitts Law Test
Переглядів 1,8 тис.10 років тому
Fitts Law Test
MLP in Weka with Iris data
Переглядів 14 тис.10 років тому
MLP in Weka with Iris data
1D Fitts' Law Fitting
Переглядів 2,1 тис.10 років тому
1D Fitts' Law Fitting
2d Fitts' Law Analysis
Переглядів 1,7 тис.10 років тому
2d Fitts' Law Analysis
naive bayes example
Переглядів 3 тис.10 років тому
naive bayes example
PRISM (part 1 of 2)
Переглядів 2,2 тис.10 років тому
PRISM (part 1 of 2)
PRISM (part 2 of 2)
Переглядів 74910 років тому
PRISM (part 2 of 2)
support and confidence
Переглядів 43 тис.10 років тому
support and confidence
1r numeric
Переглядів 2,9 тис.10 років тому
1r numeric
1r
Переглядів 25110 років тому
1r
Sample Cognitive Walkthrough
Переглядів 81 тис.10 років тому
Sample Cognitive Walkthrough
How to convert decision trees to rules
Переглядів 13 тис.10 років тому
How to convert decision trees to rules
Graphical Interpretation of decision trees
Переглядів 1 тис.10 років тому
Graphical Interpretation of decision trees
Basic statistical measures in Excel
Переглядів 14510 років тому
Basic statistical measures in Excel
data mining schemes with iris data
Переглядів 44510 років тому
data mining schemes with iris data
Concepts and concept descriptions
Переглядів 42510 років тому
Concepts and concept descriptions
INFO-I300 Course Orientation
Переглядів 6010 років тому
INFO-I300 Course Orientation
INFO-I421 Orientation
Переглядів 6310 років тому
INFO-I421 Orientation

КОМЕНТАРІ

  • @vasanthztube
    @vasanthztube 7 місяців тому

    Crisp and clear. Thanks for explaining.

  • @chaitanyasoma9059
    @chaitanyasoma9059 8 місяців тому

    I didn't get how you considered t=4 for outlook

  • @akshayreliance
    @akshayreliance 8 місяців тому

    Video of useful but I doubt interpretation of variance explained by model. shouldn't we be looking at proportion and cumulative columns for variance explained rather than looking at ranked attributes figure?

  • @ziyanalicodes
    @ziyanalicodes 8 місяців тому

    Best video ever the best I say

  • @TerriMiller-m5q
    @TerriMiller-m5q Рік тому

    Is this new moon app. Safe from third parties apps? Is it safe to leave my card no.? Also, I just wanted to try for one month first.

  • @segarus95
    @segarus95 Рік тому

    Vtu guys give a like 👍

  • @oubayeghammate9484
    @oubayeghammate9484 Рік тому

    anyone from scc 202?

  • @ahmadramishrefa9068
    @ahmadramishrefa9068 Рік тому

    Great explanation, Thank you

  • @vishnukarna438
    @vishnukarna438 Рік тому

    Hi, regarding feature extraction using PCA I applied PCA filter on input dataset that consists of 13 predictors and got 18 principal components. Is it possible? At first, the input dataset consists of 6 missing values with that it produced 12 PCs when replace the missing values with majority mark of that particular features I got 18 PCs. May I know what exactly the concept behind this? By the way I used Cleveland heart dataset and WEKA platform. Thank you.

  • @AveRegina_
    @AveRegina_ 2 роки тому

    My ranked attribute values aren't adding upto 100.. if anyone could explain what am I doing wrong?

  • @patrickojeh
    @patrickojeh 2 роки тому

    Awesome. Thank you!

  • @maheshvenkat9956
    @maheshvenkat9956 2 роки тому

    Poor audio

  • @drewjd27
    @drewjd27 3 роки тому

    isn't, for example, distance 2 must be distance 2 - distance 1?

  • @darthravan9
    @darthravan9 3 роки тому

    Thanks, my lecturer spent 15 minutes explaining this through a long winded example about a boy and a girl going to lunch and the correlation relating to whether the boy was in love with the girl and was purposely trying to see her at lunch. Thankyou for not wasting my time like my $2000 uni course.....

  • @dio_liu5155
    @dio_liu5155 3 роки тому

    Great job!

  • @Jonpaulim
    @Jonpaulim 3 роки тому

    Hi Chris great video can I ask a question please

  • @DeadMarina
    @DeadMarina 3 роки тому

    Hi

  • @Estheryaaa
    @Estheryaaa 3 роки тому

    one question, are these just some random questions that associated with the task or it's kinda standard...

  • @Estheryaaa
    @Estheryaaa 3 роки тому

    thank you very much for explaining the concept with a very easy and understandable example!

  • @marwaa.6759
    @marwaa.6759 3 роки тому

    Great explanation sir. but what after that ?? how to use this data in my model. I'm making a IDS model for detecting attacks and I need to reduce the features to get faster and accurate detection

  • @draconov_alt
    @draconov_alt 3 роки тому

    комментарий

  • @SrAbrahamAugusto
    @SrAbrahamAugusto 4 роки тому

    You are the best explainer in this topic i've found in UA-cam. The indian guys i've found are really amazing, but the bad english and the lack of focus are barriers to acknowledgement. Congratulations and thank you for have a script before recording the video and for being so patient during it.

  • @mrtlgz
    @mrtlgz 4 роки тому

    Great video, clear to understand...

  • @zachirnayq4990
    @zachirnayq4990 4 роки тому

    hey thanks a lot, the tutorial really helpful !!

  • @guilhermec9615
    @guilhermec9615 5 років тому

    Great content. Thank you so much for explaining it. But I think it's worth pointing out that the ranked values don't add up to 1. When you try with other datasets, these values are all less than 1, but certainly aren't adding up to 1. For example, I got 0.74, 0.52, 0.39 for the first 3 attributes (there were more) for the diabetes dataset. I would say that it was just a coincidence the values got in the video, but even in the cpu dataset that was used in the video, they don't add to 1 as well. They appear to, at a first look, but when you actually add them up, you get 1.0473. Close, but not 1. Thanks anyway, all the rest is really helpful.

  • @teislarsen8303
    @teislarsen8303 5 років тому

    Hey. Is there somewhere i can find this test? :)

  • @TheFibrewire
    @TheFibrewire 5 років тому

    Was there a reason play button was not replaced to stop button while recording?

  • @upasananagar8335
    @upasananagar8335 5 років тому

    This is such a great explanation. Thank you

  • @RupeshNathU
    @RupeshNathU 5 років тому

    Can you explain did you captured movement time in the data?

  • @aumifast9432
    @aumifast9432 6 років тому

    congnintgoivhfsdgb walktwefuiah

  • @chiefbac0n511
    @chiefbac0n511 6 років тому

    First of all thanks for the great video! It was very helpfull in understanding a Cognitive Walkthrough. But one Question :P Obv the camera symbol is too close to the stop button, but isn't that the reason why they actually use a circular symbol instead of the camera symbol for recording (since it's also closest to the common analog button)? And aswell don't u think an "X" will be missleading for canceling the action rather than just stoping the video?

  • @hamzasaad8431
    @hamzasaad8431 6 років тому

    in rules building, you have two things cause and result. you can guess leaf is a result because it carries a decision. for example, your diabetes data starts split with the age attribute and after two splits using different attributes get a pure subset. Thus If Age >=30, Fat person, and No regular exercising, then Yes patient has diabetes.

    • @datascienceds7965
      @datascienceds7965 6 років тому

      What about when dummy variable appear as <=0.5? For example, I have a decision tree built in python. One of the nodes is EnglishEducation_Partial College<=0.5. EnglushEducation was a variable and Partial College was one of the values. Would I interpret that node as (Data[EnglishEducation] is not equal to Partial College)?

    • @hamzasaad8431
      @hamzasaad8431 6 років тому

      I think dummy path or decision is only to keep the transaction in tree. See to your dummy if only applied for keep a sequence in the tree, then ignore it and build your rules based on real variable which named in your dataset.

    • @datascienceds7965
      @datascienceds7965 6 років тому

      Thanks

  • @jennyhernandez7
    @jennyhernandez7 6 років тому

    Awesome job with this tutorial, very helpful thanks

  • @Garpsta
    @Garpsta 6 років тому

    not my proudest fap

  • @vonjd
    @vonjd 6 років тому

    Thank you for this excellent video. If you are working with R you can use the OneR package for that, it is on CRAN. Please watch this video for a step-by-step introduction: ua-cam.com/video/AGC0oRlXxgU/v-deo.html You can find out more about it here: cran.r-project.org/web/packages/OneR/vignettes/OneR.html

  • @vonjd
    @vonjd 7 років тому

    Thank you for this excellent video. If you are working with R you can use the OneR package for that, it is on CRAN. Please watch this video for a step-by-step introduction: ua-cam.com/video/AGC0oRlXxgU/v-deo.html You can find out more about it here: cran.r-project.org/web/packages/OneR/vignettes/OneR.html

  • @robiparvez
    @robiparvez 7 років тому

    nice, mate..superb explanation...

  • @desisto007
    @desisto007 7 років тому

    thanks for sharing, very good tutorial!

  • @desisto007
    @desisto007 7 років тому

    Hello, quick question. Should the data be normalized before I use PCA? Thanks for sharing.

  • @chicopapass
    @chicopapass 7 років тому

    what type of pen is that? I had one of them in highschool and it was my favorite pen of all time.

    • @HamzaHafeez7292
      @HamzaHafeez7292 7 років тому

      I think its a Ball pointer with less viscous Gel. In my area we just ask for a Gel Pen to get one of these.

  • @spacecloud4547
    @spacecloud4547 7 років тому

    Thanks, very intuitive...

  • @TSUKILORD
    @TSUKILORD 8 років тому

    Hello. Really practical video. I have one question: How do you define the threshold? I mean why 50% or 40%? Do these numbers come out of random? During the first phase, I realized that it is easy to select features since they are ranked ( so they use something like entropy algorithm or so). But what about these percentages that you are mentioning? Where do they come from? Thank you in advance!

  • @mohsinash1134
    @mohsinash1134 8 років тому

    Great . Amazing Job

  • @mohsinash1134
    @mohsinash1134 8 років тому

    I watched the whole video & I swear I still don't know what it is ! :/

    • @GogogoFolowMe
      @GogogoFolowMe 8 років тому

      It's the validation of an interface implying observing an user during it's usage.

  • @ericolive452
    @ericolive452 8 років тому

    Very well done. Great reference for students and those new to UX.

  • @technomad900
    @technomad900 9 років тому

    Very Simplified , Great stuff

  • @amrshady
    @amrshady 9 років тому

    Rainy should be 3/9

  • @NURFAIZAfaizal90818
    @NURFAIZAfaizal90818 9 років тому

    Thank you so much! its to ease to understand it ! keep it up #usabilty #engineering

  • @hozaifahzafar6975
    @hozaifahzafar6975 9 років тому

    For some reason my ranked attribute variance doesn't add up to. Does anyone know whats the isssue with it

    • @gears257
      @gears257 9 років тому

      Hozaifah Zafar I got the same problem

    • @fathimtiaz
      @fathimtiaz 7 років тому

      i think the one that add up to 1 is the proportion part of the PC, not the one he pointed out

  • @basantsub1234
    @basantsub1234 10 років тому

    Can I use PCA with dataset containing non-numeric attributes?