Finding Reducts, Heuristics Attribute Selection, KDD Algorithms, Rough Sets

Поділитися
Вставка
  • Опубліковано 9 лис 2024

КОМЕНТАРІ • 21

  • @vincentroy1810
    @vincentroy1810 6 років тому +1

    Thank you Laurel really good and simple explanation on reducts

  • @abdullateefbalogun4043
    @abdullateefbalogun4043 8 років тому

    Thanks laurel..... a very good one... i am expecting more videos from you.

  • @IvanJRamirez
    @IvanJRamirez 6 років тому +1

    Hi Laurel, as usual thanks for this video, have you published some paper using or explaining this technique? I would like to reference your work

  • @behnam2534
    @behnam2534 8 років тому

    Thanks so much. It was amazingly clear and helpful.
    I was just wondering if you have used R to do feature selection by Rough Set Theory? In fact, I would like to use R to do feature selection by Rough Set Theory and I am pretty unsure how to do so.
    Appreciate it again.

  • @samking6691
    @samking6691 7 років тому

    Thanks so much , if you can display and explain the steps of Rough set using an example.

  • @ahohl84
    @ahohl84 9 років тому

    Hi Laurel! Awesome videos, I appreciate what you're doing! Which/how many attributes do I combine to extract rules if the core is empty? Thanks so much.

    • @laurelpowell1398
      @laurelpowell1398  9 років тому +2

      +Alexander Hohl Hi Alexander! Thank you for the comment! The process does not change if the core is empty. You still add one attribute at a time, check them and move on to adding two attributes or more until you have fully explained the decision feature. The only change is that when you start trying to create rules, you begin with one attribute rather than one attribute plus the core. I hope this helps.

  • @alifawzi4566
    @alifawzi4566 8 років тому

    thanks laurel the video is amazing and useful like you if you can make another with fuzzy rough

    • @toobafatima9311
      @toobafatima9311 5 років тому

      can you help me for understanding this concept m not cleared about that

  • @neildatu2787
    @neildatu2787 7 років тому

    Thank you. this made me clearly understand reduct:-)

  • @donl7737
    @donl7737 9 років тому

    Thank you! I like it.

  • @vikramvins7
    @vikramvins7 8 років тому

    hi mam thnq for ur time giving this session , can i hav any software to do this rough set or matlab code to follow if any u pls suggest me ma'm ,thanking you ma'm

  • @samking6691
    @samking6691 7 років тому

    could you make a tutorial on how to start with rough set please

  • @TheLivingstoneProject
    @TheLivingstoneProject 7 років тому

    Is the decision attribute always the right-most value?

    • @laurelpowell1398
      @laurelpowell1398  7 років тому +1

      No, that is just a formatting method that I like to use, it makes it easier to keep track of the decision attribute visually. Thanks!

    • @TheLivingstoneProject
      @TheLivingstoneProject 7 років тому

      Laurel Powell Thank you!

  • @vincentroy1810
    @vincentroy1810 6 років тому

    When searching the core.What happen if we need to hide 2 or 3 attributes before we find indiscernibility relation? can we conclude that the core is multiple-attributes??

    • @laurelpowell1398
      @laurelpowell1398  6 років тому

      Regarding your question, sometimes there is not one single indispensable attribute but several. One method to find the core is after trying removing all the attributes in sets of one try removing attributes in sets of two and then three, etc.. This can tell you if the core has multiple attributes. In more complex larger datasets, a multi-attribute core is more common.

    • @vincentroy1810
      @vincentroy1810 6 років тому

      Thanks laurel it will help me in my research

  • @peeintea
    @peeintea 5 років тому

    Laurel i love you