Varsha's engineering stuff
Varsha's engineering stuff
  • 214
  • 1 128 710

Відео

Logistic Regression, Sigmoid Function, Binary Classification
Переглядів 1522 місяці тому
Logistic Regression, Sigmoid Function, Binary Classification
Multivariate Linear Regression, Multivariate Simple & Multiple Linear Regression
Переглядів 2312 місяці тому
Multivariate Linear Regression Multivariate Single/Simple Regression Multivariate Multiple Linear Regression
Linear Regression: Gradient Descent Approach, Learning rate, parameters, Simple and Multiple
Переглядів 1532 місяці тому
Linear Regression : Gradient Descent Approach, Gradient, Learning rate, parameters
Part 6: Support Vector Machine, Soft Margin, Parameter C, Ski, Penalty, Lineraly separable
Переглядів 1113 місяці тому
Part 6: Support Vector Machine, Soft Margin, Parameter C, Ski, Penalty, Lineraly separable
Part V: Support Vector Machine, Quadratic Programming, Toy Example, Support vectors,w,b,unseen sampl
Переглядів 2163 місяці тому
Part V: Support Vector Machine, Quadratic Programming, Toy Example, Support vectors,w,b,unseen sampl
Part IVSupport Vector MachineQuadratic Programming
Переглядів 1923 місяці тому
Part IVSupport Vector MachineQuadratic Programming
Part 3: Support Vector Machine, Multiclass, One-vs-Rest, One-vs-One, Error-Correcting Output Codes
Переглядів 2743 місяці тому
Part 3: Support Vector Machine, Multiclass, One-vs-Rest, One-vs-One, Error-Correcting Output Codes
Part 2: Support Vector Machine, Margin, Support Vectors, Boundary, Hyperplane, Simple Problem in 2D
Переглядів 1643 місяці тому
Part 2: Support Vector Machine, Margin,Boundary,Hyperplane
Part 1: Support Vector Machine, Constrained Optimization, Margin, Support vectors, decision boundary
Переглядів 7793 місяці тому
Constrained Optimization Introduction to Support Vector Machine (SVM) Optimal Decision Boundary Margins and Support Vectors
Part 9: Introduction to NLP, Challenges in NLP Application development
Переглядів 3325 місяців тому
Part 9: Introduction to NLP, Challenges in NLP Application development Accuracy, Scalability, Efficiency, Robustness, Adaptability, Contextual Understanding, Bias Mitigation Interoperability, Multimodal Integration, Data Privacy, User Personalization, Resource Optimization
Part 8: Introduction to NLP at various Levels in Marathi and Hindi Languages
Переглядів 2415 місяців тому
Part 8: Introduction to NLP at various Levels in Marathi and Hindi Languages
PART 7: Introduction to NLP, Ambiguities, English, Lexical, Syntactic, Semantic, Pragmatic,Discourse
Переглядів 6495 місяців тому
PART 7:Introduction to NLP, Ambiguities, English, Lexical, Syntactic, Semantic, Prgamatic, Discourse
Part 6: Introduction to NLP, Why NLP is hard, Textual Humor, Sarcasm, Idioms, Neologisms, Tokenizati
Переглядів 1855 місяців тому
Textual Humor Sarcasm Tricky Entity Names Idioms Neologisms Segmentation Issues New Senses of a word Non standard use of English [ e.g. Informal, Shortform] New Senses of a word Words or Phrases: Multiway Interprtation (Confusing Meanings) Language Imprecision and Vagueness Extreme examples of Lexical Ambiguity
Part 5: Introduction to NLP, Language, Grammar and Knowledge in NLP
Переглядів 2495 місяців тому
Part 5: Introduction to NLP, Language, Grammar and Knowledge in NLP
Part 4: History of NLP, First Era, Second Era, Third Era, Fourth Era, Introduction to NLP
Переглядів 1775 місяців тому
Part 4: History of NLP, First Era, Second Era, Third Era, Fourth Era, Introduction to NLP
NLP Introduction Part 3: Generic NLP System, Parser, Semantic, Pragmatic. & Discourse, Reasoner
Переглядів 6155 місяців тому
NLP Introduction Part 3: Generic NLP System, Parser, Semantic, Pragmatic. & Discourse, Reasoner
NLP Introduction Part I:Definition,Natural Language Generation(NLG)& Understanding(NLU), Need, Goals
Переглядів 5186 місяців тому
NLP Introduction Part I:Definition,Natural Language Generation(NLG)& Understanding(NLU), Need, Goals
Semantic Analysis Part 3:Relations among lexemes & their Senses, NLP, Homonymy, Polysemy, Syno,
Переглядів 5866 місяців тому
Semantic Analysis Part 3:Relations among lexemes & their Senses, NLP, Homonymy, Polysemy, Syno,
Part 1: Semantic Analysis, NLP, Computational, Distributional, Formal Semantics, Lexicon & Lexeme
Переглядів 7656 місяців тому
Part 1: Semantic Analysis, NLP, Computational, Distributional, Formal Semantics, Lexicon & Lexeme
Part 7: Earley Parser, Top Down Parser, NLP, Predict, Scan, Complete, Chart, Tanle, CFG Rule
Переглядів 2,6 тис.6 місяців тому
Part 7: Earley Parser, Top Down Parser, NLP, Predict, Scan, Complete, Chart, Tanle, CFG Rule
Parser Part 6: Predictive Parser, Top Down, NLP, First, Follow, Stack, look ahead, Predictive Table
Переглядів 9316 місяців тому
Parser Part 6: Predictive Parser, Top Down, NLP, First, Follow, Stack, look ahead, Predictive Table
Part 5: PCFG parser, Bottom Up Parser, NLP, CFG, Probability, CYK Algorithm, Parse Trees Exercises
Переглядів 2,7 тис.6 місяців тому
Part 5: PCFG parser, Bottom Up Parser, NLP, CFG, Probability, CYK Algorithm, Parse Trees Exercises
Part 4: Bottom Up Parser, Shift Reduce Parser, Stack, Shift, Reduce, Ambiguity, Backtracking exercis
Переглядів 9716 місяців тому
Part 4: Bottom Up Parser, Shift Reduce Parser, Stack, Shift, Reduce, Ambiguity, Backtracking exercis
Parser Part 3: Bottom Up, COCKE-YOUNGER-KASAMI (CYK or CKY Parser), NLP, CNF, CFG, Tree, Dynamic
Переглядів 2,2 тис.6 місяців тому
Parser Part 3: Bottom Up, COCKE-YOUNGER-KASAMI (CYK or CKY Parser), NLP, CNF, CFG, Tree, Dynamic
Part 2: NLP Parsers, Modelling Constituency, CFG, Chomsky Normal Form (CNF), Top down & Bottom Up
Переглядів 1,5 тис.6 місяців тому
Part 2: NLP Parsers, Modelling Constituency, CFG, Chomsky Normal Form (CNF), Top down & Bottom Up
Part 1: Parsers in NLP, Parsers Role,Words & Word Groups (Constituency),Types of Parsers, Ambiguity
Переглядів 1,6 тис.6 місяців тому
Part 1: Parsers in NLP, Parsers Role,Words & Word Groups (Constituency),Types of Parsers, Ambiguity
Good Turing Discounting, Smoothing, C*, P*GT, Backoff, Interpolation, Laplace, MLE, NLP
Переглядів 2 тис.6 місяців тому
Good Turing Discounting, Smoothing, C*, P*GT, Backoff, Interpolation, Laplace, MLE, NLP
Part 6: Image Processing Introduction, Connectivity, Adjacency, Euclidean, City Block, Chess Board,m
Переглядів 2306 місяців тому
Part 6: Image Processing Introduction, Connectivity, Adjacency, Euclidean, City Block, Chess Board,m
Part 5: Image Processing Introduction, IMAGE FILE FORMATS, TIFF, BMP, JPEG, Features, Adv, Disadv
Переглядів 636 місяців тому
Part 5: Image Processing Introduction, IMAGE FILE FORMATS, TIFF, BMP, JPEG, Features, Adv, Disadv

КОМЕНТАРІ

  • @rakesharora3353
    @rakesharora3353 8 днів тому

    In Q 2) BP(3), min_sup is 2, so it should be frequent? Right? In The table × ✓ ✓ ( frequent, closed, maximal) for BP

  • @sathiskumargovindaraj5573
    @sathiskumargovindaraj5573 18 днів тому

    Awesome. you are my God. Wonderful explanation and one stop portal for NLP basics....

  • @shewitamare198
    @shewitamare198 18 днів тому

    ex11 is correct?

  • @abazmenfes
    @abazmenfes 26 днів тому

    good job.

  • @alokpatel7577
    @alokpatel7577 28 днів тому

    thank you mam

  • @ANKITKUMAR-cv7zf
    @ANKITKUMAR-cv7zf 29 днів тому

    how S->aA?

  • @ANKITKUMAR-cv7zf
    @ANKITKUMAR-cv7zf 29 днів тому

    how to divide ?

  • @dr.md.abdus_salam
    @dr.md.abdus_salam Місяць тому

    Thanks. Which test can be used for one dependant variable and multiple independent variable?

  • @syedz7755
    @syedz7755 Місяць тому

    Not even one line understood

  • @22gauravkumar70
    @22gauravkumar70 Місяць тому

    good video

  • @TheBharatOfToday
    @TheBharatOfToday Місяць тому

    it was easy with example

  • @gollarambabu3317
    @gollarambabu3317 Місяць тому

    Thank you mam

  • @rohanwarghade7111
    @rohanwarghade7111 Місяць тому

    22:00 important point note plz laplace smoothing

  • @arghaauddy
    @arghaauddy Місяць тому

    Excellent Explanation.

  • @KadhaiThalam-w7h
    @KadhaiThalam-w7h Місяць тому

    Mam after table formation (Using mahattan formula) Probably we get values as whole numbers If e=1.9 If there are no points less than 1.9 How to continue the problem

  • @Lost70000
    @Lost70000 Місяць тому

    When calculating ( justin will spot will) in the second side of formula P(POS2|POS1) how exactly are you reading the state transition Matrix sometimes u say it's from top to left side but u take values from left to top side for justin u took value from left to top for will u took left to top for spot suddenly u took from top to left it's confusing how to read transition Matrix please specify fast coz when I'm taking value from top to left side answers are becoming different

    • @Lost70000
      @Lost70000 Місяць тому

      Please reply fast ma'am

  • @shaikhtaqueveem6819
    @shaikhtaqueveem6819 Місяць тому

    You just Yapp

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 Місяць тому

      It is very difficult to get material for this topic and understanding the topic. Initial part is theory but necessary

  • @DailyBetterByOnePercent
    @DailyBetterByOnePercent Місяць тому

    great explanation

  • @Sunshine-hg5ww
    @Sunshine-hg5ww Місяць тому

    best video for this topic by far. Exactly what cam in my exams ://

  • @SnackssTime
    @SnackssTime Місяць тому

    In 18:30 why u have wrote 1/3 for P(M|N) when its actually 1/4 in table ?

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 Місяць тому

      i will check. by mistake can be possible, see other steps

    • @Lost70000
      @Lost70000 Місяць тому

      @@varshasengineeringstuff4621 yea I was stuck there too table is 1/4

  • @akash_assist
    @akash_assist Місяць тому

    we should always take centroid, right??

  • @akash_assist
    @akash_assist Місяць тому

    why average is not taken in first example and taken in other examples for relevant and non-relavant caluculations?

  • @vedikamishra4733
    @vedikamishra4733 Місяць тому

    just wanted to mention that i'm referring your nlp playlist, and it has been incredibly helpful in my preparation. Your explanations are clear and easy to follow, which has made even the complex topics much easier to understand. I truly appreciate the effort you put into making these resources available. thanks alot ma'am!

  • @pujethpallapu
    @pujethpallapu Місяць тому

    🙌

  • @megatron1560
    @megatron1560 Місяць тому

    you are a life saver mam, cant thank you enough, I am from Fr.CRCE and I got IR exam tomorrow.

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 Місяць тому

      "Thank you so much for your kind words! 😊 Wishing you the very best for your IR exam tomorrow. If you find my content helpful, feel free to share it with your friends and team, and don’t forget to subscribe to the channel for more support. 🌟 All the best!"

  • @rishabhsinha7250
    @rishabhsinha7250 Місяць тому

    After searching for 4 hours, I finally understood good turing after ur video. Thanku

  • @andrejohnv
    @andrejohnv Місяць тому

    Ma'am, formula is wrong for calculation of probabilities is it? I think few operations need to be performed in the power of e...

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 Місяць тому

      I have taken NPTEL reference. Yes, different calculations can be possible. Only main aim to understand we can use not only limited words POS(forward or backward from current word) but also with reference to other words features

  • @NOOB_141
    @NOOB_141 Місяць тому

    one of the worst videos on internet

  • @shubhampatil5822
    @shubhampatil5822 Місяць тому

    :)

  • @sohamsarkar7592
    @sohamsarkar7592 Місяць тому

    thank you this video helped a lot

  • @mumtahinaislammahi2162
    @mumtahinaislammahi2162 Місяць тому

    +45 and -45 are swapped

  • @hibro1729
    @hibro1729 Місяць тому

    9:19 In the 3-itemset the frequency of BCD should be 3 and not 4.As BC is closed item set none of its immediate superset can have the same min sup.

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 Місяць тому

      I will check. SOme minor mistake may be from my side. ONly my purpose is to explain basic concept

  • @ARYANUGHADE-kw8pw
    @ARYANUGHADE-kw8pw Місяць тому

    Great explanation!👍👍

  • @shubhayandas5127
    @shubhayandas5127 Місяць тому

    Are we multiplying the average everywhere(I mean all equations)? I am asking because in formula it is given sigma which means sum.

  • @shubhayandas5127
    @shubhayandas5127 Місяць тому

    How is 2/log(5) = 0 in the final table? Any specific reason for that or it's just a mistake?

  • @prxnxv-y6p
    @prxnxv-y6p Місяць тому

    great ❤‍🔥🔥

  • @huchchayyamkelur6842
    @huchchayyamkelur6842 Місяць тому

    Good morning mam. Do you have notes and ppts of information retrieval please share. Thank you in advance mam.

  • @MUTHUKUMARR-ts4df
    @MUTHUKUMARR-ts4df 2 місяці тому

    Good explanation mam🎉

  • @santoshpatil3614
    @santoshpatil3614 2 місяці тому

    I was waiting for this topic, very nicely explained. Thank you for sharing. 🎉🎉

  • @santoshpatil3614
    @santoshpatil3614 2 місяці тому

    Varsha mam, awesome content, very knowledgeable, kindly share more videos. Thank you madam 🎉

  • @Subhash231
    @Subhash231 2 місяці тому

    Good one

  • @supersonicplus4442
    @supersonicplus4442 2 місяці тому

    nice ppt representation mam but if you provide the ppt in description then it better for us

  • @huzaifa_swati
    @huzaifa_swati 2 місяці тому

    Thank you ❤

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 2 місяці тому

      Thanks. I guess You are from swat valley of Pakistan. It is very beautiful,

    • @huzaifa_swati
      @huzaifa_swati 2 місяці тому

      @@varshasengineeringstuff4621 I am 18th descendent of Last King of Swat Sultanate but unfortunately royal family had to leave Swat so we are living in nearby city of Swat now. Thank you for amazing words about our valley ❤️❤️

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 2 місяці тому

      Thank you for sharing your connection to such a rich heritage. It’s fascinating to learn that you are the 18th descendant of the last King of Swat Sultanate, and I can only imagine the legacy your family carries. Swat Valley’s beauty, culture, and history are truly remarkable, and it’s heartening to know that your family remains part of its story.

    • @huzaifa_swati
      @huzaifa_swati 2 місяці тому

      @@varshasengineeringstuff4621 We are now landlords of neighboring Mansehra District which is way more beautiful than Swat. The famous Valleys like Naran Kaghan Valley, Siran Valley etc are situated in Mansehra.

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 2 місяці тому

      Mansehra truly sounds captivating, especially with gems like Naran, Kaghan, and Siran Valley. Mansehra Rock Edicts are fourteen edicts of the Mauryan emperor Ashoka. It’s wonderful that you are part of such a beautiful and renowned region!

  • @MadanAK0509
    @MadanAK0509 2 місяці тому

    now i understand mam thank you

  • @luciferstark467
    @luciferstark467 2 місяці тому

    Ppt ?

  • @Saeeda-v9k
    @Saeeda-v9k 2 місяці тому

    GREAT

  • @ShaikShaik-mn6kh
    @ShaikShaik-mn6kh 2 місяці тому

    I have subscribed your channel and i hope that you will post more n more usefull videos regarding digital image processing of jtnua btech ASAP thank you and good job🎉 ... You explained in a nice way.... ❤

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 2 місяці тому

      Sure. If possible share link of syllabus

    • @ShaikShaik-mn6kh
      @ShaikShaik-mn6kh 2 місяці тому

      @@varshasengineeringstuff4621 Sure thank you very much

    • @ShaikShaik-mn6kh
      @ShaikShaik-mn6kh 2 місяці тому

      dap.jntua.ac.in/wp-content/uploads/2022/07/JNTUA-R20-B.Tech_.-ECE-III-IV-Course-structure-Syllabus.pdf

    • @ShaikShaik-mn6kh
      @ShaikShaik-mn6kh 2 місяці тому

      Please upload videos on digital image processing ASAP

    • @ShaikShaik-mn6kh
      @ShaikShaik-mn6kh 2 місяці тому

      2: Pixel relationship Week 3: Camera models & imaging geometry Week 4: Image interpolation Week 5: Image transformation Week 6: Image enhancement I Week 7: Image enhancement II Week 8: Image enhancement III Week 9: Image restoration I Week 10: Image restoration II & Image registration Week 11: Colour image processing Week 12: Image segmentation Week 13: Morphological image processing Week 14: Object representation ,description and recognition BOOKS AND REFERENCES Digital Image Processing by Rafael C Gonzalez & Richard E Woods, 3rd Edition Fundamentals of Digital Image Processing by Anil K Jain Digital Image Processing by William K Pratt If you do videos on nptel topics in digital image processing I will surely give you 20 subscribers I found your explanation is simple and clear in the whole videos that is why I am request and I hope you will try 😊thank you once again 🎉❤

  • @ash-xl3bb
    @ash-xl3bb 2 місяці тому

    Very well explained. Thankyou madam.

  • @shrutisingh4232
    @shrutisingh4232 3 місяці тому

    unclear

  • @tolhadamola5995
    @tolhadamola5995 3 місяці тому

    Hello ma'am. I love your teaching 🎉🎉 Is there any textbook you got this from or just from the internet? Could you please recommend the textbook(s) if any?

    • @varshasengineeringstuff4621
      @varshasengineeringstuff4621 3 місяці тому

      Thanks for the nice complement. Yes. Manning and Daniel Jurafsky books. Pawan Goyal NPTEL course on NLP

    • @tolhadamola5995
      @tolhadamola5995 3 місяці тому

      @@varshasengineeringstuff4621 Thanks a lot ma'am.