Karen Janice Mazidi
Karen Janice Mazidi
  • 82
  • 172 928
CA04 - MIPS and machine code
Converting MIPS instructions to machine code, and reverse engineering machine code to MIPS instructions; MIPS instruction formats
Переглядів: 6 519

Відео

CA01 - Introduction to Computer Architecture
Переглядів 15 тис.3 роки тому
What is Computer Architecture?
nlp25 - Embeddings
Переглядів 2424 роки тому
Content from Chapter 25 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Training your own embeddings or using pretrained embeddings live GloVe, ELMo, BERT.
CA23 - Dependability and security
Переглядів 7974 роки тому
Parity and ECC for error detection; virtual machines, containers and cloud; hardware vulnerabilities
nlp26 - Sequence to sequence models
Переглядів 2114 роки тому
Content from Chapter 26 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Encoders and decoders for seq-2-seq models.
nlp24 - Deep Learning Variations
Переглядів 2644 роки тому
Content from Chapter 24 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Adding an Embedding layer in Keras: RNN, CNN, LSTM, GRU in Keras
nlp23 Deep Learning
Переглядів 2954 роки тому
Content from Chapter 23 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Deep learning for text classification with Keras; Keras API; Keras Functional API
nlp21 - Logistic Regression
Переглядів 3234 роки тому
Content from Chapter 21 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Logistic Regression for text classification; underfitting and overfitting; gradient descent; odds versus probability; log odds; sigmoid function
nlp22 - Neural Networks
Переглядів 3444 роки тому
Content from Chapter 22 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Neural networks in sklearn; perceptrons; neurons; layers; activation functions; feed forward network; back propagation; epochs; network design
nlp19 - Converting text to data
Переглядів 4564 роки тому
Content from Chapter 19 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Using sklearn CountVectorizer() and TfidfVectorizer()
nlp20 - Naive Bayes
Переглядів 3634 роки тому
Content from Chapter 20 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Running Naive Bayes in sklearn for text classification. Metrics: accuracy, precision, recall, Kappa, ROC and AUC.
nlp17 ML intro
Переглядів 4174 роки тому
Content from Chapter 17 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ An overview of machine learning; supervised v. unsupervised learning; terminology
nlp18 Libraries
Переглядів 3744 роки тому
Content from Chapter 18 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ An introduction to NumPy, pandas, Seaborn, sklearn.
nlp14 - Vector space model
Переглядів 6834 роки тому
Content from Chapter 14 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Implementing a vector space model from scratch or using sklearn. Explanation of cosine similarity.
nlp15 topic modeling
Переглядів 2844 роки тому
Content from Chapter 15 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Topic modeling with gensim.
nlp12 - Corpora
Переглядів 4294 роки тому
nlp12 - Corpora
nlp13 - Information Extraction
Переглядів 1,3 тис.4 роки тому
nlp13 - Information Extraction
nlp - syntax parsers
Переглядів 3374 роки тому
nlp - syntax parsers
nlp09 - CFG
Переглядів 6594 роки тому
nlp09 - CFG
nlp10 syntax parsers
Переглядів 6994 роки тому
nlp10 syntax parsers
nlp08 - ngrams
Переглядів 5834 роки тому
nlp08 - ngrams
nlp07 - Relationships between words
Переглядів 7054 роки тому
nlp07 - Relationships between words
nlp06-POS tagging
Переглядів 1,6 тис.4 роки тому
nlp06-POS tagging
nlp05-Words and Counting
Переглядів 5174 роки тому
nlp05-Words and Counting
nlp03-NLTK
Переглядів 8344 роки тому
nlp03-NLTK
nlp04-Linguistics
Переглядів 4654 роки тому
nlp04-Linguistics
nlp02-Python
Переглядів 6594 роки тому
nlp02-Python
nlp01-Welcome to Natural Language Processing
Переглядів 1,7 тис.4 роки тому
nlp01-Welcome to Natural Language Processing
ML27 - Markov Models to Reinforcement Learning
Переглядів 2034 роки тому
ML27 - Markov Models to Reinforcement Learning
ML26 - Bayes network
Переглядів 2074 роки тому
ML26 - Bayes network

КОМЕНТАРІ

  • @lucyheartfilia6948
    @lucyheartfilia6948 24 дні тому

    When I'm really tired with MIPS, I find this video. It's really help me ❤ Thank you, teacher. It's 2:37am in my country 😢 You are the first woman teacher that I learn. You influence me so much!!! Thank you ❤

  • @Bloxicorn
    @Bloxicorn Місяць тому

    Thank you, I'm behind on my comp org and arch class and your videos are helpful. This one is just review for me but it's helpful to see MIPS with assembly code as my professor doesn't give any code examples.

    • @KJMazidi
      @KJMazidi Місяць тому

      Glad it was helpful!

  • @박의찬-p5g
    @박의찬-p5g 6 місяців тому

    Nice

  • @madelineluray7553
    @madelineluray7553 6 місяців тому

    thank you, this was very helpful :)

    • @KJMazidi
      @KJMazidi 2 місяці тому

      Glad it was helpful!

  • @DriveTru-US
    @DriveTru-US 11 місяців тому

    Hello dear professor , I am MD HELAL HOSSEN , I am taking you NLP course this spring 2024. I learn a lot from your video.

  • @armaankhatri3608
    @armaankhatri3608 Рік тому

    great explanation

  • @xxnotmuchxx
    @xxnotmuchxx Рік тому

    instead of typing the code, is there a place to copy code? or where can i get those ppts?

    • @KJMazidi
      @KJMazidi Рік тому

      You can find some of it in my GitHub: github.com/kjmazidi

  • @Satya-jr6ml
    @Satya-jr6ml Рік тому

    This deserves much more views, professor! I'm taking this course in my university right now and I don't understand one bit of what my professor talks about! Thank you for this!

  • @praneetkomandur5313
    @praneetkomandur5313 Рік тому

    thank u

  • @Octavian999
    @Octavian999 Рік тому

    Thank you so much professor, this was very helpful.

  • @codehorror8076
    @codehorror8076 Рік тому

    Interesting fact: The R3000A in the PS1 didn't have a FPU.

  • @codehorror8076
    @codehorror8076 Рік тому

    Thanks Karen. Watching this because I'm doing MIPS of a PS1. You're a star!

  • @daviddeng9743
    @daviddeng9743 Рік тому

    Thanks Professor Mazidi, I'm having a hard time learning from my current professor and your videos are making it easy for me to keep up in class

  • @practicefirsttheorylater
    @practicefirsttheorylater Рік тому

    finally a good explanation of signed and unsigned

  • @qt6969
    @qt6969 Рік тому

    Thank you professor!

  • @sniro1984
    @sniro1984 Рік тому

    I wish I could be in your class, your students are so lucky! :) for the exercise in 14:32 this is what I did : lw $t1, a lw $t2, b la $t0, a sw $t1, 4($t0) sw $t2, 0($t0) Is there a simpler way ?

  • @beckembrown7002
    @beckembrown7002 Рік тому

    When you say copy through the first 1, do you mean that we copy from the right until we reach the first 1 than flip the remaining bits on the left of that 1?

  • @caine7024
    @caine7024 2 роки тому

    explained very simply and quickly! it's perfect

  • @altblogru
    @altblogru 2 роки тому

    Thank you very much for video!

  • @slazy9219
    @slazy9219 2 роки тому

    Great explanation! Thanks :)

  • @rasikpokharel77
    @rasikpokharel77 2 роки тому

    ily. I've learned more computer architecture from these videos in a few hours than all semester.

  • @ashleylove6840
    @ashleylove6840 2 роки тому

    Thank you so much! You are so good at explaining things

  • @DailyDemo1
    @DailyDemo1 2 роки тому

    its totally amazing

  • @DailyDemo1
    @DailyDemo1 2 роки тому

    thanks for your content

  • @saeedbarari2207
    @saeedbarari2207 2 роки тому

    thanks.

  • @varinderbarmi
    @varinderbarmi 2 роки тому

    wonderful explanation!!! thank you :)

  • @cheetahcheetos7497
    @cheetahcheetos7497 2 роки тому

    you're the best

  • @luispedromorales3242
    @luispedromorales3242 3 роки тому

    Great video! Very helpful, thanks

    • @KJMazidi
      @KJMazidi 3 роки тому

      Glad it was helpful!

  • @antonchigurh8102
    @antonchigurh8102 3 роки тому

    pycharm > jupyter notebook

  • @joel2527
    @joel2527 3 роки тому

    Hey I need help with my code

  • @pernarhvincent907
    @pernarhvincent907 3 роки тому

    this is good !!

  • @cerealgrapist7386
    @cerealgrapist7386 3 роки тому

    im still lost but at least its not looking as scary anymore lol

  • @proomm986
    @proomm986 3 роки тому

    This is the best explanation about this topic that I can find on YT

  • @gopavaramsaimadhuree520
    @gopavaramsaimadhuree520 3 роки тому

    superb explanation thank you

  • @amaanrampath4100
    @amaanrampath4100 3 роки тому

    Thank you for the nice explanation

    • @KJMazidi
      @KJMazidi 3 роки тому

      You are welcome!

  • @opicaskorica564
    @opicaskorica564 3 роки тому

    Absolutely fantastic video! I couldnt find main control unit logic anywhere on the internet, you helped me very much

  • @mazinashfaq631
    @mazinashfaq631 3 роки тому

    Very clear explanation! Thank you.

    • @KJMazidi
      @KJMazidi 3 роки тому

      Glad it was helpful!

  • @4edgy8me
    @4edgy8me 3 роки тому

    Great video!

  • @NOLAMarathon2010
    @NOLAMarathon2010 3 роки тому

    Regarding the code shown at 6:59. As you've written it, it doesn't work for me. These are the errors I get on lines 11, 12, 13 and 14: 11 --> Too few or incorrectly formatted operands. Expected: lw $t1,-100($t2) 12 --> Too few or incorrectly formatted operands. Expected: lw $t1,-100($t2) 13 --> Too few or incorrectly formatted operands. Expected: sw $t1,-100($t2) 14 --> Too few or incorrectly formatted operands. Expected: sw $t1,-100($t2)

    • @KJMazidi
      @KJMazidi 3 роки тому

      These are pretty standard lw and sw instructions. Did you put in the commas? Did you define a, b, c, d in the data section? Are you using MARS?

  • @efi13efi
    @efi13efi 3 роки тому

    The guy at the end of the line!

  • @veganphilosopher1975
    @veganphilosopher1975 4 роки тому

    Next time I'm at a ML conference proceeding I'll be thanking u

  • @HH-ip5zc
    @HH-ip5zc 4 роки тому

    What is the control signal for lbu ? It is the same of lw ?

    • @KJMazidi
      @KJMazidi 4 роки тому

      This simplified MIPS implementation doesn't have the lbu instruction, but if it did, the control signals would be the same.

    • @HH-ip5zc
      @HH-ip5zc 4 роки тому

      Ok Thx ☺️

  • @finnwilliams1808
    @finnwilliams1808 4 роки тому

    aayyhh quotes are back, I was missing them

  • @habibollalatifizadeh2842
    @habibollalatifizadeh2842 4 роки тому

    Hi Dear Dr. Mazidi, Thanks. Your channel was very helpful for me. Would you please make a video on Bayesian Network, specially in "kruskal.test" ,"pairwise.wilcox.test" and "anova test" ?

  • @veganphilosopher1975
    @veganphilosopher1975 4 роки тому

    I havent used java in like two years lol

  • @veganphilosopher1975
    @veganphilosopher1975 4 роки тому

    For those of you that forgot Cosine Similarity ( Number of like items ) / [ SQRT ( x1 ^ 2 + ... + xn^2) * SQRT( x1^2 + ... + xm^2 ) ]

  • @benjaminray4833
    @benjaminray4833 4 роки тому

    Best explanation of Cosine Similarity I've ever seen. Thanks Dr. Mazidi.

  • @guljemalrahmanova1890
    @guljemalrahmanova1890 4 роки тому

    "Information is not knowledge"

  • @guljemalrahmanova1890
    @guljemalrahmanova1890 4 роки тому

    Wow the last quote was very impressive!