- 82
- 172 928
Karen Janice Mazidi
United States
Приєднався 28 лип 2010
Online materials for students in computer architecture, natural language processing, and machine learning.
See my books on Amazon: www.amazon.com/~/e/B08PG4TGM2
See my books on Amazon: www.amazon.com/~/e/B08PG4TGM2
CA04 - MIPS and machine code
Converting MIPS instructions to machine code, and reverse engineering machine code to MIPS instructions; MIPS instruction formats
Переглядів: 6 519
Відео
CA01 - Introduction to Computer Architecture
Переглядів 15 тис.3 роки тому
What is Computer Architecture?
nlp25 - Embeddings
Переглядів 2424 роки тому
Content from Chapter 25 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Training your own embeddings or using pretrained embeddings live GloVe, ELMo, BERT.
CA23 - Dependability and security
Переглядів 7974 роки тому
Parity and ECC for error detection; virtual machines, containers and cloud; hardware vulnerabilities
nlp26 - Sequence to sequence models
Переглядів 2114 роки тому
Content from Chapter 26 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Encoders and decoders for seq-2-seq models.
nlp24 - Deep Learning Variations
Переглядів 2644 роки тому
Content from Chapter 24 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Adding an Embedding layer in Keras: RNN, CNN, LSTM, GRU in Keras
nlp23 Deep Learning
Переглядів 2954 роки тому
Content from Chapter 23 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Deep learning for text classification with Keras; Keras API; Keras Functional API
nlp21 - Logistic Regression
Переглядів 3234 роки тому
Content from Chapter 21 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Logistic Regression for text classification; underfitting and overfitting; gradient descent; odds versus probability; log odds; sigmoid function
nlp22 - Neural Networks
Переглядів 3444 роки тому
Content from Chapter 22 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Neural networks in sklearn; perceptrons; neurons; layers; activation functions; feed forward network; back propagation; epochs; network design
nlp19 - Converting text to data
Переглядів 4564 роки тому
Content from Chapter 19 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Using sklearn CountVectorizer() and TfidfVectorizer()
nlp20 - Naive Bayes
Переглядів 3634 роки тому
Content from Chapter 20 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Running Naive Bayes in sklearn for text classification. Metrics: accuracy, precision, recall, Kappa, ROC and AUC.
nlp17 ML intro
Переглядів 4174 роки тому
Content from Chapter 17 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ An overview of machine learning; supervised v. unsupervised learning; terminology
nlp18 Libraries
Переглядів 3744 роки тому
Content from Chapter 18 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ An introduction to NumPy, pandas, Seaborn, sklearn.
nlp14 - Vector space model
Переглядів 6834 роки тому
Content from Chapter 14 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Implementing a vector space model from scratch or using sklearn. Explanation of cosine similarity.
nlp15 topic modeling
Переглядів 2844 роки тому
Content from Chapter 15 of Exploring NLP with Python, available on Amazon: www.amazon.com/dp/B08P8QKDZK/ Topic modeling with gensim.
nlp01-Welcome to Natural Language Processing
Переглядів 1,7 тис.4 роки тому
nlp01-Welcome to Natural Language Processing
ML27 - Markov Models to Reinforcement Learning
Переглядів 2034 роки тому
ML27 - Markov Models to Reinforcement Learning
When I'm really tired with MIPS, I find this video. It's really help me ❤ Thank you, teacher. It's 2:37am in my country 😢 You are the first woman teacher that I learn. You influence me so much!!! Thank you ❤
You got this!
Thank you, I'm behind on my comp org and arch class and your videos are helpful. This one is just review for me but it's helpful to see MIPS with assembly code as my professor doesn't give any code examples.
Glad it was helpful!
Nice
Thanks
thank you, this was very helpful :)
Glad it was helpful!
Hello dear professor , I am MD HELAL HOSSEN , I am taking you NLP course this spring 2024. I learn a lot from your video.
great explanation
instead of typing the code, is there a place to copy code? or where can i get those ppts?
You can find some of it in my GitHub: github.com/kjmazidi
This deserves much more views, professor! I'm taking this course in my university right now and I don't understand one bit of what my professor talks about! Thank you for this!
thank u
Thank you so much professor, this was very helpful.
Interesting fact: The R3000A in the PS1 didn't have a FPU.
Thanks Karen. Watching this because I'm doing MIPS of a PS1. You're a star!
Thanks Professor Mazidi, I'm having a hard time learning from my current professor and your videos are making it easy for me to keep up in class
finally a good explanation of signed and unsigned
Thank you professor!
I wish I could be in your class, your students are so lucky! :) for the exercise in 14:32 this is what I did : lw $t1, a lw $t2, b la $t0, a sw $t1, 4($t0) sw $t2, 0($t0) Is there a simpler way ?
When you say copy through the first 1, do you mean that we copy from the right until we reach the first 1 than flip the remaining bits on the left of that 1?
explained very simply and quickly! it's perfect
Thank you very much for video!
Great explanation! Thanks :)
ily. I've learned more computer architecture from these videos in a few hours than all semester.
Thank you so much! You are so good at explaining things
its totally amazing
thanks for your content
thanks.
wonderful explanation!!! thank you :)
you're the best
Great video! Very helpful, thanks
Glad it was helpful!
pycharm > jupyter notebook
Hey I need help with my code
this is good !!
im still lost but at least its not looking as scary anymore lol
This is the best explanation about this topic that I can find on YT
superb explanation thank you
Thank you for the nice explanation
You are welcome!
Absolutely fantastic video! I couldnt find main control unit logic anywhere on the internet, you helped me very much
Great to hear!
Very clear explanation! Thank you.
Glad it was helpful!
Great video!
Thanks!
Regarding the code shown at 6:59. As you've written it, it doesn't work for me. These are the errors I get on lines 11, 12, 13 and 14: 11 --> Too few or incorrectly formatted operands. Expected: lw $t1,-100($t2) 12 --> Too few or incorrectly formatted operands. Expected: lw $t1,-100($t2) 13 --> Too few or incorrectly formatted operands. Expected: sw $t1,-100($t2) 14 --> Too few or incorrectly formatted operands. Expected: sw $t1,-100($t2)
These are pretty standard lw and sw instructions. Did you put in the commas? Did you define a, b, c, d in the data section? Are you using MARS?
The guy at the end of the line!
Next time I'm at a ML conference proceeding I'll be thanking u
What is the control signal for lbu ? It is the same of lw ?
This simplified MIPS implementation doesn't have the lbu instruction, but if it did, the control signals would be the same.
Ok Thx ☺️
aayyhh quotes are back, I was missing them
Hi Dear Dr. Mazidi, Thanks. Your channel was very helpful for me. Would you please make a video on Bayesian Network, specially in "kruskal.test" ,"pairwise.wilcox.test" and "anova test" ?
I havent used java in like two years lol
For those of you that forgot Cosine Similarity ( Number of like items ) / [ SQRT ( x1 ^ 2 + ... + xn^2) * SQRT( x1^2 + ... + xm^2 ) ]
Best explanation of Cosine Similarity I've ever seen. Thanks Dr. Mazidi.
"Information is not knowledge"
Wow the last quote was very impressive!