- 33
- 63 189
Vivian Lobo
Приєднався 20 лис 2020
Відео
McCulloch-Pitts (MP) neuron for XOR gate
Переглядів 5 тис.3 роки тому
McCulloch-Pitts (MP) neuron for XOR gate
McCulloch-Pitts (MP) neuron for NOR gate continued...(Plotting a graph)
Переглядів 7293 роки тому
McCulloch-Pitts (MP) neuron for NOR gate continued...(Plotting a graph)
McCulloch-Pitts (MP) neuron for NOR gate
Переглядів 6 тис.3 роки тому
McCulloch-Pitts (MP) neuron for NOR gate
McCulloch-Pitts (MP) neuron for AND NOT function
Переглядів 2,2 тис.3 роки тому
McCulloch-Pitts (MP) neuron for AND NOT function
McCulloch-Pitts (MP) neuron for NOT gate
Переглядів 3,1 тис.3 роки тому
McCulloch-Pitts (MP) neuron for NOT gate
McCulloch-Pitts (MP) neuron for OR gate
Переглядів 5 тис.3 роки тому
McCulloch-Pitts (MP) neuron for OR gate
McCulloch-Pitts (MP) neuron for AND gate
Переглядів 4,1 тис.3 роки тому
McCulloch-Pitts (MP) neuron for AND gate
Module 06 | Application Layer | Part 1
Переглядів 683 роки тому
Module 06 | Application Layer | Part 1
TCP State Transition or Connection Modeling
Переглядів 6 тис.3 роки тому
TCP State Transition or Connection Modeling
Module 05 | TCP Header | Computer Networks
Переглядів 1063 роки тому
Module 05 | TCP Header | Computer Networks
Experiment 08 | CNL | Implementation of Network Topology (Star Topology) in NS2 via TCL
Переглядів 6413 роки тому
Experiment 08 | CNL | Implementation of Network Topology (Star Topology) in NS2 via TCL
Leaky Bucket Algorithm | Computer Networks
Переглядів 3353 роки тому
Leaky Bucket Algorithm | Computer Networks
Learning by Classification using Gaussian Naive Bayes | Machine Learning Workshop
Переглядів 353 роки тому
Learning by Classification using Gaussian Naive Bayes | Machine Learning Workshop
Bayesian Belief Network and Markov Model
Переглядів 1,4 тис.3 роки тому
Bayesian Belief Network and Markov Model
SVM and Maximum Margin Linear Separators
Переглядів 7073 роки тому
SVM and Maximum Margin Linear Separators
Expectation Maximization Algorithm | Part 1
Переглядів 873 роки тому
Expectation Maximization Algorithm | Part 1
Expectation Maximization Algorithm | Part 2
Переглядів 443 роки тому
Expectation Maximization Algorithm | Part 2
Expectation Maximization Algorithm | Numerical
Переглядів 1783 роки тому
Expectation Maximization Algorithm | Numerical
👏 Thank you sir
sir a12 is positive and a11 is negative ???
🙏🙏
can you please elaborate why you have taken -1/underroot 2 in lamda 2 for value of x1 ?
Thankkk you so muchhh!
video upload karke 3 sal hua hai , par cmmt sab exam ke ek din pehele kar rahe hai😂😂
If factors are 0 then sigma2 will be 0 and u2 will be infinite
Nice explaination sir
yess
Indeed!
this is wrong definately
Your calculations are not correct.
simple and straight explanation. Thank you!
Windows kharidle bhai
It is wrong calculation
center point=. 0, 0.5 ,1 Evaluation point= 0 ,0.5 ,0.6 ,0.7 ,0.8 ,1 Shape parameter=3 function=e^sinπx Solve this by using MQ RBF interpretation technique
your solution is totally wrong. if you don't know how to find the distance, then answer will be wrong. the solution will be (phi1,phi2)={(1, 0.1353), (0.3678, 0.3678), (0.3678, 0.3678), (0.1353, 1)
excellent tutorial...keep it up
PER ITALIANI: Ho realizzato una playlist sulle reti RBF! :) quì -> ua-cam.com/video/fcBz-3NchCI/v-deo.html
How did you get 0.3678 value when you considered x(0,1) and u(0,0) x-u= [0 1] - [0 0] = 0*0 - 1*0 = 0 how it came 0.3678????? plz clear that problem.
Hello Ankit, As per the formula, once you calculate x - µ1, whose answer is [ 0 1], we need to calculate the mod value of [0 1], which is 1 (i.e., sqrt (0^2 + 1^2) = 1). Then, as per the formula, it is exp(-1), which is 0.3678.
@@vivianlobo8440 true, but there is error in phi2(x) calculation for (0,0) case and phi1(x) calculation for (1,1) case. in both these cases the phi value is e^(-2). Hence the points (1,1) that u got in (0,0) and (1,1)-(first and fourth case) will be (1,0.135) and (.135,1) respectively
@@kevinvigi9791 kevin is right actually
@@kevinvigi9791 I agree with you
Thankyou so much ! It was lit and can u pls explain on wht basis you took initial values of theta A and theta B which we assumed?
Very helpful video sir!
One of the best faculty in st. John 😅 keep it up sir 👍