At 9:08 he compares a simple problem (electrical current prediction given a certain voltage) with a much complicated problem of image recognition. IMHO to differentiate between computationally and statistically big data the targeted problem should be the same. So for instance the plane recognition problem, thousands of pictures of the similar type of aeroplane are computationally large but statistically insufficient. However may be a reduced number of pictures of various models may suffice statistically.
1:30 If anybody wants this super long list lol generative adversarial network Markov random field K-means clustering Radial basis functions decision trees logistic regression Kalman filter kernel PCA random forest deep networks principal components Hidden Markov model convolutional networks support vector machines Gaussian mixture linear regression independent component analysis Gaussian process factor analysis Boltzmann machines
Unexpected to hear that SVM is a "funny approach for machine learning " (14:03 min). Vladimir Vapnik made an exceptional contribution to the development of the statistical learning theory, and I believe Professor Bishop did not mean what he said.
Рік тому+1
He deliberately used that expression because western people can't handle the truth that while they were playing with McCulloch-Pitts neuron in the 70s, it had been more than 10 years that Vapnik had already discovered Support Vector Machines
You say that the model of a person restricts the expressions of freedom so that less data is needed to conclude that any data is a person - fine. But then you say that a convolutional layer represent a model of a person when it clearly is not. Rather the convolutional layer has parameters which iteratively change according to an algorithm to converge towards to a structured representation of a person. Clearly the product is the model we're after and the convolutional layer is more like a 'meta model'. It seems reasonable that the more degrees of freedom this meta model allows for the more kinds of models it can derive, right?
At 9:08 he compares a simple problem (electrical current prediction given a certain voltage) with a much complicated problem of image recognition. IMHO to differentiate between computationally and statistically big data the targeted problem should be the same. So for instance the plane recognition problem, thousands of pictures of the similar type of aeroplane are computationally large but statistically insufficient. However may be a reduced number of pictures of various models may suffice statistically.
1:30 If anybody wants this super long list lol
generative adversarial network
Markov random field
K-means clustering
Radial basis functions
decision trees
logistic regression
Kalman filter
kernel PCA
random forest
deep networks
principal components
Hidden Markov model
convolutional networks
support vector machines
Gaussian mixture
linear regression
independent component analysis
Gaussian process
factor analysis
Boltzmann machines
32:33 He said it the wrong way, actually the red is better than green and that's why it is amazing.
Unexpected to hear that SVM is a "funny approach for machine learning " (14:03 min). Vladimir Vapnik made an exceptional contribution to the development of the statistical learning theory, and I believe Professor Bishop did not mean what he said.
He deliberately used that expression because western people can't handle the truth that while they were playing with McCulloch-Pitts neuron in the 70s, it had been more than 10 years that Vapnik had already discovered Support Vector Machines
Excellent!
You say that the model of a person restricts the expressions of freedom so that less data is needed to conclude that any data is a person - fine. But then you say that a convolutional layer represent a model of a person when it clearly is not. Rather the convolutional layer has parameters which iteratively change according to an algorithm to converge towards to a structured representation of a person. Clearly the product is the model we're after and the convolutional layer is more like a 'meta model'. It seems reasonable that the more degrees of freedom this meta model allows for the more kinds of models it can derive, right?
Why he is holding water for the whole talk ? lol
His concentration toward topic made him forget about bottle
Because he's exploring uncharted waters. XD