This video dives into how convolution neural networks (CNNs) works and gives an intro to kernels, padding, strides, and maxpooling using MNIST Handwriting data.
Discrete convolutions, weighted sums and fast transforms like FFT are dot products. Max pooling is switching. ReLU is a switch🤔 f(x)=x is connect, f(x)=0 is disconnect. A light switch in your house is binary on off yet connects or disconnects a continuously variable AC voltage signal. The dot product of a number of dot products is still a dot product. When all the switch states become known in a ReLU net the net collapses to a simple matrix. There is a linear mapping from the input vector to the output vector. There are a lot of metrics you can apply and further math that can be done.
Discrete convolutions, weighted sums and fast transforms like FFT are dot products. Max pooling is switching. ReLU is a switch🤔 f(x)=x is connect, f(x)=0 is disconnect. A light switch in your house is binary on off yet connects or disconnects a continuously variable AC voltage signal. The dot product of a number of dot products is still a dot product. When all the switch states become known in a ReLU net the net collapses to a simple matrix. There is a linear mapping from the input vector to the output vector. There are a lot of metrics you can apply and further math that can be done.