CCNs kernels maxpooling

Поділитися
Вставка
  • Опубліковано 15 вер 2024
  • This video dives into how convolution neural networks (CNNs) works and gives an intro to kernels, padding, strides, and maxpooling using MNIST Handwriting data.

КОМЕНТАРІ • 1

  • @hoaxuan7074
    @hoaxuan7074 3 роки тому

    Discrete convolutions, weighted sums and fast transforms like FFT are dot products. Max pooling is switching. ReLU is a switch🤔 f(x)=x is connect, f(x)=0 is disconnect. A light switch in your house is binary on off yet connects or disconnects a continuously variable AC voltage signal. The dot product of a number of dot products is still a dot product. When all the switch states become known in a ReLU net the net collapses to a simple matrix. There is a linear mapping from the input vector to the output vector. There are a lot of metrics you can apply and further math that can be done.