Why Deep Representations? (C1W4L04)

Поділитися
Вставка
  • Опубліковано 16 січ 2025

КОМЕНТАРІ • 13

  • @codingwithsam4992
    @codingwithsam4992 Рік тому +3

    Am I only hearing high frequency sound in the background?

  • @azunia4
    @azunia4 6 років тому +31

    The image he used is actually a picture of his wife the self driving car ai entrepreneur Carol Reiley, how cute

  • @arjungoud3450
    @arjungoud3450 2 роки тому +1

    Can we split the single network into multiple networks with if condition? Does it make sense for distributed computing?

  • @supreethmv
    @supreethmv 5 років тому +7

    So Deep!!:P

  • @phytasea
    @phytasea 5 років тому +3

    Guys focus on the class not the fact that how lucky he is ^^ hahaha

  • @sandipansarkar9211
    @sandipansarkar9211 3 роки тому

    great explanation

  • @machinelearning3518
    @machinelearning3518 3 роки тому

    very Deep

  • @RH-mk3rp
    @RH-mk3rp 2 роки тому

    How do I arrive to the 2^n number of hidden units if only allowed 1 hidden layer? I think the 2 comes from the fact that it's binary (true/false) and n is the number of input units, but why the power notation 2^n?

    • @RH-mk3rp
      @RH-mk3rp 2 роки тому

      after some more thinking, I think 2^n is the number of possible outcomes, e.g. for 4 input units
      there are 2^4 = 16 possible combinations of true/false that the neural network must enumerate

  • @rp88imxoimxo27
    @rp88imxoimxo27 4 роки тому +1

    who disliked the video?