What is Cross Entropy - Data Science Terminologies - DataMites institute

Поділитися
Вставка
  • Опубліковано 4 бер 2020
  • #WhatisCrossEntropy
    #DataScience #Terminologies
    #MachineLearning
    Watch video to understand about What is Cross Entropy in Machine Learning?
    #DataMites is leading training institute for data science and machine learning courses. Learn data science with machine learning, NLP, AI, Deep Learning, Tableau, Stats, Maths, Python / R Programming and Datamining.
    Work on Live projects and get certified as "data scientist".
    For more details about course visit: datamites.com/
    DataMites Classroom training centers in INDIA.
    Data Science course in Chennai: datamites.com/data-science-co...
    Data Science in Bangalore: datamites.com/data-science-co...
    Data Science in Pune: datamites.com/data-science-co...
    All the best.

КОМЕНТАРІ • 19

  • @galabpokharel6833
    @galabpokharel6833 4 роки тому +1

    Clear as crystal

  • @zarena50
    @zarena50 3 роки тому +1

    Best explanation so far. Thank you!

  • @ramavinodanm8754
    @ramavinodanm8754 4 роки тому +1

    Thanks sir. such a clear explanation with a basic example

  • @Adinasa2
    @Adinasa2 Рік тому +1

    Great

  • @sarukdon5
    @sarukdon5 4 роки тому +1

    Great Explanation

  • @pthube
    @pthube 4 роки тому +1

    Thank you! mathematical calculations helped to understand concept

  • @TheJonathanLugo
    @TheJonathanLugo 3 роки тому

    Thank you, your video clearly helped me understand the term.

  • @sofluzik
    @sofluzik 3 роки тому +1

    simple and effective...

  • @oscarsal433
    @oscarsal433 3 роки тому +1

    thank you you material is awesome :)

  • @narendraparmar1631
    @narendraparmar1631 5 місяців тому

    Thanks

  • @usama57926
    @usama57926 2 роки тому

    Nice explanation but what is the difference b/w *entropy* & *cross-entropy*

    • @DataMites
      @DataMites  2 роки тому +1

      Hi Usama Iftikhar Butt, Entropy is a measure of randomness and Cross-entropy is a measure from the field of information theory, building upon entropy and generally calculating the difference between two probability distributions.