Fake News Detection Project with BERT Fine-tuning | Deep Learning for NLP | Project#11

Поділитися
Вставка
  • Опубліковано 8 вер 2022
  • Applying transfer learning to train a Fake News Detection Model with the pre-trained BERT.
    This is a three part transfer learning series, where we have covered:
    - Basic intuition behind Transfer Learning, link: • Introduction to Transf...
    - Deep-dive into Google’s BERT Model for NLP: • BERT Model | Transfer ... , and finally,
    - Build a Fake News Detection Model, using the pre-trained BERT Model: • Fake News Detection Pr...
    Happy learning :)
    🔥 Resources:
    Project Folder: drive.google.com/drive/folder...
    Dataset: www.kaggle.com/datasets/clmen...
    BERT Tokenizer brief: huggingface.co/docs/transform...
    🔥 Our other popular ML Projects:
    1. Sentiment Analysis Project using LSTM: • Sentiment Analysis wit...
    2. Sentiment Analysis Project (End-to-end) with ML Model Building + Deployment (using Flask):
    ---- a. Model Building: • Sentiment Analysis Mac... (Part-1)
    ---- b. Model Deployment: • PIP + Virtual Environm... (Part-2)
    3. Sentiment Analysis Project using Traditional ML: • Sentiment Analysis Pro...
    4. Analytics-enabled Marketing: • Marketing Analytics Pr...
    5. Credit Scoring Project: • Credit Scoring Project...
    6. Face Recognition Project: • Face Detection Machine...
    🔥 Do like, share & subscribe to our channel. Keep in touch:
    Discord Server: / discord
    Email: skillcate@gmail.com
    Whatsapp: +91-7404139793
    Medium: / skillcate
    GitHub: github.com/skillcate
    Facebook: / mlprojects
    Instagram: / skillcate

КОМЕНТАРІ • 36

  • @jadhavpruthviraj5788
    @jadhavpruthviraj5788 Рік тому +1

    Man,What a video Loved it...The way you explain,patience is outstanding. Keep on droping videos

  • @hemalshah1410
    @hemalshah1410 Рік тому

    Thank you for sharing such great content with great explanation. 🙌 Keep posting!

    • @skillcate
      @skillcate  Рік тому

      Glad that you find it useful!

  • @MarutiBhakth7147
    @MarutiBhakth7147 Рік тому +1

    Thank you so much sir...I literally fell in love with BERT model, you really have great teaching skills, only an awesome teacher can teach ML in such and interesting easy and detailed way.... ❤🥰

    • @skillcate
      @skillcate  Рік тому

      Thanks for the lovely feedback Rakesh ❤️

  • @mryoso25
    @mryoso25 11 місяців тому

    Very insightful. Thank you so much for this! Hope we can have NLP deep learning project for text classification.

  • @MuhammadAbdullah-gx2ou
    @MuhammadAbdullah-gx2ou 6 місяців тому

    Thanks for this wonderful project, please let me know one point about transfer learning. Have we used this? if yes then where and how. Thanks.

  • @AfeezeeHQ
    @AfeezeeHQ 5 місяців тому

    Thank you for this video. It's very enlightening. However, I have a question. This is a fake news detection that is solely based on linguistic cues. The research I'm working on is more than just predicting based on linguistic cues but source credibility analysis and contextual analysis are involved too. How can I develop a fake news prediction model that will effectively detect if news is fake or real based on an extensive dataset such as the Twitter Truth Seeker dataset?

  • @user-qf6gr6hx5h
    @user-qf6gr6hx5h 11 місяців тому

    I tried running this notebook and hit a bunch of errors on the sklearn imports. Again, tried running this with an older version of sklearn and hits errors. Any help? I am fairly new to coding.

  • @sadik3611
    @sadik3611 Рік тому +1

    Hi sir ,your videos are just awesome, I use this code for sentiment analysis to detect spam email.

    • @skillcate
      @skillcate  Рік тому

      Great 👍 Glad that you find it useful.

  • @edilgin622
    @edilgin622 Рік тому

    I followed the notebook and trained my model for 30 epochs but in every epoch training loss was 0.001 and test loss was 0.003. Any reason that might be the case?

  • @hooman3563
    @hooman3563 Рік тому +3

    Thanks for the video! How can we save the fine-tuned model so that we can run it next time without going through fine-tuning the model again?

    • @skillcate
      @skillcate  Рік тому +3

      Hey Hooman 😊, I'm sure you are a dog lover!!
      Within the Fake New Detection code file, under the 'Model Training' section, the # Train & Predict code script has this line:
      torch.save(model.state_dict(), 'c2_new_model_weights.pt')
      With this line of code, the Model weights get stored in your Working Directory. For loading these saved weights, you may use (also shown in the tutorial at 15:24):
      # load weights of best model
      path = 'c2_new_model_weights.pt'
      model.load_state_dict(torch.load(path))

  • @mayanklohani19
    @mayanklohani19 9 місяців тому

    what things will changes if we want to use text instead of title?

  • @user-gb1jq8ep1s
    @user-gb1jq8ep1s 3 місяці тому

    Sir i am unable to see the datasets from the link u provided it is showing that no results found

  • @venkatdatta.g9105
    @venkatdatta.g9105 3 місяці тому

    Can you let me know how to save the model

  • @danielihenacho
    @danielihenacho Рік тому +1

    Is possible to use huggingfacae transformers for a project similar to this ?

    • @skillcate
      @skillcate  Рік тому +1

      Hey Daniel, how are you mate!
      Yes, sure you can. Not just similar, you can actually use it do "almost" any problem you may have, be it NLP / CV / Audio. Here's their documentation: huggingface.co/docs/transformers/index
      For similar problems, you may do Sentiment Analysis, Movie/Book Recommendation, Text Summarization, Topic Modeling, etc. within NLP.

    • @danielihenacho
      @danielihenacho Рік тому

      @@skillcate Thanks for the information. Please I would like to ask if a tensorflow version of this episode is available?

  • @harissaeed5811
    @harissaeed5811 Рік тому +1

    hi sir ,i need your help i have only 10 day left for my fyp submision can u please help me in this i dont know how to implement bert on roman urdu data set . oter wise my fyp will be rejected can u please make a vedio on it ,

    • @skillcate
      @skillcate  Рік тому

      Let's keep in touch on this..

  • @maleeshamendis5924
    @maleeshamendis5924 6 місяців тому

    Hi Sir.I used the same code and run it. But when the epochs are = 2 , code takes log time to run, even it doesn't execute. I even tried increasing the the number of epochs. Noprogress.How can I resolve it? :(

    • @skillcate
      @skillcate  6 місяців тому

      Where are you running the code? It is likely that you are training on CPU machine where it take longer time than gpu

  • @Siriiiiii
    @Siriiiiii 5 місяців тому

    also do project on fake reviews

  • @rashdakhanzada8058
    @rashdakhanzada8058 Рік тому

    how to work with roman urdu dataset using bert.

    • @skillcate
      @skillcate  Рік тому +1

      Dear Rashda, we'll soon be doing a video on this problem statement. Allow us a week's time.

    • @rashdakhanzada8058
      @rashdakhanzada8058 Рік тому

      @SKILLCATE sure it would be great.Another thing i want to get know that colab offers 12hours runtime per session.what does this mean that after 12 hours we cannot run the same notebook free ?

    • @skillcate
      @skillcate  Рік тому +1

      @@rashdakhanzada8058 12-hour runtime per session in Colab means you can use the free version of Colab for up to 12 consecutive hours. Once the time is up, you can start a new session and continue your work. It's like driving a rental car for a specific duration and then getting a new car if you need more time.

    • @rashdakhanzada8058
      @rashdakhanzada8058 Рік тому

      @SKILLCATE means now i have to create a new notebook ,insert the same code inside this and can work on it for 12 hours more.And have to do this again again...

    • @skillcate
      @skillcate  Рік тому +1

      ​@@rashdakhanzada8058 Not really. Notebook remains as is. Just think it like this: You're training a very heavy Deep Learning Model for which Model training goes on for let's say for 24hours. With the free Colab Account, you have a Runtime limit of 12hours. So you won't be able to train such heavy Deep Learning Models.
      But, for a beginner level Data Science related stuff, you will face absolutely no problem. As most of your training would take minutes to a couple of hours. 🤗

  • @basmahhyder5695
    @basmahhyder5695 Рік тому +1

    I implemented this code for fake reviews detection and got an accuracy of 57% which was quite unexpected.. :(

    • @skillcate
      @skillcate  Рік тому

      Dear Basmah, how many epoch you trained your model for? I think you might have done 2 epochs. If yes, you should do 2-4 more epochs for your results to get better.

    • @basmahhyder5695
      @basmahhyder5695 Рік тому

      @@skillcate Yes, I changed that, and with epoch= 10, I got an accuracy of 71%. That's the maximum accuracy I could get.