Getting Started With Hugging Face in 15 Minutes | Transformers, Pipeline, Tokenizer, Models
Вставка
- Опубліковано 11 чер 2024
- Learn how to get started with Hugging Face and the Transformers Library in 15 minutes! Learn all about Pipelines, Models, Tokenizers, PyTorch & TensorFlow integration, and more!
Get your Free Token for AssemblyAI Speech-To-Text API 👇www.assemblyai.com/?...
Hugging Face Tutorial
Hugging Face Crash Course
Sentiment Analysis, Text Generation, Text Classification
Resources:
Website: huggingface.co
Course: huggingface.co/course
Finetune: huggingface.co/docs/transform...
▬▬▬▬▬▬▬▬▬▬▬▬ CONNECT ▬▬▬▬▬▬▬▬▬▬▬▬
🖥️ Website: www.assemblyai.com
🐦 Twitter: / assemblyai
🦾 Discord: / discord
▶️ Subscribe: ua-cam.com/users/AssemblyAI?...
🔥 We're hiring! Check our open roles: www.assemblyai.com/careers
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Timestamps:
00:00 Intro
00:40 Installation
01:02 Pipeline
04:37 Tokenizer & Model
08:32 PyTorch / TensorFlow
11:07 Save / Load
11:35 Model Hub
13:25 Finetune
HuggingFace Tutorial
HuggingFace Crash Course
#MachineLearning #DeepLearning #HuggingFace
I love the fact you're smiling like a genuinely happy individual. Good vibes. We need more that in the world.
That is because you cant see the gun behind him....
@@10xFrontend Cuz he is using Happying Face,,,
Thank you for this introduction! The Finetuning part at the end was a little fast, so I'll be checking out your other videos to learn more. Thanks again.
I've been trying to figure this out for days and in the first couple minutes of the video you helped me a lot.
Thank you for this quick tutorial! It's exactly what I was looking for :)
This is a very helpful introduction! Thank you so much for putting it up!
Glad it was helpful!
Great and concise explanation! Thank you so much!
Awesome summary / introduction. Thank you so much!
I like your clear voice, easy explanation and smile.😄
He covered each details for a beginner and its great.
I cant invent by myself, how to save history of chat for conversation
Good sir, You have genuinely solved the question that was bugging me for the past two months. I have had no trouble training models using the Huggingface Trainer API but never knew what to do afterwards and the course is extremely ambiguous about this. I am not a very good student generally. You have saved me
This is awesome. Thanks for the excellent walk through transformers library
Very helpful tutorial! Many thanks for it!
By the way, you speak very clear English and that is extremely helpful for me as a 'non-native speaker' 😊
Thanks for making this task more inviting. Now I definitively want to know details on the hot-to do the fine tuning over custom content
Simple and straight to the point!
What an awesome introduction! Thanks a lot mate! 💪
Thanku so much for putting together important things in an easy way. Waiting for more of such videos from you.
Thank you so much for the tutorial. It was very helpful. The explanation was precise and concise!
Amazing explanation. Thanks for sharing your knowledge!
Exactly what I was looking for. Thank you for sharing!
It's the best tutorial I've ever see. Thanks!
Excellent tutorial. Cannot believe only 3.7k likes for this. I recommend typing the code out in vscode or other and running in parallel while watching the video.
Thanks for clearly explained tutorial !
Thank you very much for the very informative video and the great examples.
This is a superb tutorial. Many thanks.
Great lecture. Thank you for sharing your knowledges
Thanks! Clear and informative
Nice summary! Thanks!
Awesome tutorial, to the point 🔥
Thx a lot man!!!!
complete and short. awesome.
Awesome video, thanks so much!
Totally appreciate this, thank you - had to install tensorflow==2.13.0 and kares==2.13.1 - worked great
This is amazing! Thanks :)
Thanks for the tutorial!
In only 1 year we came from 34K models available to almost 300K models, almost 10x increase! 😮
thanks for using VSC, it is a better good feeling for me
really useful intro, thank you!
Great explanation, danke!
This is fantastic I hope you can make more videos covering other pipelines for HuggingFace transformers package.. Many thanks.. 😀
I love this video, very useful!
Dude I so love you❤ for making such useful videos😊😊😊
love ur vids man
This is a brilliant video!!!
Great stuff, thank you.
Very kool. It's on my watch list
Just found you. So grateful
great video! Thank you
eyebrows are talking in there own language
😅
You mean German?
Actually the eyebrows are communicating and Morse code and the messages I'm being held hostage help me escape exclamation
Hahaha I love UA-cam comments
yeah what the hell going on here
this is way more better than what AWS guys do ...
Great video!
you are eye blinking that's awesome 😊😊
outstanding, Thank You
thank you so much
Great video! Do you know how I would be able to view a classification report and confusion matrix of a sentiment analysis pipeline?
Still the best intro to HF in my opinion!
PS: would love to know which theme is used in VS Code, it's great.
I think it is Night Owl
Very helpful, thanks!
Glad it was helpful!
Just found you. So happy rn LOL
I'm watching this video in October 2023 and it's so cool to see that now there are 376,348 models
THANK YOU!!!
brilliant!!!
Hey! Thanks for the great tutorial. How can you make the AI remember the full convo, so it remembers the outputs it gave? For example it says my name is Andrew, but then will say that it's name is Anna when asked again
Sorry if its a dumb question:D
Great intro video!
The best explanation
Glad you think so!
greate video! thanks!
Bro gives off good vibes
muchas gracias!!!!
3:01 i understood up to this point,the first example 🤗
Gracias
Too good 👍
I enjoyed the video but perhaps need a "100-level" one with specific use-cases and how you would plan out your development and configuration to get after that use-case. :)
Ur best teacher
Thank you :)
im running windows on my PC is there any problem downloading Linux and the other codes you described?
2 questions : i get some warning about "right padding", dk how to fix it.
I was trying to use conversational pipes but why they are such trash? 50% of times they just repeat, and cant do anything about it.
Send help.
Do you know how to use the BPE tokenizer and have it keep the spaces? I've tried every combination of parameters imaginable and my transformer always outputs a string with no spaces.
Great. 🎉
for other models, sometimes i get something like below:
pytorch_model-00001-of-00002.bin
pytorch_model-00002-of-00002.bin
How do I make them into safetensor format so it can be loaded into the webui?
Does the finetuning part still work or do I need to make any changes ?
If you do it like you did it, you will need to install tf_keras as well.
Hi, I want to use tokenizer along with some other AI models(not decided yet). I have a set of files of text (source code). They are already classified for multiple attributes. Such as file A has x and y true, file b has x true y false.. Regarding that I have two questions. 1- I want to tokenize the file contents but in my case not single words but groups of words (such as groups of 3, 4 tokens) makes sense. So, I want to tokenize in a way to group those sets of words if that is possible. 2- Each file would have different lengths, thus different number of tokens. How can I use these sets of mergedtokens for prediction. I mean what is the best AI technique to use afterwards to prepare a model for prediction?
How do you get VS Code to show the completions (in a python virtual environment)?
Is there a pipeline for translating a text into a dense vector representation?
What a sweet guy. Amazing work
do you have a repo for the code shown in the video?
How to do text generation without the pipeline? I'm not sure why everyone everywhere is doing sentence classification.
your smile is so cute😍😚
I like your smile
How come I didn’t see any labels data used for training or fine tuning in your examples?
Man such a cherpy guy. I loved your energy. Thanks for the video.
Can you please help explain what to do if you have a Mac OS and your Jupyter Notebook keeps crashing when you try to import pipeline from transformers? The kernel keeps dying.
What is the best natural language model for answering multiple choice questions in mathematics? CHATGPT is not so good at this..
what app did you switch to one minute in?
Nice
it throws SSL certification error when I do the commands in python. Please help!
what software are you using i love the interface so streamlined i'm using jupyter notebooks
its ok i found it visual studio thanks 🙂
can i use these transformers for commercial use
This should be on hugging face
thanks for this video. But towards 11:32 timestamp, why do we save tokenizer as well? Isn't it a static module that we can simply import again?
Tokenizers can be rather complicated; in many cases they're learned/trained; you'd want to save after fine-tuning them (and just to have them offline i reckon)
Tutorial for peoples already know Transformers, Pipeline, Tokenizer and Models :)
Do you run this and visual basics?
It says that no model was supplied and it's using a default model. Did it actually download that default model or used an API to connect to it remotely? If it's the latter, is that free of charge?
it took be entire day when i read doc and implement ,, while i sould have learn the same in 15 mins.
NOICE!