I wouldn't suggest Colab, now that I've looked more into it. It isn't only my problem that my A.I. can't train on it - many across the world can't do so as well.
I suggest you set limits on CPU and GPU usage when using TensorFlow. While you can control TensorFlow's use of OpenMP threads, doing so may result in a trade-off with processing time.
@stars_ai Sometimes, you need to reduce the resources usage while model training because TensorFlow tries to use the maximum memory available for fast training. In the case of training with a 3-4TB dataset, it will definitely encounter an OOM (Out of Memory) error because it tries to load all the data into memory and copies it to the GPU, which leads to a crash. I also suggest using a garbage collector to remove unnecessary variables and release memory. Along side with OpenMP limits to load data according to given conditions.
Neither. Having a PC with OP stats sounds better since Colab+ is not great, but then you would need a lot of money for that type of PC (buying multiple A100s, or T4s or what not). If you're training an LLM that is. A smaller model may work fine, but LLM by definition refers to a Large Language Model. TLDR; I'd recommend Kaggle (free). I've gotten better results with it than with any other service.
thanks bro, stuck in the same situation. I am currently using colab pro for training transformers and llms and i exhausted all the compute limits and i dont what is the best solution for long term? Is the cloud services worth it or is there anything else i can do?
I have been recommended to use Kaggle Kernels. They say it gives you free 30 hours of double or something TPU usage per week. I have tried to load it up in vain, so far. But try it - if it works, perfect. I'm trying to load it myself too.
I honestly don't know about Azure. Research more about it. Look for reviews on UA-cam and everywhere else. Don't trust it so easily, it is a lot of money after all.
Honestly, I agree. As I said near the end of the video, my A.I. system failed on their best A100 system. Thought it was only me with these issues, but actually it turns out that Colab is just a bad service
Specifically what am I training? I'm training a language model, though it's not that large, since I can't get such a huge dataset to work with the limited computational resources that I have
You want parallel processing? That means multiple GPUs, and it only offers one GPU unfortunately. I would not recommend Google Colab at all. Go for Kaggle honestly.
Thanks bro, that is what i was looking for
Very informative, thanks bro.
I wouldn't suggest Colab, now that I've looked more into it. It isn't only my problem that my A.I. can't train on it - many across the world can't do so as well.
What would you recommend a student who is just starting out in deep learning 😅?
I bought collab Pro and i got the SAME OOM error. I really felt it is too expensive and not worth the price
Thinking the same too, I can't lie.
I suggest you set limits on CPU and GPU usage when using TensorFlow. While you can control TensorFlow's use of OpenMP threads, doing so may result in a trade-off with processing time.
Why would I set limits on a system already failing even at maximum capacity?
@stars_ai Sometimes, you need to reduce the resources usage while model training because TensorFlow tries to use the maximum memory available for fast training. In the case of training with a 3-4TB dataset, it will definitely encounter an OOM (Out of Memory) error because it tries to load all the data into memory and copies it to the GPU, which leads to a crash. I also suggest using a garbage collector to remove unnecessary variables and release memory. Along side with OpenMP limits to load data according to given conditions.
Alright I'll try @@abdumoez2077
What would be the best option for training LLMs? A PC with excellent hardware or Colab+?
Neither. Having a PC with OP stats sounds better since Colab+ is not great, but then you would need a lot of money for that type of PC (buying multiple A100s, or T4s or what not). If you're training an LLM that is. A smaller model may work fine, but LLM by definition refers to a Large Language Model.
TLDR; I'd recommend Kaggle (free). I've gotten better results with it than with any other service.
Thank you! i'm not gonna wate my money and get disappointed
Brother how many GB of GPU are given in pro plan
thanks bro, stuck in the same situation. I am currently using colab pro for training transformers and llms and i exhausted all the compute limits and i dont what is the best solution for long term? Is the cloud services worth it or is there anything else i can do?
I have been recommended to use Kaggle Kernels. They say it gives you free 30 hours of double or something TPU usage per week. I have tried to load it up in vain, so far. But try it - if it works, perfect. I'm trying to load it myself too.
@@stars_ai thanks bro
thanks for the information, what is your opinion about the Microsoft azure?
other thing, do you think 200$ worth for the Copilot Studio?
I honestly don't know about Azure.
Research more about it. Look for reviews on UA-cam and everywhere else.
Don't trust it so easily, it is a lot of money after all.
Thnak you, I won't waste my money on it then.
Yes, please don't. I did, and it was pretty bad.
bought colab pro+ to run processes in the background as adverticed and it was pretty useless..
Honestly, I agree. As I said near the end of the video, my A.I. system failed on their best A100 system. Thought it was only me with these issues, but actually it turns out that Colab is just a bad service
nice
what are training exactly? LLM?
Specifically what am I training? I'm training a language model, though it's not that large, since I can't get such a huge dataset to work with the limited computational resources that I have
Colab pro 9.9us$ inceease disk size??
Form a coherent sentence first
I cant use dynamic parallelism in gpu colab free! Is it possible in colab pro?
You want parallel processing? That means multiple GPUs, and it only offers one GPU unfortunately. I would not recommend Google Colab at all. Go for Kaggle honestly.
У тебя хороший английский братишка