Up to 150x GPU PANDAS Speedup with No Code Changes
Вставка
- Опубліковано 7 лис 2023
- In this video I introduce the latest version of CUDF that offers up to 150x speed improvements in Pandas, with no code changes, using a GPU.
You can now do this directly in CoLab, without additional install: developer.nvidia.com/blog/rap...
NVIDIA Demo: colab.research.google.com/git...
~~~~~~~~~~~~~~~ LINKS ~~~~~~~~~~~~~~~
Colab Notebook Link: nvda.ws/3ZPmfMy
AI & Data Science Virtual Summit: nvda.ws/46Kscwq
~~~~~~~~~~~~~~~ CONNECT ~~~~~~~~~~~~~~~
🖥️ Website: www.heatonresearch.com/
🐦 Twitter - / jeffheaton
😸🐙 GitHub - github.com/jeffheaton
📸 Instagram - / jeffheatondotcom
🦾 Discord: / discord
▶️ Subscribe: ua-cam.com/users/heatonresea...
~~~~~~~~~~~~~~ SUPPORT ME 🙏~~~~~~~~~~~~~~
🅿 Patreon - / jeffheaton
🙏 Other Ways to Support (some free) - www.heatonresearch.com/suppor...
~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#pandas #cudf #cuda #nvidia #rapids #machinelearning - Наука та технологія
You can now do this directly in CoLab, without additional install: developer.nvidia.com/blog/rapids-cudf-instantly-accelerates-pandas-up-to-50x-on-google-colab/?ncid=ref-inor-177218
Just the technique I’m looking for! My cpu is saved 😂😂😂
Quick note, at 7:15, the final bullet point can be ignored. Everything is installed directly from conda/pip.
Just wanted to chime in, I really your work on this channel! Purely out of curiosity though, any thoughts on eventually making videos about RAG or LLM tuning, or not interested?
Much more to come here, for now, this is my playlist on LLM. ua-cam.com/video/miTpIDR7k6c/v-deo.html&ab_channel=JeffHeaton
that would be nice to see how that works during training network.. I mean, when dataset is too big to load and preprocess in memory ( or saving pre-processed copy of dataset is also outof the question), we need to take it part by part (batches). BUT during training GPU is busy with model.. so how that access to GPU will look like, I wonder..
I recently started learning CUDf. A friend gifted me a Tesla P4 8gb for my learning lab. Will the P4 do the job? I’m new to GPUs.
This is stupid hard to install/get working on a standard pip windows machine... too many errors.
Nvida has shit drivers for linux, writes code that needs to run on linux. I know, lets run it on linux in windows. What a joke.