Apache Spark - JDBC Source and Sink | Spark Tutorial | Part 9
Вставка
- Опубліковано 5 лют 2025
- This video demonstrates the use of JDBC function to connect to a mysql.
JDBC acts as a data source and sink for Apache spark.
🙏🏻SUPPORT THE CHANNEL🙏🏻
Buy me a coffee: ko-fi.com/bigt...
Subscribe: bit.ly/2A2h6sJ
Facebook: / bigtechtalk
Telegram: t.me/bigtechtalk
#spark #jdbc #bigtech #spark
This is really helpful, thanks
You're welcome!
Very good bro
thanks
Hi will nulls allowed to load, can bulk load is possible please suggest
Good 😌
Thanks 😅
I loaded data from MySQL to dataframe using spark.read.format , after that I did some transformation and created final dataframe to load in to same table in MySQL.....so I did truncate and used overwrite using write.mode() but I am not able to see data in MySQL but if I load to different table it works but not on same table .....please provide your suggestions.
Nice. Congrats!
What the name of the IDE that you’re using?
Thanks Tiago,
IDE used in the video is Eclipse you can even use scala ide which is build on top of eclipse
python or scala in big data or should i learn both ?
Hi M. Adel
If you are interested in DataScience/Machine Learning/AI then i would say its better to learn python. But if you are only interested in big data then any of the language(scala/python) will work.
hi i want connect oracle db using python and spark using pycharm can you help me
please help me to provide this in java as well
@Big Tech Talk
How easily you wrote the whole code, its going take me ages to reach there
thanks but it took some time for me also.
we need end to end big data pipeline
Hi M.Adel
Can you tell me your exact requirement ??
i love scala than python
Me too