- 1 034
- 851 040
CloudAndDataUniverse
India
Приєднався 9 жов 2021
CloudAndDataUniverse Channel has been created to help people learn and scale up in Cloud and Data domain. As of now playlists have been created for AZURE, AZURE DATA FACTORY, BIG DATA, SPARK, PYSPARK, DATABRICKS, SCALA, PYTHON, SQL, EXCEL and POWER BI each in English and Hindi. Share within your network. Keep learning !
We offer courses on our portal : www.cloudanddatauniverse.com
We offer courses on our portal : www.cloudanddatauniverse.com
Live meet conducted on 19-jan-2025
This is a live session recording which we conduct every weekend for our students who have enrolled in the courses.
Переглядів: 67
Відео
Orientation Session on End-to-end affordable courses!
Переглядів 22214 днів тому
In this video we explained our course offering at affordable prices! Azure data engineering( 65 hours) : INR 3,999/- Only Data analytics ( 40 hours ) : INR 2,499/ Only Power bi : INR 1,299/- Only Whatsapp us on 91 9028 411 640 for enrollment and details. www.cloudanddatauniverse.com
100% Money back courses!
Переглядів 3621 день тому
Register for course details: docs.google.com/forms/d/1EBL4fUGDHAaXJrCjmL1EFlvVG5Gbr21s6YdBhwcSmu4/preview
4. Integrate Data Factory With Azure Devops
Переглядів 4421 день тому
4. Integrate Data Factory With Azure Devops
4 Integrate Data Factory With Azure Devops
Переглядів 3621 день тому
4 Integrate Data Factory With Azure Devops
Thank you for the video
Welcome
Right 👍
How to use using stored procedure
Sir, When we want to apply for a data engineer job , out job requirements are Snowflake and DBT tools. If you make these video , there will be no obstacle in getting our job . Please make the videos Hindi , sir. 🙏🙏
Yes, I understand those are demanded, but I haven't explored those tools yet , hence can't create videos on those. My focus is only on azure stack mainly
@@cloudanddatauniverse ok sir 🥰🥰
This will not work if a record is updated. This will only work for new insertions to files. Please help with update logic
Yes ,for that you will have to use upsert
Sir, databricks Hindi video continue 🙏🙏🙏
Sir, databricks Hindi video continue 🙏🙏🙏
Thanks for watching, we aren't releasing new videos in databricks as going forward pyspark is the widely adopted one, so watch the pyspark playlist
@@cloudanddatauniverse Thank you sir 🥰
Welcome
Thanks 🙏🙏🙏 Happy New Year 2025🕛🎊❤💐🎁🎉
Welcome 😊 Happy new year
noise issue
RDD kya hota hai ?
Why did u put year and day in small but month in capital?
Those are pre defined formats
@@cloudanddatauniverse okk
What is upsert instead of insert.
Upsert means combination of update and insert, when record exists it will update it and if it doesn't it will insert
@@cloudanddatauniverse thank u
Please make a play list on ETL.
Check azure data factory playlist, for in depth pyspark and azure data engineering you can enroll in our affordable courses only at inr 3,999/-
excellent
Thank you for watching
@@cloudanddatauniverse you r most welcome
Nice ❤
Most effective channel for Big data courses.
Thank you
Yusuf Sir Words really 100000% correct, I joined & watched course videos by random sample picks also....I felt It's Awesome, Even Live Classes also Not much Perfect compares to this self-paced. without Diversity, if we strictly follow this course, it can surely Shape our Lives....
Thank you brother, glad to see your comments and an active member on our channel. Wishing you great success
Happy new year Yusuf
Happy new year brother
DONE
DONE
जहां important वही Focus करते आप...वही अच्छा, फालतू Video lengthy करके कुछ पैदा नहीं....एक request आपका सब videos मिलाके 2-3 Hrs mega videos बनाइए, Interview review time easy होते हमको...."Learn by do it", "Ansh Lamba" channels ऐसे Mega बनाया but in English....If you make Pure Interview questions mega videos बनानेसे बहुत हेल्प होते ADF Synapse SQL PySpark subject wise
Glad to know your experience, will consider your suggestion. You are new member on our channel and appreciate your feedback on our videos. I would also like to suggest having a look on our end to end course on data engineering to you and all other which is now available at only inr 4k! Other courses as well are discounted, grab this new year offer and build a strong career!
Could yoy please make it using pyspark
Love it Yusuf bhai
Thank you
I done some Web deployments through Terraform in Azure Devops. Here my Doubt, similarly is there any provision to create ADF pipelines in Terraform & run that in Azure Devops?
I haven't worked on terraform, hence can't comment on it.
U haven't created service principal and Not raised request for Azure Managed pool for Azure Devops CI-CD pipelines in place of self host pool
bhai love u gajab ka tutorial hai brother hats off to u
Glad to know bhai, keep learning
May i k know that are you explaining this serues using scala?
I m relly thankfull to you for your valubale time to make it easy for us....
Welcome, glad to contribute to help community learn and grow
Can't we clone and edit entire Dataflow (Source+filter+sink) in place of branching....means if i need 5 Tables for 5 countries, dynamic parameters best hi kya?
भैय्या.... Background so noisy... please avoid it for Future Recordings
Azure made much complex for everything....If any 1 common Azure CLI command introduce its better, to mount/make connection b/w any source(ADLS, SQLDB, Snowflake...etc) and Databricks. In this mount case, i guess within 1 ADLS its one time but for many ADLS, we need to repeat all this Mount process for each, correct?
It's a one time process for each adls account. Our perspectives are always narrow and not complete, they know the things better and that's how it is.
Is it needed Editor or Storage Account Admin RBAC Role to write data into ADLS from Databricks?
You need storage blob data contributer role
So much gap from previous video
Lovely explanation
Glad you liked it!
👍
Very Informative Keep Posting the Contents Sir 👍
1.5 GB, 45 minutes time....some times Azure So worst.... इसके बदले CLI/powershell command से Copy speed मे होनेसे अच्छा होते....GCP मे Postgre SQL DB से On-Premises to GCP gcloud CLI or python script file से import/Export Fast मे होते GCS <---> cloud SQL....for some scenarios GCP sooo best
Yes, those alternatives are faster. But we can some alternatives like to choose cluster size and make it faster to run. One thing over here to be noted is setting up self hosted runtime is a one time task, post that you can seamlessly move data. Baki toh situation pe joh best hai woh karna chahiye, nothing about right or wrong
Please provide csv files by Google drive link or github link for our practice also
Download the files from here: drive.google.com/drive/folders/1fvtcltPl214pbNEyUE-CDE_oN_Ja446q?usp=sharing
U haven't used Append inside ForEach why? And Can we use Multiple (20) Lookups parallel (Like in dataflow multiple sources) to read 1 Lakhs rows....
It is too early in the playlist to talk about that, the focus in this video was just to demonstrate few properties about lookup activity. We generally dont use lookup to read data like lakhs or millions of rows, you will understand this by experience and scenarios.
Sab log sirf demo dikhate, but itna clear explain nahi Karte.... Excellent
Thank you for watching and your feedback, appreciate your kind words.
Maheer Wafa ka भी ऐसे small partition videos pura playlist देख चुका हु मैं....ये सब Functions मिलवाकर 3-4 Complex projects बनाइए please....I haven't found Mega projects in ur channel ADLS, ADF, Synapse, PowerBI pipelines.....and Run all this data pipeline by Azure Devops CI-CD pipeline....Thats i need Exactly like that projects in hindi bro, कोई playlist नहीं मिला.... Foreigners ka videos dekha but i understood 50% only there....
Ji, Maheer ka wafastudies channel bahot behtareen hai, bahot acha kaam kar rahe hai woh bhi community ke liye. Projects toh abhi tak release nahi kiye channel par, filhaal to paid course mein release ho raha hai project end to end jaise aapne mention kiya, future mein ho sakta hai channel pe bhi release ho jaye.
Here it is! End-to-end project ua-cam.com/video/qVLz7U4E4Qs/v-deo.html
Sir , please hindi video sir 🙏🙏😭
Release Kiya hai hindi mein bhi
@@cloudanddatauniverse Thank you sir 🥰
Welcome
Excellent.....I saw 10 channels Before, but not understood. Now fully clarified what are RDDs, Spark
Glad to know😊
Bhai...Without Boring theory u started direct Labs, Really I like ur Teaching Style & content too..... आपको बहुत धन्यवाद ऐसे channel बनाया मेरा Fav हिंदी में Azure Data Engineer का videos Full series hindi नही मिला अबतक मुझे
Welcome bhai, glad to know. Bas seekhte raho, bahot khazana hai apne channel par jis se aapka skill banega.
Very helpful course content for Data Engineering , Data Analytics Aspirants 👍
Thank you for your feedback Pratik, glad to have you as one of our community member.
No Video until now demonstrated on Internet about ADF file management in this process. You are the one so you are great teacher. Muhammad Khan, North America.
Appreciate your comments, keep learning, there is plenty of treasure on our channel and you will be amazed how easily you can learn so much! thanks brother
Due to my poor understanding and possession of knowledge, I feel very difficulties on other's English teaching process on ADF, I love your beautiful language of URDU to understand your great online training wherein I fell very comfort to be implemented this professional knowledge on any projects. Thanks you are great teacher for teaching FREE.
Appreciate your feedback and glad to know your experience with us. I knew Hindi/Urdu videos will help people a lot in the Indian subcontinent and we recieved great response as many were able to understand technical things easily. Happy to contribute to community for them to able to learn and grow, God bless you, keep learning, there is a lots of content on our channel which will propel you up!
I am beginner Mai azure databricks ka lecture dhundh rha h. Is it the same
Welcome to our channel. Yes you can check this or better follow the pyspark playlist on our channel, this is with the scala language. First check the python playlist then pyspark, have covered both on databricks.
I think when you mention last date function as start date , it considers Jan2019 as 1st month so (nov 2018+dec2018+Jan 2019). 7:28 sec
Sir will these work even if we have multiple different file formats such as parquet and json and csv