TLDW; Databases have low throughput while Kafka has high throughput but Databases have better storage compared to Kafka. Kafka is a Messaging service that can act as both a Message Queue and Pub-Sub pattern produced by Linkedin and Apache. Kafka requires zookeeper (another service from Apache) for load balancing. Kafka can have topics and the topics can be further stored in partition. One consumer can consume to multiple topics and the partition is self balanced by kafka but one partition can't be consumed by multiple consumers this way Kafka act as message queue. In order to act as Pub-Sub, Kafka has a concept of Group, Consumers in same groups that subscribed to a topic gets the topic from partition, self balanced by Kafka but all the groups also get all the Partition, this way it act as Pub-Sub as well. Kafka has admin which has task for creating topics and partition etc, producers that produces data, consumer that consumes data.
The way you explained made me to learn this topic *Kafka* which I never heard before I scrolled this random video …. hats off to you for teaching like a friend....👏🙌👏
Piyush, I am Usman from Pakistan and I really love your videos, The way to tell things from very basic and use drawings for visualizing things and clarity, it's literally just amazing and easy to follow. Thank you so much for such valuable content. Lots of love from Pakistan to you💖
I have watched the full video from starting to end and enjoyed it. Its amazing to see how clearly you understand the concept and your command on node syntax 🎉
@15:16 Just a correction. Kafka storage is not temporary. You can store data in kafka forever since Kafka stores memory on disk, and it is not an in-memory storage like Redis.
It came randomly to my recommendation and I clicked on it…before I had only heard of Kafka but you explained it so well. I don’t know if I’ll be using this stuff anytime soon but sure as hell won’t forget how it works ❤
Great lecture….. in one hour you can easily visualise and learn all about kafka in and out. The best part of the video is when he is coding along with visualising what he is coding.
What a brilliant explanation! I wasn't familiar with Kafka before, and you explained it in layman's language superbly! Kudos to you. Keep creating videos like this!
Bhai I rarely comment for UA-cam video.. I am mechanical engineer by profession & recently got inspired to learn Data Engineering stack... And you made my day..what a great video to understand Kafka fundamentals... God bless you man with lots of happiness and success... You are great teacher.... Would love to see your crash course as first crash course for anything new learning i will be planning for in future....
The Way @Piyush is teaching is amazing. I just opened video to over through but ended up watching it completely. I'm working as Sr. Software Engineer and can say that this is bater then many paid videos. Thanks Man.
In an Iron Man movie, Kafka could have been useful in a scene where Tony Stark (Iron Man) needs to process and manage a large volume of real-time data or communications. Here's a hypothetical scenario where Kafka could play a role: Scene: Tony Stark is in his high-tech lab, and he's remotely controlling his Iron Man suit, which is deployed in a distant location to handle a crisis. He needs to receive and process real-time data from various sensors on the suit, such as vital signs, telemetry data, and external environmental data, while also receiving live video feeds. How Kafka could be useful: 1. **Real-time Data Ingestion**: Kafka could be used to ingest data from these sensors and video feeds in real-time. Each type of data (vital signs, telemetry, video) could be treated as a separate Kafka topic. 2. **Data Processing**: Tony needs to process this data for real-time decision-making. Kafka Streams, a component of Kafka, could be used to perform real-time data processing, such as analyzing vital signs for signs of distress, stabilizing the suit's functions, and identifying threats in the video feed. 3. **Reliability**: In a high-stakes situation like this, Kafka's reliability ensures that no data is lost. If there are network interruptions or delays, Kafka can buffer and replay messages, ensuring that Tony has access to all the critical data. 4. **Scalability**: If the crisis intensifies and more data needs to be processed, Kafka can scale horizontally by adding more Kafka brokers, allowing Tony to handle the increased data flow without performance issues. 5. **Monitoring**: Kafka provides extensive monitoring capabilities, which could be depicted in the movie as Tony monitoring the health of the data pipeline in real-time, ensuring that he has a clear view of the suit's status. In this scenario, Kafka would enable Tony Stark to efficiently manage and respond to real-time data, enhancing his ability to control the Iron Man suit and handle the crisis effectively. It would add a layer of realism to the technological aspects of the movie.
I really learn from your vedios, can you please make vedios on the topics :- 1. design pattern in js 2. React with typescript 3. Typescript standalone course 4. Next js 5. Full stack application build from scratch 6. Asynchronous processing 7. Messages queue 8. Redis 10. Jenkins 11. Advance react js
@@caresbruh bri i need job but dint have skill can u tell me how can i get skill can u tell me hindi you tube channel for fresher i want join so fast in fornend plz reply
i so happy that i found your Channel , I was browsing through your channel and found you have created really good content on the things which are currently trending.Thank you ❤
Bro, Just watched 7 mins of your video and I have to come to the comment section to say that the language you are using is too simple and easy to understand. "Database fat Jayega" 😃. Perfect style to explain a core developer like me. Good Job.
Omg! this is life saver. I am interning in Goldman Sachs from IIT Delhi and didn't have web dev background at all. So I had to quickly learn about Kafka. and this is the best thing one could get.
Lemme try: Kafka is a distributed message queue, generally used to decouple micro services where one api produces some data, and many others consumes it. Simple. I don't know why he confused people with DB and Kafka. Kafka is really a queue, you only read data sequentially. Kafka CAN hold data for an infinite period if you want. But it's not made for 'querying' data, instead, it holds the data temporarily so that the producer api don't need to wait for the consumer api to 'read' this data, it can simply dump the data onto the queue and let consumer api read whenever it finds time for it. If you're even a little confused between Kafka and a DB, this guy is to blame for that.
I assume you would have better videos on UA-cam for explaining Kafka, if you are blaming him for confusing others. If not, then appreciate his efforts for teaching for free,
He was trying to explain why do we need Kafka at all and not insert the data directly to the database. It is because the throughput of the database is not that high to handle the fast streaming data. So, what if we only use Kafka and discard the database. This is not a good option because Kafka is not a best solution for storage and querying. So, I thought his explanation was perfect.
I've watched the full video through to the finish and really enjoyed it. Your command of node syntax and your obvious comprehension of the subject are outstanding. 🎉
Thanks for this content. For first time I attempted to listen to a texh content while I was busy with mundane work and got a wonderful idea and understanding of Kafka. Tomorrow morning I shall implement this and also read further on event driven architecture.🎉 Thanks again
omg I never watch such kind of video which are very helpful to anyone who want to make career in this sector. superb! I like your explaining technics. Thanks a lot!
Thank you for the hard work. DB throughput is low because of ACID properties which is not possible in case of Bulk Insert that is without redo/undo so there is a huge risk of data loss but in the cases of Junk data ( Facebook, discord, etc) you accept the risk. In case of financial transactions you need to come up with a strategy where Kafka when full storage and failures in database stops producing until consumer errors are fixed to avoid data loss.
very nicely articulated internal of Kafka and working with DB , Thanks lot Piyush ,keept it up good work. 2 questions: 1) why and when one should do bulk insert of kafka records into DB(Zomato,Uber) instead of record by record update? 2) what is the basic essence of - Kafka prefers consumer groups over standalone consumers?
Hi Piyush, This is just to let you know that you are doing a great work, Your teaching style is unmatchable, you keeps everything very simple and don't repeat yourself again that make your video so enjoyable to watch. May all your dreams come true, as you are helping us to achieve our goals.
its great video to help to improve the backend knowledge and I have one request to you please you will make video on node js with kafka on one big project THANK YOU SIR 👍👍
To sabse pehle ham smjhenge kafka hota kya. Ham dekhenge ki kafka kya hota hai. Ye boht important hai ki ham jaane ki kafka kya hota hai. Mai aapko btaunga ki kafka hota kya hai. We'll see what kafka is. :) Boht ache!
Genuinely one of the best crash course on youtube I have come across in a while Thank you Can you also have further video on such topics like data indexing and sharding
Excellent explanations, appreciate your effort, I understood 80% as i started from scratch. but you can improve few things 1. could be simplified more about zookeeper 2. can be simplified about queue and pub sub(example was not easy not relate) 3. If producer can specify target partition, then why consumer cannot. for example If there are 2 partitions, then 1 of 2 consumers cannot receive all data( it will receive only partial data from one partition)
Really use full video, please explain more on how 1. offset values and records works , 2. if you can explain how data saves in partition will be more helpful
HI Piyush, This is indeed a crystal clear video about Apache Kafka. Thanks for sharing this wonderful knowledge for the IT developer fraternity. Thanks for this wonderful video and making the concept so clear.
Great job piyush on explaining the details both visually and with the code approach 💯 While getting the hang of kafka as we go further i started to get the question of how kafka does the message transfer and when you talked about queue and sna architecture approach it clicked the brain. It's the similar how amazon SNS and SQS services works and from that point things got a lot for relatable. I had a chance to work with pusher messaging service which had the similar pub/sub messaging architecture. This kafka session turned out to be really amazing. Looking forward for more of something like this in future.
I always wanted to learn kafka , Thank you so much to putting out such great content. You made me understand every topic where i was struggling Much Thanks 🙏
Bhai Thank you so much , i am non it person , i am practicing algo trading in stock market, this all concepts are solving many of my problems , you make it very simple to understand . Thank You Bro!
great lecture. I only watched the first 45 mins and it was amazing. Please find time to make similar videos over design patterns and low level designing.
Great Work. Keep it up.
Thanks.
Saptadeep.
Thank you so much, highly appreciated 😀🙌
brooo, this is crazy donation
Sir send me some money also
@@salman1098😂😂😂
Bhai mne do din se khana nhi khaya hai 200 rupye krdo
TLDW;
Databases have low throughput while Kafka has high throughput but Databases have better storage compared to Kafka.
Kafka is a Messaging service that can act as both a Message Queue and Pub-Sub pattern produced by Linkedin and Apache.
Kafka requires zookeeper (another service from Apache) for load balancing.
Kafka can have topics and the topics can be further stored in partition.
One consumer can consume to multiple topics and the partition is self balanced by kafka but one partition can't be consumed by multiple consumers this way Kafka act as message queue.
In order to act as Pub-Sub, Kafka has a concept of Group, Consumers in same groups that subscribed to a topic gets the topic from partition, self balanced by Kafka but all the groups also get all the Partition, this way it act as Pub-Sub as well.
Kafka has admin which has task for creating topics and partition etc, producers that produces data, consumer that consumes data.
Is these topics involved system design?
@@growmoreyt4192 yes sir
@@growmoreyt4192 yes
Bro.. this guy is crazyyyyyyyyy. This was way better than other code videos i watched
The way you explained made me to learn this topic *Kafka* which I never heard before I scrolled this random video …. hats off to you for teaching like a friend....👏🙌👏
Was waiting for "Why kafka has such a high throughput" answer.
Great explanation, thank you for the effort.
Piyush, I am Usman from Pakistan and I really love your videos, The way to tell things from very basic and use drawings for visualizing things and clarity, it's literally just amazing and easy to follow. Thank you so much for such valuable content. Lots of love from Pakistan to you💖
The way you explained a complex concept like kafka with such simple and clear examples is really fascinating. Thank you!
although I do not know node.js but the way you taught kafka its simply awesome.
What a true sentence "Half knowledge is very dangerous". Good work buddy. Sweetly swallowing the bitter medicine.
I have watched the full video from starting to end and enjoyed it. Its amazing to see how clearly you understand the concept and your command on node syntax 🎉
@15:16 Just a correction. Kafka storage is not temporary. You can store data in kafka forever since Kafka stores memory on disk, and it is not an in-memory storage like Redis.
Yes but we can't rely on kafka storage for long term like database system
Bro.....hatsoff to your explanation. You explained kalfka in the Most simplest way.Thanks bro
Kya video banaya hai Bhai Saab..Top notch..1 hour full utilised..
bro, you are the only Indian UA-camr explaining/teaching real world tech.
It came randomly to my recommendation and I clicked on it…before I had only heard of Kafka but you explained it so well. I don’t know if I’ll be using this stuff anytime soon but sure as hell won’t forget how it works ❤
same
Great lecture….. in one hour you can easily visualise and learn all about kafka in and out. The best part of the video is when he is coding along with visualising what he is coding.
Pure zindagi mei itne sare terminals open nahi kia... Bohot masth video tha..
This is the best kafka course I have watched on youtube
Great Work. You have provided the example which will be never forgotten.
What a brilliant explanation! I wasn't familiar with Kafka before, and you explained it in layman's language superbly! Kudos to you. Keep creating videos like this!
Your visual way of teaching is too good, thanks mate
Bhai I rarely comment for UA-cam video.. I am mechanical engineer by profession & recently got inspired to learn Data Engineering stack... And you made my day..what a great video to understand Kafka fundamentals... God bless you man with lots of happiness and success... You are great teacher.... Would love to see your crash course as first crash course for anything new learning i will be planning for in future....
The Way @Piyush is teaching is amazing. I just opened video to over through but ended up watching it completely. I'm working as Sr. Software Engineer and can say that this is bater then many paid videos.
Thanks Man.
This man is way better than most of the Edtech Academies out there.
Use this whole scenario with a database. Thanks in advance for this soundful knowledge....
so basic explanation with basic examples which is really easy to understand
Got this channel randomly, you’re a gem ❤
BEST EXPLAINATION. I'm so sad that I have discovered this channel soo late.
😮 the way you explain at 1.75x is so cool. I never knew what Kafka is now I feel like a Kafka expert.
In an Iron Man movie, Kafka could have been useful in a scene where Tony Stark (Iron Man) needs to process and manage a large volume of real-time data or communications. Here's a hypothetical scenario where Kafka could play a role:
Scene: Tony Stark is in his high-tech lab, and he's remotely controlling his Iron Man suit, which is deployed in a distant location to handle a crisis. He needs to receive and process real-time data from various sensors on the suit, such as vital signs, telemetry data, and external environmental data, while also receiving live video feeds.
How Kafka could be useful:
1. **Real-time Data Ingestion**: Kafka could be used to ingest data from these sensors and video feeds in real-time. Each type of data (vital signs, telemetry, video) could be treated as a separate Kafka topic.
2. **Data Processing**: Tony needs to process this data for real-time decision-making. Kafka Streams, a component of Kafka, could be used to perform real-time data processing, such as analyzing vital signs for signs of distress, stabilizing the suit's functions, and identifying threats in the video feed.
3. **Reliability**: In a high-stakes situation like this, Kafka's reliability ensures that no data is lost. If there are network interruptions or delays, Kafka can buffer and replay messages, ensuring that Tony has access to all the critical data.
4. **Scalability**: If the crisis intensifies and more data needs to be processed, Kafka can scale horizontally by adding more Kafka brokers, allowing Tony to handle the increased data flow without performance issues.
5. **Monitoring**: Kafka provides extensive monitoring capabilities, which could be depicted in the movie as Tony monitoring the health of the data pipeline in real-time, ensuring that he has a clear view of the suit's status.
In this scenario, Kafka would enable Tony Stark to efficiently manage and respond to real-time data, enhancing his ability to control the Iron Man suit and handle the crisis effectively. It would add a layer of realism to the technological aspects of the movie.
Team Iron Man!!!
Please mark this to @marvel as they need this data :0
It's great and depth knowledge about Kafka by piyush and he is putting (hard +smart)work to spread best knowledge......
Yygv
Yygvvvvv
Really, you literally took me into the world of kafka. Seriously your explanation is superb.
I do remote web developer job in US from india , this is very much informative video. I am dealing with a large data, using kafka & elasticsearch.
I really learn from your vedios, can you please make vedios on the topics :-
1. design pattern in js
2. React with typescript
3. Typescript standalone course
4. Next js
5. Full stack application build from scratch
6. Asynchronous processing
7. Messages queue
8. Redis
10. Jenkins
11. Advance react js
you don't need a course to learn typescript/react-typescript u just need to use it
For 6th and 7th, I would highly recommend Namaste Javascript playlist of Akshay Saini
@@caresbruh bri i need job but dint have skill can u tell me how can i get skill can u tell me hindi you tube channel for fresher i want join so fast in fornend plz reply
Really awesome vedio , good understanding , Hindi m h toh attention span bhi acha rha ! Great job
All my Kafka concepts are finally clear today!
I have no words this video is amazing please make whole series for kaflka please...........
i so happy that i found your Channel , I was browsing through your channel and found you have created really good content on the things which are currently trending.Thank you ❤
Bro, Just watched 7 mins of your video and I have to come to the comment section to say that the language you are using is too simple and easy to understand. "Database fat Jayega" 😃. Perfect style to explain a core developer like me. Good Job.
Omg! this is life saver. I am interning in Goldman Sachs from IIT Delhi and didn't have web dev background at all. So I had to quickly learn about Kafka. and this is the best thing one could get.
Thank you so much bro, you really don't know how much help you have made by explaining this concept so easily 🙏🙏
Lemme try:
Kafka is a distributed message queue, generally used to decouple micro services where one api produces some data, and many others consumes it. Simple.
I don't know why he confused people with DB and Kafka. Kafka is really a queue, you only read data sequentially. Kafka CAN hold data for an infinite period if you want. But it's not made for 'querying' data, instead, it holds the data temporarily so that the producer api don't need to wait for the consumer api to 'read' this data, it can simply dump the data onto the queue and let consumer api read whenever it finds time for it.
If you're even a little confused between Kafka and a DB, this guy is to blame for that.
KafkaStreams and ksqldb, built on top of kafka. But yes you are right!
He is right, he is telling the working of kafka. It's seems that you only uses the technology not understand there working.
I assume you would have better videos on UA-cam for explaining Kafka, if you are blaming him for confusing others.
If not, then appreciate his efforts for teaching for free,
@@BANANAS2011 Appreciate efforts who give will give you alcohol when thirsty? Why are you braindead?
He was trying to explain why do we need Kafka at all and not insert the data directly to the database. It is because the throughput of the database is not that high to handle the fast streaming data. So, what if we only use Kafka and discard the database. This is not a good option because Kafka is not a best solution for storage and querying. So, I thought his explanation was perfect.
Thank you. You teach the Kafka concepts in a crystal clear way. Love from Bangladesh 🤍🇧🇩
I tried many youtube courses for kafka but your way of explaning is so great.. please keep the good work..
Thanks for the crash course, maybe this is my first course which I watched completely and understood completely.
I've watched the full video through to the finish and really enjoyed it. Your command of node syntax and your obvious comprehension of the subject are outstanding. 🎉
Thanks for this content. For first time I attempted to listen to a texh content while I was busy with mundane work and got a wonderful idea and understanding of Kafka. Tomorrow morning I shall implement this and also read further on event driven architecture.🎉
Thanks again
Best video on kafka ever. Loved your simple explanation and demo with visualiztion.
Thanks for the clear and detailed explanation of Apache Kafka. Your insights really helped me understand its architecture and use cases better!
omg I never watch such kind of video which are very helpful to anyone who want to make career in this sector. superb! I like your explaining technics. Thanks a lot!
Bhaiya, you are a great teacher. Your videos are way better than paid courses. Love your videos.
Bro!!!, very well explained, the first CS lecture I have watched with excitement till the end. Thank You
Thank you for the hard work. DB throughput is low because of ACID properties which is not possible in case of Bulk Insert that is without redo/undo so there is a huge risk of data loss but in the cases of Junk data ( Facebook, discord, etc) you accept the risk. In case of financial transactions you need to come up with a strategy where Kafka when full storage and failures in database stops producing until consumer errors are fixed to avoid data loss.
Mazza Agaya, Kaffi Samay Baad Itne Achi Technical Explanation Video Dekha. Yo have a teaching style similar to Brad Traverse.
Thank you I am learning spring boot this is very helpful.
What a great explanation! I don’t think I will forget it after watching this. Thanks!
very nicely articulated internal of Kafka and working with DB , Thanks lot Piyush ,keept it up good work.
2 questions:
1) why and when one should do bulk insert of kafka records into DB(Zomato,Uber) instead of record by record update?
2) what is the basic essence of - Kafka prefers consumer groups over standalone consumers?
this channel is a gold mine.
Amazingly simplified explanation, superb!
WOW!!!!!!!!!!!!! Excellent , You deserve more than what i'm writing.....
Hi Piyush,
This is just to let you know that you are doing a great work, Your teaching style is unmatchable, you keeps everything very simple and don't repeat yourself again that make your video so enjoyable to watch.
May all your dreams come true, as you are helping us to achieve our goals.
What a great explanation! Hats off to you, sir. I'm wondering why you have only 11K subscribers. You deserve millions!
Not many people interested in Kafka know Hindi, and not many people who know Hindi wants to know Kafka.
he is already at 49k in 2months
@@DanielSmith-hd9iq 60k when i joined
68k when I joined .. nice explanation
@@sandeep-rai0772k when I joined
its great video to help to improve the backend knowledge and I have one request to you please you will make video on node js with kafka on one big project THANK YOU SIR 👍👍
i was tired when i the beginning of the video
but in the last i am fresh
Masha Allah great Explanation sir
love from pakistan
To sabse pehle ham smjhenge kafka hota kya.
Ham dekhenge ki kafka kya hota hai.
Ye boht important hai ki ham jaane ki kafka kya hota hai.
Mai aapko btaunga ki kafka hota kya hai.
We'll see what kafka is. :) Boht ache!
Genuinely one of the best crash course on youtube I have come across in a while
Thank you
Can you also have further video on such topics like data indexing and sharding
Very Clear and Concise explanation! The way you dumb down complex topics for your audience is remarkable! Keep up the good work :)
Excellent explanations, appreciate your effort, I understood 80% as i started from scratch. but you can improve few things
1. could be simplified more about zookeeper
2. can be simplified about queue and pub sub(example was not easy not relate)
3. If producer can specify target partition, then why consumer cannot. for example If there are 2 partitions, then 1 of 2 consumers cannot receive all data( it will receive only partial data from one partition)
Really use full video, please explain more on how
1. offset values and records works ,
2. if you can explain how data saves in partition will be more helpful
Excellent video of Kafka purpose, use case and produce, consumer demo. Well done.
Thanks so much for this amazing course. Your way of teaching makes it super easy to understand the concept.
bhai, kya explanation dete ho yaar, maza aagaya
Bro, you have a special talent to make difficult topics easy, it was my first kafka tutorial and i understand it well enough, thanks a lot ❤❤
agree
Totally agree ❤❤😥😥
crisp explanation. Great way describing each concept
HI Piyush,
This is indeed a crystal clear video about Apache Kafka. Thanks for sharing this wonderful knowledge for the IT developer fraternity. Thanks for this wonderful video and making the concept so clear.
bhai boht acha explain karte ho aap. Ek baar me samajh aa jaata haii.
Great job piyush on explaining the details both visually and with the code approach 💯
While getting the hang of kafka as we go further i started to get the question of how kafka does the message transfer and when you talked about queue and sna architecture approach it clicked the brain. It's the similar how amazon SNS and SQS services works and from that point things got a lot for relatable.
I had a chance to work with pusher messaging service which had the similar pub/sub messaging architecture.
This kafka session turned out to be really amazing.
Looking forward for more of something like this in future.
I always wanted to learn kafka ,
Thank you so much to putting out such great content.
You made me understand every topic where i was struggling
Much Thanks 🙏
Very clear and concise explanation - I watched at 2x speed and was still able to grasp everything :D
This is one of the best tutorial on Kafka, Thank you :)
It was very to understand about Kafka for beginners like me . Thanks a lot !!
Great appreciated to you, You are the one of my favourite you tubers. Keeps continue this type of video regularly. Thanks you so much.
one of the best you tube channel found on you tube. you are doing great work bro ❤❤
Thank you so much piyush !! For such a great explanation, i think for the first time i haven’t thought for a second to subscribe someone !!
This is really a wonderful explanation. Easy to understand for non-tech person as well. Thanks😀
This is the first video I discovered in this channel.
Maja aa gaya bhai.
Thank You!
Amazing explanation, nothing can be better than this.
Bhai Thank you so much , i am non it person , i am practicing algo trading in stock market, this all concepts are solving many of my problems ,
you make it very simple to understand .
Thank You Bro!
The way of explaining was extremely good, idk why your Channel is so underrated 💥
apka explaination acha tha bhaiya
video ka har ek part samajh aya
You have explained it in very simpler way. Thanks to you man ☺
Great watch just before an interview!
please also make a video on elastic stack why we need to use ?? what kind of problems it solves?
+1
+1
Please appreciate this guy, so goooooooood explanation!!!!!!!!!
kya pyar se samjhaya h bhaiya 🙌
One of the great crash course video on Kafka. Great work Piyush 🏆
great lecture. I only watched the first 45 mins and it was amazing. Please find time to make similar videos over design patterns and low level designing.
Explained in a very easy and smooth manner !!!
Random recommendation resulted into Kafka learning. Thank you❣
Ky samjhaya hai bhai really impressive!
You got a new subscriber.