Want to learn more Big Data Technology courses. You can get lifetime access to our courses on the Udemy platform. Visit the below link for Discounts and Coupon Code. www.learningjournal.guru/courses/
Extremely Great . Even if we just listen the Audio by closing the eyes .our mind visualize the entire Kafka system . Great work sir !!! you keep post .. will keep grow
First time i found a channel where everything is explained clearly and from scratch. you are doing really nice work sir. Thank you so much. If you have spark tutorial as well please tell me.
Excellent.......Best video....with less timing....i really enjoyed a lot...please make video for spark scala with sbt and eclipse and R integration part as well.......
I'm getting import error. I don't know what to do, I'm using eclipse oxygen and created the maven project and inside pom.xml, I've installed dependencies as well.
Hi , I tired this example but getting exception(Invalid value SupplierDeserializer for configuration value.deserializer: Class SupplierDeserializer could not be found.) please help me. This dependency am using, org.apache.kafka kafka-clients 0.10.1.1
Good question. If I am not wrong, from his tutorial, what I understood was that this master (called as Leader in Apache Kafka) is the only one who receives the message so there is no need to specify in the args to which replica message should arrive. About how producer delivers message, that is presented in next video tutorial.
Hi, I have a doubt. While storing messages in Kafka in the form of Key-value pair, its not necessary for key to be unique? If yes, then how can we extract all the values of a particular key from Kafka?
Kafka is not a Key-values store. We never extract messages based on a key. I suggest you watch all the videos and note down your queries. Hold on your questions until you reach the end of the series. By the end of the series, you will have answers for most of your questions. If something is still not answered, send me those questions. I will answer them.
Good job sir but there are four core APIs not two. The other two are Streaming API and Connector API. Did you make any video/tut for this? Please share the link? Additionally do you know any .net implementation of kafka?
Kindly confirm do I need to create a maven project in eclipse IDE and then add the dependency of Consumer and producer API to run all the examples described in the videos
Using a build tool like maven to resolve dependency is the ideal method. However, you can also download the JAR and include it in your project. You can find the jar here. mvnrepository.com/artifact/org.apache.kafka/kafka-clients/0.10.1.0
Hello Sir, I'm pretty new to kafka, I'm trying to create sample application using .net code. Can you please help me with some step by step implementation guide like this?
Thx for the videos. Can we use Kafka for long processing things..like suppose a customer can have orders from 1k to 1million orders..n to process each customer avg it takes 2hrs..can we use Kafka here..presently we are using rabbit mq with 50 threads ..all customers in one queue..
In one of your previous videos you mentioned a producer only speaks with a master broker of sorts, why is it then normal to specify more than one broker?
Producer only speaks to the leader of the partition. That's correct. But how would the producer know who is the leader? So, we give him ip for one or more broker (any broker) which is used to query metadata. The metada contain the list of topic partition and leader address.
hello sir, your videos are very knowledgeable but I have one doubt related to the role of kafka in production environment. Suppose I am creating a streaming application for Macdonald which will take the customer/order/offer json data from point of sell. This I want to process through spark streaming. So where exactly kafka producer and consumer api will sit here. Will it be a bridge between Macdonald POS and my streaming application...? If yes, then producer api will be bridge or consumer api will be bridge? I am not able to imagine, please enlighten me.
Producer at POS can send data to Kafka Broker and Spark Streaming should consume it directly from Kafka Broker. You may don't need a consumer in this case.
hi,, can someone please explain me how can we make a directory/folder in unix as kafka producer. I need to read multiple files that are coming in a directory continuously using kafka,, how can i do that,, can someone please explain me.. thanks in advance!
Thanks for putting such a wonderful tutorials for us.I am having around 6 years of experience in ETL,DW and reporting background and trying to learn big data technologies from a long time.I have learnt Scala(Basics for spark),Apache Spark on my own and trying to learn Apache Kafka but I am bit worried about java as I don't have any knowledge of it.Can you please tell me how much Java I will have to learn in order to write efficient code for Apache kafka or it is doable through Scala too ?
Kafka is written in Java. There are language binding and wrappers for other languages but I Scala clients are not in good shape. I recommend learning core Java. That's going to help you in long run. Java is too big as an ecosystem and frameworks. But core Java should be the must thing for a big data developer.
Hello Sir, Thank you for giving such useful information. My Query: When we write our own producer java program, how we will run that jar?? some thing related to kafka/libs/..
You can run your jar using java. Include your jar location in your classpath. Most of the Kafka dependency is available in Kafka lib directory. You may want to include them as well.
I ran the code these are the last lines after code got executed , but when I try to read topic contents through kafka console consumer it is showing empty [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version : 0.11.0.0 [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId : cb8625948210849f [main] INFO org.apache.kafka.clients.producer.KafkaProducer - Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. SimpleProducer Completed.
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic tn (enter) its not come out form dat.its not displying any msg. sir do u have step by step process for executing producer in cosole in github.
I have documented all steps, commands etc at my website. Check this page. It might help you resolve the issue. www.learningjournal.guru/courses/kafka/kafka-foundation-training/quick-start-demo/ One more thing. Start the consumer first in one terminal and then start the producer in another terminal. Producer should send messages when consumer is already listening for it. Sometime that helps.
i done the above video sir.actually its really help full. i understand the process in the above video.am preparing my own nodes based on ur tutorial sir. my doubt is how can i run the producer api , consumer api in terminal.
The process for compiling and executing producer and consumer API is also explained in the video. Source code is available at the website and also in GitHub. I used SBT to compile and run the code. If you have any doubt on SBT, I have a full SBT tutorial as part of my Scala tutorial playlist.
Want to learn more Big Data Technology courses. You can get lifetime access to our courses on the Udemy platform. Visit the below link for Discounts and Coupon Code.
www.learningjournal.guru/courses/
Sir, I would like to touch your feet as my teacher/guruji. very useful and one of the best way to teach. please more video in the same style.
Your series is very useful. You deserve more likes than I see registered.
This is one of the best videos regarding how to create producers with APIs
Excellent , I will never forget Kafka , Please upload videos related to cracking big data interviews.Really awesome
Extremely Great . Even if we just listen the Audio by closing the eyes .our mind visualize the entire Kafka system . Great work sir !!! you keep post .. will keep grow
First time i found a channel where everything is explained clearly and from scratch. you are doing really nice work sir.
Thank you so much.
If you have spark tutorial as well please tell me.
@Ankit, Spark is comming soon :-)
Awesome waiting for the spark tutorials also :)
Nice video please upload more Kafka
Awesome ,Very Simple and so effective .Brucelee of Kafka is Sir !!!
Excellent.......Best video....with less timing....i really enjoyed a lot...please make video for spark scala with sbt and eclipse and R integration part as well.......
I feel python package gives easy way to create producer in just few lines of code... wonderful , thanks :)
Very Nice tutorial and well explained..useful for beginner and expert as well
Very good video and easy to comprehend.
Seriously Awesome Tutorial...!!!
Clear explanation about kafka producer API
very nice. simple and yet covers important topics. thanks.
Simply amazing. Thank you so much. Please keep posting such videos.
Amazing 😍
excellent video ...this is what I was looking for
Dear Sr. you are my hero, thank you so much for sharing this knowledge !
Awesome Videos! If you have Hive Videos as well please tell us.
Very nicely explained. Thanks for the hard work you put in sharing this.
awesome video
thanks a lot
wah wah wah...! period!
I'm getting import error. I don't know what to do, I'm using eclipse oxygen and created the maven project and inside pom.xml, I've installed dependencies as well.
very well explained.
Excellent explanation and clear. Thank you.
Hi sir, I wanted to know if there is an upper limit on data that can be stored in a single partition of a topic at any time?
I think yes and we can specify it using log.retention.bytes
Amazing video! Well explained
Hi , I tired this example but getting exception(Invalid value SupplierDeserializer for configuration value.deserializer: Class SupplierDeserializer could not be found.) please help me. This dependency am using,
org.apache.kafka
kafka-clients
0.10.1.1
In a previous video you had mentioned that all clients connect to the master for the partition. How is the producer connecting to the master?
Good question. If I am not wrong, from his tutorial, what I understood was that this master (called as Leader in Apache Kafka) is the only one who receives the message so there is no need to specify in the args to which replica message should arrive. About how producer delivers message, that is presented in next video tutorial.
Hi, I have a doubt. While storing messages in Kafka in the form of Key-value pair, its not necessary for key to be unique? If yes, then how can we extract all the values of a particular key from Kafka?
Kafka is not a Key-values store. We never extract messages based on a key. I suggest you watch all the videos and note down your queries. Hold on your questions until you reach the end of the series. By the end of the series, you will have answers for most of your questions. If something is still not answered, send me those questions. I will answer them.
Good job sir but there are four core APIs not two. The other two are Streaming API and Connector API. Did you make any video/tut for this? Please share the link? Additionally do you know any .net implementation of kafka?
Kindly confirm do I need to create a maven project in eclipse IDE and then add the dependency of Consumer and producer API to run all the examples described in the videos
Using a build tool like maven to resolve dependency is the ideal method. However, you can also download the JAR and include it in your project. You can find the jar here.
mvnrepository.com/artifact/org.apache.kafka/kafka-clients/0.10.1.0
Nice Explanation thanks
Good one thx...!
Love you sir 🙏🙏🙏 great expalination.
Thank you a lot for this nice tutorial. Good work!
same thing can i get in python
Hello Sir, I'm pretty new to kafka, I'm trying to create sample application using .net code. Can you please help me with some step by step implementation guide like this?
Tussi great ho sir :)
Awesome videos :)
what are jar file required to create producer using java please send me link
Does the client code needs to be updated everytime there's a new broker added?
No. Not at all.
If we use the ProducerRecord API with both key and partition field, which one will be preferred for selecting the partition?
It's a small test, I encourage you to do it yourself and let everyone know the answer.
Thx for the videos. Can we use Kafka for long processing things..like suppose a customer can have orders from 1k to 1million orders..n to process each customer avg it takes 2hrs..can we use Kafka here..presently we are using rabbit mq with 50 threads ..all customers in one queue..
That's what Kafka is designed for :-) Go for it.
Hi Sir, I am from non java platform.can we able to programme any other language
In one of your previous videos you mentioned a producer only speaks with a master broker of sorts, why is it then normal to specify more than one broker?
Producer only speaks to the leader of the partition. That's correct.
But how would the producer know who is the leader?
So, we give him ip for one or more broker (any broker) which is used to query metadata. The metada contain the list of topic partition and leader address.
Sir ,can u add a tutorial on spring/sprint boot? Thanks..
hello sir, your videos are very knowledgeable but I have one doubt related to the role of kafka in production environment. Suppose I am creating a streaming application for Macdonald which will take the customer/order/offer json data from point of sell. This I want to process through spark streaming. So where exactly kafka producer and consumer api will sit here. Will it be a bridge between Macdonald POS and my streaming application...? If yes, then producer api will be bridge or consumer api will be bridge?
I am not able to imagine, please enlighten me.
Producer at POS can send data to Kafka Broker and Spark Streaming should consume it directly from Kafka Broker. You may don't need a consumer in this case.
hi,, can someone please explain me how can we make a directory/folder in unix as kafka producer. I need to read multiple files that are coming in a directory continuously using kafka,, how can i do that,, can someone please explain me.. thanks in advance!
Thanks you so much sir.
Thanks for putting such a wonderful tutorials for us.I am having around 6 years of experience in ETL,DW and reporting background and trying to learn big data technologies from a long time.I have learnt Scala(Basics for spark),Apache Spark on my own and trying to learn Apache Kafka but I am bit worried about java as I don't have any knowledge of it.Can you please tell me how much Java I will have to learn in order to write efficient code for Apache kafka or it is doable through Scala too ?
Kafka is written in Java. There are language binding and wrappers for other languages but I Scala clients are not in good shape. I recommend learning core Java. That's going to help you in long run. Java is too big as an ecosystem and frameworks. But core Java should be the must thing for a big data developer.
Awesome.Would learn basics of java too then
Hello Sir,
Thank you for giving such useful information.
My Query:
When we write our own producer java program, how we will run that jar??
some thing related to kafka/libs/..
You can run your jar using java. Include your jar location in your classpath. Most of the Kafka dependency is available in Kafka lib directory. You may want to include them as well.
hello sir can you share text documentation of kafka which help for cracking interview and learning kafka ,please
I haven't made any document but looks like people want it. I will make one by the end of this month and share.
Hi
Great session !!! Kudos
Any news about the sharing of a text documents of these Kafka session?
Great thanks again, Well Done !
I ran the code
these are the last lines after code got executed , but when I try to read topic contents through kafka console consumer it is showing empty
[main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version : 0.11.0.0
[main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId : cb8625948210849f
[main] INFO org.apache.kafka.clients.producer.KafkaProducer - Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms.
SimpleProducer Completed.
hai sir,
am getting error in execution in terminal. plz can u tell me how to execute in terminal.
What error are you getting?
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic tn (enter)
its not come out form dat.its not displying any msg.
sir do u have step by step process for executing producer in cosole in github.
I have documented all steps, commands etc at my website. Check this page. It might help you resolve the issue.
www.learningjournal.guru/courses/kafka/kafka-foundation-training/quick-start-demo/
One more thing. Start the consumer first in one terminal and then start the producer in another terminal. Producer should send messages when consumer is already listening for it. Sometime that helps.
i done the above video sir.actually its really help full. i understand the process in the above video.am preparing my own nodes based on ur tutorial sir.
my doubt is how can i run the producer api , consumer api in terminal.
The process for compiling and executing producer and consumer API is also explained in the video. Source code is available at the website and also in GitHub. I used SBT to compile and run the code. If you have any doubt on SBT, I have a full SBT tutorial as part of my Scala tutorial playlist.
Sir, This is very informative. do you have any tutorial for java?
No Java yet.
available on udemy?
No. It's free at UA-cam.
Gold
anyone please help on How to send Xml file?
Tutorial is good but how to import his project which build that is killing me
Didn't get you.
I am trying with Maven now as am aware of Maven and Gradle, but I dont know about SBT. Please post SPARK soon.Thanks again.
Hi