Wow fantastic! Very easy and understable explanation and I know now the right route to envelope a basic consumer that read and manage a set of data that producer (a backend ad example) send on topic after , ad example, a classic query search request with, ad example, rest request . Now I will try to reply this example and next I will find noctions to set a simple listener springboot producer to send resultset by consumer client. This is my objective now. Thanks
Thank you for this video. This was really helpful and saved a lot of time. Clear explanation along with the issues we will run into. Keep up the good work!!
If it doesn't wotk, try to copy it in your POM.XML in : com.fasterxml.jackson.core jackson-databind com.fasterxml.jackson.core jackson-core com.fasterxml.jackson.core jackson-annotations Have a good one!
Congratulations on your tutorial. Very informative to get started. Could you provide an example when Kafka throws a Deserialization error: eg. when an invalid json is produced and delivered.
Nice and very informative video.. Request you to also please share a video on how to create zoo keeper/kafka/producer/listener using command line which you are continously using during your video. Thanks for this video..
Hi, I do not follow why the group id is defined twice, discussion 7:10. One time it is for the consumer and one time for the Configuration. If the two group ids are different say group_1 for the config factory and group_2 for the consumer, what impact would it have on message consumption. Would it work or that both the values have to be the same. If same then why ?
In config we are creating a group with the specified group ID say group_1, now in consumer we are using that group with the corresponding group ID which is group_1. If you want to use group_2 in consumer than you should also have a group with the corresponding ID to be configured.
Hey man.... U r very knowledgeable.. Can u create some. Complex apps... Using spring boot.. Kafka.. Multi threading n all.... Like handling huge traffics of request... Apart from this simple crud apps
Nice tutorials. I've got a question though: Why did you separate the producer example and the consumer one in separated projects? It's because you applied a "microservices" pattern? Thanks!
because from the code perspective consumer and producer might be developed by various independent teams. So they define the contract to communicate and start implementation by their own
Nice tutorials @Tech Primers! But please notice that Spring Boot allows us to avoid all the boilerplate code that we introduce creating a Java class annotated with @Configuration. You can use application properties / application.yml (also using properties is easier than declaring your own beans). Here it is an example: github.com/didorg/kafka-consumer Cheers... & excellent videos!
Thank you for your tutorial. it is really helpful. But, I have one question regarding the consumer.. is it possible to consume the data like list of entity (e.x List in your demo) I have tried that but nothing works on my own
Hi, Thanks for the video. Can you please explain a scenario where multiple kafka partitions are required? Also i want to know if the message still exists in the cluster even after the clients have consumed it. If so when will it be cleared?
multiple partitions are useful when you want to publish a specific pattern of messages on to a partition and the consuming process can be different based on partition. the purge policy of the data is configured in the Kafkacluster
Good video.Do we need to control the listener explicitly using the KafkaEndpointListener registry if we want to pause the listener if needed. Also on restart of the application, how the consumer can read the messages sent when it's down. Is it based on the offset config (latest?)
Thanks for making the video, I have doubt in the KafkaConsumer class. Why u didn't specify any containerfactory in kafkalistener when u consumed string message? But you specified it when you consumed json message. Thanks in advance
Thanks for the nice session..it's very crisp and clear..one doubt..if our consumer needs to read from multiple topics with diff schemas,can we use same listener?
firstly thanks for this great concise video. What I wonder is that do we have to create separate consumer and listener factories for each different type of object or is there a way to make a common factory that maps to corresponding object ?
I followed this tutorial and try to a consumer a message from the command line to my spring boot application. i'm not receivng any message but it's working in consumer CLI
can you make a video how to use spring-data-jpa with mysql and apache kafka ... fetching the data from mysql using my spring data and showing that data into view and also the chatting system. I want the view with a real time ...
Do we need to predefine the topics name in the @KafkaListener annotation ? What if I have 100 topics and over the period of time the topic names may grow, how do we handle this scenario ?
I can not consume the messages any way . I controlled group id, topic name, everywhere. But my kafkaListener is not working ? Do you have any idea about that?
Hi. I have created an application with both and it worked. Do you intend to create a tutorial about integrating avro schemas to kafka. Thank you for your tutorial it helped me to figure out about the kafka integration on srping boot.
I am getting an error Error while fetching metadata with correlation id : {exampletopic= UNKNOWN_TOPIC_OR_PARTITION} I have verified that the topic is present can you please help ?
I am getting error while starting application...... In kafka server terminal it shows error related to api key is 3 and api version is 2. I am not getting solution for it.
Hi ,I need to create topic ,but I do not have Kafka in my system, I have created springboot microservice using gpscaffold ,in that I added consumer Kafka ,but now I need to create topic but I am got confused ,where I could create that
never did it dynamically. i assume its possible with spring, but you may have to understand how the listeners are initialized and registered in spring internally
A little finding in regards to the no default constructor error: Alternatively you can use @JsonCreator to annotate the parameterized constructor so that a redundant default constructor can be saved. This way it'll be helpful to keep the immutability of the User class should you decide to make it so later on.
Hello, I have a question. Why did we started a consumer when we created the producer spring boot app and now we're starting a producer when this tutorial is about a consumer app?
He started consumer because he wanted to check weather his code of producer spring boot app which is sending messages to topic are weather able to be received by the consumer or not and vice-versa! :)
I tried with only the jason listner but it is showing me error no bean found Parameter 1 of method kafkaListenerContainerFactory in org.springframework.boot.autoconfigure.kafka.KafkaAnnotationDrivenConfiguration required a bean of type 'org.springframework.kafka.core.ConsumerFactory' that could not be found. - Bean method 'kafkaConsumerFactory' in 'KafkaAutoConfiguration' not loaded because @ConditionalOnMissingBean (types: org.springframework.kafka.core.ConsumerFactory; SearchStrategy: all) found bean 'userConsumerFactory'
getting error: org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition dont know why is coming and going in loop. Anyone please help .
Hi i tried both the publisher and the consumer steps mentioned here as two different springboot application. i could publish the msg easily but while trying to consume teh message i am getting the below error org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition kafka_example-0 at offset 1. If needed, please seek past the record to continue consumption. Caused by: java.lang.IllegalArgumentException: The class 'com.aexp.dbkafka.entity.ClientUserDTO' is not in the trusted packages: [java.util, java.lang, com.aexp.kafkaconsumer.bean]. If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (*). Note that for the object in the publisher application is com.aexp.dbkafka.entity.ClientUserDTO while in consumer application it is com.aexp.kafkaconsumer.bean. How should i solve this ?
Hi Tech Primers, for Example. we are using the User. So, we created a new Consumer factory. I have 20 models. I need to create 20 consumer factories or is there any solution. Could you please guide me to get that?
had zero understanding of Kafka on a coding level now it looks much clearer thanks buddy...
Wow fantastic! Very easy and understable explanation and I know now the right route to envelope a basic consumer that read and manage a set of data that producer (a backend ad example) send on topic after , ad example, a classic query search request with, ad example, rest request . Now I will try to reply this example and next I will find noctions to set a simple listener springboot producer to send resultset by consumer client. This is my objective now. Thanks
Super tutorial to get started with kafka
On of the good video on Kafka real world example . Keep doing the great work
Thank you for this video. This was really helpful and saved a lot of time. Clear explanation along with the issues we will run into. Keep up the good work!!
Very helpful in real-time project. If possible could please make videos on Kafka Stream
Thank you so much for this tutorial, bro, very helpful.
Very nice tutorial. Can you please make a video on kafka producer and consumer that use ssl sasl with kerberos for security.
very very very very good tutorial
If it doesn't wotk, try to copy it in your POM.XML in :
com.fasterxml.jackson.core
jackson-databind
com.fasterxml.jackson.core
jackson-core
com.fasterxml.jackson.core
jackson-annotations
Have a good one!
Can you please post videos on unit testing or integration testing these real-time codes? Thanks for all these videos
Sure Saurav.
@20:10 Any reason why the consumer is showing the json in single quotes as compared to the producer message being sent in double quotes
?
That is deserialized message to object and displayed by toString which is hardcoded with single quote
Congratulations on your tutorial. Very informative to get started. Could you provide an example when Kafka throws a Deserialization error: eg. when an invalid json is produced and delivered.
Nice tutorial. Do you know how to consume it using the concurrent consumer pattern (Multiple consumer instances)?
Hi please create video for catching exception like broker down or timeout or any unhandled exception using spring clould handlers...without dlq..
For the same can u do one video regarding same to be stored in cassandra database or any other
simple and clear. Thanks!
Hi, Can you also make video to set the security for kafka consumers and producing?
Thanks...clear example
Can you explain for connecting to all partitions ( dynamic ) when using kafka listener ?
Nice and very informative video.. Request you to also please share a video on how to create zoo keeper/kafka/producer/listener using command line which you are continously using during your video. Thanks for this video..
Awesome tutorial.
Hi, I do not follow why the group id is defined twice, discussion 7:10. One time it is for the consumer and one time for the Configuration. If the two group ids are different say group_1 for the config factory and group_2 for the consumer, what impact would it have on message consumption. Would it work or that both the values have to be the same. If same then why ?
In config we are creating a group with the specified group ID say group_1, now in consumer we are using that group with the corresponding group ID which is group_1. If you want to use group_2 in consumer than you should also have a group with the corresponding ID to be configured.
1st comment and 1st view..Loving ur videos man..Thank you..
Hey man.... U r very knowledgeable.. Can u create some. Complex apps... Using spring boot.. Kafka.. Multi threading n all.... Like handling huge traffics of request... Apart from this simple crud apps
Nice tutorials. I've got a question though: Why did you separate the producer example and the consumer one in separated projects? It's because you applied a "microservices" pattern? Thanks!
Thinking about the same! +1
because from the code perspective consumer and producer might be developed by various independent teams. So they define the contract to communicate and start implementation by their own
Nice tutorials @Tech Primers! But please notice that Spring Boot allows us to avoid all the boilerplate code that we introduce creating a Java class annotated with @Configuration. You can use application properties / application.yml (also using properties is easier than declaring your own beans). Here it is an example: github.com/didorg/kafka-consumer
Cheers... & excellent videos!
Nice short cut, bro. please, how do we configure the producer using the same approach?
Thaaaank u sooo much. I'm new with Kakfa, in my work it's being used... You helped a lot ;)
Please also add an example on Avro kafka spring boot
Thank you for your tutorial. it is really helpful.
But, I have one question regarding the consumer..
is it possible to consume the data like list of entity (e.x List in your demo)
I have tried that but nothing works on my own
Hi, Thanks for the video. Can you please explain a scenario where multiple kafka partitions are required? Also i want to know if the message still exists in the cluster even after the clients have consumed it. If so when will it be cleared?
multiple partitions are useful when you want to publish a specific pattern of messages on to a partition and the consuming process can be different based on partition. the purge policy of the data is configured in the Kafkacluster
Good video.Do we need to control the listener explicitly using the KafkaEndpointListener registry if we want to pause the listener if needed. Also on restart of the application, how the consumer can read the messages sent when it's down. Is it based on the offset config (latest?)
Thank you for clear explanation
Really good ... is there any vid on kafaka offSet management and partitioning samples ?
Not yet. Will do
Thank you for the content.In this example broker host name in configuration is hard coded. For multi broker cluster what should we pass ?
Thanks for making the video, I have doubt in the KafkaConsumer class. Why u didn't specify any containerfactory in kafkalistener when u consumed string message? But you specified it when you consumed json message. Thanks in advance
Thanks for the nice session..it's very crisp and clear..one doubt..if our consumer needs to read from multiple topics with diff schemas,can we use same listener?
Great explanation and example! Cheers mate!
firstly thanks for this great concise video. What I wonder is that do we have to create separate consumer and listener factories for each different type of object or is there a way to make a common factory that maps to corresponding object ?
Simple and Clean .
can you please create a video of using camel with kafka also how to handle the message delivery failure with retry logic
Simple and best.
Hi , very nice video and thanks for GitHub link , I am facing issue with running my Kafka installation on windows
Very Nice tutorial. Thanks
I followed this tutorial and try to a consumer a message from the command line to my spring boot application.
i'm not receivng any message
but it's working in consumer CLI
same here, did you ever figure out what the issue was?
can you make a video how to use spring-data-jpa with mysql and apache kafka ... fetching the data from mysql using my spring data and showing that data into view and also the chatting system. I want the view with a real time ...
Do we need to predefine the topics name in the @KafkaListener annotation ? What if I have 100 topics and over the period of time the topic names may grow, how do we handle this scenario ?
I can not consume the messages any way . I controlled group id, topic name, everywhere. But my kafkaListener is not working ? Do you have any idea about that?
Nice Video.Will you please create more videos on Kafka or JMS?
Sure will do
Simple et clear! ty!!!
Goooood video. Maybe you will make video about configuration Kafka && Zookeper like in real world app?
Hi, how can set up Kafka on local windows 10 system?
Please use white colour background in STS other wise it will create headache
Hello. I would like to know if I can have both consumer and producer in a springboot application?
Yes Lucas. You can have both
Hi. I have created an application with both and it worked. Do you intend to create a tutorial about integrating avro schemas to kafka. Thank you for your tutorial it helped me to figure out about the kafka integration on srping boot.
Awesome ❤️🔥❤️🔥❤️❤️❤️❤️❤️
Nice tutorial! thank you...
Hi any idea on how to read Kafka messages on interval bases ? Like I want to read messages every 5 min once..
Song was Flix and Chill 2 : Millenials
thank u buddy! amazing job!
You are amazing 👍
Why concurrentKafkaTemplate in consumer and not in publisher ?
I am getting an error
Error while fetching metadata with correlation id : {exampletopic= UNKNOWN_TOPIC_OR_PARTITION}
I have verified that the topic is present
can you please help ?
good tutorial wasnt expecting
Hii can i cosume message from multiple different topic with one listiner Or one cosumer method
Hi, I'm getting User{name='null', dept='null'} printed, can you please help debug this?
I am getting error while starting application...... In kafka server terminal it shows error related to api key is 3 and api version is 2.
I am not getting solution for it.
Hi ,I need to create topic ,but I do not have Kafka in my system, I have created springboot microservice using gpscaffold ,in that I added consumer Kafka ,but now I need to create topic but I am got confused ,where I could create that
You have to setup kafka buddy. Check kafka getting started documentation
Hi,
Is it possible to create listener to kafka topic dynamically to read. And then destroy it based on some condition.
Thanks,
never did it dynamically. i assume its possible with spring, but you may have to understand how the listeners are initialized and registered in spring internally
Can the previous producer and current consumer work in tandem?
Yes buddy. It does
A little finding in regards to the no default constructor error: Alternatively you can use @JsonCreator to annotate the parameterized constructor so that a redundant default constructor can be saved. This way it'll be helpful to keep the immutability of the User class should you decide to make it so later on.
Thanks . How can create deserializer for all model ?
Very helpful tutorial !! where kafka stores data? Is it kind of db as well?
Its stored in the filesystem.
Thanks for this tutorial
Can we have multiple group id on one consumer config. method ?
Hello, I have a question. Why did we started a consumer when we created the producer spring boot app and now we're starting a producer when this tutorial is about a consumer app?
He started consumer because he wanted to check weather his code of producer spring boot app which is sending messages to topic are weather able to be received by the consumer or not and vice-versa! :)
How to commit offset after receiving each message
How can we create consumer listeners dynamically per topic?
Awesome Man!
I tried with only the jason listner but it is showing me error no bean found
Parameter 1 of method kafkaListenerContainerFactory in org.springframework.boot.autoconfigure.kafka.KafkaAnnotationDrivenConfiguration required a bean of type 'org.springframework.kafka.core.ConsumerFactory' that could not be found.
- Bean method 'kafkaConsumerFactory' in 'KafkaAutoConfiguration' not loaded because @ConditionalOnMissingBean (types: org.springframework.kafka.core.ConsumerFactory; SearchStrategy: all) found bean 'userConsumerFactory'
Looks like you missed an annotation. Can you cross check with my code in GitHub
Bro, your intro music is lit. Any link?
Great videos......sir
cloud you please share kafka publish video
Here you go ua-cam.com/video/NjHYWEV_E_o/v-deo.html
how can consumer receive messages if he connected late plz help ( afte the message was send from the producer )
thats done using the offsets in Kafka
I didn't understand how did you create the producer !!!
That's a command line based producer. You can check my other video on Kafka producer to see how it can be done via code
getting error:
org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition
dont know why is coming and going in loop. Anyone please help .
Hi. Have you fixed this problem? I have the same error, I don't know what to do
Can u make a video on how to integrate a web application with paytm.
Hi i tried both the publisher and the consumer steps mentioned here as two different springboot application. i could publish the msg easily but while trying to consume teh message i am getting the below error
org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition kafka_example-0 at offset 1. If needed, please seek past the record to continue consumption.
Caused by: java.lang.IllegalArgumentException: The class 'com.aexp.dbkafka.entity.ClientUserDTO' is not in the trusted packages: [java.util, java.lang, com.aexp.kafkaconsumer.bean]. If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (*).
Note that for the object in the publisher application is com.aexp.dbkafka.entity.ClientUserDTO while in consumer application it is com.aexp.kafkaconsumer.bean.
How should i solve this ?
Hi Kaushik, even I am getting the same error, did you manage to solve it..
pretty good, thanks
Big Thanks
thanks bro
Hi Tech Primers, for Example. we are using the User. So, we created a new Consumer factory. I have 20 models. I need to create 20 consumer factories or is there any solution. Could you please guide me to get that?
Thank you
Am I the only one who is looking for the part where he explains how to install Kafka?
Nice movie
OSM
Kafka without SSL is not Kafka
хороший индус
индус
Do you know how to configure Json deserializer through application properties file rather then configuring in java??
thank you