Cool !! very useful video. It's simple to understand Kafka. May I request you to make a video on multiple microservices communicating with each other using Kafka. Thank you once again for the tutorial.
How do we need to work with multiple cluster nodes with zookeeper and spring boot (kafka). How it will prioritize the incoming requests, because all requests are given to nodes (1..n) and in that one node gets prioritiy. How kafka handles requests and passes to zookeeper with spring boot
Instead of the config file, we can make use of the properties file by adding two entries in application.properties spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer Springboot will take care of the DI
I have a doubt, at 16:26, we have created a bean for producer factory but instead of autowiring in Kafka template why do we need to call the function producerFactory()?
Really thankful to your videos i somehow started my journey in micro services ...One Query *where will we use kafka in real world projects ? If any of you guys could share the example it will be really helpful* Thank you..
it can also be used for general event driven programming - you may have many microservices in your project, and they may be waiting for some event to be triggered asynchronously. By putting messages onto a kafka topic, the consumer will detect that a message has been put onto that specific topic, and will consume from it immediately, and enable that flow to begin, whilst some other unrelated process continues on as part of that specific flow
Good tutorial. How are you reading such things . Detailed info. while you are importing package do please show the import class once you are done with the coding for the class.
@cafeta can you please help me with sample code, I m new to kafka and trying to use with docker. Kafka broker and zookeeper are running in docker and I am trying to produce and consume from other docker container.
The way you are explaining is really good, you are making things really easy... I have one question though, In the first example when we are sending a string message to Kafka, we didn't configure anything. Configurations like which server(BootStrapServers), Serialization info(Key and Value Serializers). How does it happen, KafkaTemple by default does some magic inside? What happens info I don't send Serializers info to Kafka broker while sending publishing a message?
Thank you TechPrimers for this video, it was helpful for me to start with it. Also i would request to explain the concepts a bit slow than the present speed which is available in this video.
I am facing an issue in running the KAFKA server. My ZOOKEEPER is UP. But when I am trying to UP the KAFKA server. Nothing is displayed on the screen. Even i have checked the PATH variable in USER/SYSTEM variables.
clearly understanding bro.But whats my question is in my case i want to send a file content in a topic and i have to consume it.Then what kind of value serializer have to use?
Very good video and nice explanation in simple terms. I have one query, can we publish a flat file in kafka topic. If yes, please provide the details or example on it.
Hi sir, am unable to print the data in console while hitting the loacalhost url In the browser it is showing puliah successfully but not publishing anything in console. Please assist me with this. Thanks in advance.
Hi i got one error when i try to publish user object kafka.controller.User cannot be cast to java.lang.String I set the configuration exactly the same way as per video
I followed exactly the same thing as yours, but why I got an error saying "cannot convert value User to class org.apache.kafka.commo.serialization.StringSerializer specified in value.serializer"
Add this to application.properties file spring.kafka.producer.value-serializer= org.apache.kafka.common.serialization.StringSerializer spring.kafka.producer.key-serializer= org.apache.kafka.common.serialization.StringSerializer
Nice explanation! Now I need to find how to receive a published message (serialized as json) in another spring application. What do you think about having a common module between two or more applications, containing the message templates? That would be handy, but also dangerous if somebody changes them without checking who is receiving them.
The kafka Topic was created on port 2181 , then why was the Producer Config Set to 9082 ? In the Consumer Video also the Config Was having the same port 9082 . The producer Should be Posting on Port 2181 and the Consumer should be listening to port 9082 right?
What is the meaning of key.serializer config ? Is it the topic name ? Since topic name is going to be in string we provide key.serializer = StringSerializer.class ?
Very Good TOPIC , but my cases it was not working for JSON format. I am getting some ClassCastException..I changed to the following code, it started working... @Autowired private KafkaTemplate kafkaTemplate; private static final String TOPIC = "Kafka_Example"; User user = new User(name, "Mechanical", 10000L) ; ObjectWriter ow = new ObjectMapper().writer().withDefaultPrettyPrinter(); String json = null; try { json = ow.writeValueAsString(user); } catch (JsonProcessingException e) { // TODO Auto-generated catch block e.printStackTrace(); }
while running in my local same code is giving the serializationException,unable to figure it out every thing is downloaded from GIT.pls help me to resolve
Wasted a whole day reading a dzone article regarding kafka, this tutorial made my day, beautifully explained!!!!
Glad it was helpful!
Absolutely
bahut sahi bhai
Yes ,dzone is waste to read. Only that author will know what he is trying to say
Watched so many Vidoes on Kafka to understand but finally end up with this video for kafka understanding.. Thank you for serving IT community.
Great and simple explanation. Nothing extra, just straight on point all the time. Thank you!
This is pure gold, the way you explained using a simple GET and showing how to pass a String and then a JSON make the example perfect, great job.
Great video.. Easily understood.. I am a beginner to Kafka. Didn't understand all the theory part. But this practical example useful very much.
Perfect video to start playing with Kafka & spring boot
Amazing channel!!! ONE OF THE BEST I've SEEN!!! THANK YOU SO MUCH! WITH LOVE & RESPECT FROM KAZAKHSTAN!!!!
thats exactly what i was looking for, thank you so much!
It is very useful video; I learned about Kafka from this video tutorial. Thank you so much and god bless you.
Very straightforward explanation. Thank you so much. It's very useful for me!
Very simple and straight forward explanation .... Thanks...
Short and sweet tutorial. Thanks 😊
Cool !! very useful video. It's simple to understand Kafka. May I request you to make a video on multiple microservices communicating with each other using Kafka. Thank you once again for the tutorial.
yes sir, plz do that same.
Do it do it do it
Awesome - simple and great content
Thank you for clear explanation along with example
Very clear explanation. Thank you
glad i found this channel
Great and simple explanation
if possible please share a real time use case of kafka in springboot µservices
It's shared already naveena
Take a look at my video on Kafka Streams
Really good explanation. It was working perfectly for me.
How do we need to work with multiple cluster nodes with zookeeper and spring boot (kafka).
How it will prioritize the incoming requests, because all requests are given to nodes (1..n) and in that one node gets prioritiy. How kafka handles requests and passes to zookeeper with spring boot
One of the great tutorial on Kafka clear and concise
Awesome video bro. Really helpful. Thanks.
great, very nice explanation
Thanks for simple and clear explanation
Great explained.. please post more videos on Kafka. Thank you.
@Bean above producerFactory method is not required. right? as it is getting called from the kafkaTemplate method which has @Bean
We can also configure in application.properties instead of configuration class.
very nice...neatly explained
Glad you liked it
BEST and CLEAN Thanks
Hi, can you also cover the Sync and Async producer send options with Future and Callbacks?
Simple and good explanation.
Lovely explanation 👌🙏🏼
Instead of the config file, we can make use of the properties file by adding two entries in application.properties
spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
Springboot will take care of the DI
What if I want to use my own serializer like mapstruct for example?
Thank you! You solved my problem!
Thank you for your video I want some clarification on Kafka is it suitable for financial services like banking sector
If you use necessary security measures for authentication, authorization, data encryption/masking, storge, etc., then you can use it.
Here is a case study from Capital One bank www.confluent.io/kafka-summit-sf18/building-an-enterprise-streaming-platform-capital-one/
@@TechPrimers thank you for your reply
Good stuff, straight to the point. Thanks.
Thanks. It worked like charm!!!!
I have a doubt, at 16:26, we have created a bean for producer factory but instead of autowiring in Kafka template why do we need to call the function producerFactory()?
Good and clear explanation..
Thanks for the video. It's very useful.
Awesome video !! Hello from Colombia
Hi Carlos. Great to see you.
Welcome to TechPrimers
Really thankful to your videos i somehow started my journey in micro services ...One Query *where will we use kafka in real world projects ? If any of you guys could share the example it will be really helpful* Thank you..
Usually Kafka is used in Big Data space where huge stream of messages need to be processed. It goes well with Spark Streaming and Kafka Streams
Thank you Ajay for the instant reply
it can also be used for general event driven programming - you may have many microservices in your project, and they may be waiting for some event to be triggered asynchronously. By putting messages onto a kafka topic, the consumer will detect that a message has been put onto that specific topic, and will consume from it immediately, and enable that flow to begin, whilst some other unrelated process continues on as part of that specific flow
Thank man, simple but detailed informative, keep it up, wish u all the best...
Thank you so much.. it is very simple and I made it in 30 mins using this tutorial
Thanks for this video. Simple and useful. it was very helpful!
Great session!
Awesome explanation. Able to understand the concepts clearly. Thanks.
Good work guys!
Thank you so much, bro.
This video helped me a lot :)
Awesome xplanation, thnx! Could do some for springboot, reactive webflux.
You are my tech guru
It was amazing tutorial.you have explained one way but what you think about Kafka consumer.Did you cover any video for that?
Good tutorial. How are you reading such things . Detailed info. while you are importing package do please show the import class once you are done with the coding for the class.
Amazing video!
Excellent tutorial, I fallowed but the only different is I use Kafka inside a Docker container.
@cafeta can you please help me with sample code, I m new to kafka and trying to use with docker. Kafka broker and zookeeper are running in docker and I am trying to produce and consume from other docker container.
very nice tutorial.
request one thing to add is kafka test
super good and straightforward, only question is why are you using BOOTSTRAP_SERVER_CONFIG at port 9092 if default port is 2181? thanks
Very nice and usefull video
Can make a video on rebalancing and how to handle the rebalance at the consumer end.
The way you are explaining is really good, you are making things really easy...
I have one question though, In the first example when we are sending a string message to Kafka, we didn't configure anything. Configurations like which server(BootStrapServers), Serialization info(Key and Value Serializers).
How does it happen, KafkaTemple by default does some magic inside?
What happens info I don't send Serializers info to Kafka broker while sending publishing a message?
Thank you TechPrimers for this video, it was helpful for me to start with it. Also i would request to explain the concepts a bit slow than the present speed which is available in this video.
You can leverage the speed control in UA-cam to slow down the video by 0.5x or 0.25x 🤓
Thanks for the great content
awesome Video !!! thanks dude!
I am facing an issue in running the KAFKA server. My ZOOKEEPER is UP. But when I am trying to UP the KAFKA server. Nothing is displayed on the screen. Even i have checked the PATH variable in USER/SYSTEM variables.
very good tutorial!
Great! Thanx for your effort!
What if Kafka server down and we are pushing data how to handle this situation.
Great. Thanks for your work, as always.
clearly understanding bro.But whats my question is in my case i want to send a file content in a topic and i have to consume it.Then what kind of value serializer have to use?
Good tutorial, maybe you will make a new video about consuming kafka message with spring boot.
Yes. uploaded the consumption part now. check out the latest video
Ooooouu so fast.
Very good video and nice explanation in simple terms. I have one query, can we publish a flat file in kafka topic. If yes, please provide the details or example on it.
Thanks very much for this video.. Simple and effective
Hi sir, am unable to print the data in console while hitting the loacalhost url
In the browser it is showing puliah successfully but not publishing anything in console. Please assist me with this.
Thanks in advance.
Nice video. Can you please tell me why do we need to start consumer when we are creating a producer. ?
That’s just for demo
Sweet and simple , thanks for the video :)
Sir please mention order of videos . for own learners it's too hard to follow the playlist
Hello Sir, how do we decide when to use messaging or when to use RestTemplate ? Can you give me some practical use cases for messaging
Hi
i got one error when i try to publish user object
kafka.controller.User cannot be cast to java.lang.String
I set the configuration exactly the same way as per video
I followed exactly the same thing as yours, but why I got an error saying "cannot convert value User to class org.apache.kafka.commo.serialization.StringSerializer specified in value.serializer"
Add this to application.properties file
spring.kafka.producer.value-serializer= org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.key-serializer= org.apache.kafka.common.serialization.StringSerializer
Nice explanation!
Now I need to find how to receive a published message (serialized as json) in another spring application.
What do you think about having a common module between two or more applications, containing the message templates?
That would be handy, but also dangerous if somebody changes them without checking who is receiving them.
There is a concept of Schema Registry where the message contract is stored. Confluent Kafka provides it.
@@TechPrimers thx
Very good! Thank you friend! 🇧🇷
The kafka Topic was created on port 2181 , then why was the Producer Config Set to 9082 ?
In the Consumer Video also the Config Was having the same port 9082 . The producer Should be Posting on Port 2181 and the Consumer should be listening to port 9082 right?
How to work with Kafka on WIndows? It keep on throwing file access denied error
What is the meaning of key.serializer config ? Is it the topic name ? Since topic name is going to be in string we provide key.serializer = StringSerializer.class ?
hey pls tell yeah in kafka console its not displaying the message but localhost :8081 showing the message Published
Hi can we publish XML directly to Kafka topic. If yes can you provide me an example Iam in need of it. Thanks in advance ..
Very good.
from localhost im not able to publish my message what can I do?
Thank you for simple and complete explanation. :-)
Please, how calls last song in this video?
Awesome ❤️❤️❤️🔥❤️🔥❤️❤️❤️❤️
How did the Spring boot initially connected to Kafka without mentioning anything?
Very Good TOPIC , but my cases it was not working for JSON format. I am getting some ClassCastException..I changed to the following code, it started working...
@Autowired
private KafkaTemplate kafkaTemplate;
private static final String TOPIC = "Kafka_Example";
User user = new User(name, "Mechanical", 10000L) ;
ObjectWriter ow = new ObjectMapper().writer().withDefaultPrettyPrinter();
String json = null;
try {
json = ow.writeValueAsString(user);
} catch (JsonProcessingException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
kafkaTemplate.send(TOPIC, json);
Very helpful videos.
Do we have something that can post a Object directly to the topic?
Nope buddy. Not directly
while running in my local same code is giving the serializationException,unable to figure it out every thing is downloaded from GIT.pls help me to resolve
how to change port number 9092???? 9092 is default, if I use other than 9092 then its not working
awesome video...
That explanation was really really good mate! Cheers for explaining step by step! All the best :)
Good explaination
Very good. Thank you
Can we use json object mapper instead of serialization class?