Spring Cloud Stream With Apache Kafka Binder | Example | JavaTechie
Вставка
- Опубліковано 4 лют 2025
- This video will guide you How to build highly scalable event-driven microservices connected with shared messaging systems using Kafka Binder
#javatechie #SpringCloud #ApacheKafka #kafkaStream
GitHub:
github.com/Jav...
Blogs:
/ javatechie
Facebook:
/ 919464521471923
guys if you like this video please do subscribe now and press the bell icon to not miss any update from Java Techie
Disclaimer/Policy:
--------------------------------
Note : All uploaded content in this channel is mine and its not copied from any community , you are free to use source code from above mentioned GitHub account
Basant thank you. You are a Genius and a very technical and kind person.
This is really awesome.
I just have created similar producer-consumer. Which use to produce data as an object using Kafka which required lots of configuration at both side.
With Spring Cloud Stream we have specified almost null configuration.
Yes exactly
what a relief, struggling to understand these, thank you so much , excellent explanation, :-),
Thanks Kamal ☺️
Simply super, nice videos 👍👌
Awesome ...Thank you for sharing your precious knowledge with us.
Great tutorial with so well explanation. Thank you for making such kind of tutorial.
Very helpful tutorial, you are doing great work, keep it up...
Awesome explanation..thank you
2 question faced in interview for kafka/Rabbit Mq:
1) how to avoid duplicate message to be consumed in consumer? .
2) how to handel if topic is full of messages?
Did u find a solution to 1 ?
The content you showcase is amazing and really good to understand thank you sir
At last my doubts Clarified.... Tq a lot
Excellent Bro..👍
Awesome . Very clean explanation. Thank you Bro🙏
Superb, very easy way explained and good to remember me this concept. please make video on kafka vs rabbitmq and with use case.
Very informative. Could you also make a video on data transfer using avro instead of json. Also some use case where multiple input and output topics are involved would be great. Thanks.
Excellent explanation as always sir..!
Extremely helpful for beginners, thanks
Quality video tutorial. Just curious to know which software you are using for screen recording and which device you are using for voice recording.
God Bless You.
nicely explained, very helpful
Excellent .
Thank you so much 🎉🎉🎉
Thank you. This is quite detailed.
Hi @Java Techie, I have a specific question. Lets assume that for some reason, your consumer application was down. How would you know that the message sent through the message bus (kafka) has not been consumed and how do you mitigate around that?
It doesn't matter whether your consumer is up or not once message received to topic , whenever your consumer will listen to it immediately he will receive events
@@Javatechie, thank you. I will check this out.
Clear and concise explanation! Thanks!
Yes, its clear and very easy to understand.
@@raghavendrabhandari6150 true
Awesome
Thank you, I was wondering how spring boot and kafka are related.
Well explained.Thank you for this explanation
awesome broo.. waiting for this tutorial.. clear explanation
Hello @Basant, Thanks for posting such an informative video in great detail in a simple way.
Just one small doubt, When the consumer was down initially, your first request from producer should have parked to the topic (18:39), but was not reflected when the consumer was up (18:55). Did we lose the message published initially, consumer should automatically have picked up the message when started which is parked at 18:39? Please guide.
Consumer should automatically pick-up when back to online buddy
Great video.. Thank you !!!
Hi Java Techie, your all videos are worth trying. Could you please make a video on recent spring cloud stream (3.1.3) with Kafka because certain annotations which you have used like EnableBinding, StreamListener are deprecated .
Hi Asha , definitely I will try with update one
@@Javatechie Dear, Thanks a lot for amazing kafka videos, it would be too good if you make a video as per above request.
@@subhrajeetpadhy sure buddy will do this
Very Informative . Can you show how to configure multiple bindings is a same service .
Yes will definitely try
@@Javatechie Can you also show a case where a Sink/Consumer can also publish to another topic after it is done processing? Great video!
Very useful videos, Where to add kafka properties.
In application.yml
kindly do a video on spring cloud streams with supplier, function and consumer
Hi Basant,
I like to watch your all youtube videos.
I have a recommendation for you and it will be quick and easy -
Can you please dockerize these projects? (current video)
producer consumer of apache kafka with spring cloud on DOCKER??
On Windows OS(please)
Ok I will do this
You created topic in Kafka, and I belive Kafka is running on some port. How publisher and subscriber is communicating with Kafka server to produce/consume from topic, we did not provided any configuration from producer/consumer side?
Amazing and very helpful, thanks so much... it's just superb. I would like to achieve exactly the samething but using docker in stead of running localy. Do you have a tutorial that explores something of that nature?
using docker i haven't tried
Hi @EnableBinding is deprecated , can u please let us know what is the alternative we can use.
Thanks for accepting my request
Thanks for this. It's a shame the @EnableBinding is already deprecated and there is no documentation on referring to "@deprecated as of 3.1 in favor of functional programming model". Doesn't seem to be many reliable resources to go from this.
did u find out how to replace it?
@Bean(name = "input")
Function uppercase() {
return in -> new Transformer(in, in.toUpperCase());
}// it will read from topic input-in-0 and perform transformation and publish to input-out-0 topic // no configurations needed in properties file or Java class
Thank you :)
Thanks for the video, if possible make an example with RabbitMq also, so that we can differentiate easily
Will do
@@Javatechie thank you bro
Nice video . i have one question here. if I need to use more than one object to consume. how can i do it? .
Publish list of object in source and consume list from sink
Hi @Basant Nice presentation and I am trying to do the same .But consumer is not listening to the message .I haven't added the iml file .Is there any reason not printing statement in those method
Sink annotation is there ?
@@Javatechie yes .streanlistener(Sink.input) and enabled bindings for class and annotated with component.not showing errors but not entering to that method
Thanks..Can you share video how can configure multiple brokers...
@Enablebinding is deprecated now.What is the alternative for this?I want to build a kafka consumer in spring boot.What are the options?
Same query.
We didn't mentioned here the kafka IP and host details, that also done required with the binder?
It's not required that's what the beauty of spring cloud stream. Based on your binder spring cloud is smart enough to identify what messaging channel you are using
I really respect you sir.
You always reply on comments thanks.
What if Kafka server is on different server and application on different?
Then you need to configure bootstrap server details buddy but again that is one go solution
Sir, pieces other than the binder looks tightly coupled. In a real time scenario, we expect innumerous back and forth interactions between the microservices. Configuring one way messaging channeling seems quite expensive from development standpoint. Could you please suggest an alternative solution offering a balanced approach?
Hi Basant,
I have a question. On How do we get the data from a topic in a consumer application?
Usually we should get in the form of JSON string
@@Javatechie ok. how to read the data in the topic and respond/provide it in a REST API. ?
Please check out my Kafka producer and consumer example
Bro you used spring cloud stream but not Kstream. Isn't it necessary to implement kstream while working with cloud stream.
Streaming is done using Streams right ?
Hi Satya ,
May I know purpose of using kstream here ?
I'm new to kafka .i implemented your video but when i started to read kafka streams everything code is done creating by using kstream.that why i was asking
nice one . can we configure multiple topic channel in consumer side ?
Yes we can
@@Javatechie can you please give some sample code for this . i have some requirements..
@@Javatechie can you please give your contact details.
Hi Nice explanation, I am new to Spring cloud stream concept. I have one doubt i.e. can we use same input chaneel with more than one streamLinsters. Please clarify.
You can use but one of the running application will read the message
If we don't add source in binding in publisher whether consumer will not work ?? or if message already published by some one else and if I'm writing consumer with sink binding will i able to receive the message ??
Your first question how consumer will consume the message if you won't publish it
Source -- publisher
Sink -- Consumer
If possible make video on kafka async produce and consume, it helps us alot.
Definitely I will check this out
@@Javatechie thank you Bro
Is there any specific reason we have to create application.yml while we already have application.properties? Can't we specify the bindings inside application.properties ?
It's upto you ☺️ .I feel good to use application.yml
Hi,
In my project I am using spring cloud stream binder and azure event hub as broker.
Everything works well for means able to publish the event.
But when I tried to load test the producer using Jmeter, let's say for 1000 threads,app publishes some random number of events to event hub and after that it is throwing Emissions exception with message like doNext or doError must be signaled serially.
Any idea on this.
Haven't tried with cloud messing queue Budd will check and update
Getting this error when running producer while passing data through postman : Failed to convert message: 'GenericMessage [payload=com.creator.cloud.stream.api.model.....Can you help as there is a change in spring version I guess?
Hi, You might have forgot to link Apache kafka link in Description.
Where do we specify the kafka server details (host,port) this needs to connect
Not required spring cloud will identify it based on binder you added in your pom
Hope this is same for STS? Instead of yml we can use application.properties as I am not clear why you require both files
Yes we can use only one no issues
If we have 2 producer in one service and 2 consumer in another service thenhow will it work
You need to define those info in application.yml bindings section
@@Javatechie Thanks for your answer but can you share example.
Hi. Is it possible to show an example with manmual acknowledge functionality along with some other custom configurations just like Kafka consumer. I am not able to find any refrence.
how to configure kafka..if it is an aws kafka(MSK)..do we need to use msk url in bootstrap_server_config ?
Hi Abhishek am not sure , I believe we should use AWS provide messing queue like sqs
Hello java Techie, Can you give explain also if kakfa is on other server and producer and consumer are on other server, I think we need to define configuration somewhere then so that prodcer and consumer can send message through that server
Yes both are running in different .
We could separate with 2 different application to understand it better way
any reference for remote partitioning using Spring cloud stream ?
whrere you are specifying broker details of kafka?
Can we define multiple topics in the yml file. If we can then How can we choose specific topic with channel.
Tried this IDE not working with spring boot but for your it's working like anything,is there any video for this?
It will work add spring assistant plugin on it .
@@Javatechie tried this way bro if failed for recently IDE version 2020.2 think,working version please?
I was using 2019.3
@@Javatechie thanks bro,installed same version and spring assistant plugin as well am getting spring initializr option but again autocomplete not coming for application.yml
Can we connect in google meet javatechie4u@gmail.com .
It looks Some configuration problem
What if we want to publish data on multiple topic from 1 microservice to multiple micro services from multiple controller, can we deploy kafka server on kubernates pods. Could you please explain the same using GCP pub/sub with multiple topics. And also how can we assured that the subscriber had execute for that payload, does we get any acknowledge id ? and can we see the unacknowledges messages in kafka server which are not passed to subscriber(consumer) ?
And with the GCP i have faced the problem using PUB/SUB as until acknowledge come its executing the same message and in my code due to some error its never returning the ack due that the same message executed continuously please help me as i am not configure the pub/sub in my company
appreciate if we can discuss over the google meets, Thanks
If you create a seperate class for publisher instead of putting publisher logic in main, do you have to annotate the @EnableBinding(Source.class) on main class or the publisher class?
You need to add both Source and sink in @EnableBinding
Hi bro .. m trying to follow your code
But here .. @EnableBinding has been deprecated .. so is there any alternative way ..pls suggest
I didn't check update version
If we have more than one controller in publishers and more than one consumer method how it will execute it's Respective consumer method
In consumer we are using stream listener right so you need to specify channel name
`@StreamListener` is deprecated now.
How to configure consumer service for multiple topics and group?
Where do we add broker details which is running remotely.
Same in application.yml
Where do we mention the kafka server running ip and port?
Can you make a video to integrate kafka with spring boot microservice architecture on kubernetes?
I am not much aware about kunernates
@@Javatechie No problem! Please keep making great videos on Java Concepts! :)
Yes Lau I will do
How can we use db
Can we bind source and sink class in the same application?
I have one more doubt like how to configure multiple topics in producer and consumer
Is github link for this application accessible? I am not able to access it. Can someone share the working github link for this application
Yes it's public repository everyone can access .
Login to GitHub then try to access it
please post the producer error handling methodologies with DLQ
Please video on rabitmq with UI, thanks
Will do 👍
Sir can you tell how to retrieve the data from to two different services at a time please.
Hi Yogesh , Are you trying this using rest API call ?
@@Javatechie yes I am trying rest api
How to configure more than one destination topics
Kinda weird to see using Lombok without @Log or smth when u need logger
@Sl4j is enough
Could you explain what is cloud pubsub?
I never tried this, will check and let you know
are you working only with java?
Yes working on Java
Please help me on this , I have 2 methods with separate api call and both are consuming the same topic. when I publish data, only first method is calling, 2nd is not calling. what I need to do.- Sample code:
@StreamListener("inputchannel")
public void consumeMessage(String message){}
@StreamListener("inputchannel")
public void consumeMessage2(String message){}
Awesome
Very nicely explained. Thank you