If multiple topics are in same Kafka server then , you can , But if two topics are part of two different server , you need to configure , separately for different servers . already I have done this one will upload .
One doubt.. currently in kafkalistener topic name is always same and it will consume from same topic. Is there a way to consume from a particular topic if i pass topic name via api? Assumption all topics are in same kafka server
Hey , i have created publisher on one port and consumer on one port. Both are running fine separately. How can i link them together? Can you please help me with that
First of all very nice video's but please tell me one thing Here u are implementing the restcontroller wd Apache Kafka can tell use cases for that videos where we can implement this Kafka concept architectural levels and how we can relate this concept to the project levels then I am very Thankful 😊👍
1. First of all it will , decouple multiple systems . 2. Real time project scenario : 1 . We have Adhar , Driving licence ,PAN ,Voter ID. For each document we have separate application or api or portal . 2. Consider Adhar is the main component . And if any changes in Adhar info like address , email and phone no , then this changes should propagate to all other portals . 3. How you can achieve : First when ever any updates happens in Adhar details . Post that event to queue . Eg. you have updated address using UI or rest call in Adhar portal . Then we should post that info or message to queue . 4. Then all other applications who ever want to update those info , then need to listen that topic and update their DB . Example 2: Suppose in a insurance project , at each month they will identify who have not paid the premium they want to terminate their policies . How we can approach: 1. Fetch the records those need to be terminated. 2. Push those lakhs of records to queue . 3. Read from queue one by one and process .
Whenever I post a message I get the following error: Can't convert value of class java.lang.String to class org.apache.kafka.common.serialization.ByteArraySerializer specified in value.serializer Can you please help me out here. I have followed your entire code
thank you
I used docker to run the kafka, but your directions just work fine with that also
😊
Quite helpful video. Thanks
Thank you 😊
Excellent
Rajani ia awesome
Thanks 😊
Hi very nice tutorial..if we need to read from multiple topics with different schemas,can we use same listener in consumer?
If multiple topics are in same Kafka server then , you can ,
But if two topics are part of two different server , you need to configure , separately for different servers . already I have done this one will upload .
@@JavaShastra ok but schema will be different na for each topic,then how can we use same listener for all topics?
@@pavanim6258 Guess you may want to use Polymorphism / interfaces and have wrapper methods to retrieve schema specific data
One doubt.. currently in kafkalistener topic name is always same and it will consume from same topic. Is there a way to consume from a particular topic if i pass topic name via api? Assumption all topics are in same kafka server
Hey , i have created publisher on one port and consumer on one port. Both are running fine separately. How can i link them together? Can you please help me with that
Hi as here it’s running in different ports , you need to configure separate producer factory and consumer factory .
First of all very nice video's but please tell me one thing Here u are implementing the restcontroller wd Apache Kafka can tell use cases for that videos where we can implement this Kafka concept architectural levels and how we can relate this concept to the project levels then I am very Thankful 😊👍
Java shastra tell me the use cases bro can u relate this concept at any project levels
1. First of all it will , decouple multiple systems .
2. Real time project scenario :
1 . We have Adhar , Driving licence ,PAN ,Voter ID. For each document we have separate application or api or portal .
2. Consider Adhar is the main component . And if any changes in Adhar info like address , email and phone no , then this changes should propagate to all other portals .
3. How you can achieve :
First when ever any updates happens in Adhar details . Post that event to queue .
Eg. you have updated address using UI or rest call in Adhar portal . Then we should post that info or message to queue .
4. Then all other applications who ever want to update those info , then need to listen that topic and update their DB .
Example 2:
Suppose in a insurance project , at each month they will identify who have not paid the premium they want to terminate their policies .
How we can approach:
1. Fetch the records those need to be terminated.
2. Push those lakhs of records to queue .
3. Read from queue one by one and process .
Check the below comment I have explained , the real-time scenarios .
@@JavaShastra now thank you so much ❤️👍
Excellent and lucid way of describing
Whenever I post a message I get the following error:
Can't convert value of class java.lang.String to class org.apache.kafka.common.serialization.ByteArraySerializer specified in value.serializer
Can you please help me out here. I have followed your entire code
How to share messages to kafka topic via hec event api can u please give some information
I have never come across hec api.