Microservice | CQRS Design Pattern with SpringBoot & Apache Kafka | JavaTechie
Вставка
- Опубліковано 3 сер 2023
- In this tutorial, we will Implement CQRS Design pattern using Spring Boot & Kafka
CQRS stands for Command and Query Responsibility Segregation, a pattern that separates read and update operations for a data store. Implementing CQRS in your application can maximize its performance, scalability, and security.
👉 What is CQRS
👉 Why CQRS Problem & Solution
👉 Use-case
👉 Hands on coding implementation
#JavaTechie #Microservice #SpringBoot #DesignPattern
Spring boot microservice Premium course lunched with 70% off 🚀 🚀
Hurry-up & Register today itself!
COURSE LINK : javatechie5246.ongraphy.com/
PROMO CODE : JAVATECHIE50
GitHub:
github.com/Java-Techie-jt/cqr...
Blogs:
/ javatechie
Facebook:
/ javatechie
Join this channel to get access to perks:
ua-cam.com/users/javatechiejoin
guys if you like this video please do subscribe now and press the bell icon to not miss any update from Java Techie
Disclaimer/Policy:
--------------------------------
Note : All uploaded content in this channel is mine and its not copied from any community ,
you are free to use source code from above mentioned GitHub account - Наука та технологія
You show something complex in an easy way and using simple but great diagram which is also key. Thanks a lot !
Clean and clear explanation. I have watched other youtube CQRS videos, none is upto this mark. I will recommend my team to watch this for CQRS design pattern. Thank you.
you are so great my friend, short and to the point! that is what makes me follow your channel and take you lessons.
Thanks buddy 😊. keep learning 👍
Bro, excellent explnation!! First time seeing a different version of CQRS!!
Such an awesome explanation!! Thank you so much 🙏
Thank you for giving me this great opportunity to learn new concepts from you, Sir...
Hi basant Thank u.
One day just randomly searching on youtube about Generics In Java so expecting ur channel suggestions in search results. But I didn't find u were never covered that on ur channel.
Such an awesome tutorial!!!! Congrats
Excellent . Thank you for taking us with code . Appreciate it
Good day greetings!
Thanks for the video on CQRS.
Appreciate your efforts Basant, God bless you….😊
wow!!this is what i really want.Thank You Basant :)
Hats off for your hardwork and explanation.
your explanation is entirely superb boss can you do more in microservices these videos are helping a lot actually great work
el mejor tutorial de kafka que jamas haya visto nunca.
Your teaching skills 👌
Excellent..just awesome
In one word Awesome 👍
Thanks so much :), clear explanation!
It was helpful.....plz upload more design patterns.
thank you very much , excellent explanation.
Tq Very much sir. Wonderful Videos are posting. Best Spring boot tutorial channel in youtube.
Thanks Vino 😃. Keep learning 👍
Thanks a ton man, you have made my life easy. Hats off
Thank you so much 👋👍🙏🏻👌👏🤝🫡🫰🏻✌️🥳
Why service is crying? With that question, you made us laugh team. Interesting as always. Keep up the good work.
Awesome video! Just one thing. When creating a Product entity within the Product-Query-Microservice, there is no need to specify @GeneratedValue for the id field. This is because the id is always provided by the Product-Command-Microservice when consuming the product from the Kafka topic. This way you wouldn't have problem with id 53 in video.
Yes agreed with you . Thank you for the correction
I love your tutorials!
Can you consider making one tutorial on aggregating microservices' Swagger UI into the API Gateway Swagger UI? It would be awesome to manage all APIs from a centralized interface. Thanks!
Hi, Do you have a Course with Swagger UI with Custom Logo ?
Great video!
very clear and nice explanation, great job
Thankyou very much for explaining
Best explanation Ever
Thank you so much! it helped a lot!!
Your videos are simply awesome.
I am an experienced IT professional with more than 10yrs of experience. I always refer to your videos for quick references or brushing up a few concepts.
Thanks a lot.
Thank you so much buddy 😊. Really so glad to have you as a Javatechie followers 🥰
Fantastic tutorial
Again great thanks 🎉
Fantastic!!! Thanks!!!
Thanks for the detailed explanation of CQRS pattern. Excellent !!
One Quick question that Query microservice is also writing the data eventually then what is advantages of aggregating Query and Command MS. I know that is asynchronous call but writing the data in the same DB from where it's being querying/fetched.
You made it easy👍
Great thanks!
Great tutorial. Just a small doubt.Cant we use slave dbs for query service??
Thus it is very easy for the keys to be desynchronized in the consumption of the topic. Better would be to use UUIDs generated in the command and used in the consumer of the query. Thank you very much for your amazing videos.
Thanks buddy yes we should use random key 🗝️
@@Javatechie Can you please show how to do that in another small video?
Simple instead of only sending value in Kafka just send key and value both
To make it even better, I would write the UUID as the key of the kafka message to apply the correct order of changes for the same entity. It must be used as entity key in the DB query
@@Javatechie thanks
Bu aralar CQRS ve Mediatör'e ilgi büyük. Guut. 😀
Thank you for explanation! acn you please also explain how it will handle distributed transition?
Please checkout my Saga design pattern video
very good explanation
He is great with spring framework because his name is Basant
Hi Basant. Please tell how can we get yml auto complete in Intellij ? Which plugin is required.
good. clear.
This is sheer CQRS without Event sourcing - ES. in real world almost there is no CQRS pattern implemented without ES
nice tutorial
Even in CQRS also you are putting in the same amount of load on DB through kakfka events have to write data again to sync DBs then what is the use of it ? Correct if i am wring
I became a fan of you :)
Awesome explanation.
Would you please help to understand handling failure scenario in this design pattern?
In failure we need to implement a retry mechanism i will cover that as part of Kafka series
Excellent tutorial, but the one thing I am greatly concerned is duplication of data. You need double the amount of db space which is a big concern for me. Is there a way to address this. Do you have any suggestion on this?
No such mechanism is available to avoid this usecase buddy.
When We should use CQRS or when Read replica? What are benefit of CQRS over read replica? Looking for answer as architecture point of view not coding.
This is really awesome. Thank you so much for putting all efforts in making these informative videos. I have one question. Deleting the tables might not be a great solution for different sequence ids issue. Is there any other way to fix that problem? Please let us know.
Yes we need to set strategy type as identity to avoid sequence issues
@@Javatechie thank you so much. we would be really grateful if you can provide that fix. I strongly believe that will help a lot of people out there. It is a real world practical problem.
@anands53 is very simple buddy just use below
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
Thanks!
Thank you buddy 🙂
Sir it will be helpful if u can add one more step where we can use model classes at one module and use it in whole project , as it was one of the interview question asked
That is multi module project please check my saga pattern video you will get an idea 💡 how to do that
@@Javatechie Thank You loved ur video
Just another thanks
How to scale up read and make it more performant ? will the Kafka listeners can obstruct in the reading ?
Yes Kafka well capable to tackle this
Interestingly, you didn't implement mechanism for handling delete endpoint, since in that case you would need to have delete command in the product read service.
How would you solve that?
That's not a problem at all in read also we do have save and update right? In a similar way you can add delete anyway it's just that we used to sync data and only we are allowing users to do fetch records not other http methods
Good video. I have a question: How I can avoid code dumplication using CQRS?
Can you do some video were you show a new video and teach how handle communication problems? Thanks in the advance
@Java Techie CQRS Pattern Is looks good but If we write Kafka In Each and every create/update methods It's looks like Coupled... Tomorrow If you want to update or Introduce any new Service It's hard to Update...So My suggestion Is to Sync the databases Independently I mean we can use any trigger points In database side Queues than the code side...and we can use any third party tool that will sync with external databases
Why will it impact buddy since Kafka will play the role asynchronously .
@@Javatechie Yeah async only but tomorrow If you want to use any streams we are tightly coupled with Kafka right we should use any generic one like spring cloud stream
Yes spring cloud stream we can use to make it loose coupling with application
whenwe implement rest endpoint , do we need to implement cqrs always since we are segragating read operation from write, or is it used in specific scenario only pls suggest
Not always but it's recommended design pattern for microservices
@@Javatechie both contoller are maped to /products only, since we have 2 microservies, each should be pointing different request mapping like /productc /productq
How to open that kafka interface, partition, consumer related?
Hi sir Can u please create video on Distributed Transaction management pattern on spring microservice
Check this playlist it's already there
Microservice design patterns: ua-cam.com/play/PLVz2XdJiJQxw1H3JVhclHc__WYDaiS1uL.html
@@Javatechie Thank you
Interview Question:
Do we have a way to exclude parent pom inherited dependency ???
No not getting you exactly, what I understood in module project you want to execute parent pom ?
can you pls cover event sourcing if possible
Here, you created two tables in same database right? Not each table in each database?
My question is here , why do we need CORS design pattern ? we can do same thing with replicas and use for reading purpose . Any specific problem where we can only use cors not replica ?
Any videos about session and cookie in your channel?
Not yet buddy
But, I believe there is something wrong with how is it done. Read DB is still getting write operations from a kafka topic, instead of REST calls.
when I create my own cqrs project with Soring boot 3.2.1 it is giving (Listener unable to read message error :- cannot convert from com.lang.String to com.myDto). Why is this happening? Is it any issue with the json de-serialiser? when i clonee your code it is working fine.(your code has version 3.1)
Let me upgrade and check
Why can't the problem of DB update be solved easily by having a view on the table?
Not getting you
@@Javatechie I think there was no necessity of creating another table for the query service. You could have just created a view on the 1st table. Please correct me if I am wrong
@@amitpadgaonkar8830 i am not aware about view will check and update you buddy
hello , i had this error as a loop
The class 'com.example.productcommandservice.dto.ProductEvent' is not in the trusted packages
any idea how to fix it please?
Check application.ptoperies file i have defined a key to avoid this error
@@Javatechie i did not find it ;-;
spring:
kafka:
consumer:
properties:
trusted.packages: com.example.myapp.domain
producer:
properties:
trusted.packages: com.example.myapp.domain
Without this design pattern, for every product creation there is only one write operation.
With this pattern there are two db writes to two different dbs with additional complexity of messaging.
If the read service is making a save call then what is the benefit of additional complexity? 🤔
This is valid question as on this example the benefit of CQRS can't be shown. The core idea behind CQRS is that different models are optimized for reads and writes, which can lead to better performance and scalability. I can give you three main reasons:
- Improved Read Performance: Since the read database is designed specifically for query optimization, it can provide faster responses to read requests, especially in scenarios where complex queries or aggregations are involved.
- Flexibility: With separate read and write models, you can choose different technologies and data storage solutions for each. For example, you can use a NoSQL database for the read model and a traditional relational database for the write model, depending on the data access patterns and query requirements.
- Event Sourcing: CQRS often goes hand in hand with event sourcing, where each change to the data is captured as an event. These events can be used for various purposes like auditing, maintaining a history of changes, or even for updating materialized views in the read database.
It's important to note that while CQRS can provide significant benefits, it is not a one-size-fits-all solution. Implementing CQRS introduces additional complexity and requires careful design and consideration.
@@Iliaspap80 completely agree with you
@12:58 Isn't using ProductRepository directly instead of Interface tightly coupled the code?
Yes it’s an interface that we have created , where did you find tight coupling ?
@@Javatechie my bad. I have misunderstood. The tutorial is really very helpful. Thank u a lot for making such video.
But here how to make sure the DB data is in sync?
That's why we are publishing Kafka events right
@@Javatechie yes thanks bro really your videos are very much helpful
If id in PRODUCT_QUERY difference id PRODUCT_COMMAND
Then that's completely incorrect your both db is not sync correctly
CAn I use h2 databse
Ofcourse in your choice
@@Javatechie wont it be problematic while connecting because h2 cant be up until spring is running
Anyways you will test your api after starting the server right so h2 will automatically spin up that time no problem at all
@@Javatechie can I have 2 sepreate dB of h2 instance running. As h2 has default port number. Even if create different dB we will have same port number.
No since both applications running on different port 2 h2 instances will be created by spring boot. I would suggest you try once then ask
kafka data publish failed with ClassCastExeption. class com.product.command.dto.ProductEvent cannot be cast to class java.lang.String
Did anyone face ?
I did...any workaround for this?
Visited
Tried with hands-on, able to successfully produce the message to the topic but getting exception
"Caused by: java.lang.IllegalArgumentException: The class 'com.sourav.command.dto.ProductEvent' is not in the trusted packages:"
though I mentioned the right package in the yml file. Is there any other changes needs to do for this?
application.yml
spring:
kafka:
consumer:
bootstrap-servers: localhost:9092
groupId: product-event-group
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
properties:
spring:
json:
trusted:
packages: com.sourav.query.dto
You might not follow the proper package structure please check and fix that
@@Javatechie Nice explanation buddy. Its now working fine.
Hi
how you resolve it ? I have same problem
You made it easy👍