FINALLY a good thorough yet simple explanation of what the heck Kinesis is for. I have combed through UA-cam and was about to give up when I watched this video. You're doing god's work! Thank you for breaking it down so well!
hugely helpful! wow AWS sure figured out how to turn EVERYTHING into a product. “kinesis.. when you want to pay for 4 products rather than just the one you need”
At the end of the session you brought up the difference between SNS, SQS and Kinesis is amazing. Your sessions are comprehensive and time bound. Thanks a lot for sharing your knowledge.
This is an excellent introductory video on AWS Kinesis! I am studying for the AWS SAA exam, and I tried many resources to understand the Kinesis service. None of the resources was able to explain this service as easily as this video did. Thanks a lot for putting this together! I am learning more about AWS through this channel than through any other resource.
I am currently preparing to take the SAA-C03 cert exam. First time watching your video and wish I found this a while ago. Very good explanations of Kinesis and it all makes sense.
Awesome video, always on point!! 🚀, one question though, so If Kinesis have all those features, what are the drawbacks in using it? (like why choosing SNS or EventBridge if Kinesis can "do" what they do, plus extra features?). Thanks!
It would be interesting to see in more detail how to do CDC from databases such as RDS, DynamoDB, Aurora and process and store this data into the Datalake. Although solutions like zero-etl integration already allow us to do this much easier (i.e. aurora /redshift). It would also be very interesting to see how to exploit the potential of flink with kinesis analytics. Thank you for this quality content. Your videos are always full of very interesting considerations. As a Data Engineer, I learned a lot about the aws world thanks to you. Sometimes it's easy to get lost among the many services offered 🙂
Thanks a lot for the video! I hope you will keep posting! :D Maybe you could help me understand the below better: 09:20 - 10:01 I do not understand by the formulation... did you mention 1 option (proxy + gateway) or 2 different options (the second one being: if you have a 00:09:47.700 that's one feasible option. 00:09:50.100 if you have more secure producer of your information 00:09:52.440 so say it's a back end that you can 00:09:54.779 control you don't have to do it this way 00:09:57.000 you can use the Kinesis producer library 00:09:59.100 to directly write into Kinesis data 00:10:01.500 stream ) ? And I think you mentioned the same thing I did not fully understand at around the 16:00 mark again: and the 00:16:05.399 same things kind of hold true like I was 00:16:07.019 speaking of before like you don't have 00:16:08.459 to go through the API you can go around 00:16:10.079 this way as well the same kind of 00:16:11.639 concept still applies here Thanks again!
FINALLY a good thorough yet simple explanation of what the heck Kinesis is for. I have combed through UA-cam and was about to give up when I watched this video. You're doing god's work! Thank you for breaking it down so well!
Great as always. AWS should pay you for advocating their cloud services.
hugely helpful!
wow AWS sure figured out how to turn EVERYTHING into a product.
“kinesis.. when you want to pay for 4 products rather than just the one you need”
At the end of the session you brought up the difference between SNS, SQS and Kinesis is amazing. Your sessions are comprehensive and time bound.
Thanks a lot for sharing your knowledge.
This channel is way too underrated. Your videos are very helpful in preparing for the SAA exam, thanks!
The best explanation on this subject I've found so far. Hands down. Keep up the good work!
22:27 So far, you delivered the knowledge like a pro.
One of the hardest parts of SAA exam for me to understand was Kinesis. Thank you for an in-depth explanation.
This is an excellent introductory video on AWS Kinesis! I am studying for the AWS SAA exam, and I tried many resources to understand the Kinesis service. None of the resources was able to explain this service as easily as this video did. Thanks a lot for putting this together! I am learning more about AWS through this channel than through any other resource.
I am currently preparing to take the SAA-C03 cert exam. First time watching your video and wish I found this a while ago. Very good explanations of Kinesis and it all makes sense.
very clear and concise explanation, covered almost...😀
This is a fantastic introduction to Kinesis. Thank you.
In this video you mention you have worked with Apache Flink. Can you please make a video on Flink?
A very good explanation. It helped me a lot.
Fantastic explanation! Subscribed
Same as always, high quality video, well presented, very informative.
Glad you enjoyed it!
you saved my life,, thank you so much
Crisp and clear, structured delivery of content!
One suggestion, voice is so bass-boosted, so could not listen for a longer period.
Awesome video, always on point!! 🚀, one question though, so If Kinesis have all those features, what are the drawbacks in using it? (like why choosing SNS or EventBridge if Kinesis can "do" what they do, plus extra features?). Thanks!
It costs a lot more to use kinesis
Costs in terms of technical knowledge? or costs in terms of money? (or both jaja)@@PTBKoo
It would be interesting to see in more detail how to do CDC from databases such as RDS, DynamoDB, Aurora and process and store this data into the Datalake. Although solutions like zero-etl integration already allow us to do this much easier (i.e. aurora /redshift). It would also be very interesting to see how to exploit the potential of flink with kinesis analytics.
Thank you for this quality content. Your videos are always full of very interesting considerations. As a Data Engineer, I learned a lot about the aws world thanks to you. Sometimes it's easy to get lost among the many services offered 🙂
What do you think about just having a lambda and saving the information in a Dynamo Table
Can't we directly load the data into S3 from Kinesis Data Analytics skipping firehose if we are not making use of batching feature?
Thanks a lot for the video! I hope you will keep posting! :D
Maybe you could help me understand the below better:
09:20 - 10:01 I do not understand by the formulation... did you mention 1 option (proxy + gateway) or 2 different options (the second one being:
if you have a
00:09:47.700 that's one feasible option.
00:09:50.100 if you have more secure producer of your information
00:09:52.440 so say it's a back end that you can
00:09:54.779 control you don't have to do it this way
00:09:57.000 you can use the Kinesis producer library
00:09:59.100 to directly write into Kinesis data
00:10:01.500 stream
)
?
And I think you mentioned the same thing I did not fully understand at around the 16:00 mark again:
and the
00:16:05.399 same things kind of hold true like I was
00:16:07.019 speaking of before like you don't have
00:16:08.459 to go through the API you can go around
00:16:10.079 this way as well the same kind of
00:16:11.639 concept still applies here
Thanks again!
So if im not interested in replay or timeline like in kinesis, i can use SQS instead for parallel processing with multiple consumers ?
My Humble request is to do Openshift course, Please do that..... Thanks in Advance.
what about kinesis and kafka?
Make another detailed video on:
AWS Kinesis
AWS Glue
AWS Lambda
AWS SNS, SQS, EventBridge
AWS Athena
AWS DynamoDB
AWS Redshift
bro you talk to him like he's chat GPT 😂😂
@@georgelemach4449 😅😅bruh
Great, thx!
842 Nia Underpass
instead of using Lambda for transformation, there's the AWS Glue.