00:00 - What is Apache Flink 02:55 - Use Cases - Fraud Detection and Shopping offers 08:15 - Data Processing - Batch and Realtime 13:15 - Why is it popular? 21:14 - Architecture 35:50 - Execution Flow 38:45 - Monitoring 40:38 - Real Time Example 45:40 - Revisit
The example you gave for word count in last part of the video, where you mentioned that spark count the word "abc" as 1 always but in your spark video you also mentioned that using the sliding window we can do aggregation and state is maintained. Did you mean that spark will just do the calculation on the window based and in the next sliding window it will do separate calculation and there won't be any relationship between first and second window ?
How does Flink even consider 2 like events the "same"? Isnt that logically flawed? How can two different things happening be the same thing having happened?
00:00 - What is Apache Flink
02:55 - Use Cases - Fraud Detection and Shopping offers
08:15 - Data Processing - Batch and Realtime
13:15 - Why is it popular?
21:14 - Architecture
35:50 - Execution Flow
38:45 - Monitoring
40:38 - Real Time Example
45:40 - Revisit
A very good Presentation and Knowledge let down by the irritating white board, Please use excali draw
It is very knowledgeable session. now i understand basics things about the flink
Holy doly, today youtube is spaming me with adverts,lol.
Thanks for the content. It is very good explained imo.
At the end you said wrong. Spark also supports stateful streaming with every micro-batch. So it will show "abc" as 2 if it comes twice.
Thank you...very good presentation
excellent video! thank you for sharing. Can you please provide more videos about Flink?
Thank you for making this video. Many of your content are so relevant with what I am doing, so helpful.
Great job !! Appreciate your support in providing the info.
The example you gave for word count in last part of the video, where you mentioned that spark count the word "abc" as 1 always but in your spark video you also mentioned that using the sliding window we can do aggregation and state is maintained. Did you mean that spark will just do the calculation on the window based and in the next sliding window it will do separate calculation and there won't be any relationship between first and second window ?
For the first 2o mins its just real-time and batch-data being explained, as part of use-case..
Very good video...informative indeed
Great video. Very very useful for Beginners.
Nice video
Nice Overview
nice
Spark streaming global variable will count the value.
Excellent
good lecture, but subtitles would be nice
How does Flink even consider 2 like events the "same"? Isnt that logically flawed? How can two different things happening be the same thing having happened?
Great
Good (y)
Well explained in simple words
这咖喱味真带感
he could have finished in 15 mins
Don’t waste your timr
Waste of your time