Very good Robin lots of material to share. You guys are awesome at Confluent. Open and thanks for the explanation. Very advanced I’d like to see more videos like this.
Nice - interesting that you are using AIS data. Did you see this talk too? ua-cam.com/video/7ZBxgjo52qk/v-deo.html I'd be interested to hear how you're using Kafka with AIS.
Hi Robin, when you run the count(*) aggregation on first "orders" stream , I am seeing multiple count values for same window start? I faced a similar problem ,how to resolve that? EX- for "9 june 2019 , 4 pm" - window start ., mutliple count rows.
A "push query" will emit changes to the aggregate as it changes, and since new data was arriving the aggregate was changing, and thus you see the new state written to the screen.
Very good Robin lots of material to share. You guys are awesome at Confluent. Open and thanks for the explanation. Very advanced I’d like to see more videos like this.
Thank you for your kind words :)
Very useful for my use case...i am handling live ais data, capturing the original timestamp for processing is what i wanted...
Nice - interesting that you are using AIS data. Did you see this talk too? ua-cam.com/video/7ZBxgjo52qk/v-deo.html
I'd be interested to hear how you're using Kafka with AIS.
@@rmoff yes, Robin offcourse,,,,,
i could implement it..
And able to integrate offline opnstreetmap with kibana and done lot of live dashboards....
Hi Robin, when you run the count(*) aggregation on first "orders" stream , I am seeing multiple count values for same window start?
I faced a similar problem ,how to resolve that? EX- for "9 june 2019 , 4 pm" - window start ., mutliple count rows.
A "push query" will emit changes to the aggregate as it changes, and since new data was arriving the aggregate was changing, and thus you see the new state written to the screen.