HI Soumil, Perfect solution Soumil. So my case, I have some streaming tables same like your demo, and after landing on S3, how can I do join them for further real time analytics ? Can Flink do it by select data from Sink table and join each other for further analytics ?
Hi Soumil, Your video is extremely helpful for someone looking to know and study streaming data like me. However, when the data arrives in S3. How can we help provide them with Data Analytics to visualize the real-time dashboard? Because from my S3 bucket, I only see some log files and the folder ".hoodie/.." How can we query the data in S3?
Thats really great, I want to apply the same use case but instead of aurora, i have to use mysql. So what is the setting needs to get apply in mysql instance. Pls advise. Thanks in Advance.
Thank you so much for this lab. Helps a lot with learning AWS platform. I tried following the lab and set everything as per the video. However, I'm facing some issue with the kinesis portion and need some help. DMS migration task is capturing the changes fine but I'm not receiving the data on the kinesis data streams end Also having error running the Flink SQL code too