Great. explanations is good. could you please help on how to handle dynamically changing headers in csv . everyday we are getting different columns in particular file.
Thank you very much for watching the video and providing the comments. You can change the columns in the file as mentioned in the video. Only a file will be changed and not the code. Also, you can pass this file as input to the Spark job with the latest columns.
could please provide, how to store data in hdfs for real time kafka messaging data using spark streaming in scala language in intelljidea . its grateful for me if you can provide this.
Hi Joseph, thank you for watching the video. The option spark.read.option("inferschema",true) can be used for applying the schema dynamically from the file that we are reading as a DF.
Hi Saravana , Good video , Can you tell me how do we handle if the headers are stored in a config table or in another data frame as rows
Get the df columns and assign it to a val and use that val in the current dataframe
Great. explanations is good. could you please help on how to handle dynamically changing headers in csv . everyday we are getting different columns in particular file.
Thank you very much for watching the video and providing the comments. You can change the columns in the file as mentioned in the video. Only a file will be changed and not the code. Also, you can pass this file as input to the Spark job with the latest columns.
@@sravanalakshmipisupati6533 thankyou
could please provide, how to store data in hdfs for real time kafka messaging data using spark streaming in scala language in intelljidea . its grateful for me if you can provide this.
@@narayanareddy5837 Sure. I will work on it. Thank you.
@@sravanalakshmipisupati6533 Thankyou
Nics explanation sravana, can u please explain how can we write in pyspark
How to apply schema dynamically ?
Hi Joseph, thank you for watching the video. The option spark.read.option("inferschema",true) can be used for applying the schema dynamically from the file that we are reading as a DF.