Hi Raja, This particular video is great. I have one question. will the size of df increase by creating encryption on some of the columns? How do we take care of memory while designing? Thanks in advance.
Sir After creating the encrypted and decrypted column.. how do we hide the original ssn table? and when we run the command select ssn from dimeployee how do we ensure that we get the encrypted format ?
Good video.I have a question regarding KEY, how is that value stored ..? can we use the same function another notebook with the same cluster? any draws using this method?
good Explain, could you please explain delta lake live table (today i loaded 1 table with 20 columns and same table next day i am getting with 2 more extra columns how to handle in delta loads in ADB and how to manage delta merge command (In production))
Nice explanation 👌 👍 👏
Thank you 🙂
GREAT EXPLANATION SIR
Thanks, keep watching!
Hi Raja,
This particular video is great.
I have one question. will the size of df increase by creating encryption on some of the columns? How do we take care of memory while designing?
Thanks in advance.
Sir After creating the encrypted and decrypted column.. how do we hide the original ssn table? and when we run the command select ssn from dimeployee how do we ensure that we get the encrypted format ?
Good video.I have a question regarding KEY, how is that value stored ..? can we use the same function another notebook with the same cluster?
any draws using this method?
Thank you
You're welcome
Good job, but,I have a question, when we encrypt information aren't we not supposed to be able to decrypt it?
Thanks.
Yes we need to decrypt it when we need to use it later
Good one. But, how do we filter the data and apply comparisons?
In order to filter and compare, we need to decrypt the data on the fly using decrypt method
@@rajasdataengineering7585 It will impact performance too much..
good Explain, could you please explain delta lake live table (today i loaded 1 table with 20 columns and same table next day i am getting with 2 more extra columns how to handle in delta loads in ADB and how to manage delta merge command (In production))
Sure, will create a video series for delta live table
Can we apply those functions on nested json data also??
Yes we can do. At dataframe level, we can apply encryption and write in json format which will keep encrypted data within json file
I am getting the below error while encrypting the data.
'TypeError: encoding without a string argument'
Kindly help
Can we apply same thing to encrypt a csv file
Yes we can do
What is md5, could u please explain this one
Md5 is one of the hashing function
Hi Raja, could you please share a notebook link or DBC file
Raja sir, Will they these kind of questions in Spark interviews??
Yes, data security is one of must topic in interviews