Good job on the clear explanation of the method and simplification. At 3:40, when you showed the matrix decomposition, the result on the left side does not match the result on the right side. Is this a mistake in the video editing, or is there a point to this? [1 2 3] x [2 20 30[ should be [[2. 4 6], [20 40 60], [30 60 90]]
Very Well Explained! If ΔW's dimensions is 10 x 10 , A and B dimensions are 10x2 and 2x10 respectively. So, instead of training 100 params we only train 40 params (10x2 + 2x10). Am I correct ?
this is better explained than what the inventor of Lora itself explained in his video.
oh! thank you so much. such words really keep me going :-)
Underrated channel, keep making videos and itll eventually blow up
Sure. Thanks for the encouraging words 👍
Thanks for the video!
I loved that you added some libraries we can use for this.
do you want me to do more videos on hands-on? Or should I continue on the theory and papers? your inputs will be quite valuable :)
@@AIBites Hands on videos will be great too
Super in depth and specific, thank you!!!
my pleasure! :)
Good job on the clear explanation of the method and simplification. At 3:40, when you showed the matrix decomposition, the result on the left side does not match the result on the right side. Is this a mistake in the video editing, or is there a point to this? [1 2 3] x [2 20 30[ should be [[2. 4 6], [20 40 60], [30 60 90]]
ah yeah! super spot! I got that wrong while editing. Sorry... 🙂
@@AIBites Yup the Matrix should be [1/2/3] * [ 2 20 1]
Thanks again :)
Amazing video
Glad you think so! 😊
wow u r great 😄
Thank you! I am chuffed :)
Very Well Explained! If ΔW's dimensions is 10 x 10 , A and B dimensions are 10x2 and 2x10 respectively. So, instead of training 100 params we only train 40 params (10x2 + 2x10). Am I correct ?
yup you got it right. And based on the compute available, we can adjust the rank ranging from say from as low as 2.
@@AIBites Thanks for the confirmation.
I wish I was good at math to understand this stuff.
we all can get good at it by putting in the effort. Its just another language spoken by scientists :)