Amazing video with many important basic concepts all compressed into a short video. Very nice format. Was not expecting you to explain all of LoRA, Quantization, RLHF, PPO, Distillation using Teaching Committee, Structured Pruning, etc all in this seemingly random video about Apple Intelligence
Generally, multiple independently-trained LORA adapters can be applied sequentially one after the other, or in parallel on the same input followed by adding their outputs together. I don't think the paper mentioned anything specific about stacking multiple LORAs, but I'm sure it should be possible to do properly with some fine-tuning.
wonderful video
Amazing video with many important basic concepts all compressed into a short video. Very nice format.
Was not expecting you to explain all of LoRA, Quantization, RLHF, PPO, Distillation using Teaching Committee, Structured Pruning, etc all in this seemingly random video about Apple Intelligence
Haha thanks for recognizing that! This video was pretty insane to make ngl. These topics all deserve their own 15 minute video tbh.
love your content!
Glad you enjoy it! Thanks.
Can I connect multiple lora adapters at the same time to the base model?😊
Generally, multiple independently-trained LORA adapters can be applied sequentially one after the other, or in parallel on the same input followed by adding their outputs together. I don't think the paper mentioned anything specific about stacking multiple LORAs, but I'm sure it should be possible to do properly with some fine-tuning.
@@avb_fj thanks for the explanation.
💙
can we connect somewhere to have a chat?
You can reach out on the email address linked on my main channel page.