If you liked the first part of this video (which I've never done so far) and would like me to make more similar content please let me know in the comments what interests you in particular! Much love and keep learning at your own pace.
Would like to know the journey behind finally getting a job in a FAANG company specially in AI/ML domain and what are different rounds they conducted for hiring.
i just loved hearing your personal story. It encourages me to start learning through my carreer, since i like a lot the kind of things you do but i still feel like I'm a rookie.
Thank you so much for sharing your story, it’s truly truly helpful. It’s always rlly easy to get panic and get lost when you see people on the internet making complicated stuff like super easy. I feel like a lot of the times it’s the mentality that decides how much you succeed instead of the absolute intelligence or productivity at that moment of time.
@@o_felipe_reis Sim e o pessoal é incrivel, sinto muito saudades do Brazil. Sou da Servia! Estava so 1.5 meses la porque recebi uma oferta de emprego do Microsoft. Mas morava numa republica (Badalação) e por isso conheci muito mas do Brazil.
Btw. I obviously also have bad days when it's harder to get focused and get things done. Flirting with burnouts since 94' basically haha. What the remedy is is to just take an evening off and do something that'll relax your brain. But not everything. Going outside helps a lot. Less information ingestion that's important. Also I forgot to mention that every second day I do some workout in the evening that helps a lot.
your videos are interesting and in-depth, thank you for sharing the valuable infos i need your help please, currently i'm working on a translation transformer from english to arabic, spacy does not support arabic language, so i had to do tokenizing with nltk and vocab and indexing from scratch no torchtext help, i did everything except batching so i decided to input one sentence by one along one dimension to a nn.transformer model, does the built-in nn.transformer of pytorch take only batches or does it take one by one samples? if it doesn't take seq by seq then should i write transformer from scratch and adjust the dimensions? thank you so much
If you liked the first part of this video (which I've never done so far) and would like me to make more similar content please let me know in the comments what interests you in particular! Much love and keep learning at your own pace.
Would like to know the journey behind finally getting a job in a FAANG company specially in AI/ML domain and what are different rounds they conducted for hiring.
Hi sir,Can you make one video on text to SQL implementation using transformer or bert
i just loved hearing your personal story. It encourages me to start learning through my carreer, since i like a lot the kind of things you do but i still feel like I'm a rookie.
Thank you so much for sharing your story, it’s truly truly helpful. It’s always rlly easy to get panic and get lost when you see people on the internet making complicated stuff like super easy. I feel like a lot of the times it’s the mentality that decides how much you succeed instead of the absolute intelligence or productivity at that moment of time.
Great video! I loved hearing about how you manage your time. Thank you for open-sourcing :)
Thank you Simon! It wasn't as detailed as it could be but it does paint a rough picture. Glad you found it interesting
Great Work Aleksa, you are spot on. Baking good stuff takes time.
Oh definitely! Thanks Tahmid!
subscribed! your channel's deserved for views and subscribers. Keep working!
Thank you for the kind words! Oh I'm not stopping any time soon!
Kralj si brate stvarno
Kako se odgovara na to 😅 hvala? Haha salim se, hvala na podrsci!
This video gave me real motivation. Thank you.
Hi there! Great video! Thanks for sharing and regards from Brazil 🇧🇷🇧🇷🇧🇷
Obrigado amigo! Tudo de bom pra cê!
@@TheAIEpiphany valeuuuuuu
@@o_felipe_reis kkkkk, eu morava no Brazil no Ouro Preto, fiz uma pratica la!
@@TheAIEpiphany ah que legal! ótima cidade! De onde você é? e quanto tempo passou aqui?
@@o_felipe_reis Sim e o pessoal é incrivel, sinto muito saudades do Brazil. Sou da Servia! Estava so 1.5 meses la porque recebi uma oferta de emprego do Microsoft. Mas morava numa republica (Badalação) e por isso conheci muito mas do Brazil.
great explanation
A very great content aleksa, thank you so much for the information..
Legendo, svaka cast
Hvala!
svaka cast, samo nastavi tako!
Hvala puno!
Btw. I obviously also have bad days when it's harder to get focused and get things done.
Flirting with burnouts since 94' basically haha.
What the remedy is is to just take an evening off and do something that'll relax your brain. But not everything. Going outside helps a lot. Less information ingestion that's important.
Also I forgot to mention that every second day I do some workout in the evening that helps a lot.
Hvala na motovaciji!
Juhu, drago mi je da ne deluje kontraproduktivno na ljude kao sto neki prazni motivacioni govori mogu da izazovu. 😅🧠
what is input_embedding , ouput_embedding ?
@The AI Ephiphany
Could you link the story of that stanford phd please?
Its really helpful
Great video
Can you pls do one for detection transformer as well
Great video! Thank you!
your videos are interesting and in-depth, thank you for sharing the valuable infos i need your help please,
currently i'm working on a translation transformer from english to arabic, spacy does not support arabic language, so i had to do tokenizing with nltk and vocab and indexing from scratch no torchtext help, i did everything except batching so i decided to input one sentence by one along one dimension to a nn.transformer model, does the built-in nn.transformer of pytorch take only batches or does it take one by one samples? if it doesn't take seq by seq then should i write transformer from scratch and adjust the dimensions? thank you so much
Do you ever pause being fucking awesome or are you fucking awesome 24/7?
Hahahah nowhere as near as I'd like! So many amazing people out there I'm just striving for that
Poor Siraj Raval!
🤣