- Remembers password to his other channel - Pays tribute to his mother - Drops a banger of a tutorial - Leaves with only +15 social credit score We love you Jeff
That "because mom is in heaven" got me in tears, even though I have known about it for a couple of months due to the community post about your mother passing away. Stay strong Jeff, we all Love you! ❤
“My mom knows nothing about programming but watches every video I make. That’s why I say ‘hi mom!’ instead of ‘hello, world!’ in every video.” And then one episode his mom wasn’t there anymore.
"spit my tee" I like to imagine you had a golf tee in your mouth, waiting for someone to hit the golf ball atop it, and you Lucy'd him by spitting it out XD
This is the type of project that was on every developers mind but 99.99% are to lazy to actual try making it work 😂 So nice to see this!! Well executed 👏
Word of caution - only the 671b model is the actual deepseek-r1, the smaller models are based on Qwen and Llama with "reasoning patterns of deepseek-r1 distilled into them".
@@averdadeeumaso4003you can run it on a single epyc server, the newest gen has like 600GB/s memory throughput, with 37B active params you can get upwards of 15 tokens per second (dpends on quantization, this is a pessimistic estimation with q8), you just need enough ram, costs: 3k+ cpu, 4k ram, 1k board and you need a gpu for prompt processing so you can do it within 9-10k with imo reasonable performance. The yt channel dreaming fairy has a similar setup and some benchmarks for llms if you want some hard data :) (hopefully he adds one for r1 soon)
I have a Chinese phone, I'm not afraid, they already know everything about me. But with deepsik I literally made a simple portfolio site for myself in 20 minutes, before that I hardly coded at all, only in Python. He answered all my stupid questions perfectly, corrected mistakes and kept the original code in memory. 10/10
Dude I was grown by my mom since I was 4 y old until 8 because my dad died and when I was 8 she remarried but to tell you something I come from East Europe and I was part of a generation that suffer from poverty and it was even more painful to see my mom struggling to grow me up and become a half-ass-decent IT worker, I am 24years old now, thank you Jeff, I know what a mom means for you, I am still lucky to have her yet alive and well, but the day she will die will be the day I will die inside as well... For sure... Keep it up dude... And thank you again...
DeepSeek’s monthly plan for its AI is *40 times* cheaper than OpenAI’s standard price. This suggests that big U.S. tech companies may be grossly inefficient or overcharging consumers far more than what newer, better alternatives in the global market can now offer. Affordable options like DeepSeek and other emerging low-cost open source AI tools are great choices for Americans looking to save money on their AI usage, especially now that budgets are tight for many people. This is legitimately game changing. I’m beginning to understand why DeepSeek is being a called a “profound gift” to the entire world.
when they go to heaven, first 2 months are ... weird, but things start to return to normal after, and you realize they aren't gone, they are still there, in memory
Dude! You rule. That is all, carry on... So I implemented this, adding markdown and subsequently sanitized HTML. Then I used that to have r1:32b add Chat context, Clear the input once the request is sent, change the button to say, "thinking...", auto scroll the response to keep the page size static, and added stop functionality if user chooses.
@@nagorik24 I mean doesn't open source go along with some of the principles of the free market. China is authoritarian so I am not sure if it really represents there government.
@@ordinaryrat the American government is far more authoritarian. The American government is very happy to leave people in the street starving. The Chinese government actually helps people and get them on their feet.
This guy is living on the edge. I'v been messing around with deepseek since last Thursday Jan 23 2025, and fireship released something on Monday (which was a just 4 days behind). Then he drops this [vs extension template]. How is he getting on the ball this fast; he plays catch-up and then he surpasses the wire? I actually run my business on knowing the newness. I might have to put Fireship in my n.e.w.s. feed. Crazy.
There already is such an extension, called Continue. It can connect to basically any llm provider, local or remote, has code completion, chat window, context etc.
yes that's true, though I think this video still has a lot of value because it shows how easy it can be to start writing extensions (not saying the comment implies that necessarily)
0:43 Oh? They have my romantic story prompt and chat history? That's fine. I hope the person in charge of reading all those prompts will be shocked by my bad taste, lol
It should work. First open the debug panel by pressing Ctrl+Shift+P, select the default debugger from the list, open it in new window, then use the command palette in new window by typing Ctrl+Shift+P again. You should see "Debug: start debugging" as an option when you have your main window open. Then when you have a new window open, you need to type the command "Hi mom" or whatever you set as a command. Make sure typescript works in your project directory by checking the version with command "tsc --version". If it hasn't been installed, run: "npm install typescript --save-dev". Also you can try reinstalling node module packages. First "rm -rf node_modules package-lock.json", then "npm install" and finally "npm run compile". You can start the watcher with command "npm run watch". If you can open a new window that says "extension development host" when you click "run extension [name of the extension] and you get no errors and you're running compilation in watch mode, well then it means you're good to go.
Well, you are literally a godsend with this video. I mean, the AI stuff is cool and all, but syntax-highlighting inside a string literal? When we just today discussed in the team the problem we have with a progress project that forces us to write JS code inside strings? What are the odds?! Anyway, thank you as always for your awesome videos!
YOu can do it yourself like this, it's nice for the control. But I just want to say if you dont have the time you could use Roo Code which is a really good free vs code extension and autonomous. It can be used with R1 through the custop endpoint
From the transcript: "in fact you could have deep seek build you your own deep seek powered IDE from scratch" It is SO not capable of that. I would like to see one where it could analyze a large codebase and I could ask it questions about the code. I think the full version has a maximum of 128k token context (including its very lengthy self-prompt reasoning). But when I saw people talking about building their own full version server, they were talking about contexts of 4k and slowdowns so there might be a problem building one that's actually useful for examining a whole project.
What amazes me about all the recent AI advances is how much better software has become. Large companies have abandoned bug bounties because AI now rewritten most backends in Erlang fixing all the errors along the way and offering seamless upgrades with zero downtime. What a time to be alive!
4:27 & 4:49 - part of the code shown for these lines are cut off. Will the full script be available elsewhere? Or would it be possible to share the #response line?
3:13 It's a sin to recommend a linear search as a programmer instead of binary search (middle option, and then go down or up depending on how your system handles it)
Thanks very much, What laptop did you use for this and what was the specs? I wanna gauge the system requirement before I replicate on my. FYI: my laptop is "Technically a VINTAGE"
- Remembers password to his other channel
- Pays tribute to his mother
- Drops a banger of a tutorial
- Leaves with only +15 social credit score
We love you Jeff
Another Jeff!!
W
my name is jeff
you forgot the legendary Chinese mafia beaver thumbnail
Why only +15 sc
"because mom is in heaven" dang bro
I felt that
@@jorgerangel2390 me too bro, me too
When he said that, I felt it within my entire soul
Currently
❤❤❤ mom is a multiverse ❤❤❤
thank you Jeff's mom, your son is our hero
mom should be proud
Deeply proud
😭😭
she is I'm sure 😢
"because mom is currently in heaven" hits deep in the heart
yea
rip mom
jai hind ser
yeah actually
lmao, but also damn..
Hits deep in the seek.
Mom would be proud, Fireship. Mom would be proud
That "because mom is in heaven" got me in tears, even though I have known about it for a couple of months due to the community post about your mother passing away. Stay strong Jeff, we all Love you! ❤
holy sh*t, after knowing the story of your "hi mom" easter egg, the mom joke came out of nowhere and made me spit my tea. Love you
yea, it's actually pretty sad....
“My mom knows nothing about programming but watches every video I make. That’s why I say ‘hi mom!’ instead of ‘hello, world!’ in every video.”
And then one episode his mom wasn’t there anymore.
"spit my tee"
I like to imagine you had a golf tee in your mouth, waiting for someone to hit the golf ball atop it, and you Lucy'd him by spitting it out XD
@@kittymedusa3618
It also means he believes his mom is his world. ❤
@@xenalin1 oops 😂
2:11 bought me into tears
Same
This is the type of project that was on every developers mind but 99.99% are to lazy to actual try making it work 😂 So nice to see this!! Well executed 👏
"Mom is currently in heaven" goes deep. 🗿
you could say it was a DeepSeek
"Be proud Mom"
"Your son is a hero"
I was expecting DeepSeek to write the extension on it's own.
It became inbred, the result in a monster, that need to be purge
THAT'S WHAT I EXPECT AS WELL 😭😭😭😭😭😭😭😭😭😭😭😭😭😭😭😭😭😭😭😭😭
Open source always leaves a space gor others to contribute
I am pretty sure you can also do that after many trial and error
That's probably what he used to learn how to set this up.
Excellent as always and timely too. RIP Mom.
Word of caution - only the 671b model is the actual deepseek-r1, the smaller models are based on Qwen and Llama with "reasoning patterns of deepseek-r1 distilled into them".
And I say that as I download the model for archiving purposes and for using in the future if I get such a cluster lol
But yeah the 38B model is good for programming, I heard? I wonder if it is as good as API Claude 3.5 Sonnet, though.
Yeah, using R1 requires a hefty amount of electricity and a cluster of CPU+GPUs which would cost probably just under 100k dollars
@@averdadeeumaso4003you can run it on a single epyc server, the newest gen has like 600GB/s memory throughput, with 37B active params you can get upwards of 15 tokens per second (dpends on quantization, this is a pessimistic estimation with q8), you just need enough ram, costs: 3k+ cpu, 4k ram, 1k board and you need a gpu for prompt processing so you can do it within 9-10k with imo reasonable performance.
The yt channel dreaming fairy has a similar setup and some benchmarks for llms if you want some hard data :) (hopefully he adds one for r1 soon)
@@averdadeeumaso4003 No the smaller models are not as good as Claude, only the 671b model is.
sorry to hear that man, I wish you all the best man!
Happy Chinese new year Jeff
I absolutely love this. Thanks, Fireship. You're an amazing human
Human? 😂😂
"mom is in heaven" hit me though my mom is with me
best VSCode extension tutorial out there
I was literally in the middle of doing this myself....... THANK YOU
I have a Chinese phone, I'm not afraid, they already know everything about me. But with deepsik I literally made a simple portfolio site for myself in 20 minutes, before that I hardly coded at all, only in Python. He answered all my stupid questions perfectly, corrected mistakes and kept the original code in memory. 10/10
which parameter are you using mate?
Sorry to learn about your mom. May she rest in peace. Thank you for sharing this.
Stopping that one guy from saying "bro fell off"
YESSSS I’VE BEEN WAITING FOR THIS VIDEO SINCE YESTERDAYS VIDEO!!!!! Love u
So nice to see Hi Mom after so long, May she continue resting in peace ❤
Dude I was grown by my mom since I was 4 y old until 8 because my dad died and when I was 8 she remarried but to tell you something I come from East Europe and I was part of a generation that suffer from poverty and it was even more painful to see my mom struggling to grow me up and become a half-ass-decent IT worker, I am 24years old now, thank you Jeff, I know what a mom means for you, I am still lucky to have her yet alive and well, but the day she will die will be the day I will die inside as well... For sure...
Keep it up dude... And thank you again...
You can only live now, so enjoy every moment you have with your mother. And don't suffer!
0:57 caught me off guard 💀
i still remember the video of you talking about your mom. how she doesn’t understand tech stuff well but is happy that you are happy. ❤
DeepSeek’s monthly plan for its AI is *40 times* cheaper than OpenAI’s standard price. This suggests that big U.S. tech companies may be grossly inefficient or overcharging consumers far more than what newer, better alternatives in the global market can now offer. Affordable options like DeepSeek and other emerging low-cost open source AI tools are great choices for Americans looking to save money on their AI usage, especially now that budgets are tight for many people.
This is legitimately game changing. I’m beginning to understand why DeepSeek is being a called a “profound gift” to the entire world.
Call Jensen for a stack of the latest cards ✋️
Write something efficient 🫵
There is a VS Code extension names "Continue" that can already be used with a locally running Ollama
First
when they go to heaven, first 2 months are ... weird, but things start to return to normal after, and you realize they aren't gone, they are still there, in memory
Is that a joke about memory leaks 💀
@@Dom-zy1qyhahahaha
May she rest in peace bro. Sorry for your loss
"Deep Fucking Chat" - i see what you did there ;)
You really outdid yourself on this one! Keep it up!
fireship always makes me proud
Dude! You rule. That is all, carry on...
So I implemented this, adding markdown and subsequently sanitized HTML. Then I used that to have r1:32b add Chat context, Clear the input once the request is sent, change the button to say, "thinking...", auto scroll the response to keep the page size static, and added stop functionality if user chooses.
Rest in Power Mama Fireship 😊
+100 social credit comrade
and -$200 capitalism
@@nagorik24 I mean doesn't open source go along with some of the principles of the free market. China is authoritarian so I am not sure if it really represents there government.
@@ordinaryrat the American government is far more authoritarian. The American government is very happy to leave people in the street starving. The Chinese government actually helps people and get them on their feet.
Social credits scores only affect companies. It’s yet to be implemented on people.
@@ordinaryratopen source is literal marxism.
Just libertarians marxists dominate the space
This guy is living on the edge. I'v been messing around with deepseek since last Thursday Jan 23 2025, and fireship released something on Monday (which was a just 4 days behind).
Then he drops this [vs extension template]. How is he getting on the ball this fast; he plays catch-up and then he surpasses the wire? I actually run my business on knowing the newness. I might have to put Fireship in my n.e.w.s. feed. Crazy.
Bro that mom part stuck with me 😮
Happy Chinese New Year, Happy Spring Festival
谢谢,新年快乐
I’m bowing overlord
"Unless youve just woken up from a coma" made me laugh out loud! 😅😂😊
0:19 Intel CEO be like:- ✋🤓🤚
There already is such an extension, called Continue. It can connect to basically any llm provider, local or remote, has code completion, chat window, context etc.
Kinda like aider, just give it an LLM API key (or even open router APIkey) and it will commit/architect changes to your provided codebase.
yes that's true, though I think this video still has a lot of value because it shows how easy it can be to start writing extensions (not saying the comment implies that necessarily)
@@arronalt I definitely agree, it's a great tutorial.
Can you please explain how to do this? Im kinda new to this
The fact that a FOSS AI is taking proprietary's AI's jobs is pvre poetry.
Karma woke up from a nap and said "not today sunshine" to OpenAI.
$Qardun yet has to really hit mainstream media guys
Thanks for the tutorial learned loads, keep up the great content !!!
Wow that's a boss tutorial in record time 🤯
On M1 iMac, I can run 8B for fast responses, and 14B for slower responses.
Where should I start with an M1 MacBook Air?
@@MichaelH-w6e 7B
will Lenovo AMD work as well lol?
Musk's Qardun announcement is coming soon. Easyest money if you get in on the ICO
0:43 Oh? They have my romantic story prompt and chat history? That's fine. I hope the person in charge of reading all those prompts will be shocked by my bad taste, lol
Its open weights so you can run it locally and see excatly what it does
Qardun is going to absolutely blow up
Thank you fireship's mom for this hero
I’m just dying that during the tutorial, dude has the exact same pace🤣
sir i adore your content in both channels... keep it up
Mom is proud of you ,may mom rest in peace
Friendship ended with o1
now
r1
is my
best friend
None of them is our friend, it just data collector cold world war berween US and CN that’s it ;)
Hats off to your mom. That hit deep and stayed with me 💔🕯
Subbed in the basis of this spankingly good post man
im extremely new to coding and stuff but at 1:18 what should i use to run that line?
Your terminal
@@mark5418 like the one with already installed in windows right? guess im too green to try this out
It should work. First open the debug panel by pressing Ctrl+Shift+P, select the default debugger from the list, open it in new window, then use the command palette in new window by typing Ctrl+Shift+P again. You should see "Debug: start debugging" as an option when you have your main window open. Then when you have a new window open, you need to type the command "Hi mom" or whatever you set as a command. Make sure typescript works in your project directory by checking the version with command "tsc --version". If it hasn't been installed, run: "npm install typescript --save-dev". Also you can try reinstalling node module packages. First "rm -rf node_modules package-lock.json", then "npm install" and finally "npm run compile". You can start the watcher with command "npm run watch". If you can open a new window that says "extension development host" when you click "run extension [name of the extension] and you get no errors and you're running compilation in watch mode, well then it means you're good to go.
Well, you are literally a godsend with this video. I mean, the AI stuff is cool and all, but syntax-highlighting inside a string literal? When we just today discussed in the team the problem we have with a progress project that forces us to write JS code inside strings? What are the odds?! Anyway, thank you as always for your awesome videos!
This may go down as your most important video ever.
Now We can copy paste code in Roller coaster ! YAAY
damn i had that idea, you are fast my man
the mom was deeply proud being seeked
Thanks for the good explanation! Very nice video!
rip fireship mom, you're son is the GOAT
You’re too good for me to have just found you
Use the 1.58 bit quantization of the big model if you want the real reinforcement learning-trained CoT with MoEs; the smaller models don't have that.
How did I accidentally stumble onto another fireship channel 😂😂😂
thanks again!! love your work
2:10 woooooooooo take it easy with the suckerpunch dude
Excellent video. more like this, please.
YOu can do it yourself like this, it's nice for the control. But I just want to say if you dont have the time you could use Roo Code which is a really good free vs code extension and autonomous.
It can be used with R1 through the custop endpoint
"Mom is currently in heaven" man that's deep
thank you hero, mom's son, the Jeff, the rememberer of the second channel's password, breaker of chains, king in the north, is our hi mom moment
Embed your code into your freshly created deep seek UI extension. Congratulations 🎉 you’ve achieve deeper seek.
nah the crazy pills section was actually insane
Happy Chinese New Years! Much respect to you and your mom🫡
You can use Zed to run local Ollama instances
From the transcript: "in fact you could have deep seek build you your own deep seek powered IDE from scratch"
It is SO not capable of that.
I would like to see one where it could analyze a large codebase and I could ask it questions about the code.
I think the full version has a maximum of 128k token context (including its very lengthy self-prompt reasoning).
But when I saw people talking about building their own full version server, they were talking about contexts of 4k and slowdowns so there might be a problem building one that's actually useful for examining a whole project.
Perfect content, was just thinking about how to do something like this - thanks. 🙏
Just when I thought I should uninstall codeium and get Deepseek extension, this vid popped up!
If this is integrated into smart home devices it would be so helpful
Surely there is an extension to use lama models in vs code by now? But I guess rag is the harder part to give it context?
cline does
Been stuck on at 2:42 for the past 5 hours, and I've been unable to get that error message. Any Ideas?
I put this up on my GitHub, Invincibear/local-deepseek-r1-in-vscode
Same
My issue was typescript was not working. I had to downlaod it and type tsc in the command line when i was in the project folder
@@ras906 ah great thx mate
Hope you are proud mom. Your boy is great !!
You sell courses? I would've never guessed. Honestly you should push it more at least once a week dude. I've been following you for at least a year.
What amazes me about all the recent AI advances is how much better software has become. Large companies have abandoned bug bounties because AI now rewritten most backends in Erlang fixing all the errors along the way and offering seamless upgrades with zero downtime. What a time to be alive!
Great tutorial, thanks for this! But why the zoom?? 😵
4:27 & 4:49 - part of the code shown for these lines are cut off. Will the full script be available elsewhere? Or would it be possible to share the #response line?
He did that on purpose to force people to pay for his courses.. Rotten
@feliceoggi I ended up getting it to run, but running debug had it also open then close immediately...
@@Quicksymphony I've just got Claude 3.5 to fill in the missing pieces and about to test it, will let you know how I go
3:13 It's a sin to recommend a linear search as a programmer instead of binary search (middle option, and then go down or up depending on how your system handles it)
"because mom is in heaven"... I paused to take a walk
Mom IS proud.
Mom is proud in the cloud :)
All the distilled models are not Deepseek's models btw. They are "student" models finetuned on R1's output, so only the 671B is actually deepseek's.
Thanks very much,
What laptop did you use for this and what was the specs?
I wanna gauge the system requirement before I replicate on my.
FYI: my laptop is "Technically a VINTAGE"
pls search your experience and expertise
That's deep, bro.