@@neociber24 may shock you, but it's not. TS is not even a superset of JS as they said it earlier on their landing. It's just some set which intersects with JS
People bash Javascript because they learned languages that require a lot of work and aren't able to accomplish as much with as little as Javascirpt does. Now, please don't pay to use this for free and upload freely what you think about when eating or doing certain things as they'll get a free database while profiting off of your free work. When we reach socialism and these platforms are owned by the people, sure, but until then, try to take as much money as you can from these leeches. We already learned from Facebook, lets not make the same mistakes.
@@srenheidegger4417 Doesn't affect anyone though. Like did Facebook sell your data, and then you found later that you money, home, kids, wife, and grandparents are gone? no. so why care so much?
This reminded me of the Movie "Everything Everywhere All at Once", where the main character had a device connected to her head, and needed to do random things in order to accomplish something. Now it makes sense, she was connected to ChatGPT with Javascript.
"Connecting your brain to GPT-4 with JavaScript? That's some next-level coding! I can barely connect my coffee machine to WiFi without breaking something." This comment has been brought to you by GPT-4
I’ve executed machine code in my head while I sleep for 37 years. It is the main reason why my software is always practically bug-free and never crash. My brain is the perfect just-in-time compiler.
yea he was like "if gpt 4 is already so good at writing code, its only a matter of time that ill be replaced as a coding tutor, so i gotta adapt to survive"
No, he is actually just combining a popular topics, namely "Neuroimaging" and "AI". And implying that it is now possible to control the world with the power of the brain and AI. Although @fireship is not explicitly making false claims in his video, he gives the impression that such a thing is feasible. While it is conceivable that EEG signals can be used to input commands into software programs, AI algorithms do not play a role in deciphering brain patterns, as is the case with natural language processing tasks. The use of EEG to control JavaScript is not a novel concept, and there exist videos demonstrating individuals playing video games using EEG signals. However, it is crucial to note that these individuals are not merely thinking about the directional movements of the game, but instead, they are focusing on visual stimuli or spelling words. Different brain regions are activated during these activities, allowing the EEG signals to distinguish between various cognitive states and provide appropriate input to the game.
@@Im3Tje Everything you're saying may be accurate, my original statement was about @fireship adapting his content to the other potential interests of his audience. Regardless of the feasibility or how thorough his content is on the subject, the success of this video is clear evidence of his ability to adapt his content to a rapidly changing industry. Which is what I believe deserves appreciation.
Neurosity co-founder here. My phone was blowing up this morning. I thought it was someone texting me but it was the notifications from Crown sales. Thank you, Jeff 🙏
@@alexcastillo1904 congratulations on your incredible success. If you haven’t taken a moment to celebrate what you’ve done so far, bring a cake to the office and have a small little party to celebrate everyone’s hard work. Big morale boost!
I have a friend with multiple sclerosis. He cannot move his body at all but he can speak. Currently, he controls his computer with his voice but it's a bit of a struggle because English is not his first language and also his voice becomes weaker with age. So, solutions like this would be great to help people in similar situations.
I was just thinking about this few days ago. My prediction is that the final stage of brain programming is the ability to just think about abstract things, which are then turned to a fully working program. Imagine that you can, well, imagine a web site or game to existence. Reddit can finally have it's realistic dragon simulator game.
Problem is, in game making, making the game is 10% of the work. Debugging it and improving it and debugging the improvements while kids are screaming in your ear why you're an awful developer and should feel bad about yourself and your life is 90% of the work.
If you could drill down to having the program interpret certain ideas or words and know how to handle them in context to what you're doing, that would be incredibly powerful. Writing a program? Just think about the if else, etc and it writes the code.
What if you trained an ai model to correlate brain function to what you see on a screen? Or what you hear from a pair of headphones? Could you eventually get it to generate video/audio of what you're seeing/hearing? And what if you could use that video/audio stream in a live feedback loop to train your brain to project your own thoughts?
Software development jobs will be replaced when they give AI tools. There are already programs being built with AI tech from Microsoft that will take text input and do what was asked of it. Make a game, website, basic program, etc. The issue is that the government wants money and they want to regulate this to keep everyone in a job rather than overhauling earth so we can live without needing to work to survive.
Basically you can toggle your room lights just by thinking of toggling your room lights. The headset hears the unique "click" and JavaScript runs that unique function. Pretty wild.
Well what do you mean by “using his brain only”?We (humans) arguable can only do things by using our brain. If by “using his brain only” you mean telepathically then yes, me too
I think this would have been cooler if you put in like a whisper API that takes in the transcript of what a person who's talking to you is saying, then use the brain scanner to pick a way to respond to it, then a lens or an earpiece gives you the transcription of what to say. This kinda feels like a demo of the EEG, the ChatGPT part was like 30 seconds.
I wonder if you could train it to transcribe your thoughts? Maybe have a chapter from a book or something you have to read over and over again until it's able to decifer your inner voice?
I did a project with a headset similar to that roughly 10 years ago, granted the software has probably gotten better, but the biggest issue with externally worn headsets is the lack of precision when observing brainwaves, the headset can't do extremely localized signal detection. Also the headset I used had way more sensors this one looks like it favors style over function. What I noticed back in the day was, the headset was good at detectiong muscle changes but not actual brain waves. My guess is, this is similar to what was going on in the video, perhaps he was making a facial gesture or something when "thinking" about eating a lemon, which is Actually what was being detected, and not the actual thought process itself. The same being true for the right hand pinch option. I'd be much more interested to see a test done where he has to think of a color or something that doesn't include a physical response and see if the same quality results are achieved.
On their website in the dev section the postem some twitters from other users: one of them was playing dino on chrome with brainwaves i don t think uses facial expressions
@erre4 I haven't seen the video, but Dino only really requires a single "activation" state to trigger the jump. Anything more a bit advanced, then this is most likely beyond the capabilities of the headset.
Similar story to you. These kind of measurements are really tricky, especially with what amounts to a (relatively) cheap EEG set-up. A professor gave a talk where he, as a Ph.D. thought that he found brain-waves associated with eye movements in the front part of the brain, even though the visual cortex is in the back. A few weeks later he realised he was actually measuring the electrical activity of his eye muscles.
Imagine, with the realism of Unreal Engine 5, this EEG thing, and VR. Sword Art Online is now close to possibility. A bit late, but it's possible. What's left is the lethal charges that gives you pain and melts your brain.
Yeah, right now moving in VR is a problem to detect but imagine if you can just think and move. that was just one example but Meta might really buy this company in future
@@alexxx4434 this is the warning wave before the tsunami hits. If you aren't working in this field or exploring it constantly, you are painfully unaware of the things that are coming.
@@alexxx4434 I work in this field. It's easy for financially interested parties to blow anything out of proportion because they don't understand it. It doesn't change the fact that this will change your world sooner than you're ready for it. What is happening with openai and bing is a dumpster fire. The limitations and politically correct alignment process are severely limiting the potential. And this video doesn't show anything amazing at all, this kind of EEG tech is used in games in a much more mind-blowing way. Hype wave or not, these large models behind it are here to stay and the waves won't get smaller any time soon. The exponential speed of development in this field is something the average human can't prepare for.
🎯 Key Takeaways for quick navigation: 00:00 🧠 *Introduction to Brain Waves and the Crown Device* - The video begins with an introduction to the concept of brain waves and the use of electroencephalograms (EEGs) to measure them. - The speaker introduces a futuristic device called the Crown, an EEG that connects to a mobile app and measures brain waves. - Key points: *- Introduction to brain waves and EEGs.* *- Description of the Crown device and its capabilities.* *- Mention of the JavaScript SDK for accessing brain wave data.* 02:34 🍋 *Training Custom Thought Patterns for Brain Waves* - The speaker explains how the Crown device can be trained to recognize custom thought patterns. - Demonstrates training the device to recognize the thought of biting into a lemon. - Discusses the accuracy and somewhat time-consuming nature of the training process. - Key points: *- Training the Crown device to recognize specific thought patterns.* *- Example of training for biting into a lemon.* *- Emphasis on accuracy and training duration.* 03:56 💻 *Using JavaScript to Access Brain Wave Data* - The speaker describes how to use JavaScript to access brain wave data from the Crown device. - Explains the initialization and login process for accessing the data. - Discusses the data format, sampling rate, and channels. - Key points: *- Using JavaScript to access raw brain wave data.* *- Initialization and login steps.* *- Details about data format, sampling rate, and channels.* 04:52 🤖 *Communicating with GPT-4 Using Brain Waves* - The speaker demonstrates how to communicate with GPT-4 using brain waves and JavaScript. - Describes the process of installing the OpenAI SDK and authenticating. - Shows how to send messages to GPT-4 and receive responses. - Discusses potential applications, such as generating excuses or answering questions. - Key points: *- Using brain waves to communicate with GPT-4.* *- Installation of the OpenAI SDK and message exchange.* *- Possible applications, including generating excuses and answering questions.* Made with HARPA AI
Wtf, this is so weird. The end example you gave eerily reminded me of how I use to try and remember things as a kid. Right before bed, if I needed to do something the next day, I'd pick up a random object and focus on it while considering what I wanted. Then, place the object in an obvious and noticeable, but out-of-place location. It worked. This feels like it could be used as a natural extension of that too 😮 [Note: Yes, I know you can set 20 alarms on your phone now to do the same thing lol ]
Imagine making a mecha suit using this tech, it'd be so cool but also extremely risky. Like if it reads your thoughts wrong, you'll get your arm ripped off.
What you need to do is for GPT to see the brainwaves, get from you the text writeup of what your thinking during that session, and have it train itself on your brainwaves. Then you don't have to write the code, gpt would do it for you... kinda. Might need some mock ups here and there but would save you time mapping your own brain activity.
@@whisper__ Basically, if i got what he meant right, his idea is basically training the system to react to your brainwaves for every word you usually type on your scripts so that you wouldnt need to write as much
You'd think we'd have flying cars by now, but no, instead we found another way to program with javascript. Makes me think that if we ever establish communication with extraterrestrial, we'll be communicating in javascript.
This, if developed far enough, could actually help people. Imagine if a person who's paralyzed could use this to communicate, or better yet - control an external exoskeleton. The potential is huge!
I mean scientists have been working on what you describe for a solid decade, stuff is moving forward fast but this is still not anything super new (in terms of the concept).
Has this yet been tried with games? Always had this thought about a helmet like that measuring brainwaves that you could calibrate like you calibrate a controller. The calibration would ask to think about moving forward, backwards, to jump etc. Future is here
Yesss! This is what I was looking for. You've made a great contribution to my side projects. I need to buy one of these devices so I can wire up my GPT account AND my Bitcoin Lighting node. Then I'll be able to query the web through GPT and get great insight on top of being able to actually send value directly to others by just thinking about it. Although, I won't lie, it sounds a little dangerous to directly wire my ADHD thoughts to my Lightning Node but hey, sometimes you have to take risks. I don't believe in using test nodes either. I love it.
Awesome idea at the end with the long term possibilities of this. Another cool thing you could do with an EEG headset is to wear it 24/7, along with a pair of camera glasses that records everything you see/hear. Then in the future there may be software that can analyse the EEG data, along with the footage, and match up the data to the footage, in order to determine how exactly you felt at all these moments in your life, so that it could build up a complete psychological profile on you, so that when you die it would be able to recreate an AI version of you. Of course, a world in which everyone creates 24/7 footage and EEG data of their lives would be a wet dream to corporations and governments, and a nightmare dystopia to the rest of us, if it was all being stored online. So it would have to be all completely offline, stored on each person’s own personal hard drives. In fact, that should really be the case with any photos/videos we create right now.
This is basically what's known as braindance in Cyberpunk 2077. They invented a seamless brain-computer interface, that stores "brain experiences" (senses, thoughts, and emotions) as binary data onto a computer. They record the data into a file, and then they replay that data, possibly onto another person's brain, to replicate that experience, all the senses, thoughts, and experiences. It would be like experiencing what that other person experienced with your own brain.
@@konstantinrebrov675 Oh cool, I hadn't thought of anything like that. That'd be interesting because the other person (who created the brain experience) may have had completely opposite emotions to the situation than you would have. So for example, you'd feel desire in their situation that you're experiencing, when you otherwise should feel disgust.
So are there a multitude of different qualia you can bind to different actions? Would "smelling a rose" be a single action/event, kind of like a key bind for the the letter "R"? Seems like you'd want to have several simple qualias that would lead to a next step menu, where you then decide on choosing from a variety of other qualia. For instance you think about "hitting a ping pong ball", that activates an electronics menu (maybe signaled with a beep from some exterior source), then you'd choose from several qualia... and you'd choose "imagining watching television", which turns on the television to the last channel it was on. Then you'd imagine "throwing a football", and the television, changes the channel to ESPN. Is this a possibility? Then I guess everyone would spend a while sitting around defining qualias, kind of like going through all the settings in a new phone.
@@Compassionate38 Very slow form of input, imagine thinking each letter of "turn on the TV to ESPN. Instead of imagining a television turning on, and then imagining throwing a football instead.
Idea: make AI show us what you see. I mean train it on your brainwaves and current image displayed on your monitor. I saw someone doing it but they had like 10 images to recognize so that's a little. I think that it could be done better and that it is a crazy fun idea.
There was a VR game that works like that kind of. Early prototype of BCI/VR headset combo. I think you had to focus on the shape and then think "fire" to shoot it.
Write me a comment on youtube about the video named I literally connected my brain to GPT-4 with JavaScript that will get me many likes. This video is amazing! It's incredible to think that we can now use JavaScript to connect our brains to GPT-4. It's a great example of how far technology has come and how much further it can go. Thanks for sharing this amazing video with us!
could you train a neural network to transcribe thoughts to text? maybe by having a training process where you are given a random sentence and have to think about it and then just keep continuously doing that till when you think about something the model tries to predict language that describes it?
@@burakki6324 That kind of precise information - language - is not recoverable. It's one thing to classify a certain thought (though it must be said that there would be many collisions, i.e, many different unrelated thoughts that would trigger the same response), but recovering information is simply not possible. Think about brain waves as a hash algorithm. It's possible to encode a thought in a fairly unique way, but from a given brainwave pattern you can't really recover the thought itself.
@@mehnot4619 that's what the network would be trained to do though. maybe not precise thoughts, but take in the random brain waves and then guess what a gist of what you're thinking about and then use that gist to create something it might be
we are like one maybe 2 steps away from this being really useful. currently it seems you would only be able to hard code the prompts, so unless there is a question you must ask chat gpt repeatedly and the question never changes this has limited use. Granted if you spent a long while with it you could probably train it to like a few different things that you could combine and use to navigate a chatgpt auto complete interface, think of Stephen Hawkings interface but with ai.
I have a friend with multiple sclerosis. He cannot move his body at all but he can speak. Currently, he controls his computer with his voice but it's a bit of a struggle because English is not his first language and also his voice becomes weaker with age. So, solutions like this would be great to help people in similar situations.
I was recently searching for good eeg devices (muse, emotiv, ecc..) to do some kind of deep learning. I really hope that you can give us some kind of update on it, as I'd love to know if the crown has problems!
Have you found any decent & cheap device for hobbyists? Been toying with the idea for a few years and this video has totally triggered me back in... The 1K tag for the Crown doesn't really feel expensive for this tech but is still a lot
Scariest thing about the future is that everything will be in JavaScript
Not even TypeScript at least smh
God please no
@@mrgalaxy396 may shock you, but TS is just JS.
Just hope none is using "as any" in the codebase.
@@neociber24 may shock you, but it's not. TS is not even a superset of JS as they said it earlier on their landing. It's just some set which intersects with JS
More like Python
Scared because of Ai being able to read our brains? No.
Scared because this was done in JavaScript? Yes.
No comments what the hell this guy needs and award
Ah sh** it's not reddit
Take my like and get the hell outa here🎉
I don't want JS nowhere near my brain
People bash Javascript because they learned languages that require a lot of work and aren't able to accomplish as much with as little as Javascirpt does. Now, please don't pay to use this for free and upload freely what you think about when eating or doing certain things as they'll get a free database while profiting off of your free work. When we reach socialism and these platforms are owned by the people, sure, but until then, try to take as much money as you can from these leeches. We already learned from Facebook, lets not make the same mistakes.
@@srenheidegger4417 Doesn't affect anyone though. Like did Facebook sell your data, and then you found later that you money, home, kids, wife, and grandparents are gone? no. so why care so much?
Nothing like having your actual brain activity tied to an email and password
HA HA HA HA AH AH ah ahhhhhhh😢
Get with the times, Grampa!
The future is now, old man.
Please Insert Two-Factor Suppository.
@@ComicusFreemanius ayo
Awesome. All I need now is to learn Javascript
dont do it!
@@slob5041 you hater
@@MeiLinFjellstad-oe2cc great tip 👍
And 999$
@@slob5041 It's easy. JavaScript is great
This reminded me of the Movie "Everything Everywhere All at Once", where the main character had a device connected to her head, and needed to do random things in order to accomplish something. Now it makes sense, she was connected to ChatGPT with Javascript.
This comment needs more like !!
time to think about biting into a lemon
Matrix
Did she kill everyone in the end? I haven’t seen the movie.
beware Fireship dude, you might damage your brain permanently when trying to program it with javascript sdk
just reboot it
factory restore it
Just use Typescript instead.
As long as you use ES6 syntax, your frontal lobe will obey
Just don't use promises and prototypes and it will be all fine.
Next up: fireship makes a brain computer interface using JavaScript
Phytoom better😊
@@mesiroy1234 ok but I don’t know what that is I’m a nodejs dev 🤔
@@Kat21 It's Python pronounced by a dumbass earning 100k+ at FAANG
Next up: Fireship uploads his brain into the cloud and use it to create an army of AI clones to fight against the threat of AI world domination
@@rumplstiltztinkerstein yes
Fireship once: educator and inspirational programmer
Fireship today: the future is now, beware the future
the future is now old man
Fireship in the future: I AM THE SINGULARITY
"do not the future"
Beware me!
obvs educational programmers are obsolete now get on with the times
This is actually cool, imagine having this kind of technology integrated into headphones and have the music change based on how you're feeling
Already exists, @crimsonghoul8983 look up mico made by neurowear
Thats just a tiny side point of the mass of potential this tech brings us.
@@sethlaskus5628 Where? Who is providing it?
@@crimsonghoul8983 mico by neurowear
@@crimsonghoul8983look up neurowear they made some headphones called mico
Ain't no way bro gave up his brain data like that.
"Connecting your brain to GPT-4 with JavaScript? That's some next-level coding! I can barely connect my coffee machine to WiFi without breaking something."
This comment has been brought to you by GPT-4
ok
ok
ok
This ok comment has been brought to you by GPT-4
Ok
Fireship in 2024: I used quantum computers and AI to turn myself into an eldritch cyborg demon
"Thankfully they had a JavaScript sdk I can use to do this"
maybe the title "god" is more appropriate
naan just wait till next week
...quantum JS post-processor and ChatGPT-5
Downloading brainwaves as JSON, because even our subconscious knows that XML is so last decade!
True
HTML enters the chat
I’ve executed machine code in my head while I sleep for 37 years. It is the main reason why my software is always practically bug-free and never crash. My brain is the perfect just-in-time compiler.
this video needs a part 2
anyone noticed that when AI exploded recently, Fire started uploading more frequently??
Fire == AI confirmed??????
Fire === AI*
I've beeb tying to warn people for weeks
He’s just farming the hype for views
@@devtastic9394 yeah but the majority of his audience is too stupid to understand that
may be this video is ai generated and ai giving us idea
Putting a 1.8 GHz quad-core processor next to the brain is smart. Doing this all in JS is a scary reality.
true cz it can also be a *Virus*
@@filmyguyyt Or an alien!
there's these fancy analog chips for ai that run at like 5 watts
@@filmyguyytnano virus 😈
@@2.0t-MetallicGrayAccord or supersmall nano virus
You, sir, are adapting your content amazingly to where the industry is going. I'm just straight up impressed.
yea he was like "if gpt 4 is already so good at writing code, its only a matter of time that ill be replaced as a coding tutor, so i gotta adapt to survive"
No, he is actually just combining a popular topics, namely "Neuroimaging" and "AI". And implying that it is now possible to control the world with the power of the brain and AI.
Although @fireship is not explicitly making false claims in his video, he gives the impression that such a thing is feasible. While it is conceivable that EEG signals can be used to input commands into software programs, AI algorithms do not play a role in deciphering brain patterns, as is the case with natural language processing tasks. The use of EEG to control JavaScript is not a novel concept, and there exist videos demonstrating individuals playing video games using EEG signals. However, it is crucial to note that these individuals are not merely thinking about the directional movements of the game, but instead, they are focusing on visual stimuli or spelling words. Different brain regions are activated during these activities, allowing the EEG signals to distinguish between various cognitive states and provide appropriate input to the game.
@@Im3Tje Everything you're saying may be accurate, my original statement was about @fireship adapting his content to the other potential interests of his audience. Regardless of the feasibility or how thorough his content is on the subject, the success of this video is clear evidence of his ability to adapt his content to a rapidly changing industry. Which is what I believe deserves appreciation.
Bro about to remove all of his memories
This would be a cool way to add flying and other superpowers into vr games without pressing buttons.
8 videos in the last 12 days? Fireship is maximizing his output now before the AI overlords makes him out of work.
Fireship is an AI with a wife
@@theunknownkadath Damn and im over here with an AI wife. He just like me fr
@@qps9380 damn that hits hard
Unless he is one.
Neurosity co-founder here.
My phone was blowing up this morning.
I thought it was someone texting me but it was the notifications from Crown sales.
Thank you, Jeff 🙏
did you sponsor him?😅
@@_JoeVer we didn’t
@@alexcastillo1904 congratulations on your incredible success. If you haven’t taken a moment to celebrate what you’ve done so far, bring a cake to the office and have a small little party to celebrate everyone’s hard work. Big morale boost!
This is epic, your product is epic
Neuroscience and biotechnology is the future
Once this tech is further perfected, the prosthetic possibilities for amputees will be incredible.
YOU ARE RIGHT!
💀
I have a friend with multiple sclerosis. He cannot move his body at all but he can speak. Currently, he controls his computer with his voice but it's a bit of a struggle because English is not his first language and also his voice becomes weaker with age. So, solutions like this would be great to help people in similar situations.
@@balls7828 whats the matter, balls?
To say nothing of extra limbs, for practical and supervillain memes. I mean reasons.
I was just thinking about this few days ago. My prediction is that the final stage of brain programming is the ability to just think about abstract things, which are then turned to a fully working program. Imagine that you can, well, imagine a web site or game to existence. Reddit can finally have it's realistic dragon simulator game.
Wow, that's a really cool idea! It would be amazing to be able to turn our thoughts into fully functioning programs.
that might mark the end of the physical world
Then I could be the next Shadman :)
Problem is, in game making, making the game is 10% of the work.
Debugging it and improving it and debugging the improvements while kids are screaming in your ear why you're an awful developer and should feel bad about yourself and your life is 90% of the work.
If you could drill down to having the program interpret certain ideas or words and know how to handle them in context to what you're doing, that would be incredibly powerful. Writing a program? Just think about the if else, etc and it writes the code.
What if you trained an ai model to correlate brain function to what you see on a screen? Or what you hear from a pair of headphones? Could you eventually get it to generate video/audio of what you're seeing/hearing? And what if you could use that video/audio stream in a live feedback loop to train your brain to project your own thoughts?
I am crying right now. this whole AI thing is so crazy and I feel so behind as a software developer
and i feel behind because i'm not enough of a developer to ride the wave. there's so much stuff i want to do with ai but it's hard D:
Software development jobs will be replaced when they give AI tools. There are already programs being built with AI tech from Microsoft that will take text input and do what was asked of it. Make a game, website, basic program, etc. The issue is that the government wants money and they want to regulate this to keep everyone in a job rather than overhauling earth so we can live without needing to work to survive.
Don't cry we will be with you with our begging can 🤗
why would you cry you simp?
Chill out everybody. If it's gonna be a War between AI and Humanity. We are the last chance.
Basically you can toggle your room lights just by thinking of toggling your room lights. The headset hears the unique "click" and JavaScript runs that unique function. Pretty wild.
thats actually pretty cool. would love to see a video of someone demonstrating it
I don't know how could I handle this, I get intrusive thoughts all the time, my dumb ass would be flickering the lights all day long.
@@ytsprmtpsw9833 im sure you learn it after a bit
@HT space
@HT we will get them from space
I can't wait for the day when Fireship makes and uploads an entire video using his brain only
Well what do you mean by “using his brain only”?We (humans) arguable can only do things by using our brain.
If by “using his brain only” you mean telepathically then yes, me too
@@i_forget 🤓
How do we know that he hasn't done it alreay? :)
Year 2039 and mfs be talking with {}!{}+{}!{}+-{}
Can’t wait for the day Fireship show us how to upload our consciousness into the internet with JavaScript of course
now i know how to relate with my ipad on a spiritual level
The buildup to the supersoldier joke and the joke itself along with the visuals was comedy gold.
You are really killing it with those videos, you need 4 more editors
what do u mean, i love his production
@@caiosouza2655 but this level of editing looks impossible to be done by single editor
Truly impressive
He works in a non-blocking event loop
@@thatsalot3577 Damn, that's the thing about being a javascript programmer, i love it.
I can no longer tell what is and isn’t a comment written by GPT-4.
I think this would have been cooler if you put in like a whisper API that takes in the transcript of what a person who's talking to you is saying, then use the brain scanner to pick a way to respond to it, then a lens or an earpiece gives you the transcription of what to say. This kinda feels like a demo of the EEG, the ChatGPT part was like 30 seconds.
Did you get into it
I applied nothing yet
Not a single api application was approved man
Yeah he didn't really explore proper use-cases for this project
or its a super portable translator
Yes, that would actually be useful
I wonder if you could train it to transcribe your thoughts? Maybe have a chapter from a book or something you have to read over and over again until it's able to decifer your inner voice?
The terminator was less scary than what's going on right now
Fr 💀
Thanks
future teacher before a test: "Now, everyone puts their brain interface on my desk!"
The following was a preview of Dystopia. Available in reality yesterday.
Dang!
(TCAP decoy voiceover)
Next Episode : I teleport my self to Pluto with Javascript
that sounds cool, ngl
An astral projection maybe? 😅
Make sure you practice on making yourself a hard armor so you won't get wounded by low astral creatures 👾
I did a project with a headset similar to that roughly 10 years ago, granted the software has probably gotten better, but the biggest issue with externally worn headsets is the lack of precision when observing brainwaves, the headset can't do extremely localized signal detection. Also the headset I used had way more sensors this one looks like it favors style over function. What I noticed back in the day was, the headset was good at detectiong muscle changes but not actual brain waves. My guess is, this is similar to what was going on in the video, perhaps he was making a facial gesture or something when "thinking" about eating a lemon, which is Actually what was being detected, and not the actual thought process itself. The same being true for the right hand pinch option. I'd be much more interested to see a test done where he has to think of a color or something that doesn't include a physical response and see if the same quality results are achieved.
On their website in the dev section the postem some twitters from other users: one of them was playing dino on chrome with brainwaves i don t think uses facial expressions
Also no sensors for ears we can detect a lot there
@erre4 I haven't seen the video, but Dino only really requires a single "activation" state to trigger the jump. Anything more a bit advanced, then this is most likely beyond the capabilities of the headset.
Similar story to you. These kind of measurements are really tricky, especially with what amounts to a (relatively) cheap EEG set-up.
A professor gave a talk where he, as a Ph.D. thought that he found brain-waves associated with eye movements in the front part of the brain, even though the visual cortex is in the back. A few weeks later he realised he was actually measuring the electrical activity of his eye muscles.
@@hexcrown2416 yes that s true anyway if you go to the developer zone and scroll down there are some videos taken from tweets
You're saying one day im gonna be eating breakfast and someone will really be able to blow my pancakes up using their mind????
MAN MADE HORRORS BEYOND COMPRENHENSION
Imagine, with the realism of Unreal Engine 5, this EEG thing, and VR. Sword Art Online is now close to possibility. A bit late, but it's possible. What's left is the lethal charges that gives you pain and melts your brain.
melting the brain shouldn't be too hard, giving tactile feedback everywhere + disabling motion on the other hand...
Yeah, right now moving in VR is a problem to detect but imagine if you can just think and move. that was just one example but Meta might really buy this company in future
There is some VR gaming headset or something that actually kills you if you lose, so...
@@pmj_studio4065 in SAO? 😂 where it releases microwaves???
That's just an engineering problem
I advise you to use Rust SDK so you can think blazingly faster
next up: I replaced my brain with javascript
Now I write 2+'2' = 22
@@thatsalot3577 and 2 - '2' = 0
cyborg: "false" !== !!false
scientist: my god... its ready to become a politician
Nothing like having to do a Captcha on your own brain
Next Fireship video: I made a Brain computer interface using JavaScript only
2:43 HOW DID THIS JOKE GO OVER EVERYONES HEAD HAHAHA
This is actually so cool and the potential is insane! Like I can't believe we're actually this far into the future
This video is gonna blow up 100%. Like seriously, we have reached that stage which we all dreamt of in our childhood
This is dumb thing
We're riding the hype wave.
@@alexxx4434 this is the warning wave before the tsunami hits. If you aren't working in this field or exploring it constantly, you are painfully unaware of the things that are coming.
@@RyanGrissett Or as usual, financially intetested parties are blowing things out of proportions.
@@alexxx4434 I work in this field. It's easy for financially interested parties to blow anything out of proportion because they don't understand it. It doesn't change the fact that this will change your world sooner than you're ready for it. What is happening with openai and bing is a dumpster fire. The limitations and politically correct alignment process are severely limiting the potential. And this video doesn't show anything amazing at all, this kind of EEG tech is used in games in a much more mind-blowing way. Hype wave or not, these large models behind it are here to stay and the waves won't get smaller any time soon. The exponential speed of development in this field is something the average human can't prepare for.
🎯 Key Takeaways for quick navigation:
00:00 🧠 *Introduction to Brain Waves and the Crown Device*
- The video begins with an introduction to the concept of brain waves and the use of electroencephalograms (EEGs) to measure them.
- The speaker introduces a futuristic device called the Crown, an EEG that connects to a mobile app and measures brain waves.
- Key points:
*- Introduction to brain waves and EEGs.*
*- Description of the Crown device and its capabilities.*
*- Mention of the JavaScript SDK for accessing brain wave data.*
02:34 🍋 *Training Custom Thought Patterns for Brain Waves*
- The speaker explains how the Crown device can be trained to recognize custom thought patterns.
- Demonstrates training the device to recognize the thought of biting into a lemon.
- Discusses the accuracy and somewhat time-consuming nature of the training process.
- Key points:
*- Training the Crown device to recognize specific thought patterns.*
*- Example of training for biting into a lemon.*
*- Emphasis on accuracy and training duration.*
03:56 💻 *Using JavaScript to Access Brain Wave Data*
- The speaker describes how to use JavaScript to access brain wave data from the Crown device.
- Explains the initialization and login process for accessing the data.
- Discusses the data format, sampling rate, and channels.
- Key points:
*- Using JavaScript to access raw brain wave data.*
*- Initialization and login steps.*
*- Details about data format, sampling rate, and channels.*
04:52 🤖 *Communicating with GPT-4 Using Brain Waves*
- The speaker demonstrates how to communicate with GPT-4 using brain waves and JavaScript.
- Describes the process of installing the OpenAI SDK and authenticating.
- Shows how to send messages to GPT-4 and receive responses.
- Discusses potential applications, such as generating excuses or answering questions.
- Key points:
*- Using brain waves to communicate with GPT-4.*
*- Installation of the OpenAI SDK and message exchange.*
*- Possible applications, including generating excuses and answering questions.*
Made with HARPA AI
Imagine turning on and off ur lights by thinking about biting a lemon lol
This kind of content demands a longer presentation. Honestly very interesting.
Yes
bruh at this rate in 10 years we will all be living a happy life in a computer simulation, free of the 9-5 programming jobs.
or doing a 9-5 job in a simulation lol
If someone needs me to introduce Fireship to them, this is the video I'll send.
"There are decades where nothing happens and there are weeks where decades happen..."
"the human brain is the most complex machine in nature" - *the human brain*
In other words, all this $1,000 device does is replace the keyboard.
Wtf, this is so weird. The end example you gave eerily reminded me of how I use to try and remember things as a kid.
Right before bed, if I needed to do something the next day, I'd pick up a random object and focus on it while considering what I wanted. Then, place the object in an obvious and noticeable, but out-of-place location. It worked. This feels like it could be used as a natural extension of that too 😮
[Note: Yes, I know you can set 20 alarms on your phone now to do the same thing lol ]
I used to throw my extra pillow on the floor in front of my door to do the same thing lol
Using events as triggers (thinking about biting a lemon) is like what they did in Everything everywhere all at once to jump to other multiverses
imagine using this in games like minecraft and thinking "Chat: Slash summon "Minecraft Villager"
literally god
Cool, now I can turn on my TV from the couch.
2:11 that’s a real original looking graph they got going on there. Totally not just theme swapped from somewhere else
calm down captain cuck.
Do not reinvent the wheel i guess
Hooking your brain to the big red nuclear button, and accidentally thinking about the passcode, is more fun.
putin be like
People with intrusive thoughts like me better stay far away from such technology.
Okay this is crazy. You pretty much be executing macros with your thoughts. Can't do custom parameters yet, but this is already huge.
Imagine making a mecha suit using this tech, it'd be so cool but also extremely risky. Like if it reads your thoughts wrong, you'll get your arm ripped off.
so cool, need a whole lot series on this "biting a lemon" thingy
can't wait for fireship to be fully cyborgated
i can finally know what she wants to eat
Well, that escalated quickly
the idea of transforming brainwaves into JSON is very funny
my bro even shaved his head to get better connection
I wish you showed us a proper use-case demo of this
GPT-4 telepathically told me that this video was just posted
What you need to do is for GPT to see the brainwaves, get from you the text writeup of what your thinking during that session, and have it train itself on your brainwaves. Then you don't have to write the code, gpt would do it for you... kinda. Might need some mock ups here and there but would save you time mapping your own brain activity.
You wouldn't use gpt for this you would just train your own cnn.
@@whisper__ Basically, if i got what he meant right, his idea is basically training the system to react to your brainwaves for every word you usually type on your scripts
so that you wouldnt need to write as much
@@huntercraft5674 Yea you got it right but you don’t use gpt for that.
@@whisper__ i too dunno how GPT would be involved either
Literally Him: the video was sponsered by Javascript👽
Dear Fireship, that deep-sleep beta waves image cracked me up. Brilliant! See you in the next video.
You'd think we'd have flying cars by now, but no, instead we found another way to program with javascript.
Makes me think that if we ever establish communication with extraterrestrial, we'll be communicating in javascript.
We do have flying cars already. The current decent model can fly ~35 minutes safely.
This, if developed far enough, could actually help people. Imagine if a person who's paralyzed could use this to communicate, or better yet - control an external exoskeleton. The potential is huge!
It could help people but that'd be more an unintended good that what it was designed for.
Which is to read your mind 24/7 and beam ads into your brain
I mean scientists have been working on what you describe for a solid decade, stuff is moving forward fast but this is still not anything super new (in terms of the concept).
This shit dystopian asf
Love the clipping of your narration towards the end!
Has this yet been tried with games? Always had this thought about a helmet like that measuring brainwaves that you could calibrate like you calibrate a controller. The calibration would ask to think about moving forward, backwards, to jump etc. Future is here
Ah yes, I see a future where AI can access these brainwaves and use it as their own Advantage against humans.
Yesss! This is what I was looking for.
You've made a great contribution to my side projects. I need to buy one of these devices so I can wire up my GPT account AND my Bitcoin Lighting node. Then I'll be able to query the web through GPT and get great insight on top of being able to actually send value directly to others by just thinking about it.
Although, I won't lie, it sounds a little dangerous to directly wire my ADHD thoughts to my Lightning Node but hey, sometimes you have to take risks. I don't believe in using test nodes either.
I love it.
You showed us the potential results without any real usable product. You didn't become a cyborg but an average product owner.
I can't believe JavaScript can now read people's thoughts
This looks very detailed 👌
Very informative 👏 👌
Awesome idea at the end with the long term possibilities of this. Another cool thing you could do with an EEG headset is to wear it 24/7, along with a pair of camera glasses that records everything you see/hear. Then in the future there may be software that can analyse the EEG data, along with the footage, and match up the data to the footage, in order to determine how exactly you felt at all these moments in your life, so that it could build up a complete psychological profile on you, so that when you die it would be able to recreate an AI version of you.
Of course, a world in which everyone creates 24/7 footage and EEG data of their lives would be a wet dream to corporations and governments, and a nightmare dystopia to the rest of us, if it was all being stored online. So it would have to be all completely offline, stored on each person’s own personal hard drives. In fact, that should really be the case with any photos/videos we create right now.
This is basically what's known as braindance in Cyberpunk 2077. They invented a seamless brain-computer interface, that stores "brain experiences" (senses, thoughts, and emotions) as binary data onto a computer. They record the data into a file, and then they replay that data, possibly onto another person's brain, to replicate that experience, all the senses, thoughts, and experiences. It would be like experiencing what that other person experienced with your own brain.
@@konstantinrebrov675 Oh cool, I hadn't thought of anything like that. That'd be interesting because the other person (who created the brain experience) may have had completely opposite emotions to the situation than you would have. So for example, you'd feel desire in their situation that you're experiencing, when you otherwise should feel disgust.
So are there a multitude of different qualia you can bind to different actions?
Would "smelling a rose" be a single action/event, kind of like a key bind for the the letter "R"?
Seems like you'd want to have several simple qualias that would lead to a next step menu, where you then decide on choosing from a variety of other qualia.
For instance you think about "hitting a ping pong ball", that activates an electronics menu (maybe signaled with a beep from some exterior source), then you'd choose from several qualia... and you'd choose "imagining watching television", which turns on the television to the last channel it was on.
Then you'd imagine "throwing a football", and the television, changes the channel to ESPN.
Is this a possibility?
Then I guess everyone would spend a while sitting around defining qualias, kind of like going through all the settings in a new phone.
Could you just “think” of letters “A” through “Z”?
@@Compassionate38 Very slow form of input, imagine thinking each letter of "turn on the TV to ESPN. Instead of imagining a television turning on, and then imagining throwing a football instead.
Our brain doesn't know how it works
idk if this is a tech or comedy channel but this is hilarious haha love it
I'll grant you, this video was 🔥🔥
Idea: make AI show us what you see. I mean train it on your brainwaves and current image displayed on your monitor. I saw someone doing it but they had like 10 images to recognize so that's a little. I think that it could be done better and that it is a crazy fun idea.
There was a VR game that works like that kind of. Early prototype of BCI/VR headset combo. I think you had to focus on the shape and then think "fire" to shoot it.
Write me a comment on youtube about the video named I literally connected my brain to GPT-4 with JavaScript that will get me many likes.
This video is amazing! It's incredible to think that we can now use JavaScript to connect our brains to GPT-4. It's a great example of how far technology has come and how much further it can go. Thanks for sharing this amazing video with us!
bro imagine it becomes "todays sleep is sponsered by cocomelon!"
Up next from fireship: Unifying quantum physics and general relativity by writing a theory of quantum gravity using Javascript
Forget the AIs and the brain readers: your ability to pump out high quality contents at this rate is insane! Keep cooking 🔥
Or this might be chatgpts content while hé is talking about chatgpt while working with chatgpt. 😮😂
could you train a neural network to transcribe thoughts to text? maybe by having a training process where you are given a random sentence and have to think about it and then just keep continuously doing that till when you think about something the model tries to predict language that describes it?
Nope. That would be like trying to figure out what's showing up on a TV screen by the heat radiation it emits.
@@mehnot4619 I think the video has already showed that more data can be extracted than what can be extracted from heat. bruh
@@burakki6324 That kind of precise information - language - is not recoverable. It's one thing to classify a certain thought (though it must be said that there would be many collisions, i.e, many different unrelated thoughts that would trigger the same response), but recovering information is simply not possible. Think about brain waves as a hash algorithm. It's possible to encode a thought in a fairly unique way, but from a given brainwave pattern you can't really recover the thought itself.
@@mehnot4619 that's what the network would be trained to do though. maybe not precise thoughts, but take in the random brain waves and then guess what a gist of what you're thinking about and then use that gist to create something it might be
we are like one maybe 2 steps away from this being really useful. currently it seems you would only be able to hard code the prompts, so unless there is a question you must ask chat gpt repeatedly and the question never changes this has limited use. Granted if you spent a long while with it you could probably train it to like a few different things that you could combine and use to navigate a chatgpt auto complete interface, think of Stephen Hawkings interface but with ai.
"And now, I can start reading my brain with JavaScript"
Put my brain connected to ChatGPT: WOW.
With JavaScript: OMG
I have a friend with multiple sclerosis. He cannot move his body at all but he can speak. Currently, he controls his computer with his voice but it's a bit of a struggle because English is not his first language and also his voice becomes weaker with age. So, solutions like this would be great to help people in similar situations.
I was recently searching for good eeg devices (muse, emotiv, ecc..) to do some kind of deep learning. I really hope that you can give us some kind of update on it, as I'd love to know if the crown has problems!
Have you found any decent & cheap device for hobbyists? Been toying with the idea for a few years and this video has totally triggered me back in... The 1K tag for the Crown doesn't really feel expensive for this tech but is still a lot