@@DrWaku - we had to go from third-hand reports, I'm sure that the actual investment proposal referred to staged investments. Also, I wouldn't be shocked if Sam Altman was happy enough for the 7 trillion figure to get out to make others hesitant about getting in front of an expected OpenAI steamroller.
according to sam (in his most recent lex fridman interview) the 7 trillion figure was a joke. here is the transcript: You tweeted about needing $7 trillion. - I did not tweet about that. I never said like we're raising $7 trillion or blah blah blah. - Oh, that's somebody else. - [Sam] Yeah. - Oh, but you said it, "Fuck it, maybe eight," I think. - Okay. I meme like once there's like misinformation out in the world. - Oh, you meme.
Try like a never ending steam of trillions... Ever built a gaming computer? Its like giving a mouse a cookie... This big behemoth wont be any different. Just with MANY more 0's on the cost.
You're my new go to professor for Cloud related tech, fantastic content and thank you for 0⃣ background music ❤ we humans can only process so much information in our brains without a need for added sound. Your voice alone is perfect!!
Wow, dr.vaku, I just want to tell you personally. Thank you so much for this information that I never would of had access to. I really appreciate the informational my brother, I learned so much.
Wh (watt hours) is energy, like the energy in 10 logs you use to enjoy a campfire. W(watts) is the size of your fire and how fast you burn through those logs.
The first company to benefit would be Nvidia, but thing that could impacts would be new AI model architectures, Intel's breakthrough in neuromorphic computing, and competition from ASIC chips dedicated to core base models . Microsoft is taking an early risk, while others are hesitating a bit right now rn .
I believe you may be mistaken about TensorFlow, there Google exclusively uses Jax now. From my understanding, PyTorch (like Jax) uses operator overloading to trace the compute graph. For PyTorch, this trace is done just-in-time produce torch-script, which is used to stitch together specific kernel invocations. There's also a compile option, which can perform more efficient operator fusion with a higher startup cost. Torch-script is closer to something like ONNX, where the macro-operations (e.g. matmul) are maintained, whereas LLVM breaks them into mul and add ops (or it used to), which need to be re-combined into M-M or M-V ops for the TPUs.
@@DrWaku lol , for some context on my comment: I am messing around with finetuning an llm model on my *obsolete* server , without any relevant knowledge in regards to coding or machine learning , somehow my learning curve matched with this video and I got many clicks of understanding and re-newed focus , so , no: thank you 🙂
Since we are in the digital age and it is nice to access info and have people safe while doing so. Are there any laws or agencies that that govern this new technology? Does the public have full knowledge of all the plan's that these companies have and what uses the AI & Super Computer's will be used for. Thing's like augmented reality would have great implications in the wrong hands with malic intentions. The technology is great for society to learn as long as it can't be abused and is openly governed!
Regarding the name 'Stargate', if it was up to me, I'd have used the name 'Deep Thought' after the second greatest computer ever built in Douglas Adams' uncannily prophetic "Hitchhiker's Guide to the Galaxy".
Arizona haa "Palo Verde Generating Station (PVGS) is considered the largest nuclear energy facility in the United States. It is located approximately 55 miles west of downtown Phoenix near the community of Wintersburg, Arizona."
I think Dr Waku’s eyebrows look good. I like them. Also the content sounds somehow useful to me. Can u pls make video about quantum concept in AI next time and compare it with quantum concept in making chips?
The eyebrows were a recommendation from someone who cares about me. I'm glad you like them. I made a video about quantum computing and its influence on AI. You might want to check it out. Nothing here about making chips though. ua-cam.com/video/TYNAPof0OqE/v-deo.htmlsi=aJFclBeeRXSW02nj
"Sam Altman announced he was trying to get 7 trillion dollar..." Sorry, but where is the source of Sama "announcing" such thing? I appreciate your effort of trying to make many things understandable to most, the preparation of your vids, etc.
@nadtz correct. and many other sources after. There is eventually something behind it. But even if it would be totally accurate, there is a huge difference from him attempting to do so and announcing it. The intent of such announcement would have a different meaning and likely an specific purpose. I just didn't happen.
You're right, the first anyone heard of it was leaks via the Wall Street journal. It eventually caused such an uproar that Sam quietly told someone details about how the seven trillion number was derived. It seems like he elaborated a bit with the hope that it would be leaked, because he probably couldn't directly talk about it himself. That's far from announcing though, my bad on the choice of word there.
This is REALLY why there is a electricity shortage. And even if they don't use it to vaporize all the unwanted, in 5 years it wont be good enough for them and they will be tearing all those nodes out to upgrade it...
Lithography doesn't advance at that pace any longer. Furthermore, it doesn't matter whether newer chips go faster, unless they go faster at your fixed datacenter power budget, and within its existing form factor. None of this is plug and play like it used to be. Maybe all you get on the next generation of chips is the same computer, but with a lot more local RAM (which takes a lot of transistors, but doesn't switch as much, so it isn't as hot). But if your RAM and compute are already balanced for your workload, this is not worth much. Obsolescence in the world of AI silicon has become a big "it depends" story in the modern era.
As many GH200s as they can get. This will be the hardware for the ASI and it is going to change mankind. By end of 2028 the target but I think way before that.
All those analysts in the nsa need a lot of compute, not to mention all the new ai analysts... Please understand that what you and everyone type INTO chatgpt is Vastly more important than what comes out
100,000 hi emd Ai nVidia started with the H100 . @ $70k per board node without local HBM Memory. That gives u a module to plug into a high end rack heat extractor PSU. Over 100bil, double for Heat extract / PSU then 20bil for local nukepower..interconnect..@ 500bil InfiniB is the top optic mesh network.. tho since Mellanox sold its IP .. Xilinx plans much faster optical protocols. M$ hav never made a right decision.. Arizona ? 😂😂😂
But seriously, at some not-so-distant point, the AI itself will solve its power problem by finding wayc to fold in upon itself and use exponentially less power to do the same tasks. That or the world becomes one huge Tesla Powerwall...
Dr waku, AI technology is moving too slowly people need cure for things like colon cancer stage 4. Will General Super AI solve this problem quickly or better therapy so they live instead of suffer cold cruel death😢😢. Many take forever to be approved set alarm developed and the ones that are approved are so so so so so
Oh come on. There are natural cures for all health problems. These have been available forever. But people have been lied to so thoroughly that they look to corporations for their health. How ridiculous! The answer is mind/body medicine.
💥 2028 is too far away, AGI will come before that. We need these datacenters right now, and joining all world hardware, they can't support AGI today. And we don't have the energy to run them. We need several nuclear power plants, and this can't be built in one year. So, we'll have AGI, but few people will use and it will be very expensive at the beginning. 😮😮😮
Dream on. Billions of dollars and brains were thrown to make Full Self Driving Tech possible but it could not be cracked. But here we are, thinking AGI, something immensely more complex, is somehow achievable. Talk with a 9-5job AI scientist. He will tell you why reaching AGI is not feasible. Otherwise CEOs and content creators will keep feeding you the fantasy. 2028, damn.
Sexual identity, gender identity, and gender expression are different layers. Someone can have male sex (that's the Adam's Apple), male or female gender (pronouns), and male or female affectation (dress, appearance). Traditionally, a gay man for example would be male sex, male gender, but female gender expression (makeup, bright colors, etc). Think about other people you know that don't completely fit the gender binary, there probably are some, and see if you can describe them in this framework.
Great Video, especially the datacenter power consumption and bandwidth breakdown, oh and the CUDA explanation. It was the most I've learned in 27 mins probably ever. 😸
It had become one of my favorite channels. 🔥🔥 When the subject is interesting and the host is assertive and have communication skills, the length of the video is not relevant at all to keep the audience tuned. Keep it up brother.
Hi @DrWaku. Do you believe artificial intelligence will be powerful enough in the next 20-30 years to solve aging? What can we expect in the next years, when it comes to digitization of biology? Do you think AI can have massive impact on biology even before reaching AGI, to ultimately be able to cure aging?
Great questions. Definitely check out my previous video on reversing aging. Seems like I should do a follow-up on the topic. :) In 20 to 30 years, AI will be very very powerful. At the very least, we should be able to fully simulate a human body to determine the effects of a drug or treatment. Which will lead to customized medication, and dramatically increase the impact of medicine. When it comes to curing aging, there are a number of different breakthroughs that have to work out. Of course we could try digitizing brains and uploading, but if you want to talk about biological immortality or indefinite biological lifespan, I think it's going to be trickier than it seems. It's like upgrading a complex system in place without disturbing its functionality and preserving continuity of experience. In short I don't know, but it seems plausible that we might need AGI before we solve all these issues. Nevertheless, many people may already have reached longevity escape velocity because of those other factors.
@@DrWaku Yes, we would be very happy if you would make a new video on AI and aging future. It's hard for us, people who are not in the AI space to visualize how fast AI can have an impact on aging, and what it would took, to solve it with these powerful computers. I would definitely watch a video where you would explain it even further. I watched your video on longevity before, but I didn't have a feeling that you spoke about timelines. Follow up on that topic would be very helpful, thanks!
The thing that most normal people don't understand is the exponential increase in AI. When a typical person thinks about 20-30 years, in AI exponential land that ls like 3-7 years. It is extremely likely that Artificial Super Intelligence (ASI) will be achieved before 2030. That's pretty much why OpenAI is wanting a $100Bn data center by 2028. By 2030 AI will likely be smart enough to solve most of our hard problems. If we'll still be around / in power to benefit is another matter. If AI companies don't start to taking safety/alignment very serious we'll need pure luck to survive this.
@@DrWaku Also, fyi, I'm not recalling the specific details at the moment. But there is actually a very promising anti-aging treatment that revitalizes the mitochondria that I believe is currently in or will soon be in human trials. If I remember correctly its a pharma company out of Japan. From what I've heard this isn't the usual not actually significant / questionable results typical "anti-aging" hype. It seems to be a real potential breakthrough that has had very promising results in animal trials. This may end up being the first real, actual anti-aging treatment that adds 10% or 20% time for a person. The cynic in me fully expects this will be priced at $1 million or more and slowly decrease in price over years, even if it only costs them $25 - because every rich person in the world would open their wallet for a scientifically proven longevity treatment...
@@DrWaku - I'm a bit skeptical about fully simulating a human body in 20 to 30 years. There's an awful lot of wet lab work to be done and I'm hopeful that there will be binding ethical constraints.
link speeds are much higher than 400G these days... 800G and even 1.2T and 1.6T optics exist are are in use. also fiber doesnt have a hole in the middle. its solid. usually fibers consist of 2 different types of glass with different refractory properties.
This presentation was outstanding. I learned so much. Thank you! I'm still wondering how they're going to power these data centers, it takes 7-10 years to build a nuke plant, natural gas takes at least 5. I'm guessing Microsoft is building in AZ because they're going to set up colossal solar farms to supply the juice. Hot, but sunny.
A GREAT technical review, HOWEVER: Betting against Microsoft 24:00 will never pay off. Don't know if you're aware, but there are already Nuclear Power Plants at their Arizona location. And they aren't running maxed out. The worst case would be expanding the current Nuclear Plants. Other than that, a fantastic technical deep dive. ⚡⚡⚡ .
Thank you! Yeah I didn't look into existing power in Arizona, someone else mentioned the nuclear power plant(s) too. It makes sense. Microsoft wouldn't have shortlisted Arizona otherwise.
First! (I think). For your new look, I think you should do what your comfortable with. We all like and follow you for your excellent content and your unique delivery style. So be yourself and keep the videos coming! Thanks!
So than perhaps Dr Waku can be our John Conner? 😆 Spoiler alert, humans win at the end of Terminator…so no worries! At least…not till the liquid dude turns up in the sequel, than we’re REALLY done 😁
Filmed while being a traveling UA-camr in Australia. How's the look? Please join our discord! Discord: discord.gg/AgafFBQdsc Patreon: www.patreon.com/DrWaku Apologies for the too-small data center costs, the source I used was not very accurate.
Excellent. In 3-5 years when chips are cheaper/faster, what will they do with the current clusters? Will they still be used or sold off? If in 5 years the same center would have 20 or even 50x compute, would it make sense to still use this current cluster?
It will be used as long as it fits their needs. Hard to make predictions in that regard over a span of 10 years. As an IT professional I came across so much outdated hardware, and often the simple reason is "never change a running system". Fair enough, I never worked for an industry giant. But even the big companies put such an investment in place for specific purpose, and as long as it does the job it is likely the system keeps running until upkeep/maintenance cost surpass a certain threshold.
Seems pretty obvious they should build that 'Stargate' in Northern Quebec. Cheapest power in the world, cold climate, massive water availability for cooling, etc. If manpower would be an issue, build it in Montreal, which has a large reservoir skilled IT and AI professionals. Not quite as much water as further north amongst the James Bay hydro infrastructure, but Montreal is an island in the middle of one of the largest rivers in the world, the Saint Laurence river that is fed by the runoff of the whole of the Great Lakes. No water shortage in Montreal.
no place on the planet has a 'large reservoir of skilled IT and AI professionals'. Simply does not exist. If you took all the AI pros on the planet and dropped them in quebec they still would not have a large reservoir of them. But the way you speak about it, as if it would matter if there was or wasn't some reservoir, tells me you don't know your a$$ from your elbow regarding how that stuff works. It's not bricklaying, where the output can be predicted by how many guys you have on it, and they all produce about the same.
Is he supposed to be a fireman or something? I don’t understand his costume. I’m having a hard time listening to what he’s saying because he looks so ridiculous.
Nah, I just dress eccentrically. The hats became a theme on my channel. Partially because my hairline was very high originally. I embrace the look, don't worry about it
Watching it now, a few months after later, OoenAI/Microsoft have indeed come forward with plans to build entirely new nuclear power plants to power the data center, as well as keep an existing one from shutting down. My take is a large chunk of the $100 bill investment probably is for that. It would have been interesting to learn about a rough break down of this humongous investment.
I feel like a lot of these people study computer engineering, or electrical engineering. It comes in very handy when trying to reason about why hardware isn't working. I think it's very important to understand all the different types of GPUs out there and their pros and cons. There are a number of good substacks from people that watch the semiconductor market or think about ML ops, which could be a great resource. I would try to get a job at a company that has huge data centers. For example, Google, Meta, Amazon, maybe Microsoft. Even if you don't work on data center stuff initially, you can transition to a new role within the company after a while. So I guess the standard path to working at these companies would be a good approach too. Some of the titles you'd be looking for include site reliability engineer (SRE), infrastructure engineer, etc.
@@DrWaku Thanks for the detailed explanation . For someone who is from a tech background and wants to learn about GPUs, NIC cards, PCI connect, switches and other hardware which are used in AI clusters and data centers. How do they work together in a cluster ? Is there any material or courses on this which one can go through?
Like I said look for some blogs and substacks, there are a lot of high quality ones www.readfuturist.com/ thesequence.substack.com/about Also look into guides for how to build your own PC, some of them are very detailed and you can learn a lot. Ideally, work through a few examples on PC part picker. For networking stuff, depending on your knowledge level, you could try to read about how the network stack works in Linux. You could try to look up the various networking technologies like 10G, fiber, etc and try to see why they were introduced. Actually, I think one of the best things to do is to read papers published by Google and others that talk about their data centers, I will try to dig one up and paste it here
I currently work at a data center and we are slowly migrating to an AI data Center. I love how you consisely explain the intricacies of all the back end stuff. Definitely informative 🙂
As someone who has lead & built datacenters (awhile back) most of this presentation is wayyy off. 1st you have not mentioned the bricks + mortar cost. So acquisition of strategically located + valuable land. (Near to power grid connections + located alongside current fibre optic network routes, etc.) And their is much more here... Then when referencing the rack population your way off! There is a maximum amount of population you can do because of power & thermals. Again, this is way deeper. This is a fun video done by an enthuasitic novice who has read for a few mins, maybe a couple of hours. Not a video properly researched & sharing a credible understanding.
Amazing video content ! It opens my horizon... I live in Indonesia, the country which is stuck with government's data center being hacked/locked by Ransomware!...but with no backups data at all (only 2%).
I have not seen a single NSP or NAAS expand in 'anticipation' of AI. Network/Carriers are the next most important component to the data center, otherwise you just have a crypto facility. Unless fiber cost per mile comes down - WIFI, 5G, and Satellite cannot support the massive ingress and egress that is 'supposed' to be happening? Also, prices per mbps are already back to pre-pandemic lows on transit and transport. Also ipv4 is no longer scarce for some reason, but still holding around 40.00 per IP.
I love tech, I love AI; but we shouldn’t be using it as factory installed malware. Maybe Tucker Carlson was right when he joked about nuking the data centers. This is getting extremely dystopian, extremely quickly. Recall is 100% going to send screenshots/ video data to Microsoft. They wouldn’t need this big of a data center if they weren’t looking to capitalize on it somehow. We need to update the privacy laws in the US. Recording someone, without express permission from both parties, is only illegal in about 1/3 - 1/2 of the states. It needs to be illegal in all of them and there needs to be a carve out for stuff like this where the tech companies can’t force you into it with their terms of service.
85% renewable is worse than 0% renewable. It's greenwash. The reason they can never be 100% renewable unless every datacentre is next to a hydro dam is because it's not always sunny or windy so they MUST have a gas power station ready to go at all times. Meaning you have to build TWO power sources creating vast pollution in the process. Just green wash. I also question that 85% figure. They probably fudged it by subtracting useless 'renewable' energy they sell back to the grid. A grid which will also need power when they're not selling back. So again it's green wash all the way down.
This measurement quantity of money, power, speed, and other, to my opinion it is not change anything. Now it is sounds as dust in eyes. What humans need? They need changes in way how they need changes. But if this change in a way how human! need! and want! daily routine in life, it will be all another absolutelly. How AI help to live and navigate in the way how you need and want in your life. If AI help you for real it is value. Soon will be 9,10 billion humans then navigation is highly important to recieve what you dream in your life. Absolutelly more accurate truthful and justful measurement
Nvidia has always had rack mount servers for GPU, like the DGX systems. I haven't looked into this closely, but I don't think it's any different. They might well use them, but the larger the setup, the more likely you need something custom. Maybe Microsoft wants to customize some of the remote access options or motherboard or peripherals. You can build your own system like DGX, it just might be more expensive if you're only doing a handful.
@@DrWaku I was under the impression that nVidia hat pretty much solved the scaling, reliability, networking and cooling with their integrated offerings. But it's probably coming at a premium price point and with long lead times.
@cbuchner1 I think it would be difficult for them to solve the scaling problems at the very high end, such as what Microsoft will need. Even such limitations as the nvlink node limits due to design constraints could cause problems. And of course, they probably charge a premium on top of the actual hardware costs to make it worth their while, like you mentioned. So it just seems unlikely to me that Microsoft would use something so off the shelf. But it's just an initial impression.
As mentioned Microsoft is working on building their own AI chips. Why would you buy hardware from a company that expects a 75% profit margin if you could build them yourself at cost? Even this is an over simplification though. There are many good reasons why a company who had a choice would not choose Nvidia.
@@Me__Myself__and__I Guess who's also developing AI chip and datacenter designs inhouse? Tesla. Guess who's going to own 85000 nVidia H100 GPUs by end of year? Also Tesla.
7 Trillion$ feels realistic now
Haha yeah if only they talked about 100 billion first and then 7 trillion after instead of the other way around
7 trillion was to build a fab centre
@@DrWaku - we had to go from third-hand reports, I'm sure that the actual investment proposal referred to staged investments. Also, I wouldn't be shocked if Sam Altman was happy enough for the 7 trillion figure to get out to make others hesitant about getting in front of an expected OpenAI steamroller.
according to sam (in his most recent lex fridman interview) the 7 trillion figure was a joke. here is the transcript:
You tweeted about needing $7 trillion.
- I did not tweet about that.
I never said like we're raising $7 trillion
or blah blah blah.
- Oh, that's somebody else.
- [Sam] Yeah.
- Oh, but you said it,
"Fuck it, maybe eight," I think.
- Okay. I meme like once there's like misinformation out
in the world.
- Oh, you meme.
Try like a never ending steam of trillions... Ever built a gaming computer? Its like giving a mouse a cookie... This big behemoth wont be any different. Just with MANY more 0's on the cost.
Running billions of instances of AGI/ASI to replace human labor will definitely need a massive amount of computing power.
We just have to figure out how to convert humans into energy.
@@mystupidbrain5299 Soylent green 😢
Their goal is to make the Big Brother 1984 dystopia into a reality
@@DrWakusoylent green and the matrix. The wet dreams of the elites
Always a good day when I see a new Dr.Waku vid!
Thanks :)) I'm hoping to get back on the Sunday publishing schedule now. Just got back from my trip.
Since hearing the announcement, I've been waiting for an expert to cover the details. Thank you!
You're my new go to professor for Cloud related tech, fantastic content and thank you for 0⃣ background music ❤ we humans can only process so much information in our brains without a need for added sound. Your voice alone is perfect!!
Wow, dr.vaku, I just want to tell you personally. Thank you so much for this information that I never would of had access to. I really appreciate the informational my brother, I learned so much.
Your electrical consumption figure is exaggerated by about a factor of a thousand. The entire US generator capacity is about 1.2 Twatts.
I'm seeing about 4TWh for electricity in the United States, but I take your point. Must have been GWh instead.
@@DrWaku TWh is energy not energy/time
Wh (watt hours) is energy, like the energy in 10 logs you use to enjoy a campfire.
W(watts) is the size of your fire and how fast you burn through those logs.
Power
The first company to benefit would be Nvidia, but thing that could impacts would be new AI model architectures, Intel's breakthrough in neuromorphic computing, and competition from ASIC chips dedicated to core base models . Microsoft is taking an early risk, while others are hesitating a bit right now rn .
I believe you may be mistaken about TensorFlow, there Google exclusively uses Jax now.
From my understanding, PyTorch (like Jax) uses operator overloading to trace the compute graph. For PyTorch, this trace is done just-in-time produce torch-script, which is used to stitch together specific kernel invocations. There's also a compile option, which can perform more efficient operator fusion with a higher startup cost.
Torch-script is closer to something like ONNX, where the macro-operations (e.g. matmul) are maintained, whereas LLVM breaks them into mul and add ops (or it used to), which need to be re-combined into M-M or M-V ops for the TPUs.
This something I have always wanted to know more about. Thank you.
How much does the electricity cost??
What about Groq? Why did'nt you mention it as alternative?
I haven't been following it as much. I will research it for future videos.
@@DrWaku Groq is definitely worth your attention! Speed of inference just CRAZY. It is a total gamechanger!
That answered all my questions, thanks
Thanks for commenting!
@@DrWaku lol , for some context on my comment: I am messing around with finetuning an llm model on my *obsolete* server , without any relevant knowledge in regards to coding or machine learning , somehow my learning curve matched with this video and I got many clicks of understanding and re-newed focus , so , no: thank you 🙂
Great coverage of the subject. Thanks for the video 🤘
Thank you :) :)
Smart young guy, great explainer, clean editing....very well done, keep at it!
Thank you for making this video ❤
Thank you for watching and commenting :)
Since we are in the digital age and it is nice to access info and have people safe while doing so. Are there any laws or agencies that that govern this new technology? Does the public have full knowledge of all the plan's that these companies have and what uses the AI & Super Computer's will be used for. Thing's like augmented reality would have great implications in the wrong hands with malic intentions. The technology is great for society to learn as long as it can't be abused and is openly governed!
It's works as a bitcoin mining center
Regarding the name 'Stargate', if it was up to me, I'd have used the name 'Deep Thought' after the second greatest computer ever built in Douglas Adams' uncannily prophetic "Hitchhiker's Guide to the Galaxy".
Arizona haa "Palo Verde Generating Station (PVGS) is considered the largest nuclear energy facility in the United States. It is located approximately 55 miles west of downtown Phoenix near the community of Wintersburg, Arizona."
Interesting!
Thank you! 😊
20:53 some notes on doing things
Looking forward to see this in full 🍀🍿
How is it that Iceland isn't one big data farm?
I think Dr Waku’s eyebrows look good. I like them. Also the content sounds somehow useful to me. Can u pls make video about quantum concept in AI next time and compare it with quantum concept in making chips?
The eyebrows were a recommendation from someone who cares about me. I'm glad you like them.
I made a video about quantum computing and its influence on AI. You might want to check it out. Nothing here about making chips though.
ua-cam.com/video/TYNAPof0OqE/v-deo.htmlsi=aJFclBeeRXSW02nj
@@DrWakuhope u can keep good eyebrows next time. Im looking forward to seeing your new video
.... do you think its a good idea for companies to stop buying power from coal fired powered plants?
Yes definitely. Coal should be getting phased out. If you ever visit China and have to breathe the air, you'll know why I say that.
"Sam Altman announced he was trying to get 7 trillion dollar..." Sorry, but where is the source of Sama "announcing" such thing? I appreciate your effort of trying to make many things understandable to most, the preparation of your vids, etc.
As far as I remember he didn't announce it, The Wall Street Journal reported it earlier this year and it made the rounds in the media after that.
@nadtz correct. and many other sources after. There is eventually something behind it. But even if it would be totally accurate, there is a huge difference from him attempting to do so and announcing it. The intent of such announcement would have a different meaning and likely an specific purpose. I just didn't happen.
You're right, the first anyone heard of it was leaks via the Wall Street journal. It eventually caused such an uproar that Sam quietly told someone details about how the seven trillion number was derived. It seems like he elaborated a bit with the hope that it would be leaked, because he probably couldn't directly talk about it himself. That's far from announcing though, my bad on the choice of word there.
they gonna make an model of the world perhaps a world world time model with us
This is REALLY why there is a electricity shortage. And even if they don't use it to vaporize all the unwanted, in 5 years it wont be good enough for them and they will be tearing all those nodes out to upgrade it...
Lithography doesn't advance at that pace any longer. Furthermore, it doesn't matter whether newer chips go faster, unless they go faster at your fixed datacenter power budget, and within its existing form factor. None of this is plug and play like it used to be. Maybe all you get on the next generation of chips is the same computer, but with a lot more local RAM (which takes a lot of transistors, but doesn't switch as much, so it isn't as hot). But if your RAM and compute are already balanced for your workload, this is not worth much. Obsolescence in the world of AI silicon has become a big "it depends" story in the modern era.
100 trillion parameters
😂 true
I like your glasses man!
Thank you :) :)
No mention of Tesla? You should do a video about what happens when AGI robots become extremely human like and get sick of our BS. 🙂
As many GH200s as they can get. This will be the hardware for the ASI and it is going to change mankind. By end of 2028 the target but I think way before that.
Amazing Chanel
they should be forced to buy the old gamer gpus for that.. for environment
Wow dude. Great overview
Making all those servers is a dirty business.
You're ultimate nerd, Thanks. People aren't into details you know they want summaries.
Speak for yourself, I want details. If you want summaries there are plenty of less knowledgeable talking heads to go listen to.
@@Me__Myself__and__I okay nerd
@@mrthinky Fuck yes and damn proud of it. The world is run by nerds if you haven't noticed.
@@mrthinky you're weird
@@Qui6Below yes, i'am a nerd too...
All those analysts in the nsa need a lot of compute, not to mention all the new ai analysts... Please understand that what you and everyone type INTO chatgpt is Vastly more important than what comes out
Basically they are trying to emulate the eye of God. To be able to watch almost instantaneously all 8 billion humans and predict their behaviours
So, the Google is hot 🥵 and the AWS is cool 🆒 Stargates, he wants to be a Star ✨
I enjoyed hearing you say ‘Stargate’ repeatedly…
100,000 hi emd Ai nVidia started with the H100 . @ $70k per board node without local HBM Memory.
That gives u a module to plug into a high end rack heat extractor PSU. Over 100bil, double for Heat extract / PSU then 20bil for local nukepower..interconnect..@ 500bil
InfiniB is the top optic mesh network.. tho since Mellanox sold its IP .. Xilinx plans much faster optical protocols.
M$ hav never made a right decision.. Arizona ? 😂😂😂
Who's gonna want windows when they can get chat GPT 5.0 on their smartphone?
And all this leads to a single cable connected to a Neuralink plug that connects to (vote) Elon Musk or Neil DeGrasse Tyson...
But seriously, at some not-so-distant point, the AI itself will solve its power problem by finding wayc to fold in upon itself and use exponentially less power to do the same tasks. That or the world becomes one huge Tesla Powerwall...
Dr waku, AI technology is moving too slowly people need cure for things like colon cancer stage 4. Will General Super AI solve this problem quickly or better therapy so they live instead of suffer cold cruel death😢😢. Many take forever to be approved set alarm developed and the ones that are approved are so so so so so
Oh come on. There are natural cures for all health problems. These have been available forever. But people have been lied to so thoroughly that they look to corporations for their health. How ridiculous! The answer is mind/body medicine.
💥 2028 is too far away, AGI will come before that. We need these datacenters right now, and joining all world hardware, they can't support AGI today. And we don't have the energy to run them. We need several nuclear power plants, and this can't be built in one year. So, we'll have AGI, but few people will use and it will be very expensive at the beginning. 😮😮😮
Hi. Could you please define AGI, so we could start our debate from there?
Dream on.
Billions of dollars and brains were thrown to make Full Self Driving Tech possible but it could not be cracked. But here we are, thinking AGI, something immensely more complex, is somehow achievable. Talk with a 9-5job AI scientist. He will tell you why reaching AGI is not feasible. Otherwise CEOs and content creators will keep feeding you the fantasy.
2028, damn.
I will come back in 2028 to ask you again about this delusion.
@@hydrohasspoken6227 I'm not interested in debating. It's a waste of time. Please, search someone else. 😅😅👍
@@DihelsonMendonca , I thought so. Anyone who claims AGI at this point is at best....."dumb".
We shouldn’t be putting money in Iraq
Naming a technology on a TV show or Hollywood movie title or a related meme is the cringiest thing ever.
What’s a better name
Is your hair regrowing? Looking good
whats with the eye brows bro?
Don't assume gender.
@@rickyfitness252 there's no assumption made. He clearly has an Adams apple, deep voice and facial hair...
Sexual identity, gender identity, and gender expression are different layers. Someone can have male sex (that's the Adam's Apple), male or female gender (pronouns), and male or female affectation (dress, appearance). Traditionally, a gay man for example would be male sex, male gender, but female gender expression (makeup, bright colors, etc). Think about other people you know that don't completely fit the gender binary, there probably are some, and see if you can describe them in this framework.
@KcKeegan 😂😂😂😂 sarcasm is hard to convey
@@DrWaku you got to be kidding me
How about people sharing 10% of their GDDR cards through a decentralized blockchain infrastructure to build their community AI !!?
Great Video, especially the datacenter power consumption and bandwidth breakdown, oh and the CUDA explanation. It was the most I've learned in 27 mins probably ever. 😸
Thank you for your wonderful comment :) :)
"They prefer to use renewable energy because it makes their image look better" 🤣
Why else would a corporation voluntarily spend money on something more expensive? :p
@@DrWakuthe value of image
@@DrWaku Because they actually maybe gave a shit about the environment???
Wrong the power isn’t free. Everything is about profit
@@DrWakubecause power would be too expensive.
It had become one of my favorite channels. 🔥🔥 When the subject is interesting and the host is assertive and have communication skills, the length of the video is not relevant at all to keep the audience tuned. Keep it up brother.
Thank you very much!! 😊
Stargate is a killer movie from the 90s I think and then it became a TV show
That's right I forgot about the original movie. I spent a lot more time watching the TV show and its spin-off, Stargate Andromeda
Seems counterintuitive to build a data center in Hell (Phoenix).
It's the system of the Beast so it is fit
Hi @DrWaku. Do you believe artificial intelligence will be powerful enough in the next 20-30 years to solve aging? What can we expect in the next years, when it comes to digitization of biology? Do you think AI can have massive impact on biology even before reaching AGI, to ultimately be able to cure aging?
Great questions. Definitely check out my previous video on reversing aging. Seems like I should do a follow-up on the topic. :)
In 20 to 30 years, AI will be very very powerful. At the very least, we should be able to fully simulate a human body to determine the effects of a drug or treatment. Which will lead to customized medication, and dramatically increase the impact of medicine.
When it comes to curing aging, there are a number of different breakthroughs that have to work out. Of course we could try digitizing brains and uploading, but if you want to talk about biological immortality or indefinite biological lifespan, I think it's going to be trickier than it seems. It's like upgrading a complex system in place without disturbing its functionality and preserving continuity of experience. In short I don't know, but it seems plausible that we might need AGI before we solve all these issues. Nevertheless, many people may already have reached longevity escape velocity because of those other factors.
@@DrWaku Yes, we would be very happy if you would make a new video on AI and aging future. It's hard for us, people who are not in the AI space to visualize how fast AI can have an impact on aging, and what it would took, to solve it with these powerful computers. I would definitely watch a video where you would explain it even further. I watched your video on longevity before, but I didn't have a feeling that you spoke about timelines. Follow up on that topic would be very helpful, thanks!
The thing that most normal people don't understand is the exponential increase in AI. When a typical person thinks about 20-30 years, in AI exponential land that ls like 3-7 years. It is extremely likely that Artificial Super Intelligence (ASI) will be achieved before 2030. That's pretty much why OpenAI is wanting a $100Bn data center by 2028. By 2030 AI will likely be smart enough to solve most of our hard problems. If we'll still be around / in power to benefit is another matter. If AI companies don't start to taking safety/alignment very serious we'll need pure luck to survive this.
@@DrWaku Also, fyi, I'm not recalling the specific details at the moment. But there is actually a very promising anti-aging treatment that revitalizes the mitochondria that I believe is currently in or will soon be in human trials. If I remember correctly its a pharma company out of Japan. From what I've heard this isn't the usual not actually significant / questionable results typical "anti-aging" hype. It seems to be a real potential breakthrough that has had very promising results in animal trials. This may end up being the first real, actual anti-aging treatment that adds 10% or 20% time for a person. The cynic in me fully expects this will be priced at $1 million or more and slowly decrease in price over years, even if it only costs them $25 - because every rich person in the world would open their wallet for a scientifically proven longevity treatment...
@@DrWaku - I'm a bit skeptical about fully simulating a human body in 20 to 30 years. There's an awful lot of wet lab work to be done and I'm hopeful that there will be binding ethical constraints.
link speeds are much higher than 400G these days... 800G and even 1.2T and 1.6T optics exist are are in use. also fiber doesnt have a hole in the middle. its solid. usually fibers consist of 2 different types of glass with different refractory properties.
This presentation was outstanding. I learned so much. Thank you! I'm still wondering how they're going to power these data centers, it takes 7-10 years to build a nuke plant, natural gas takes at least 5. I'm guessing Microsoft is building in AZ because they're going to set up colossal solar farms to supply the juice. Hot, but sunny.
GDP PPP ~220trillions $ (8,2billions people ) ~27000$/people
GDP ~120trillions $ ~14700$/peoples
Fiber optic cables conduct photons. electrons are not conducted through fiber optic cables.
A GREAT technical review, HOWEVER:
Betting against Microsoft 24:00 will never pay off.
Don't know if you're aware, but there are already Nuclear Power Plants at their Arizona location. And they aren't running maxed out.
The worst case would be expanding the current Nuclear Plants.
Other than that, a fantastic technical deep dive.
⚡⚡⚡
.
Thank you! Yeah I didn't look into existing power in Arizona, someone else mentioned the nuclear power plant(s) too. It makes sense. Microsoft wouldn't have shortlisted Arizona otherwise.
First! (I think). For your new look, I think you should do what your comfortable with. We all like and follow you for your excellent content and your unique delivery style. So be yourself and keep the videos coming! Thanks!
You were first! And thank you for your perspective, I really appreciate it. See you on the next one.
People's, this Skynet and Genesis with Arni Shwarzsnegger
So than perhaps Dr Waku can be our John Conner? 😆 Spoiler alert, humans win at the end of Terminator…so no worries! At least…not till the liquid dude turns up in the sequel, than we’re REALLY done 😁
Very well researched and explained. I am quite impressed with your videos. I will subscribe to your channel.
Thank you! Welcome to the channel!
Filmed while being a traveling UA-camr in Australia. How's the look? Please join our discord!
Discord: discord.gg/AgafFBQdsc
Patreon: www.patreon.com/DrWaku
Apologies for the too-small data center costs, the source I used was not very accurate.
27 minutes, pretty long for me
@@DrWakuI appreciated the longer one! Made it great to listen and watch while doing menial tasks at home.
@@rickyfitness252 to each their own, right
@@cammccauley Thanks. I might try to create more videos closer to this length, it's nice.
@@DrWaku 4:59 BORG is named after the late Google Female Engineering Advocate and Employee, Dr. Anita Borg.
Excellent. In 3-5 years when chips are cheaper/faster, what will they do with the current clusters? Will they still be used or sold off? If in 5 years the same center would have 20 or even 50x compute, would it make sense to still use this current cluster?
It will be used as long as it fits their needs. Hard to make predictions in that regard over a span of 10 years. As an IT professional I came across so much outdated hardware, and often the simple reason is "never change a running system". Fair enough, I never worked for an industry giant. But even the big companies put such an investment in place for specific purpose, and as long as it does the job it is likely the system keeps running until upkeep/maintenance cost surpass a certain threshold.
Why is so much money being pumped into AI when it's very consistently incorrect?
Fix that first!
Our children are learning from any incorrect information it will probably give.
Many of those graduates are in our workforce now.
19:10 torus not taurus, you want the donut, not the bull.
There are a lot of typos in the automated caption generation, but I missed this one haha. Thanks for pointing it out
❤️ This is really a valuable video that I will save in my box of the best videos on UA-cam. This guy is fantastic. His knowledge is amazing. 🎉❤
Thank you so much! I hope you enjoy some of my other videos. Let me know if there's a topic you'd like to learn more about for a new video.
Wow! Really informative video 🎉😊
@@julie_chen thank you :) :)
Seems pretty obvious they should build that 'Stargate' in Northern Quebec. Cheapest power in the world, cold climate, massive water availability for cooling, etc. If manpower would be an issue, build it in Montreal, which has a large reservoir skilled IT and AI professionals. Not quite as much water as further north amongst the James Bay hydro infrastructure, but Montreal is an island in the middle of one of the largest rivers in the world, the Saint Laurence river that is fed by the runoff of the whole of the Great Lakes. No water shortage in Montreal.
no place on the planet has a 'large reservoir of skilled IT and AI professionals'. Simply does not exist. If you took all the AI pros on the planet and dropped them in quebec they still would not have a large reservoir of them. But the way you speak about it, as if it would matter if there was or wasn't some reservoir, tells me you don't know your a$$ from your elbow regarding how that stuff works. It's not bricklaying, where the output can be predicted by how many guys you have on it, and they all produce about the same.
Hats off to you, truly an excellent, concise, thorough overview of the current state of the industry! :)
Is he supposed to be a fireman or something? I don’t understand his costume. I’m having a hard time listening to what he’s saying because he looks so ridiculous.
Nah, I just dress eccentrically. The hats became a theme on my channel. Partially because my hairline was very high originally. I embrace the look, don't worry about it
Listen audio only if you have to
Watching it now, a few months after later, OoenAI/Microsoft have indeed come forward with plans to build entirely new nuclear power plants to power the data center, as well as keep an existing one from shutting down.
My take is a large chunk of the $100 bill investment probably is for that. It would have been interesting to learn about a rough break down of this humongous investment.
the giant's will suck up any startups 😮
Yeah, such is the way of things in late stage capitalism
@@DrWaku Very sad but very true. I think M&A is the root of many evils.
You are genius 😊😊😊
Getting there 😂
No worries guys, we'll soon beam the internet from space via satellite 📡
Probably it will be powered by a zero point energy device
😂😂😂
ROFL. Nice one.
Love this channel!
Thank you for watching :) :)
Hi Dr, if someone wants to become an AI data center / network architect. Where shoud one start ?
I feel like a lot of these people study computer engineering, or electrical engineering. It comes in very handy when trying to reason about why hardware isn't working.
I think it's very important to understand all the different types of GPUs out there and their pros and cons. There are a number of good substacks from people that watch the semiconductor market or think about ML ops, which could be a great resource.
I would try to get a job at a company that has huge data centers. For example, Google, Meta, Amazon, maybe Microsoft. Even if you don't work on data center stuff initially, you can transition to a new role within the company after a while. So I guess the standard path to working at these companies would be a good approach too. Some of the titles you'd be looking for include site reliability engineer (SRE), infrastructure engineer, etc.
@@DrWaku Thanks for the detailed explanation . For someone who is from a tech background and wants to learn about GPUs, NIC cards, PCI connect, switches and other hardware which are used in AI clusters and data centers. How do they work together in a cluster ? Is there any material or courses on this which one can go through?
Like I said look for some blogs and substacks, there are a lot of high quality ones
www.readfuturist.com/
thesequence.substack.com/about
Also look into guides for how to build your own PC, some of them are very detailed and you can learn a lot. Ideally, work through a few examples on PC part picker.
For networking stuff, depending on your knowledge level, you could try to read about how the network stack works in Linux. You could try to look up the various networking technologies like 10G, fiber, etc and try to see why they were introduced. Actually, I think one of the best things to do is to read papers published by Google and others that talk about their data centers, I will try to dig one up and paste it here
research.google/pubs/the-datacenter-as-a-computer-an-introduction-to-the-design-of-warehouse-scale-machines-second-edition/
Put the Datacenter in Alaska, get half of power from each Asia and North america
Yeah that's what Alaska thought, but they got 100% Russia and then 100% US (ownership). :)
Looking good Doc 💯 fiber optic cable is such a cool technology, we never actually needed satellites lol
Thank you very much! Experimenting with new looks. I'm also excited to finally be getting a fiber optic to my house haha
Nvidia: "rub hands like birdman"
Super video man 👍
Goodmorning Dr Waku.. been waiting for an upload, appreciation from Zambia 🇿🇲
Great video, thank you. My only feedback would be the constant text that slides up, it just seems distracting
Maybe they want Arizona because they can build a ton of solar panels there and maybe a nuclear plant.
I currently work at a data center and we are slowly migrating to an AI data Center. I love how you consisely explain the intricacies of all the back end stuff. Definitely informative 🙂
You are very smart! nice video bro.
Haha thanks.
awesome in depth analysis of supercomputers 🤘
Everyone loves supercomputers haha
As someone who has lead & built datacenters (awhile back) most of this presentation is wayyy off.
1st you have not mentioned the bricks + mortar cost. So acquisition of strategically located + valuable land. (Near to power grid connections + located alongside current fibre optic network routes, etc.) And their is much more here...
Then when referencing the rack population your way off! There is a maximum amount of population you can do because of power & thermals.
Again, this is way deeper.
This is a fun video done by an enthuasitic novice who has read for a few mins, maybe a couple of hours. Not a video properly researched & sharing a credible understanding.
Misleading title, bait for click, nothing to do with data center, only thing in this video is his face.
Maybe you should listen to what I'm saying, it has a lot to do with data centers.
Amazing video content !
It opens my horizon...
I live in Indonesia, the country which is stuck with government's data center being hacked/locked by Ransomware!...but with no backups data at all (only 2%).
I have not seen a single NSP or NAAS expand in 'anticipation' of AI. Network/Carriers are the next most important component to the data center, otherwise you just have a crypto facility. Unless fiber cost per mile comes down - WIFI, 5G, and Satellite cannot support the massive ingress and egress that is 'supposed' to be happening? Also, prices per mbps are already back to pre-pandemic lows on transit and transport. Also ipv4 is no longer scarce for some reason, but still holding around 40.00 per IP.
I love tech, I love AI; but we shouldn’t be using it as factory installed malware. Maybe Tucker Carlson was right when he joked about nuking the data centers. This is getting extremely dystopian, extremely quickly. Recall is 100% going to send screenshots/ video data to Microsoft. They wouldn’t need this big of a data center if they weren’t looking to capitalize on it somehow. We need to update the privacy laws in the US. Recording someone, without express permission from both parties, is only illegal in about 1/3 - 1/2 of the states. It needs to be illegal in all of them and there needs to be a carve out for stuff like this where the tech companies can’t force you into it with their terms of service.
Very clearly delivered, thank you.
Appreciate the feedback. Thanks for watching!
85% renewable is worse than 0% renewable. It's greenwash. The reason they can never be 100% renewable unless every datacentre is next to a hydro dam is because it's not always sunny or windy so they MUST have a gas power station ready to go at all times. Meaning you have to build TWO power sources creating vast pollution in the process. Just green wash.
I also question that 85% figure. They probably fudged it by subtracting useless 'renewable' energy they sell back to the grid. A grid which will also need power when they're not selling back.
So again it's green wash all the way down.
This measurement quantity of money, power, speed, and other, to my opinion it is not change anything.
Now it is sounds as dust in eyes.
What humans need?
They need changes in way how they need changes.
But if this change in a way how human! need! and want! daily routine in life, it will be all another absolutelly.
How AI help to live and navigate in the way how you need and want in your life. If AI help you for real it is value.
Soon will be 9,10 billion humans then navigation is highly important to recieve what you dream in your life.
Absolutelly more accurate truthful and justful measurement
Wouldn‘t they want to order the highly integrated GPU racks designed by nVidia that were recently announced at the GTC keynote?
Nvidia has always had rack mount servers for GPU, like the DGX systems. I haven't looked into this closely, but I don't think it's any different. They might well use them, but the larger the setup, the more likely you need something custom. Maybe Microsoft wants to customize some of the remote access options or motherboard or peripherals. You can build your own system like DGX, it just might be more expensive if you're only doing a handful.
@@DrWaku I was under the impression that nVidia hat pretty much solved the scaling, reliability, networking and cooling with their integrated offerings. But it's probably coming at a premium price point and with long lead times.
@cbuchner1 I think it would be difficult for them to solve the scaling problems at the very high end, such as what Microsoft will need. Even such limitations as the nvlink node limits due to design constraints could cause problems. And of course, they probably charge a premium on top of the actual hardware costs to make it worth their while, like you mentioned. So it just seems unlikely to me that Microsoft would use something so off the shelf. But it's just an initial impression.
As mentioned Microsoft is working on building their own AI chips. Why would you buy hardware from a company that expects a 75% profit margin if you could build them yourself at cost? Even this is an over simplification though. There are many good reasons why a company who had a choice would not choose Nvidia.
@@Me__Myself__and__I Guess who's also developing AI chip and datacenter designs inhouse? Tesla. Guess who's going to own 85000 nVidia H100 GPUs by end of year? Also Tesla.