Hi Anastasi, always lovely having something new from your channel. What do you think about the solution, diamond-based cooling technology for semiconductors, by Akash Systems? ua-cam.com/video/xypuFnbudzI/v-deo.htmlsi=PccnazTWCsubKwA0
If you look on the paper mentioned here, it is clear, that the the remarkable low-power behavior and near-ideal switching demonstrated in these sub-kelvin experiments would not carry over to room temperature, nor would it be automatically attainable by further device scaling down to 2 nm. The gains are inherently linked to operating the device in a deeply cryogenic regime.The ultra-low power performance of cryogenic MOSFETs, including subthreshold swings below 1 mV/dec, is tied to reduced thermal excitations at millikelvin temperatures. At room temperature, the thermionic limit of 60 mV/dec still applies, even for advanced 2 nm transistors, due to thermal noise. While scaling and material advances improve power and speed, the extreme efficiency seen in cryogenic conditions cannot be replicated at ambient temperatures. So while this is still good news for quantum, not for everybody else.
Agreed, even if you could scale the cryogenic conditions down to laptop format, it would still require significant amounts of energy to maintain sub kelvin temperatures. No one wants to carry a science lab around, right?
Yep, just listening to quotes in the video, it suggests the tech is designed to withstand extreme cold, this doesn't relate to the un-cooled device operating temperature at all. It's not he first time this channel has misunderstood a technical paper.
Congrats on picking up AMD as a sponsor! As you summarize, the real question becomes - how quickly can this new process work its way into real world applications and what will mass adoption do for the costs of production?
I might have missed it in the video, but does this new chip design require cryogenic cooling to work or was the cooling just required for the quantum applications?
Very interesting, but it's not clear if these transistors are also suitable for use at room temperatures? It doesn't seem likely that we will ever have liquid nitrogen cooled laptops, but I see you liked another comment relating this tech to being used in a laptop. So is this a general chip transistor replacement, that particularly excels at cryo temperatures, and if so, how does it compare to today's standard silicone substrate transistors in terms of speed, power, and waste heat, *when running at room temperature* ? Cheers.
Read the paper displayed in the video at 04:09 to find out. I think you can apply it in general, but at room temp you need the heat to get out, so having an insulation layer is counterproductive. At the far below sub-zero temp for quantum level cooling, you want it to stay in sort of (very simplistic look) for a better controlled “heat” dissipation to keep the overall low temp steady. So transistors close together don't pass heat to each other, which can influence their working capability (more resistance). This tech could possibly help In the quantum compute realm, I think, or make easier chip designs possible. The cost and risk of cooling with liquid nitrogen, helium etc. for whatever X time performance gain are at best equal but mostly surpass the cost, and then you have the risk and cost of the cooling failing. This was and still is the issue 50 years later. So this won't be a thing widely applied or adopted. Everything quantum computers have done and became a mayor news headline, normal computing clusters eventually did, and they did it faster and better, but those don't become news headlines. I'm not saying theirs no case for them, but I'm very skeptical and don't like simplistic buzzwords (for more funding). The “Superconductivity” property in materials needed for that type of quantum computing can only be achieved at very low temperatures. Unless some new superconductive material gets made/discovered that has properties to do that at higher temp: -30 + Celsius, I don't see anything changing in the future. Here they made transistors to operate better at very low temps now make a functioning quantum computing chip with them that has an actual use case. Then demonstrate it, and I'll be slightly interested😉.
I'm glad there are really smart people like you to break down this specialized information for the rest of us. I have a computer science degree but have a limited understanding of these very low level processes
Great question. Why wouldn't any company want to use this technology. When they don't it makes you pause and wonder why. Who gains to benefit from not using this technology?
Ty Ana this was so cool, pun intended. That insane cooler/cooling process, like you said, looks like something that would be very difficult for a retail market lol
Great that transistor performance can be much better at LN2 temperatures, but I wonder how many watts of heat pump energy does it take to remove 1 watt at 77K?
Wow, this looks significant. This should remove lots of limits on CPUs and GPUs. Plus power consumption will go down. So many things could benefit from this.
I just checked out the paper, but it appears that these so-called "new transistors" are at the 10 micrometer scale, which means that they take up 1 million times more area than current VLSI. That is a giant step back in terms of integration density.
If they're new, there's no reason they can't improve. Our current transistors had to go through the same improvement process. Additionally, our circuits at the current moment in CPUs are incredibly bloated. It's possible to shave off a significant portion of circuitry to accommodate the larger transistors, if we shrink the transistors by a factor of, say, 1000x. Current chips, especially Intel chips, are not optimized for low transistor count.
Lab scale proof of concept is not the same as cutting edge mass produced ASML lithography. It would be ridiculous to expect this to be made on a modern process.
I been hearing "Microchip breakthrough" since the 2006, each time promising anywhere from x10 to x50 times computing performance, still waiting. Heck i'm still waiting for those graphene CPU's that were supposed to reach 1000 Gigahertz.
Breakthroughs don't change overnight. And linear speeds have no match for parallel processing. The physical size(not speed) of transistors was 100nm in 2004, we are now 2-6nm. Not even mentioning efficiency. It's not about ONLY speed.
Super interesting & super clear presentation of this much needed ultra efficient silicon on insulator transistor. Not only is the current trend of relentless increases in transistor heating unsustainable it is ruinous for quantum computation. This new idea is nice not only because it addresses both of these two critical areas, but it also does not require new fabrication plants. It is remarkable how humans find elegant solutions to what at first sight looks like impossible challenges. Thank you for sharing!
From what I understand, the only limit that shifted the frequency race into core race is heat. So of course they will adapt to any technology that shows a cost effective heat handling solution.
CMOS transistor consists of 2: pnp and npn, isn’t it? Both of them are printed on doped silicon. Is it a good solution to turn one of them upside down to make it easier to manufacture?
@1:23 I did the numbers and it will cost about 40% less to power this with solar/battery than nuclear. That calc does not include the nuke waste storage for the next 10,000 years.
The problem with heat and cold sounds like a condition encountered in space. So a spaceship on one side is heated by the sun; on the other side, cooled by space. The big problen, then, becomes shielding.
Very significant for data-centres, where centralised liquid cooling already has the edge on thousands of little fans, and cost of operation is significantly tied to undesired heat generation and its removal. Economics will drive this ahead, especially since existing fabrication can be leveraged.
The article only discusses sub 1K operation. Maintaining that temp would offset any energy saved operating the devices. Looking forward to progress toward room temp. Even 0C would really help.
Heat is a big factor, BUT the biggest problem today is SIZE. A transistor cannot get any smaller due to the size of an electron and CPU speed has flattened too
This will help make sure we can put a iBooks it’s very powerful on our wrist and build it into the edge of the task that we are trying to accomplish. This will also make augmented reality glasses very possible. This may sound weird, but we need to build processors that are the physical glass frame instead of in a square : ) Your head will be the heat sink
Instead of data. If we could build parallel processing, then we could just make a bunch of tiny parallel processors that we could build it into it all kinds of things. We also need to build it into thread so you wear smart clothes
I wrote his in Blkuesky to NIST: After Quantum computing & Qbits, comes 4D computing that has Zero latency and can give you the answer before you even asked the question. RBits will be the data standard that defies time and will be a jump in technology Affectionately known as the 4D Frog
Look at all the technology that Gene Roddenberry and others caused to happen. I’m doing the same thing but I’m not ready not writingi I just write post out on the Internet. The engineers will see this, and that will get them to start thinking about the next thing : )
Id say, just like the lightbulb had the unfortunate propperty to emit heat, and the led-breakthrough tackeled that (converting energy to light without the heat part), chips need to have this breakthrough too. (at room temperature).
If this is actually for real and it's not too good to be true, this is actually something we desperately need. We need that one gigawatt data center to run on one megawatt that'd be really fantastic!
That IS impressive but how will you shield it from say a solar flare event? I agree that its wonderous and very cool but how do or did they overcome this limitation? Is it deep underground where they will be used? Because I dont think we will have desk top computers using this tech.
Great video, thanks for bringing us the latest. I'm curious if we could somehow mimic mother nature; I've heard about animals like birds using quantum mechanics-based functions to navigate. Perhaps we can copy this architectural base to allow quantum-phenomena at room temperature.
Great video, I hope smart house tech might also become more energy efficient in heating and cooling a house/fridges/freezers/water heating etc with eg new materials etc that use less energy more efficient. Instead of depending on the development of new power sources for energy. Making us less depending on Energy providers. Eg charging walls/floors in a house for 12 hours with heat that emits during 24 hours or roofs that converts indoor heat to energy during the summer. Or windows that can be configured to block heat in one direction, eg winter no heat leakage and summer outside heat is reflected.
70 Kelvin is liquid nitrogen temperatures. A data centre will need a cooling system that re-liquefies any gas that is boiled off. Liquid nitrogen is available hundreds of tonnes, as shown by SpaceX at Starbase when running cryogenic tests. The trade-off will be using it as a consumable or re-liquefying the boil-off. This is doable but requires R&D.
I saw a thing about light being used in microchips to make them faster. Why do they want to build quantum computers if they require so much care? What does a quantum computer actually do for a company? Do they use it for the whole company office space?
Efficiency improvements challenge thinking AI growth will be energy production limited and delayed. AI will not take 3 times the world's energy production capacity if breakthroughs persist. It all takes time. Space application growth too is fast. Anastasi, You are spot on.
تشتيت الحرارة ضروري وهناك أمثلة التبريد وصناعة نار سوداء بتشتيت الضوء بضوء مشابه لموجة طرق تشتيت الحرارة بالحرارة صورة أخرى مثل البرادات التي تستخدم حرارة المحركات للتبريد اقصد ان الامر يحتاج لنوع من البرمجة لتحقيق تشتيت حراري لمواد ذكية كسيرية
this will be big if they manage to adapt the chip to one of the popular processor architecture. it's a interesting PoC ... this will certainly help rocket HPC and QC
Another reason they would thrive in space applications would be that as they generate less heat that would be less heat you'd need to eject from a manned spacecraft, as heat is surprisingly difficult to get rid of in space as traditional cooling such as convection doesn't really apply in space despite it being cold due to it being a vacuum.
Argentina and Chile have advantages to use the cold temperatures from the Antarctica for the supercomputers, the submarine cables and their central powers on the land.
The nanosheet technology and cryogenic applications are impressive steps forward. It’s exciting to see how these innovations could reshape quantum computing performance.
As a programmer, sometimes I wish the semiconductor progress stopped or slowed down significantly, so that most of my peers would start paying attention to program performance at least at some point in the future. Right now we have a ridiculous situation where even the most basic apps are written using web browser engines, eating up insane amounts of RAM and heating up CPUs for no good reason.
This seems to be a return of the SOI or SOS idea that was previously abandoned partly because they couldn't get the defect rate low enough. The parts had the advantage of not having the latchup and parasitic issues that the bulk parts had.
Check out the AMD Ryzen PRO laptops and support the channel:
tinyurl.com/222dzww9
Hi Anastasi, always lovely having something new from your channel. What do you think about the solution, diamond-based cooling technology for semiconductors, by Akash Systems? ua-cam.com/video/xypuFnbudzI/v-deo.htmlsi=PccnazTWCsubKwA0
STEM 💕 IS SUFFICE 1:37
Imagine how fast satellites could run.
0:13 the subtitles say "emits almost zero HIT". I think it means HEAT.
Could the technology also be applied to silicon-based thermoelectrics and extend the operating temperatures of those devices to near absolute zero?
If you look on the paper mentioned here, it is clear, that the the remarkable low-power behavior and near-ideal switching demonstrated in these sub-kelvin experiments would not carry over to room temperature, nor would it be automatically attainable by further device scaling down to 2 nm. The gains are inherently linked to operating the device in a deeply cryogenic regime.The ultra-low power performance of cryogenic MOSFETs, including subthreshold swings below 1 mV/dec, is tied to reduced thermal excitations at millikelvin temperatures. At room temperature, the thermionic limit of 60 mV/dec still applies, even for advanced 2 nm transistors, due to thermal noise. While scaling and material advances improve power and speed, the extreme efficiency seen in cryogenic conditions cannot be replicated at ambient temperatures. So while this is still good news for quantum, not for everybody else.
Agreed, even if you could scale the cryogenic conditions down to laptop format, it would still require significant amounts of energy to maintain sub kelvin temperatures. No one wants to carry a science lab around, right?
everybody a yt miliking guru on yt
so..... superconductors are superconducting near zero K. We know this. Why the fuzz then?
actually stopped watching this channel. The cerebus vids seemed like a paid sponsor too. Feels pretty click baity these days
Yep, just listening to quotes in the video, it suggests the tech is designed to withstand extreme cold, this doesn't relate to the un-cooled device operating temperature at all. It's not he first time this channel has misunderstood a technical paper.
Congrats on picking up AMD as a sponsor! As you summarize, the real question becomes - how quickly can this new process work its way into real world applications and what will mass adoption do for the costs of production?
well said ..did not see any killer app yet 😢
Copilot :) @@t.w.7065
Wow big sponsor, careful potential big conflict
cringe
@dunravin Do you mind explaining that comment?
I might have missed it in the video, but does this new chip design require cryogenic cooling to work or was the cooling just required for the quantum applications?
It works on room temp, but perf is doubled when temp dropped to 77 Kelvin.
AMD = Anastasi My Darling
Before diving into a commercial. you should make clear AMD Ryzen is a sponsor and not part of the presentation.
"Efficiency is everything" -- I am happy to hear that. We should emphasize this more. But everything needs time.
Very interesting, but it's not clear if these transistors are also suitable for use at room temperatures? It doesn't seem likely that we will ever have liquid nitrogen cooled laptops, but I see you liked another comment relating this tech to being used in a laptop. So is this a general chip transistor replacement, that particularly excels at cryo temperatures, and if so, how does it compare to today's standard silicone substrate transistors in terms of speed, power, and waste heat, *when running at room temperature* ? Cheers.
Read the paper displayed in the video at 04:09 to find out. I think you can apply it in general, but at room temp you need the heat to get out, so having an insulation layer is counterproductive. At the far below sub-zero temp for quantum level cooling, you want it to stay in sort of (very simplistic look) for a better controlled “heat” dissipation to keep the overall low temp steady. So transistors close together don't pass heat to each other, which can influence their working capability (more resistance). This tech could possibly help In the quantum compute realm, I think, or make easier chip designs possible.
The cost and risk of cooling with liquid nitrogen, helium etc. for whatever X time performance gain are at best equal but mostly surpass the cost, and then you have the risk and cost of the cooling failing. This was and still is the issue 50 years later. So this won't be a thing widely applied or adopted. Everything quantum computers have done and became a mayor news headline, normal computing clusters eventually did, and they did it faster and better, but those don't become news headlines. I'm not saying theirs no case for them, but I'm very skeptical and don't like simplistic buzzwords (for more funding).
The “Superconductivity” property in materials needed for that type of quantum computing can only be achieved at very low temperatures. Unless some new superconductive material gets made/discovered that has properties to do that at higher temp: -30 + Celsius, I don't see anything changing in the future. Here they made transistors to operate better at very low temps now make a functioning quantum computing chip with them that has an actual use case. Then demonstrate it, and I'll be slightly interested😉.
@@mr.needmoremhz4148are you trying to justify the moon landing💀
@@-tarificpromo-7196 😂
Would work OK at room temperature but would operate at lower voltage and at lower frequency.
I'm glad there are really smart people like you to break down this specialized information for the rest of us. I have a computer science degree but have a limited understanding of these very low level processes
Loving you today Anastasi ❤
❤ Großes Lob wegen der Sprachübersetzung,
How often tech revolutions from this channel see the light of day?
Great question. Why wouldn't any company want to use this technology. When they don't it makes you pause and wonder why. Who gains to benefit from not using this technology?
She is just another Matt Ferrell-type grifter.
@@samsabruskongen What's wrong with you? It's not okay to throw around insults, especially insults based on speculation.
I would guess that in general, tech breakthroughs either take years to come into everyday use, or are either priced, marketed or bullied out of it.
Thank you Anastasi 🙏🏻
Good day. Fantastic evolution, I can't wait to see theses improvements in my laptop. Thank you (I started my studies in electronics in 1967... ).
Ty Ana this was so cool, pun intended. That insane cooler/cooling process, like you said, looks like something that would be very difficult for a retail market lol
Research comes first. Then comes development and adaptation to current production models.
Anyone old enough to remember IBM research on Josephson junction transistors?
Great that transistor performance can be much better at LN2 temperatures, but I wonder how many watts of heat pump energy does it take to remove 1 watt at 77K?
1.3 to 1.5W is typical
Thank you for very ineresting information.
Interesting & staying tuned. Thanks for the look forward.
"chip designer" badge reminded me how once they used to make games from UA-cam videos using those clickable links placed on top of the vid.
hey, thanks, I am thrilled to hear this !
Thanks for covering this.
My pleasure!
Ana have a blessed Christmas & a fun new year, 👍
Thank you, you too!
is helium cooling also good for limiting quantum effects
or .. doesn't it have any limiting effect
Wow, this looks significant. This should remove lots of limits on CPUs and GPUs. Plus power consumption will go down. So many things could benefit from this.
I just checked out the paper, but it appears that these so-called "new transistors" are at the 10 micrometer scale, which means that they take up 1 million times more area than current VLSI. That is a giant step back in terms of integration density.
I thought I was going to complain but infod up the Z80 transistor was 4.7um (4u process) so 10u is , i should read the paper.
If they're new, there's no reason they can't improve. Our current transistors had to go through the same improvement process. Additionally, our circuits at the current moment in CPUs are incredibly bloated. It's possible to shave off a significant portion of circuitry to accommodate the larger transistors, if we shrink the transistors by a factor of, say, 1000x. Current chips, especially Intel chips, are not optimized for low transistor count.
Thank you. But we iterate! Beginnings get improved. We have the path now. You or others will move us forward.
Lab scale proof of concept is not the same as cutting edge mass produced ASML lithography. It would be ridiculous to expect this to be made on a modern process.
True, but it is not uncommon for new transistor tech to be demonstrated at a larger pitch size prior to being commercialised at a smaller scale.
But what about consumer hardware? Will that work well in room temperature?
What performance would this sort of transistor give if run at room temperature?
I been hearing "Microchip breakthrough" since the 2006, each time promising anywhere from x10 to x50 times computing performance, still waiting. Heck i'm still waiting for those graphene CPU's that were supposed to reach 1000 Gigahertz.
Breakthroughs don't change overnight. And linear speeds have no match for parallel processing. The physical size(not speed) of transistors was 100nm in 2004, we are now 2-6nm. Not even mentioning efficiency. It's not about ONLY speed.
This is definitely what we are looking for
Love you Anastasi 💚😊🤗😍❤
beautifully explained ! thank you
Super interesting & super clear presentation of this much needed ultra efficient silicon on insulator transistor. Not only is the current trend of relentless increases in transistor heating unsustainable it is ruinous for quantum computation. This new idea is nice not only because it addresses both of these two critical areas, but it also does not require new fabrication plants. It is remarkable how humans find elegant solutions to what at first sight looks like impossible challenges. Thank you for sharing!
When do u think we gonna hit the physical limit of computing?
Like Wii. Near zero. How cool. And good for business/ processing. 💯☺️😜😎
Can these be used in PC and Laptops too?
From what I understand, the only limit that shifted the frequency race into core race is heat. So of course they will adapt to any technology that shows a cost effective heat handling solution.
Would it be cheaper to operate large data centers/quantum computers in space versus keeping them cool with liquid nitrogen?
Is the photonic computing breakthrough announced by MIT 14 days ago worth making a video on?
Let's hope, what's the next ten years going to look like?
Great Video. Engineering regards from Germany 🇩🇪 Where do you live? An nice warm island? 😎
Thank you for this exciting updates just before Xmas!
How do you think this device be used for something like a GPU?
Great work, thanks a lot!
The last “ciao” really melted me, I need a cooling system now
Is this something that is useful in consumer devices or only cryogenic applications?
CMOS transistor consists of 2: pnp and npn, isn’t it?
Both of them are printed on doped silicon.
Is it a good solution to turn one of them upside down to make it easier to manufacture?
This will be very useful in space applications where heating silicon to keep it running is a huge energy cost in system design!
Can you link the paper for the transitor
@1:23 I did the numbers and it will cost about 40% less to power this with solar/battery than nuclear. That calc does not include the nuke waste storage for the next 10,000 years.
The problem with heat and cold sounds like a condition encountered in space. So a spaceship on one side is heated by the sun; on the other side, cooled by space.
The big problen, then, becomes shielding.
Very significant for data-centres, where centralised liquid cooling already has the edge on thousands of little fans, and cost of operation is significantly tied to undesired heat generation and its removal. Economics will drive this ahead, especially since existing fabrication can be leveraged.
There are so many breakthroughs on this channel I am broken myself now trying to follow it all..
The article only discusses sub 1K operation. Maintaining that temp would offset any energy saved operating the devices. Looking forward to progress toward room temp. Even 0C would really help.
Isn't most power in CPUs already spent on data movement rather than compute? Does this new technology also affect the cost of data movement?
Amazing!!!!!
Heat is a big factor, BUT the biggest problem today is SIZE. A transistor cannot get any smaller due to the size of an electron and CPU speed has flattened too
This will help make sure we can put a iBooks it’s very powerful on our wrist and build it into the edge of the task that we are trying to accomplish. This will also make augmented reality glasses very possible. This may sound weird, but we need to build processors that are the physical glass frame instead of in a square : ) Your head will be the heat sink
Instead of data. If we could build parallel processing, then we could just make a bunch of tiny parallel processors that we could build it into it all kinds of things. We also need to build it into thread so you wear smart clothes
Anyone else getting tired of Qbit and think we need to move onto Rbits or Affectionately known as 4D frogs. LOL
That will be a real JUMP in technology : )
I wrote his in Blkuesky to NIST: After Quantum computing & Qbits, comes 4D computing that has Zero latency and can give you the answer before you even asked the question. RBits will be the data standard that defies time and will be a jump in technology Affectionately known as the 4D Frog
Look at all the technology that Gene Roddenberry and others caused to happen. I’m doing the same thing but I’m not ready not writingi I just write post out on the Internet. The engineers will see this, and that will get them to start thinking about the next thing : )
Id say, just like the lightbulb had the unfortunate propperty to emit heat, and the led-breakthrough tackeled that (converting energy to light without the heat part), chips need to have this breakthrough too. (at room temperature).
If this is actually for real and it's not too good to be true, this is actually something we desperately need.
We need that one gigawatt data center to run on one megawatt that'd be really fantastic!
frontier? chip? or google willow ?
That IS impressive but how will you shield it from say a solar flare event? I agree that its wonderous and very cool but how do or did they overcome this limitation? Is it deep underground where they will be used? Because I dont think we will have desk top computers using this tech.
I was waiting for this video 👍
What can this do for gaming? What would be the practical implications for something like a handheld console like the Steam Deck or Switch? 🤔
Happy Christmas and a great new year 2024-2025! .. Joyeux Noël et une excellente année 2024-2025 !😊😇
Great video, thanks for bringing us the latest. I'm curious if we could somehow mimic mother nature; I've heard about animals like birds using quantum mechanics-based functions to navigate. Perhaps we can copy this architectural base to allow quantum-phenomena at room temperature.
Great video, I hope smart house tech might also become more energy efficient in heating and cooling a house/fridges/freezers/water heating etc with eg new materials etc that use less energy more efficient. Instead of depending on the development of new power sources for energy. Making us less depending on Energy providers. Eg charging walls/floors in a house for 12 hours with heat that emits during 24 hours or roofs that converts indoor heat to energy during the summer. Or windows that can be configured to block heat in one direction, eg winter no heat leakage and summer outside heat is reflected.
When?
🤔 won’t they work well at room temperature?
impressive, real good advancement
Can I have the paper you mentioned link
we love you 😍🥰❤❤❤
70 Kelvin is liquid nitrogen temperatures. A data centre will need a cooling system that re-liquefies any gas that is boiled off. Liquid nitrogen is available hundreds of tonnes, as shown by SpaceX at Starbase when running cryogenic tests. The trade-off will be using it as a consumable or re-liquefying the boil-off. This is doable but requires R&D.
wow and peace be upon you from me
Who knows with this advancement a quantum computer may be able to factor the number 21 or else factor 15 without cheating! Amazing!
I saw a thing about light being used in microchips to make them faster. Why do they want to build quantum computers if they require so much care? What does a quantum computer actually do for a company? Do they use it for the whole company office space?
really cool , literally
Everything new always has boto pros and cons.
I want to learn as much about this as possible.
I could have some use for this new technology so that I keep my cool better at stress.
Efficiency improvements challenge thinking AI growth will be energy production limited and delayed.
AI will not take 3 times the world's energy production capacity if breakthroughs persist.
It all takes time. Space application growth too is fast. Anastasi, You are spot on.
تشتيت الحرارة ضروري وهناك أمثلة التبريد وصناعة نار سوداء بتشتيت الضوء بضوء مشابه لموجة طرق تشتيت الحرارة بالحرارة صورة أخرى مثل البرادات التي تستخدم حرارة المحركات للتبريد اقصد ان الامر يحتاج لنوع من البرمجة لتحقيق تشتيت حراري لمواد ذكية كسيرية
What's the relationship between SemiQon and IBM?
Hardware for the win
So, with out the more efficient tech then the zetaflop computers use as much power as a doc brown time machine?
How good is willow.:)
this will be big if they manage to adapt the chip to one of the popular processor architecture. it's a interesting PoC ... this will certainly help rocket HPC and QC
Another reason they would thrive in space applications would be that as they generate less heat that would be less heat you'd need to eject from a manned spacecraft, as heat is surprisingly difficult to get rid of in space as traditional cooling such as convection doesn't really apply in space despite it being cold due to it being a vacuum.
I was only listening
Sounded Asian
Big surprise
Excellent video
All these new breakthroughs makes me think weve attained agi behind the scenes.
First ad on a youtuber's video that I didn't skip (and was even interested in learning more about). Very good.
Can we get a look into 4DS Memory’s Area Based Non-Filament Interface Switching ReRAM?
Argentina and Chile have advantages to use the cold temperatures from the Antarctica for the supercomputers, the submarine cables and their central powers on the land.
So you say that when you HIT atoms it destroys quantum states?
The nanosheet technology and cryogenic applications are impressive steps forward. It’s exciting to see how these innovations could reshape quantum computing performance.
danke für die deutsche Audiospur
As a programmer, sometimes I wish the semiconductor progress stopped or slowed down significantly, so that most of my peers would start paying attention to program performance at least at some point in the future. Right now we have a ridiculous situation where even the most basic apps are written using web browser engines, eating up insane amounts of RAM and heating up CPUs for no good reason.
This seems to be a return of the SOI or SOS idea that was previously abandoned partly because they couldn't get the defect rate low enough. The parts had the advantage of not having the latchup and parasitic issues that the bulk parts had.
Cadê os Chips de grafeno?
Great
Vacuum tubes?
Can you review the new apple m4/pro/max chips
Where are spintronics when we need them? 😊
No heat? So where is the energy going?