The Race to Build a Perfect Computer Chip

Поділитися
Вставка
  • Опубліковано 9 лис 2022
  • Digital activity uses a huge amount of electricity with semiconductors near the limit of their efficiency. Now scientists are racing to perfect new chips that use much less power and handle much more data.
    #thespark #technology #green
    --------
    Like this video? Subscribe: ua-cam.com/users/Bloomberg?sub_...
    Become a Quicktake Member for exclusive perks: ua-cam.com/users/bloombergjoin
    Subscribe to Quicktake Explained: bit.ly/3iERrup
    QuickTake Originals is Bloomberg's official premium video channel. We bring you insights and analysis from business, science, and technology experts who are shaping our future. We’re home to Hello World, Giant Leap, Storylines, and the series powering CityLab, Bloomberg Businessweek, Bloomberg Green, and much more.
    Subscribe for business news, but not as you've known it: exclusive interviews, fascinating profiles, data-driven analysis, and the latest in tech innovation from around the world.
    Visit our partner channel QuickTake News for breaking global news and insight in an instant.

КОМЕНТАРІ • 780

  • @firstnamelastname7941
    @firstnamelastname7941 Рік тому +340

    0:00 The importance of developing low-energy computer chips
    4:12 Carbon Nanotube Transistors
    11:07 Photonics Chips
    15:26 Neuromorphic Computing
    24:47 Conclusion

  • @freddoflintstono9321
    @freddoflintstono9321 Рік тому +109

    From my perspective this was one of the most interesting quicktakes I've seen. Well assembled and presented.

  • @adrielamadi8585
    @adrielamadi8585 Рік тому +58

    I'm a computer scientist specialized in software development but this made me appreciate the hardware side of things. It's really inspiring

    • @7eVen.si62
      @7eVen.si62 Рік тому

      Pr!ck

    • @dtrueg
      @dtrueg Місяць тому

      what do you develop? just wondering. AI is taking over.. I was the complete opposite.. i would take my dads old computers in the early 90's (from ibm at the time) think they only ran windows dos.. back when 128mb and floppy disks were awesome haha.. and i would take em apart and learned how to soder and what did what and how etc.. now can build and mod pretty much anything for optimal results. mining, model, training, servers, etc.. just recently started learning coding languages.. wish i started sooner tbh

    • @adrielamadi8585
      @adrielamadi8585 Місяць тому +1

      @@dtrueg i develop web applications and desktop applications but I want to explore AI development and cloud computing.

    • @dtrueg
      @dtrueg Місяць тому

      get the best gpu preferably the nvidia 40 series also may need to up cooling with an overload of fans if get into training models.. i mean you basically need server hardware for any real results but can be done slowly.. making agents or llms or assistants is fairly simple. just need the correct programs to run such as visual studio/textwebui/lm studio/autogen etc. then can pick from hugginface and go from there..@@adrielamadi8585

  • @thinktoomuchb4028
    @thinktoomuchb4028 Рік тому +228

    I'd heard of these technologies to varying degrees, but this piece on the current progress of all of them was informative and fascinating. Thank you!

    • @rawallon
      @rawallon Рік тому +5

      Yeah, also, personally I find it quite odd how I never thought about the carbon footprint of our reliance in computers/tech in general

    • @Ottee2
      @Ottee2 Рік тому

      Fascinating, indeed. Not only do we need to consume energy more efficiently, but also, we need to devise novel ways to create more energy on the planet. Maybe, one day, for example, we will have massive solar energy collectors in space, which then transmit that energy to the planetary surface.

    • @ko7305
      @ko7305 Рік тому

      Epyc.

    • @notevennelson
      @notevennelson Рік тому +1

      No one asked

    • @alanhat5252
      @alanhat5252 Рік тому

      @@Ottee2 ...without chargrilling intervening birds.

  • @nayeb2222
    @nayeb2222 Рік тому +136

    This just renewed my interest in engineering, truly an inspiring documentary.

    • @atlantic_love
      @atlantic_love Рік тому +1

      I'm sure tomorrow would see a kitty rescue and go "this renewed my faith in humanity", right? You renew your interest is something by doing something.

    • @nayeb2222
      @nayeb2222 Рік тому

      @@atlantic_love nothing renews my faith in humanity, I know it's doomed at this point

    • @HueghMungus
      @HueghMungus Рік тому

      @@nayeb2222 Then be a prime example, and do not have kids, then dissolve your body and donate all your organs to charity. Thanks!

    • @nayeb2222
      @nayeb2222 Рік тому

      @@HueghMungus I do have faith in religion, so it won't be an option for me

    • @joshuathomas512
      @joshuathomas512 Рік тому +2

      @@nayeb2222 you have faith in fiction

  • @shivangsingh2463
    @shivangsingh2463 Рік тому +317

    Just want to thank the team of Bloomberg Quicktake for making these really high quality content for us 🙏🏻♥️

    • @JonahNelson7
      @JonahNelson7 Рік тому +3

      It’s genuinely great

    • @ko7305
      @ko7305 Рік тому +1

      Epyc.

    • @JakeWitmer
      @JakeWitmer Рік тому

      Yep. Too bad they're associated with the totalitarian name "Bloomberg." I've met Bloomberg employees before who were rightfully ashamed to be associated with the name...

    • @AndrewMellor-darkphoton
      @AndrewMellor-darkphoton Рік тому +1

      they said something without saying anything

    • @ROSUJACOB
      @ROSUJACOB Рік тому

      You are welcome Mr.Singhania.

  • @kayakMike1000
    @kayakMike1000 Рік тому +77

    The greatest enemy of a wonderful technological breakthrough is the advanced technology that works well enough.

    • @ko7305
      @ko7305 Рік тому +2

      Epyc.

    • @Typhonnyx
      @Typhonnyx Рік тому +3

      yup the consistency is the death of developement

    • @khatharrmalkavian3306
      @khatharrmalkavian3306 Рік тому +1

      Nonsense. We're using all of our existing technology and pouring hundreds of billions of dollars per year into researching new methods.

    • @mnomadvfx
      @mnomadvfx Рік тому +1

      Well ye, but more than that because the tech that works well enough is the one already receiving most of the investment.
      If silicon had become untenable 10 years ago we would have been forced to switch faster to something radically different.
      The same thing is true with NAND flash memory - it's truly terrible for power consumption, and the wear rate of each memory cell is anything but great for long term storage.
      But because the investment was so deep even the most promising advanced replacement tech has constantly been left to rot even as flash becomes an ever greater power drain on our mobile devices.

    • @mnomadvfx
      @mnomadvfx Рік тому

      @@khatharrmalkavian3306 All the nope.
      Hundreds of $billions is going into silicon semiconductor logic and all the other standard computer and information tech of the moment.
      Only a tiny fraction of that amount is going into alternative paths.
      Quantum dots were predicted to replace CMOS image sensors years ago, but nothing is forthcoming simply because the industry investment in CMOS sensors is too high and QD's are not regarded as enough of a benefit to pull money away from CMOS sensor improvement research.
      You can create a revolutionary memory tech for not a huge amount of money in a lab like Rice university - but making a chip using it that scales to competitive bit density with a modern 3D NAND chip costs a frickin shipping port full of money to engineer in staff and time, only compounded by the lesser experience and knowledge with the newer technologies in the industry.
      There are some things that are being pursued more vigorously, such as metalenses - they can be produced faster, more cheaply than conventional lenses and offer dramatically increased compactness + utility as a single achromatic metalens element can replace many in an optical system by focusing all wavelengths onto a sensor in one super thin piece rather than needing one element per wavelength and others for extra adjustment.
      So they are basically winners across the board relative to the technology they aim to replace. 20 years from now we will wonder why cameras lenses ever used to be so heavy.

  • @CoreyChambersLA
    @CoreyChambersLA Рік тому +4

    The atom is not the limit to size reduction. Subatomic particles can perform the same functions, better, cheaper and faster.

  • @djayjp
    @djayjp Рік тому +52

    Still a lot better than delivering physical goods (eg Blockbuster vs streaming). I'm sure the former uses more than 100x as much energy.

    • @niveshproag3761
      @niveshproag3761 Рік тому +3

      Yes, delivering physical goods use 100x more energy but the convenience of digital means that we use it 1000x more. Some people just leave netflix/youtube running in the background the whole day.

    • @djayjp
      @djayjp Рік тому +6

      @@niveshproag3761 I highly doubt that figure.

    • @niveshproag3761
      @niveshproag3761 Рік тому +2

      I highly doubt the 100x too. I just mean our consumption outpaces the increases in efficiency. Proven by the fact that our electricity consumption increases every decade.

    • @djayjp
      @djayjp Рік тому +4

      @@niveshproag3761 Certainly not proven as you're not isolating variables.

    • @florisr9
      @florisr9 Рік тому +1

      Exactly, we should focus on the way digital equipment makes our lives more productive and efficient, rather than how it consumers 'much' energy (it doesn't). The ratio of energy consumption to value added is tremendously small compared to other sectors like transportation.

  • @edeneden97
    @edeneden97 Рік тому +5

    I really like the jumps between the founders and the skeptic guy

  • @madad0406
    @madad0406 Рік тому +10

    Literally just passed my computer architecture final and then this was recommended to me haha. Great video!

  • @oldgamer856
    @oldgamer856 Рік тому +4

    24:36
    Where's the mouse?
    * Points at camera *
    What a burn

    • @TheSano
      @TheSano Місяць тому

      Yeah, money wasted type moment 😂

  • @mdaverde
    @mdaverde Рік тому +146

    What a time to be alive.
    This is obviously exciting work but the skepticism in me assumes that once vendors see that they can do more with less, they'll just do more and so the cycle continues. What is also brutally hard about this work to take full effect is the next layer above this needs to also change to take advantage of these performance/energy gains (the firmware, instruction sets, the algorithms, programming languages and so on). Over the long term though, I'm optimistic of this shift

    • @michaeledwards2251
      @michaeledwards2251 Рік тому +2

      Rust and other modern languages will become more significant. Hardware implementation of typing, inheritance, bounds, soft cells, networks, and flexible control assignment, will all be implemented.

    • @jackdoesengineering2309
      @jackdoesengineering2309 Рік тому +3

      I'm using an APU not a CPU. It's 35 watts and is extremely capable. With computational progress comes reduced power. It's just they are mutually exclusive and at some point people have to choose lower power over performance gains. As energy prices rise this choice is reconsidered.

    • @michaeledwards2251
      @michaeledwards2251 Рік тому +1

      @@jackdoesengineering2309
      Under clocking tricks allow the computational rate to be reduced to the demand rate with lower power consumption. This still allows high computational rates, with high power consumption, whenever they are needed.

    • @amosbatto3051
      @amosbatto3051 Рік тому +9

      To some degree the energy consumption hasn't fallen, especially with desktop PCs, but the greater energy efficiency has made possible all sorts of new form factors which are much more energy efficient: laptops, netbooks, tablets, smart phones and smart watches. Eventually we will get to smart glasses, smart earbuds and smart clothes that are extremely energy efficient and can replace much of the functionality of traditional PCs. If you look at energy consumption in advanced economies, it is actually falling, which is an indication that we are doing more with less energy.
      As a computer programmer, I can tell you that energy efficiency is becoming increasingly important in programming. Not only are programmers focusing more on code that can deal with low energy systems running on a battery, but they are focusing more on compiled languages, such as Rust, Swift, Go and Julia, that use less memory and computing cycles than interpreted languages.

    • @boptillyouflop
      @boptillyouflop Рік тому +4

      @@michaeledwards2251 Hardware implementation of typing, inheritance and bounds as of yet hasn't been able to make any of these things faster for code that does these things as of yet:
      - Inheritance is basically a fancy jump instruction. The main problem with this is that with inheritance, your jump address usually has to be loaded from memory, which can take many cycles, and the CPU has to basically guess the branch target and run a whole bunch of speculative instructions while the address loads for real. Having a special version of "jump to variable address" just for inheritance just doesn't gain much over the regular variable jump.
      - Bounds is likewise a fancy conditional branch. Conditional branches that rarely get taken are already quite cheap on modern CPUs - they do take up slots in the instruction decoder and micro-op execution but they don't compete for the really important slots (memory loading/storing). In fact, loading the bound is definitely slower than testing it (since it uses a memory load instruction). The speed gain from adding hardware bounding tests is likely to be rather small.
      - Typing is in a similar situation. Usually dynamic typed variables are either just dynamic versions of small fixed-size static types (double, float, bool, int32_t, int64_t) or larger dynamic-sized variable types (strings, objects, maps, etc). The larger dynamic-sized types have to be handled in software (too complex for hardware), so you'd still have to load the type and test for it. The small fixed-size types could conceivably be handled in hardware but you'd probably just be using the largest type all the time.

  • @hgbugalou
    @hgbugalou Рік тому +102

    Software development is also important and is rarely considered in these type of scenarios pertaining to compute efficiency and carbon output. Today's developers are writing bloated inefficient code using high level languages that just add even more overhead. This comes out as wasted CPU/GPU/DPU cycles and thus wasted energy. To some degree the increase in power of the hardware has caused this as before developers had to be much more diligent about writing lean code.

    • @zhinkunakur4751
      @zhinkunakur4751 Рік тому +8

      cmon are you really suggesting high level languages are bad and inefficient ? I believe high level languages are really inevitable

    • @zhinkunakur4751
      @zhinkunakur4751 Рік тому +4

      what we should look at more is high efficiency conversion from Upper bound languages like basic English instruction to machine language using machine learning using the analog energy efficiency advantage we have , you cannot stop the inevitable but we can get more efficient codes and there isn't only one way to do it

    • @hgbugalou
      @hgbugalou Рік тому +7

      @@zhinkunakur4751 I am not suggesting that entirely. High level languages are awesome and things like python have made countless cool and invaluable solutions and gotten a lot of people into coding. Part of the benefit of this more powerful hardware is the amount of abstraction that can be done and still get the job done nicely to the end user. My point was only to highlight that I worry that as this things advance, the lower level stuff will start to become lost and appreciation for how efficient low level languages can be will be more and more underappreciated due to lack of understanding or thinking its voodoo not worth getting into. The are still a lot of scenarios where efficient code matters, and the closer to hardware you are the better. It is important we do not lose site of that or let that knowledge become stale.

    • @zhinkunakur4751
      @zhinkunakur4751 Рік тому

      @@hgbugalou I see , Agreed , I too am a little worried for the increasing unpopularity of LLLs , or maybe the concentration is seeming to be going down because more more people are getting into coding and vast majority of them are will be using the HLLs , and not that the growthrate of LLLS are going down maybe its just that HLLs have a higher growth rate

    • @mnomadvfx
      @mnomadvfx Рік тому +3

      Hopefully generative AI's can do something about that.
      It's something I have often thought about when observing the painfully slow development process of new video codecs from ISA portable C code to fast, efficient ISA specific SIMD assembly.

  • @omid_saedi
    @omid_saedi Рік тому +36

    Such an awesome content that you produce. It has something to teach nearly anybody at any level of knowledge regarding the problem.

  • @AlanTheBeast100
    @AlanTheBeast100 Рік тому +18

    I heat my house with electricity for nearly 6 months per year. Therefore 100% of electronics use in the house is 0 energy consuming during that time.
    Data centres should be located where there is a need for heating water or other materials. Thus all of the heat can be dumped in the early part of the manufacturing process.

    • @paulspvk6049
      @paulspvk6049 Рік тому +1

      You're correct about the second part. But unless you're using a space heater instead of an HVAC, it's not zero. Heat pumps which move thermal energy are actually more efficient than pure electric to heat converters.

    • @AlanTheBeast100
      @AlanTheBeast100 Рік тому

      @@paulspvk6049 Regardless of how I heat, the heat from all things dumps into the house - so no extra charge ($). As to heat pumps: True enough, but it's cold here (-24°C presently, -30°C tomorrow - will be mild next week) and electricity is very, very cheap - whereas heat pumps are expensive to buy and install - and fail expensively. That said, Hydro Quebec will subsidize about $1000 if I throw a heat pump at it. Maybe some day.

    • @paulspvk6049
      @paulspvk6049 Рік тому +1

      @@AlanTheBeast100 Makes sense.

  • @siddhantjain243
    @siddhantjain243 Рік тому +20

    Lithography "nm" these days doesn't really means exact no. Ie 5nm doesn't actually mean 5nm manufacturing process

    • @djayjp
      @djayjp Рік тому +5

      Yeah TSMC state it's more of a marketing term than anything.

    • @siddhantjain243
      @siddhantjain243 Рік тому +3

      @@djayjp same goes for Samsung & Intel

    • @DementedPingu
      @DementedPingu Рік тому +3

      @Cobo Ltger Isn't it refering to the size of transistor gates?

    • @mmmmm49513
      @mmmmm49513 Рік тому +1

      It did at one point. But now it’s just used to say it’s 2x better than this old process etc.

  • @trolly4233
    @trolly4233 Рік тому +45

    Can confirm we are running out of chips, the chip-to-air ratio changed from 50-50 to 35-65. Trying times indeed. The bag itself is worth more than the chips inside now.

  • @pterandon
    @pterandon Рік тому +6

    Superb presentation. Both “pop culture” exposure, and real technical info for experts

  • @johnsavard7583
    @johnsavard7583 Рік тому +22

    We've already made amazing strides in the power efficiency of computers. An IBM 360/195, with cache and out-of-order execution, like most modern computers, used much more power. And go back to the days when computers used vacuum tubes instead of transistors, and their power consumption compared to the work they could do was much higher.

    • @ko7305
      @ko7305 Рік тому

      Epyc.

    • @mnomadvfx
      @mnomadvfx Рік тому

      That is true, but back when that happened the worldwide use of computers was just a tiny fraction of what it is now.
      The increase in use means we need to push the hardware efficiency ever further to keep up.

    • @0xD1CE
      @0xD1CE Рік тому +3

      Wirth's Law. We don't necessarily need better computers. We need software to be more efficient. Nowadays it's normal for computer programs to occasionally crash due to memory leaks or bug in the code. I work at a datacenter and I have to use this app on my phone to do daily routine inspections and the app crashes when open for too long... It's crazy how tolerant we became of unstable software.

  • @chi4829
    @chi4829 Рік тому +2

    15:11 The interconnect cables which are devised to mitigate energy consumption challenges in data centers are just simply optical fiber interconnects which are directly plugged to the ASIC. Co-packaged optics technology bridges the gap between electronics and photonics by integrating them on a common platform with photonic chip serving as the first point of contact to the external world.

  • @hyperteleXii
    @hyperteleXii Рік тому +42

    This new chip sounds like a pathfinding co-processor to my game developer ears. Navigating in real-time an order of magnitude more agents in a dynamic world would revolutionize game development. Everybody's stuck on pathfinding. We're still using algorithms from the 1960s.

  • @aresmars2003
    @aresmars2003 Рік тому +5

    At least in Minnesota, during winter heating season, I figure waste heat from my PC all goes into heating my home. Of course if I had better home insulation, I'd probably save more in heating that way!

  • @stevesedio1656
    @stevesedio1656 Рік тому +2

    Most of the data that moves around a server farm, goes over copper. Even when computers are paralleled.
    Light travels through fiber at 65% speed of light, through copper at 60%.
    The devices that convert data to light have the same limits as the devices that drive wire.
    Light can send more than one signal using color, but that only uses a small slice of the available bandwidth.
    Copper wire operates at a lower frequency (maybe 10GHz vs 50,000GHz), but uses the entire bandwith of the wire.
    The big advantage fiber has is how far a signal can travel.

  • @roopkaransingh1794
    @roopkaransingh1794 Рік тому +5

    Amazing content, so cool information, please keep coming up with these kinds of videos

  • @gusauriemo
    @gusauriemo Рік тому +2

    Nengo is a software that works with Loihi currently with the intention of allowing software applications for the neuromorphic chips. The research they do at Waterloo in general is quite interesting

  • @vrajpatel3139
    @vrajpatel3139 Рік тому +7

    neuro computer looked fired🔥

  • @RB747domme
    @RB747domme Рік тому +8

    Fascinating, as Spock would say. It's an interesting decade or two ahead as we grapple with these different technologies, in the hope that one of them will become commercially mainstream - and breathe new life into the industry for another 50 years or so until something newer more more radical appears.
    Ica is also fascinating, especially it's neuromorphic learning paradigm, and will definitely accelerate the rate at which robot can learn their surroundings and interract, as well as learn from it's past, and build on its intelligence.
    The future is definitely bright.

  • @ojhuk
    @ojhuk Рік тому +49

    Photonics are the future. I've been blown away with the ways they are devising to build logic gates that function by altering photons on a quantum level. Light based computers have been a mainstay in Science Fiction for a long time now and it's amazing to see actual real-world advances with practical applications being made.

    • @koiyujo1543
      @koiyujo1543 Рік тому

      Well yea bur Maybe bit it depends we can always make our current election ones better besides trying to add more transistors I mean yea will need better materials like graphene could make computers hundreds of thousands of times faster.

    • @ojhuk
      @ojhuk Рік тому +4

      @@koiyujo1543 Yeah i agree, there are still advancements to be made in electronics, i imagine hybrid photonic/electronic systems will become a thing before we get any fully photonic chips, from what i understand the benefits of photonics to latency and efficiency go far beyond what is possible with electronics.

    • @katarn848
      @katarn848 Рік тому +4

      I have my doubt about carbon. After High-NA EUV lithography has reached it limit with silicon wafers. That like in 2 decades. I think there will be limits found how far you can go in complexity , layers and materials.

    • @billfarley9015
      @billfarley9015 Рік тому

      Offhand I can't think of any examples of light-based computers being a mainstay of science fiction. Can you cite any?

    • @ojhuk
      @ojhuk Рік тому +5

      @@billfarley9015 I can't offhand either. I thought about Data from Star Trek: TNG but he's positronic. I also thought about Voyager's computer but iirc that's organic. There is Orac, The LIberator's supercomputer from Blake's 7, I always assumed that was Photonic but I may be wrong. I'm sure if I was to look hard enough I'd find something soon enough, sci-fi writers have a far greater imagination and scientific knowledge than myself. :)

  • @ryansumbele3552
    @ryansumbele3552 Рік тому +2

    Very informative, thank you from Cameroon ❤️

  • @dan_rad
    @dan_rad Рік тому +20

    Could we solve a lot of the energy problem by writing more efficient code? It seems that as processing power has increased developers are less concerned with memory constraints. There is also a lot of pressure to push new features at the expense of optimised code (and of course more and more abstraction layers in coding).
    It's like Parkinson's law, but with computer memory.

    • @zonegaming3498
      @zonegaming3498 Рік тому +4

      I could see AI generating new ways to solve computational problems that reduces the need to compute them.
      For example DLSS or AI upscaling.

    • @Meleeman011
      @Meleeman011 Рік тому +1

      nope. because in order to make money code needs to be shipped fast. there are better. what we can do is encode more in less. so instead of binary computers we use ternary, or even quarternary computation. and that could increase the amount of possible calculations. the reason developers are less concerned with memory constraints is because its expensive to write efficient code, as it takes longer and you need to understand more math and how computers work in order to write efficient code. its also more prone to bugs and errors. what you need is something simple enough to write but provides enough control for the task at hand. and most people don't even know until the product is shipped, optimizations happen after the product is built. a real solution would be using analog computers and a whole bunch of them to do specific calculations, and then translate them into binary. this in principle is how and why asic mining exists because instead of abusing sand and making it think, we simply just let it read the electrical charge outputs from several mechanical computers and let it process those inputs via conventional silicon which will need less power to operate since it need only read from its mechanical computer counterparts and maybe do a few calculations here and there.

    • @Meleeman011
      @Meleeman011 Рік тому +2

      the quickest thing you could do is learn how to use linux and a terminal, and you would already be using less power the a majority of people, and use a window manager like i3wm. and use more terminal applications including on your phone like termux. its not as conveniant but you can do quite a bit with a terminal. so much i'm convinced that real work is done in a terminal.

    • @velvetypotato711
      @velvetypotato711 Рік тому +2

      @@Meleeman011 Using c++ server side can have a reduction in energy usage

    • @andyfreeze4072
      @andyfreeze4072 Рік тому

      @@Meleeman011 mate, i dont want to learn anymore than necessary when it comes to computers. They are supposedly made to adapt to us not us to a binary code. Yes i have done unix and linux before but i gave up, i dont wish to reinvent the wheel, i will let other nerds like you do that. Yes you can, yes i can do this and that but i chose not to. I can think of better things to do with my life.

  • @user-ue7wu2dh4o
    @user-ue7wu2dh4o 20 днів тому

    brilliantly explained to the layman with such recondite acumen.

  • @Rnankn
    @Rnankn Рік тому +2

    Progress is not tied to computers getting better, it is contingent on excising technology from our lives. Apparently these folks failed to learn Jevon’s paradox: “an increase in efficiency in resource use will generate an increase in resource consumption rather than a decrease”. Nor do they consider the exponential power of compound growth to exceed any linear reduction or transition. Even the Greeks knew Sisyphus would never get the boulder to the top of the hill, yet techno-utopians gleefully assume a smaller transistor is going to solve all problems.

  • @deiphosant
    @deiphosant Рік тому +5

    People always talk about how more effeciency will lead to less energy consumption. But if I know anything about humans, is that they will always push to limits (and power/thermals are the limiting factor right now), so I feel more effecient chips are just going to lead to even more computers and increased performance instead of decreased power draw.

    • @RobinOnTour
      @RobinOnTour Рік тому +2

      Energy consumption would increase either way

  • @nicolasdujarrier
    @nicolasdujarrier Рік тому +6

    I think a few other options have not been duscussed liked spintronics (with MRAM already on the market), and maybe (flexible) organic electronics…

  • @michaelmccoubrey4211
    @michaelmccoubrey4211 Рік тому +6

    photonic computers, neuromorphic computers, and cpu's that use carbon nanotubes are very intresting but frankly if we wanted to dramatically reduce computer power consumption we could already do this today.
    We could:
    - use the programming languages C or Rust instead of popular programming languages like Python (which is something like 45 times less efficient)
    - use RISC based CPUs such as ARM chips or RISC5 chips
    - underclock CPUs so that they maximise power efficiency rather trying to maximise performance
    - use operating system drivers that aim to use minimal power
    If we did these things we could probably use < 1% of the power we currently use. We don't do these things largely because it would be slightly more inconvenient and would require social change rather than innovations in technology.

    • @User9681e
      @User9681e Рік тому +2

      There is benefit for higer languages too c is only for low level / high performance stuff and is absolutely unreplaceable there
      other stuff y need higer languages like rust python java to actually get projects done in time and not mess with optimization too much
      there is always a use for both
      About undercolcking the PC has a thing called a governer who decides clock speed per workload so that's already been done when y don't need max performance
      Plus we have ecores

    • @rajuaditya1914
      @rajuaditya1914 Рік тому +5

      This is such a normie take that it is hilarious.

    • @User9681e
      @User9681e Рік тому +3

      @@rajuaditya1914 we all are interested learning and yeah we normies may not understand key concepts

    • @ericmoulot9148
      @ericmoulot9148 Рік тому

      @@rajuaditya1914 Sounds like a reasonable argument to me. Maybe you have some insights to share that'd change my mind and his?

    • @bhuvaneshs.k638
      @bhuvaneshs.k638 Рік тому

      Bruhhhh ur point number 1 doesn't make any sense. U r talking in the software domain. Plus there's a reason why everyone use python. It's faster to build and test

  • @vigneshs6232
    @vigneshs6232 Рік тому

    Wonderful...Great task....Enormous knowledge...Thankyou all....

  • @carl8790
    @carl8790 7 місяців тому

    @3:32 there should have been a huge asterisk at the figure of 5nm (nanometer). The transistors aren't actually 5nm in size and that '5nm' technology is just advertising for the manufacturing company's next gen transistors. Now due to how transistors can be manufactured and packaged differently, there's no agreed industry standardized size. Some fabs (places where semiconductors are being built), usually give a density figure of x amount of transistors per mm² of their die, but even that is difficult to verify independently.

  • @PrinceKumar-hh6yn
    @PrinceKumar-hh6yn 7 місяців тому

    I am heavily impressed and amazed at the same time the kind of presentation Bloomberg has presented here..PURELY scientific ...

  • @sergebillault730
    @sergebillault730 Рік тому +6

    The best alignment process I know of is using magnetic fields. Is there a way to make these nano tubes or the environment in which they are stored temporarily magnetic?

  • @rheung3
    @rheung3 Рік тому +1

    Thanks so much for such beautifully illustrated video about modern economy, and its current limits, and those with courage and wisdom to go beyond, yet at same time hopefully to also start reversing our energy footprint causing climate change disaster.

  • @qwertyali2943
    @qwertyali2943 Рік тому

    this really got my attention becuz of my deep interest in science and tech, thanks bloomberg!!

  • @_Pickle_Rick_
    @_Pickle_Rick_ Рік тому +6

    So maybe the AI bottleneck (ie. class 5 autonomy, 'general ai', etc) is due to the binary nature of the base layer architecture - from this it sounds like the analogue stochasticity of neuromorphic architectures may be required for AI to meaningfully progress...

  • @johndawson6057
    @johndawson6057 Рік тому +3

    This was The best explanation i have heard for quantum tunneling. Thanks guys.

  • @jaredspencer3304
    @jaredspencer3304 Рік тому +7

    It's got to be tough for these new technologies to compete with Silicon, which has had 50 years of uninterrupted existential growth. Even if a new technology could be better than Silicon, it might never get there because it can't be immediately small enough or fast enough or cheap enough to compete with the cutting edge.

    • @latvialava6644
      @latvialava6644 Рік тому +1

      New challenges will create New opportunities. Maybe, not with commercial applications but, these technological breakthroughs will initiate their own journey with defence and space Applications !!!

    • @femiairboy94
      @femiairboy94 Рік тому

      They will eventually get cheap enough, the beauty of the capitalist system. 50 years ago owning a computer was impossible. Today the average American has two computers.

  • @weirdsciencetv4999
    @weirdsciencetv4999 Рік тому +4

    Neuromorphic computers are the next key technology.
    What would be interesting is if chips can be more three dimensional, as opposed to the relatively two dimensional chips afforded by conventional lithography techniques.

  • @TheRomanTimesNews
    @TheRomanTimesNews Рік тому

    13:00 talk to me boi
    you got me at photon

  • @AlexTrusk91
    @AlexTrusk91 Рік тому +3

    I'm glad to own some oldshool household hardware like my cettle and toaster that don't rely on chips and last for like 3 years, but for over 30 years and counting (got the stuff from my parents and maybe i give it to the next generation as some magical relics with of another time)

  • @grumpeepoo
    @grumpeepoo 8 місяців тому +1

    There is already a neuromorphic chip company from Australia called BrainChip with their Akida 2nd gen chip out. They have partnered with ARM, Intel, Prophesee, Megachips to name a few

  • @drivenbyrage5710
    @drivenbyrage5710 Рік тому +3

    Imagine how much we could save if millions of people weren't living online every waking minute seeking validation, and simply put thier phone down. It would save not only energy, but humanity itself.

    • @SumitPalTube
      @SumitPalTube Рік тому

      Remember, it is these validation seeking individuals which are pushing scientists and engineers to innovate and come with fundamentally new solutions, ensuring that we progress as a species. If we didn't need to upgrade, no one would care to innovate, and we would still be happy with stone-age technology.

  • @alecs536
    @alecs536 Рік тому +11

    I'm glad that Pam from "The Office" has finally found her calling

  • @FS-ft8ri
    @FS-ft8ri Рік тому +1

    Maybe dielectrophoresis in combination with flow fields in solution is a way of tuning and improving the alignment over pre-treated (for instance lithography) inhomogenous surface energy Si wavers.
    Worked out pretty well for GaAs nanowires in a study we conducted at the university to align them parallel at contacts.

    • @ivanlam1304
      @ivanlam1304 11 місяців тому

      Am I correct in thinking that the aligned nanotubes would form a large scale matrix of potential MOSFET transistors?

    • @FS-ft8ri
      @FS-ft8ri 11 місяців тому

      @@ivanlam1304 In principle i think you could manage to make it that way, however, I habe to admit that i am no expert in Transistor technology.
      My knowledge is more coming from the surface science/electrochemistry/interface science especially solid to liquid

  • @BlackBirdNL
    @BlackBirdNL Рік тому

    24:33, "Here is the mouse." Proceeds to point at the Jell-O.
    Jump cut to it pointing at the mouse.

  • @BloodyMobile
    @BloodyMobile Рік тому +2

    I'd like a study on how much % of this power consumption falls on user profiling and the processing needed for it.
    I wouldn't be surprised if it's around or above half of it...

  • @duckmasterflex
    @duckmasterflex Рік тому +1

    reducing energy consumption is like adding more lanes to a highway, it won't reduce traffic, it will just add more cars

  • @BradenLehman
    @BradenLehman Рік тому +1

    24:19 "What is this called" *flips off the teacher* 🤣

  • @ankitroy3319
    @ankitroy3319 Рік тому +1

    This is really youtube should recommend

  • @PrivateSi
    @PrivateSi Рік тому

    Nice succinct, informative, up-to-date vid and objective analysis. Photonic computing is definitely the way forward. Neuro-photonic and even bio-photonic computing will combine well in the future when the tech. is worked out. 1000x more computing using 1000x less power within 20 years. Moore's Law will be utterly broken, but in a productive way via a large tech. leap or two, rather than slowing to a standstill as pushed by many youtube vids.

  • @D-Z321
    @D-Z321 Рік тому

    My dad works as an electrical engineer in the semi-conductor industry. Pretty crazy stuff.

  • @mousatat7392
    @mousatat7392 Рік тому +22

    Even though there is hundreds of company racing in this field, but all of them are pushing the world to the front even if they doesn't win the pie at the end.

  • @sergioespitia7847
    @sergioespitia7847 Рік тому +7

    Definitely. This is a pretty important documentary to inspire engineers all over the world.

  • @zAlaska
    @zAlaska Рік тому +2

    The Cerebras wafer 80 EXO scale processor has 850,000 cores, each core itself is a supercomputer. All ultra interconnected without bus speed, it outperforms every supercomputer ever built, all on one chip. They believe they have found the pathway to singularity. I gather the only supercomputer that's faster, doesn't exist. 44,000 watts, perhaps it could two jobs at once, heating water in the room while predicting the future, it's that fast. You know like when you make a movie and it takes forever for the processor to get done with it. Pictures simulating a nuclear explosion, fluid dynamics. Current supercomputers draw the event much slower than it happens in actuality. This chip can do more work faster and predict the event accurately in great detail faster than it can occur. Made that tsmc at 5 nanoscale. Strangely, Moore's law will continue. IBM has already produced chips at the two nanometers, so surely there's lots of room for improvement yet to come for the cerebras wafer supercomputer.

  • @blueguy5588
    @blueguy5588 Рік тому

    Some corrections: 1) 5 nm process isn't actually 5 nm, it's a marketing term, so the graphic is inaccurate, 2) modern chips are already layered.

  • @martinblake2278
    @martinblake2278 Рік тому +1

    I was surprised that Mythic Chip was not included on this video to represent Analog. The Nuero Computing part that is being developed in Mumbai -in this video- has already been created by those guys years ago and currently has a computing power that is equivalent to current digital standards but using only 3 watts of energy.

    • @I___Am
      @I___Am 9 місяців тому

      Mythic chip?

  • @CrackDavidson1
    @CrackDavidson1 4 дні тому

    Even though these chips would be more expensive to produce, the power savings in use makes a huge dent in the life time costs of those chips. This is what big data centers live on. So if there is a 1000 time reduction in power usage, *theoretically* they can cost a 1000 times more to produce, but still be competitive.

  • @anushantony
    @anushantony Рік тому +1

    beautifully put together.

  • @JJs_playground
    @JJs_playground Рік тому

    This was a great little mini-documentary.

  • @nishantaadi
    @nishantaadi Рік тому +10

    Semiconductor is the world new Gold and Oil.

    • @anantsky
      @anantsky Рік тому

      There's nothing new about semi-conductors.

    • @organicfarm5524
      @organicfarm5524 Рік тому

      semiconductor came in 1930s; they are not new

    • @zzmmz3789
      @zzmmz3789 Рік тому

      But still begging for oil

  • @ChrisBrengel
    @ChrisBrengel 8 місяців тому

    First minute does a great job explaining how much electricity computers use.

  • @shadow-sea
    @shadow-sea Рік тому

    absolutely fascinating

  • @Wulfcry
    @Wulfcry Рік тому +1

    Reducing environmental impact by choice is said to be used as a performance measure advancing chip design. Could be worst off in no development releasing them, Most designs end up on the shelf without ever being released not even partly small functional designs. All costs go to the larger processing of data while the least of data processing works as well to uncover much of a design. However, I applaud how they go about it.

  • @pedropimont6716
    @pedropimont6716 Рік тому +1

    people forgot to consider how a smartphone saves energy by replacing old solutions like physical maps, books and so on which also takes energy to produce and pollute the planet

  • @shapelessed
    @shapelessed Рік тому +1

    There are datacenters that reuse the heat generated to provide central heating to towns around them. That’s just one example of how much power is wasted on computing - if it’s enough to heat the houses around you and it’s actually even profitable to do that.

  • @studiolezard
    @studiolezard Рік тому +1

    Wouldn't a high frequency vibration like ultrasound while in suspension help to align the nanotubes?

  • @JG_UK
    @JG_UK Рік тому +5

    Amazed how long these alternative silicon methods have been in development. Seems like we’re stuck with wafer silicon for this generation

  • @MrChronicpayne
    @MrChronicpayne Рік тому +1

    The guy with the beard is a great commentator/middle man for this Quicktake. Hope to see him again.

  • @dekev7503
    @dekev7503 Рік тому +1

    As a microelectronics engineering grad student, I'm very well aware of the major challenges that power optimization can pose. There have been many attempts to "cheat" the physical boundaries of materials, some have been successful, some have lead to entirely different technologies.

    • @Typhonnyx
      @Typhonnyx Рік тому

      really like for example

    • @boptillyouflop
      @boptillyouflop Рік тому

      Any technology that can build a >1Ghz 32bit exact adder...

  • @celdur4635
    @celdur4635 Рік тому +3

    I think we will always pump more power even if we have more efficient chips, because there is no limit to what we want to do with them.
    So cool, more efficient chips, its great, BUT we will still increase your energy consumption.

  • @robertpearson8546
    @robertpearson8546 Рік тому

    You could use q-nary logic like in flash memories. You could use adiabatic circuits. You could use self-time circuits. You don't have to stay in the same rut, doing the same thing over and over.

  • @jamesjanse3731
    @jamesjanse3731 Рік тому +3

    If the production process always creates metallic nanotubes as a by-product, could those be aligned magnetically before removing them from the semiconducting ones?

    • @gkess7106
      @gkess7106 Рік тому

      Not when they are copper.

  • @xanokothe
    @xanokothe 5 місяців тому +1

    The think is that while building data centers, these tech companies also build solar and wind

    • @phelan8385
      @phelan8385 2 місяці тому

      They're gonna start building reactors in the data centers

  • @laughingvampire7555
    @laughingvampire7555 23 дні тому

    The real benefit comes from using Analog computers, that can optimize the usage of energy.

  • @MilesBellas
    @MilesBellas Рік тому

    Technology that creates more advanced technology should be the primary military focus.

  • @spaceprior
    @spaceprior Рік тому +1

    Hey, bloomberg, could you put links to the things you discuss in the video description? I'd expect your viewers to be pretty likely to want to look further into things and read stuff.

  • @TotallyRandomHandle
    @TotallyRandomHandle Рік тому +1

    I'm a science nerd and fan. But I hope there are simultaneous efforts to develop safe disposal of carbon nanotube solution 5:19. The tubes are too small to filter conventionally, and they don't easily degrade. Waste has to be considered, particularly since 33% of it is already known to be unwanted by products (full time conducting nanotubes).

  • @stevegunderson2392
    @stevegunderson2392 Рік тому +4

    Putting a computer into a toaster is the dumbest use of computing power one can imagine. Are you so lonely that you need emails from your refrigerator? Security = control.

  • @bhuvaneshs.k638
    @bhuvaneshs.k638 Рік тому +15

    As an Indian I have to compliment the US for pushing these bleeding edge r&d. I work at neuromorphic computing area in India & I'm sure India will start competing with USA soon

    • @lophilip
      @lophilip Рік тому +2

      I don't work in that specific field but I'm certain that India will have much to contribute in that technology!

    • @didzisskards7987
      @didzisskards7987 Рік тому

      India have great future, however...the day when world will start to cooperate more instead of compete, will be the best day for humanity...also, and i am sure that it wont happen any time soon. People are just too self centric and primal

    • @J_X999
      @J_X999 Рік тому

      China's already much further ahead in all types of chips, whether carbon, photonic or RISC

    • @bhuvaneshs.k638
      @bhuvaneshs.k638 Рік тому

      @@J_X999 u mean compared to USA?

    • @J_X999
      @J_X999 Рік тому +1

      @@bhuvaneshs.k638 Compared to India. India has potential but it just isn't as big as many people think

  • @gkess7106
    @gkess7106 Рік тому +2

    Light travels in “conductors” not “wires”.

  • @mlc4495
    @mlc4495 Рік тому

    The human mind is a funny thing. This video brought a long dormant memory back to the surface and now I can't shake it. I recalled reading an article in a games magazine in the mid to late 90s that talked about a coming revolution in gaming: light based computing utilising diamonds! One company had created a games machine supposedly a 1,000 times more powerful than the, at the time, still speculative PlayStation 2 and would be releasing it "soon". I'd love to read that article again. I don't even know the magazine I read it in. Perhaps CVG?

    • @luka3174
      @luka3174 Рік тому

      That said technology has experienced major innovations in the last 20 years, but news always like to make it seem like we’ll have futuristic technology in the coming decade

  • @jessty5179
    @jessty5179 Рік тому +1

    0:00 misleading image, atomic power plant don't produce carbone emissions, in fact it's one of the best alternative to that specific problem.

  • @pinkliongaming8769
    @pinkliongaming8769 Рік тому

    Wow I can't wait for my Quantum Carbon Photonic Neuromorphic Toaster

  • @deltadom33
    @deltadom33 Рік тому

    It is interesting that you can’t combine some of these methods as with neuromorphic computing or neural networks , you could use the efficiency of say photonics but the problems with all these chips is running things like an operating system on top of them as photonics wouldn’t struggle with quantum mechanics problems and as photons are smaller than electrons . The problem is that you have to convert electricity to light using oleds,
    Neuromorphic computing could solve the carbon nanotube problem but the problem is that you can’t get smaller than the carbon atom which is essentially quite big as it has 14 proton and neutrons. To separate them if it is metallic you would just need a magnet.
    The thing is to arrange them on a chip would be the most fascinating thing as you could use a neural network to align the chips in the best configuration
    Light to me is the best option with a neuromorphic setup but to make light travel as it has different wavelengths on the electromagnetic spectrum ,
    But even if you have light together it is still going to generate heat or if you have light transistors . As you could go down to the width of a photon in electronics or photonics , how can you tell whether a light transistor is on or off. If light has mass that would make a difference. You could control light if it was moving at a specific frequency say red shifted.
    These chips are along way from being used to run games on a computer
    Quantum computing wasn’t even bought up

  • @kayakMike1000
    @kayakMike1000 Рік тому +2

    I will have that researcher know.... My brain uses at least 25 watts, when I am sleeping. It's well over 32 watts when I am awake.

  • @wonkafactory936
    @wonkafactory936 Рік тому

    Gaming at the speed of light. Trading stocks at the speed of light. Downloading at the speed of light. Money transactions at the speed of light. I love the thought already.

  • @nickvoutsas5144
    @nickvoutsas5144 Рік тому

    Light traveling through optic chips is the future.
    Combined calculations of a traditional binary computer integrated with a quantum computer makes sense

  • @malectric
    @malectric Рік тому

    I'm in awe of the technology and ideas which have been developed to enable material manipulation at molecular and atomic scales. Just amazing. My choice application of new AI technology: to recognize and edit ads from UA-cam videos.

  • @chakrabortyraj
    @chakrabortyraj Рік тому +3

    Didnt expect an analog solution to be listed in this but that sounds very promising along with the photonics one being already a proven technology, when it comes to consume less energy.
    My antenna cant catch Rest of technologies (infact none lol), but we have smart and dedicated fellows working on them.
    Best of luck for all of us!!!

    • @AshishPatel-vy7mn
      @AshishPatel-vy7mn Рік тому

      Computer chips work in digital domain as it is easier for us too handle digital data in hardware. Analog design is really hard compared to digital design. I myself being a digital design engineer know how things get easier after analog to digital conversion. Real time analog data is easier to understand in binary.

  • @VA-ie4qq
    @VA-ie4qq Рік тому +3

    I appreciate these new insights. Thank you Bloomberg. Truly exciting new developments.

  • @anodominate
    @anodominate Рік тому +5

    My Question:- Will Quantum Computer's chip be made from Silicon or something else.

  • @TaylorFalk21
    @TaylorFalk21 Рік тому

    I’d imagine computers of the future having computer cubes instead of computer chips. Just layers on layers on layers