what do you develop? just wondering. AI is taking over.. I was the complete opposite.. i would take my dads old computers in the early 90's (from ibm at the time) think they only ran windows dos.. back when 128mb and floppy disks were awesome haha.. and i would take em apart and learned how to soder and what did what and how etc.. now can build and mod pretty much anything for optimal results. mining, model, training, servers, etc.. just recently started learning coding languages.. wish i started sooner tbh
get the best gpu preferably the nvidia 40 series also may need to up cooling with an overload of fans if get into training models.. i mean you basically need server hardware for any real results but can be done slowly.. making agents or llms or assistants is fairly simple. just need the correct programs to run such as visual studio/textwebui/lm studio/autogen etc. then can pick from hugginface and go from there..@@adrielamadi8585
Fascinating, indeed. Not only do we need to consume energy more efficiently, but also, we need to devise novel ways to create more energy on the planet. Maybe, one day, for example, we will have massive solar energy collectors in space, which then transmit that energy to the planetary surface.
I heat my house with electricity for nearly 6 months per year. Therefore 100% of electronics use in the house is 0 energy consuming during that time. Data centres should be located where there is a need for heating water or other materials. Thus all of the heat can be dumped in the early part of the manufacturing process.
You're correct about the second part. But unless you're using a space heater instead of an HVAC, it's not zero. Heat pumps which move thermal energy are actually more efficient than pure electric to heat converters.
@@Beyondarmonia Regardless of how I heat, the heat from all things dumps into the house - so no extra charge ($). As to heat pumps: True enough, but it's cold here (-24°C presently, -30°C tomorrow - will be mild next week) and electricity is very, very cheap - whereas heat pumps are expensive to buy and install - and fail expensively. That said, Hydro Quebec will subsidize about $1000 if I throw a heat pump at it. Maybe some day.
Yes, delivering physical goods use 100x more energy but the convenience of digital means that we use it 1000x more. Some people just leave netflix/youtube running in the background the whole day.
I highly doubt the 100x too. I just mean our consumption outpaces the increases in efficiency. Proven by the fact that our electricity consumption increases every decade.
Exactly, we should focus on the way digital equipment makes our lives more productive and efficient, rather than how it consumers 'much' energy (it doesn't). The ratio of energy consumption to value added is tremendously small compared to other sectors like transportation.
Well ye, but more than that because the tech that works well enough is the one already receiving most of the investment. If silicon had become untenable 10 years ago we would have been forced to switch faster to something radically different. The same thing is true with NAND flash memory - it's truly terrible for power consumption, and the wear rate of each memory cell is anything but great for long term storage. But because the investment was so deep even the most promising advanced replacement tech has constantly been left to rot even as flash becomes an ever greater power drain on our mobile devices.
@@khatharrmalkavian3306 All the nope. Hundreds of $billions is going into silicon semiconductor logic and all the other standard computer and information tech of the moment. Only a tiny fraction of that amount is going into alternative paths. Quantum dots were predicted to replace CMOS image sensors years ago, but nothing is forthcoming simply because the industry investment in CMOS sensors is too high and QD's are not regarded as enough of a benefit to pull money away from CMOS sensor improvement research. You can create a revolutionary memory tech for not a huge amount of money in a lab like Rice university - but making a chip using it that scales to competitive bit density with a modern 3D NAND chip costs a frickin shipping port full of money to engineer in staff and time, only compounded by the lesser experience and knowledge with the newer technologies in the industry. There are some things that are being pursued more vigorously, such as metalenses - they can be produced faster, more cheaply than conventional lenses and offer dramatically increased compactness + utility as a single achromatic metalens element can replace many in an optical system by focusing all wavelengths onto a sensor in one super thin piece rather than needing one element per wavelength and others for extra adjustment. So they are basically winners across the board relative to the technology they aim to replace. 20 years from now we will wonder why cameras lenses ever used to be so heavy.
Software development is also important and is rarely considered in these type of scenarios pertaining to compute efficiency and carbon output. Today's developers are writing bloated inefficient code using high level languages that just add even more overhead. This comes out as wasted CPU/GPU/DPU cycles and thus wasted energy. To some degree the increase in power of the hardware has caused this as before developers had to be much more diligent about writing lean code.
what we should look at more is high efficiency conversion from Upper bound languages like basic English instruction to machine language using machine learning using the analog energy efficiency advantage we have , you cannot stop the inevitable but we can get more efficient codes and there isn't only one way to do it
@@zhinkunakur4751 I am not suggesting that entirely. High level languages are awesome and things like python have made countless cool and invaluable solutions and gotten a lot of people into coding. Part of the benefit of this more powerful hardware is the amount of abstraction that can be done and still get the job done nicely to the end user. My point was only to highlight that I worry that as this things advance, the lower level stuff will start to become lost and appreciation for how efficient low level languages can be will be more and more underappreciated due to lack of understanding or thinking its voodoo not worth getting into. The are still a lot of scenarios where efficient code matters, and the closer to hardware you are the better. It is important we do not lose site of that or let that knowledge become stale.
@@hgbugalou I see , Agreed , I too am a little worried for the increasing unpopularity of LLLs , or maybe the concentration is seeming to be going down because more more people are getting into coding and vast majority of them are will be using the HLLs , and not that the growthrate of LLLS are going down maybe its just that HLLs have a higher growth rate
Hopefully generative AI's can do something about that. It's something I have often thought about when observing the painfully slow development process of new video codecs from ISA portable C code to fast, efficient ISA specific SIMD assembly.
What a time to be alive. This is obviously exciting work but the skepticism in me assumes that once vendors see that they can do more with less, they'll just do more and so the cycle continues. What is also brutally hard about this work to take full effect is the next layer above this needs to also change to take advantage of these performance/energy gains (the firmware, instruction sets, the algorithms, programming languages and so on). Over the long term though, I'm optimistic of this shift
Rust and other modern languages will become more significant. Hardware implementation of typing, inheritance, bounds, soft cells, networks, and flexible control assignment, will all be implemented.
I'm using an APU not a CPU. It's 35 watts and is extremely capable. With computational progress comes reduced power. It's just they are mutually exclusive and at some point people have to choose lower power over performance gains. As energy prices rise this choice is reconsidered.
@@jackdoesengineering2309 Under clocking tricks allow the computational rate to be reduced to the demand rate with lower power consumption. This still allows high computational rates, with high power consumption, whenever they are needed.
To some degree the energy consumption hasn't fallen, especially with desktop PCs, but the greater energy efficiency has made possible all sorts of new form factors which are much more energy efficient: laptops, netbooks, tablets, smart phones and smart watches. Eventually we will get to smart glasses, smart earbuds and smart clothes that are extremely energy efficient and can replace much of the functionality of traditional PCs. If you look at energy consumption in advanced economies, it is actually falling, which is an indication that we are doing more with less energy. As a computer programmer, I can tell you that energy efficiency is becoming increasingly important in programming. Not only are programmers focusing more on code that can deal with low energy systems running on a battery, but they are focusing more on compiled languages, such as Rust, Swift, Go and Julia, that use less memory and computing cycles than interpreted languages.
@@michaeledwards2251 Hardware implementation of typing, inheritance and bounds as of yet hasn't been able to make any of these things faster for code that does these things as of yet: - Inheritance is basically a fancy jump instruction. The main problem with this is that with inheritance, your jump address usually has to be loaded from memory, which can take many cycles, and the CPU has to basically guess the branch target and run a whole bunch of speculative instructions while the address loads for real. Having a special version of "jump to variable address" just for inheritance just doesn't gain much over the regular variable jump. - Bounds is likewise a fancy conditional branch. Conditional branches that rarely get taken are already quite cheap on modern CPUs - they do take up slots in the instruction decoder and micro-op execution but they don't compete for the really important slots (memory loading/storing). In fact, loading the bound is definitely slower than testing it (since it uses a memory load instruction). The speed gain from adding hardware bounding tests is likely to be rather small. - Typing is in a similar situation. Usually dynamic typed variables are either just dynamic versions of small fixed-size static types (double, float, bool, int32_t, int64_t) or larger dynamic-sized variable types (strings, objects, maps, etc). The larger dynamic-sized types have to be handled in software (too complex for hardware), so you'd still have to load the type and test for it. The small fixed-size types could conceivably be handled in hardware but you'd probably just be using the largest type all the time.
Can confirm we are running out of chips, the chip-to-air ratio changed from 50-50 to 35-65. Trying times indeed. The bag itself is worth more than the chips inside now.
15:11 The interconnect cables which are devised to mitigate energy consumption challenges in data centers are just simply optical fiber interconnects which are directly plugged to the ASIC. Co-packaged optics technology bridges the gap between electronics and photonics by integrating them on a common platform with photonic chip serving as the first point of contact to the external world.
We've already made amazing strides in the power efficiency of computers. An IBM 360/195, with cache and out-of-order execution, like most modern computers, used much more power. And go back to the days when computers used vacuum tubes instead of transistors, and their power consumption compared to the work they could do was much higher.
That is true, but back when that happened the worldwide use of computers was just a tiny fraction of what it is now. The increase in use means we need to push the hardware efficiency ever further to keep up.
Wirth's Law. We don't necessarily need better computers. We need software to be more efficient. Nowadays it's normal for computer programs to occasionally crash due to memory leaks or bug in the code. I work at a datacenter and I have to use this app on my phone to do daily routine inspections and the app crashes when open for too long... It's crazy how tolerant we became of unstable software.
Most of the data that moves around a server farm, goes over copper. Even when computers are paralleled. Light travels through fiber at 65% speed of light, through copper at 60%. The devices that convert data to light have the same limits as the devices that drive wire. Light can send more than one signal using color, but that only uses a small slice of the available bandwidth. Copper wire operates at a lower frequency (maybe 10GHz vs 50,000GHz), but uses the entire bandwith of the wire. The big advantage fiber has is how far a signal can travel.
photonic computers, neuromorphic computers, and cpu's that use carbon nanotubes are very intresting but frankly if we wanted to dramatically reduce computer power consumption we could already do this today. We could: - use the programming languages C or Rust instead of popular programming languages like Python (which is something like 45 times less efficient) - use RISC based CPUs such as ARM chips or RISC5 chips - underclock CPUs so that they maximise power efficiency rather trying to maximise performance - use operating system drivers that aim to use minimal power If we did these things we could probably use < 1% of the power we currently use. We don't do these things largely because it would be slightly more inconvenient and would require social change rather than innovations in technology.
There is benefit for higer languages too c is only for low level / high performance stuff and is absolutely unreplaceable there other stuff y need higer languages like rust python java to actually get projects done in time and not mess with optimization too much there is always a use for both About undercolcking the PC has a thing called a governer who decides clock speed per workload so that's already been done when y don't need max performance Plus we have ecores
Bruhhhh ur point number 1 doesn't make any sense. U r talking in the software domain. Plus there's a reason why everyone use python. It's faster to build and test
Nengo is a software that works with Loihi currently with the intention of allowing software applications for the neuromorphic chips. The research they do at Waterloo in general is quite interesting
People always talk about how more effeciency will lead to less energy consumption. But if I know anything about humans, is that they will always push to limits (and power/thermals are the limiting factor right now), so I feel more effecient chips are just going to lead to even more computers and increased performance instead of decreased power draw.
Fascinating, as Spock would say. It's an interesting decade or two ahead as we grapple with these different technologies, in the hope that one of them will become commercially mainstream - and breathe new life into the industry for another 50 years or so until something newer more more radical appears. Ica is also fascinating, especially it's neuromorphic learning paradigm, and will definitely accelerate the rate at which robot can learn their surroundings and interract, as well as learn from it's past, and build on its intelligence. The future is definitely bright.
Photonics are the future. I've been blown away with the ways they are devising to build logic gates that function by altering photons on a quantum level. Light based computers have been a mainstay in Science Fiction for a long time now and it's amazing to see actual real-world advances with practical applications being made.
Well yea bur Maybe bit it depends we can always make our current election ones better besides trying to add more transistors I mean yea will need better materials like graphene could make computers hundreds of thousands of times faster.
@@koiyujo1543 Yeah i agree, there are still advancements to be made in electronics, i imagine hybrid photonic/electronic systems will become a thing before we get any fully photonic chips, from what i understand the benefits of photonics to latency and efficiency go far beyond what is possible with electronics.
I have my doubt about carbon. After High-NA EUV lithography has reached it limit with silicon wafers. That like in 2 decades. I think there will be limits found how far you can go in complexity , layers and materials.
@@billfarley9015 I can't offhand either. I thought about Data from Star Trek: TNG but he's positronic. I also thought about Voyager's computer but iirc that's organic. There is Orac, The LIberator's supercomputer from Blake's 7, I always assumed that was Photonic but I may be wrong. I'm sure if I was to look hard enough I'd find something soon enough, sci-fi writers have a far greater imagination and scientific knowledge than myself. :)
This new chip sounds like a pathfinding co-processor to my game developer ears. Navigating in real-time an order of magnitude more agents in a dynamic world would revolutionize game development. Everybody's stuck on pathfinding. We're still using algorithms from the 1960s.
Could we solve a lot of the energy problem by writing more efficient code? It seems that as processing power has increased developers are less concerned with memory constraints. There is also a lot of pressure to push new features at the expense of optimised code (and of course more and more abstraction layers in coding). It's like Parkinson's law, but with computer memory.
nope. because in order to make money code needs to be shipped fast. there are better. what we can do is encode more in less. so instead of binary computers we use ternary, or even quarternary computation. and that could increase the amount of possible calculations. the reason developers are less concerned with memory constraints is because its expensive to write efficient code, as it takes longer and you need to understand more math and how computers work in order to write efficient code. its also more prone to bugs and errors. what you need is something simple enough to write but provides enough control for the task at hand. and most people don't even know until the product is shipped, optimizations happen after the product is built. a real solution would be using analog computers and a whole bunch of them to do specific calculations, and then translate them into binary. this in principle is how and why asic mining exists because instead of abusing sand and making it think, we simply just let it read the electrical charge outputs from several mechanical computers and let it process those inputs via conventional silicon which will need less power to operate since it need only read from its mechanical computer counterparts and maybe do a few calculations here and there.
the quickest thing you could do is learn how to use linux and a terminal, and you would already be using less power the a majority of people, and use a window manager like i3wm. and use more terminal applications including on your phone like termux. its not as conveniant but you can do quite a bit with a terminal. so much i'm convinced that real work is done in a terminal.
@@Meleeman011 mate, i dont want to learn anymore than necessary when it comes to computers. They are supposedly made to adapt to us not us to a binary code. Yes i have done unix and linux before but i gave up, i dont wish to reinvent the wheel, i will let other nerds like you do that. Yes you can, yes i can do this and that but i chose not to. I can think of better things to do with my life.
Progress is not tied to computers getting better, it is contingent on excising technology from our lives. Apparently these folks failed to learn Jevon’s paradox: “an increase in efficiency in resource use will generate an increase in resource consumption rather than a decrease”. Nor do they consider the exponential power of compound growth to exceed any linear reduction or transition. Even the Greeks knew Sisyphus would never get the boulder to the top of the hill, yet techno-utopians gleefully assume a smaller transistor is going to solve all problems.
It's got to be tough for these new technologies to compete with Silicon, which has had 50 years of uninterrupted existential growth. Even if a new technology could be better than Silicon, it might never get there because it can't be immediately small enough or fast enough or cheap enough to compete with the cutting edge.
New challenges will create New opportunities. Maybe, not with commercial applications but, these technological breakthroughs will initiate their own journey with defence and space Applications !!!
They will eventually get cheap enough, the beauty of the capitalist system. 50 years ago owning a computer was impossible. Today the average American has two computers.
As an Indian I have to compliment the US for pushing these bleeding edge r&d. I work at neuromorphic computing area in India & I'm sure India will start competing with USA soon
India have great future, however...the day when world will start to cooperate more instead of compete, will be the best day for humanity...also, and i am sure that it wont happen any time soon. People are just too self centric and primal
Even though there is hundreds of company racing in this field, but all of them are pushing the world to the front even if they doesn't win the pie at the end.
Yep. Too bad they're associated with the totalitarian name "Bloomberg." I've met Bloomberg employees before who were rightfully ashamed to be associated with the name...
At least in Minnesota, during winter heating season, I figure waste heat from my PC all goes into heating my home. Of course if I had better home insulation, I'd probably save more in heating that way!
The Cerebras wafer 80 EXO scale processor has 850,000 cores, each core itself is a supercomputer. All ultra interconnected without bus speed, it outperforms every supercomputer ever built, all on one chip. They believe they have found the pathway to singularity. I gather the only supercomputer that's faster, doesn't exist. 44,000 watts, perhaps it could two jobs at once, heating water in the room while predicting the future, it's that fast. You know like when you make a movie and it takes forever for the processor to get done with it. Pictures simulating a nuclear explosion, fluid dynamics. Current supercomputers draw the event much slower than it happens in actuality. This chip can do more work faster and predict the event accurately in great detail faster than it can occur. Made that tsmc at 5 nanoscale. Strangely, Moore's law will continue. IBM has already produced chips at the two nanometers, so surely there's lots of room for improvement yet to come for the cerebras wafer supercomputer.
So maybe the AI bottleneck (ie. class 5 autonomy, 'general ai', etc) is due to the binary nature of the base layer architecture - from this it sounds like the analogue stochasticity of neuromorphic architectures may be required for AI to meaningfully progress...
Imagine how much we could save if millions of people weren't living online every waking minute seeking validation, and simply put thier phone down. It would save not only energy, but humanity itself.
Remember, it is these validation seeking individuals which are pushing scientists and engineers to innovate and come with fundamentally new solutions, ensuring that we progress as a species. If we didn't need to upgrade, no one would care to innovate, and we would still be happy with stone-age technology.
I was surprised that Mythic Chip was not included on this video to represent Analog. The Nuero Computing part that is being developed in Mumbai -in this video- has already been created by those guys years ago and currently has a computing power that is equivalent to current digital standards but using only 3 watts of energy.
The best alignment process I know of is using magnetic fields. Is there a way to make these nano tubes or the environment in which they are stored temporarily magnetic?
Neuromorphic computers are the next key technology. What would be interesting is if chips can be more three dimensional, as opposed to the relatively two dimensional chips afforded by conventional lithography techniques.
I'm glad to own some oldshool household hardware like my cettle and toaster that don't rely on chips and last for like 3 years, but for over 30 years and counting (got the stuff from my parents and maybe i give it to the next generation as some magical relics with of another time)
15:04 "in theory you could have a processor in one room, memory in another, storage in another." Maybe, but interconnect latency is still a concern. Light travels 30 cm (1 foot) in 1 nanosecond, which is the duration of one clock cycle when the computer runs at 1 GHz. Latency to communicate with RAM in an adjacent room will limit the usability for many computing applications.
There are datacenters that reuse the heat generated to provide central heating to towns around them. That’s just one example of how much power is wasted on computing - if it’s enough to heat the houses around you and it’s actually even profitable to do that.
Photons move so much faster than electrical impulses - almost 2% faster (Gasp!). The major advantage is additional semantic elements. Using Josephson Junctions allows the use of interferometers as building units. Using tunnel diodes allows q-nary logic. What semantics are available using optical units?
No clue about any of the other stuff you said but wasn't the whole photons idea a bit more about saving energy than it was about the speed? They definitely did say something about it being "so much" faster but also went into detail about energy efficiency, not disagreeing with you though just idk lol
@@zFA113NNINJA The hype was all about speed. For efficiency, look at adiabatic circuits. They change state without consuming energy. In 1981 I designed a stepper motor driver using this concept. I was only able to get 9 times better performance than the motor's rating. Check out adiabatic tunnel diode threshold gate logic. Tunnel diodes are 100,000 times better than CMOS. A tunnel diode threshold gate CPU would run at 2000 GHz at 5 W.
I'm a science nerd and fan. But I hope there are simultaneous efforts to develop safe disposal of carbon nanotube solution 5:19. The tubes are too small to filter conventionally, and they don't easily degrade. Waste has to be considered, particularly since 33% of it is already known to be unwanted by products (full time conducting nanotubes).
There is already a neuromorphic chip company from Australia called BrainChip with their Akida 2nd gen chip out. They have partnered with ARM, Intel, Prophesee, Megachips to name a few
@3:32 there should have been a huge asterisk at the figure of 5nm (nanometer). The transistors aren't actually 5nm in size and that '5nm' technology is just advertising for the manufacturing company's next gen transistors. Now due to how transistors can be manufactured and packaged differently, there's no agreed industry standardized size. Some fabs (places where semiconductors are being built), usually give a density figure of x amount of transistors per mm² of their die, but even that is difficult to verify independently.
As a microelectronics engineering grad student, I'm very well aware of the major challenges that power optimization can pose. There have been many attempts to "cheat" the physical boundaries of materials, some have been successful, some have lead to entirely different technologies.
Hey, bloomberg, could you put links to the things you discuss in the video description? I'd expect your viewers to be pretty likely to want to look further into things and read stuff.
Reducing environmental impact by choice is said to be used as a performance measure advancing chip design. Could be worst off in no development releasing them, Most designs end up on the shelf without ever being released not even partly small functional designs. All costs go to the larger processing of data while the least of data processing works as well to uncover much of a design. However, I applaud how they go about it.
We obviously need more efficient chips, but they at least run on electricity. Everything that runs on electricity gets more eco-friendly over time. The cheapest electricity at the time is from renewables. In Europe, some data centers provide district heating and more want to switch to it. Yes, heat pumps are way more efficient for heat (300-400%) computer chips are “only” ~100% efficient in producing heat. That heat would otherwise be discarded, which costs more energy, so by using it we are reducing our electricity required.
Maybe dielectrophoresis in combination with flow fields in solution is a way of tuning and improving the alignment over pre-treated (for instance lithography) inhomogenous surface energy Si wavers. Worked out pretty well for GaAs nanowires in a study we conducted at the university to align them parallel at contacts.
@@foxbat888 In principle i think you could manage to make it that way, however, I habe to admit that i am no expert in Transistor technology. My knowledge is more coming from the surface science/electrochemistry/interface science especially solid to liquid
Even though these chips would be more expensive to produce, the power savings in use makes a huge dent in the life time costs of those chips. This is what big data centers live on. So if there is a 1000 time reduction in power usage, *theoretically* they can cost a 1000 times more to produce, but still be competitive.
I think we will always pump more power even if we have more efficient chips, because there is no limit to what we want to do with them. So cool, more efficient chips, its great, BUT we will still increase your energy consumption.
16:27 "...that communicate with each other through electrical pulses known as spikes" this is incorrect, they are known as action potentials. You can have a brief depolarisation event, or spike in charge, that does not reach the threshold of an actual action potential and thus does not result in communication.
I'd like a study on how much % of this power consumption falls on user profiling and the processing needed for it. I wouldn't be surprised if it's around or above half of it...
I'm in awe of the technology and ideas which have been developed to enable material manipulation at molecular and atomic scales. Just amazing. My choice application of new AI technology: to recognize and edit ads from UA-cam videos.
I think Analog systems can perform image and motion processing, including calculations such as addition, subtraction, multiplication, division, and logarithms, sine and cosine according to the decomposition of Fourier columns into different functions. Faster than any digital systems as advanced as they may be. Including meter calculations and grades. and providing realistic solutions to multicollection equations.
You could use q-nary logic like in flash memories. You could use adiabatic circuits. You could use self-time circuits. You don't have to stay in the same rut, doing the same thing over and over.
It is interesting that you can’t combine some of these methods as with neuromorphic computing or neural networks , you could use the efficiency of say photonics but the problems with all these chips is running things like an operating system on top of them as photonics wouldn’t struggle with quantum mechanics problems and as photons are smaller than electrons . The problem is that you have to convert electricity to light using oleds, Neuromorphic computing could solve the carbon nanotube problem but the problem is that you can’t get smaller than the carbon atom which is essentially quite big as it has 14 proton and neutrons. To separate them if it is metallic you would just need a magnet. The thing is to arrange them on a chip would be the most fascinating thing as you could use a neural network to align the chips in the best configuration Light to me is the best option with a neuromorphic setup but to make light travel as it has different wavelengths on the electromagnetic spectrum , But even if you have light together it is still going to generate heat or if you have light transistors . As you could go down to the width of a photon in electronics or photonics , how can you tell whether a light transistor is on or off. If light has mass that would make a difference. You could control light if it was moving at a specific frequency say red shifted. These chips are along way from being used to run games on a computer Quantum computing wasn’t even bought up
Thanks so much for such beautifully illustrated video about modern economy, and its current limits, and those with courage and wisdom to go beyond, yet at same time hopefully to also start reversing our energy footprint causing climate change disaster.
Putting a computer into a toaster is the dumbest use of computing power one can imagine. Are you so lonely that you need emails from your refrigerator? Security = control.
"5nm" in chips is used as a marketing technique and it does not mean that the dimensions of the chip are near 5nm. Contacted gate pitch is 51 nm and metal pitch is 30 nm.
2:17 "semiconductors are made up of transistors". Yeah and concrete is made up of skyscrapers too :) Transistors are made up of semiconductor materials not the other way around lol.
If the production process always creates metallic nanotubes as a by-product, could those be aligned magnetically before removing them from the semiconducting ones?
So...how is it that a computer materials scientist expresses himself by saying "100 times smaler...". It's become popular to use "times smaller" in media but it's very inaccurate.
Light traveling through optic chips is the future. Combined calculations of a traditional binary computer integrated with a quantum computer makes sense
people forgot to consider how a smartphone saves energy by replacing old solutions like physical maps, books and so on which also takes energy to produce and pollute the planet
In 1968 I worked with tunnel diodes and .5 GHz. TDs are 100,000 more efficient that CMOS. Current TD speeds are 2000 GHz at 5 W. Yet no one is developing a resonant tunnel diode threshold logic front end. It would be smaller, cheaper, and more efficient than CMOS. Yet executives are ignoring it. Never underestimate the power of human stupidity.
Also, moving away from rare earth metals and minerals to more common metals and minerals would be a game-changer if somebody can make said technology powerful and energy efficient. It would allow third-world countries more ability to compete and thrive as a result. We just need to deal with antitrust and insider trading issues that often hamper the ability for people to compete with larger corporations that could potentially 'buy them out'.
There's fundamental difference between lohi intel and what IIT Bombay is doing. IIT Bombay uses memristors/ analog circuits. Lohi only uses neuromorphic architecture but uses digital circuits
0:00 The importance of developing low-energy computer chips
4:12 Carbon Nanotube Transistors
11:07 Photonics Chips
15:26 Neuromorphic Computing
24:47 Conclusion
and yet Michael P. Frank can't get funding for adiabatic chipsets that have reversible computing....
thnx mate
Thank you!
Thank you brother
thx
I'm a computer scientist specialized in software development but this made me appreciate the hardware side of things. It's really inspiring
Pr!ck
what do you develop? just wondering. AI is taking over.. I was the complete opposite.. i would take my dads old computers in the early 90's (from ibm at the time) think they only ran windows dos.. back when 128mb and floppy disks were awesome haha.. and i would take em apart and learned how to soder and what did what and how etc.. now can build and mod pretty much anything for optimal results. mining, model, training, servers, etc.. just recently started learning coding languages.. wish i started sooner tbh
@@dtrueg i develop web applications and desktop applications but I want to explore AI development and cloud computing.
get the best gpu preferably the nvidia 40 series also may need to up cooling with an overload of fans if get into training models.. i mean you basically need server hardware for any real results but can be done slowly.. making agents or llms or assistants is fairly simple. just need the correct programs to run such as visual studio/textwebui/lm studio/autogen etc. then can pick from hugginface and go from there..@@adrielamadi8585
From my perspective this was one of the most interesting quicktakes I've seen. Well assembled and presented.
I'd heard of these technologies to varying degrees, but this piece on the current progress of all of them was informative and fascinating. Thank you!
Yeah, also, personally I find it quite odd how I never thought about the carbon footprint of our reliance in computers/tech in general
Fascinating, indeed. Not only do we need to consume energy more efficiently, but also, we need to devise novel ways to create more energy on the planet. Maybe, one day, for example, we will have massive solar energy collectors in space, which then transmit that energy to the planetary surface.
Epyc.
No one asked
@@Ottee2 ...without chargrilling intervening birds.
This just renewed my interest in engineering, truly an inspiring documentary.
I'm sure tomorrow would see a kitty rescue and go "this renewed my faith in humanity", right? You renew your interest is something by doing something.
@@atlantic_love nothing renews my faith in humanity, I know it's doomed at this point
@@nayeb2222 Then be a prime example, and do not have kids, then dissolve your body and donate all your organs to charity. Thanks!
@@HueghMungus I do have faith in religion, so it won't be an option for me
@@nayeb2222 you have faith in fiction
I heat my house with electricity for nearly 6 months per year. Therefore 100% of electronics use in the house is 0 energy consuming during that time.
Data centres should be located where there is a need for heating water or other materials. Thus all of the heat can be dumped in the early part of the manufacturing process.
You're correct about the second part. But unless you're using a space heater instead of an HVAC, it's not zero. Heat pumps which move thermal energy are actually more efficient than pure electric to heat converters.
@@Beyondarmonia Regardless of how I heat, the heat from all things dumps into the house - so no extra charge ($). As to heat pumps: True enough, but it's cold here (-24°C presently, -30°C tomorrow - will be mild next week) and electricity is very, very cheap - whereas heat pumps are expensive to buy and install - and fail expensively. That said, Hydro Quebec will subsidize about $1000 if I throw a heat pump at it. Maybe some day.
@@AlanTheBeast100 Makes sense.
Still a lot better than delivering physical goods (eg Blockbuster vs streaming). I'm sure the former uses more than 100x as much energy.
Yes, delivering physical goods use 100x more energy but the convenience of digital means that we use it 1000x more. Some people just leave netflix/youtube running in the background the whole day.
@@niveshproag3761 I highly doubt that figure.
I highly doubt the 100x too. I just mean our consumption outpaces the increases in efficiency. Proven by the fact that our electricity consumption increases every decade.
@@niveshproag3761 Certainly not proven as you're not isolating variables.
Exactly, we should focus on the way digital equipment makes our lives more productive and efficient, rather than how it consumers 'much' energy (it doesn't). The ratio of energy consumption to value added is tremendously small compared to other sectors like transportation.
The greatest enemy of a wonderful technological breakthrough is the advanced technology that works well enough.
Epyc.
yup the consistency is the death of developement
Nonsense. We're using all of our existing technology and pouring hundreds of billions of dollars per year into researching new methods.
Well ye, but more than that because the tech that works well enough is the one already receiving most of the investment.
If silicon had become untenable 10 years ago we would have been forced to switch faster to something radically different.
The same thing is true with NAND flash memory - it's truly terrible for power consumption, and the wear rate of each memory cell is anything but great for long term storage.
But because the investment was so deep even the most promising advanced replacement tech has constantly been left to rot even as flash becomes an ever greater power drain on our mobile devices.
@@khatharrmalkavian3306 All the nope.
Hundreds of $billions is going into silicon semiconductor logic and all the other standard computer and information tech of the moment.
Only a tiny fraction of that amount is going into alternative paths.
Quantum dots were predicted to replace CMOS image sensors years ago, but nothing is forthcoming simply because the industry investment in CMOS sensors is too high and QD's are not regarded as enough of a benefit to pull money away from CMOS sensor improvement research.
You can create a revolutionary memory tech for not a huge amount of money in a lab like Rice university - but making a chip using it that scales to competitive bit density with a modern 3D NAND chip costs a frickin shipping port full of money to engineer in staff and time, only compounded by the lesser experience and knowledge with the newer technologies in the industry.
There are some things that are being pursued more vigorously, such as metalenses - they can be produced faster, more cheaply than conventional lenses and offer dramatically increased compactness + utility as a single achromatic metalens element can replace many in an optical system by focusing all wavelengths onto a sensor in one super thin piece rather than needing one element per wavelength and others for extra adjustment.
So they are basically winners across the board relative to the technology they aim to replace. 20 years from now we will wonder why cameras lenses ever used to be so heavy.
Lithography "nm" these days doesn't really means exact no. Ie 5nm doesn't actually mean 5nm manufacturing process
Yeah TSMC state it's more of a marketing term than anything.
@@djayjp same goes for Samsung & Intel
@Cobo Ltger Isn't it refering to the size of transistor gates?
It did at one point. But now it’s just used to say it’s 2x better than this old process etc.
Software development is also important and is rarely considered in these type of scenarios pertaining to compute efficiency and carbon output. Today's developers are writing bloated inefficient code using high level languages that just add even more overhead. This comes out as wasted CPU/GPU/DPU cycles and thus wasted energy. To some degree the increase in power of the hardware has caused this as before developers had to be much more diligent about writing lean code.
cmon are you really suggesting high level languages are bad and inefficient ? I believe high level languages are really inevitable
what we should look at more is high efficiency conversion from Upper bound languages like basic English instruction to machine language using machine learning using the analog energy efficiency advantage we have , you cannot stop the inevitable but we can get more efficient codes and there isn't only one way to do it
@@zhinkunakur4751 I am not suggesting that entirely. High level languages are awesome and things like python have made countless cool and invaluable solutions and gotten a lot of people into coding. Part of the benefit of this more powerful hardware is the amount of abstraction that can be done and still get the job done nicely to the end user. My point was only to highlight that I worry that as this things advance, the lower level stuff will start to become lost and appreciation for how efficient low level languages can be will be more and more underappreciated due to lack of understanding or thinking its voodoo not worth getting into. The are still a lot of scenarios where efficient code matters, and the closer to hardware you are the better. It is important we do not lose site of that or let that knowledge become stale.
@@hgbugalou I see , Agreed , I too am a little worried for the increasing unpopularity of LLLs , or maybe the concentration is seeming to be going down because more more people are getting into coding and vast majority of them are will be using the HLLs , and not that the growthrate of LLLS are going down maybe its just that HLLs have a higher growth rate
Hopefully generative AI's can do something about that.
It's something I have often thought about when observing the painfully slow development process of new video codecs from ISA portable C code to fast, efficient ISA specific SIMD assembly.
The atom is not the limit to size reduction. Subatomic particles can perform the same functions, better, cheaper and faster.
What a time to be alive.
This is obviously exciting work but the skepticism in me assumes that once vendors see that they can do more with less, they'll just do more and so the cycle continues. What is also brutally hard about this work to take full effect is the next layer above this needs to also change to take advantage of these performance/energy gains (the firmware, instruction sets, the algorithms, programming languages and so on). Over the long term though, I'm optimistic of this shift
Rust and other modern languages will become more significant. Hardware implementation of typing, inheritance, bounds, soft cells, networks, and flexible control assignment, will all be implemented.
I'm using an APU not a CPU. It's 35 watts and is extremely capable. With computational progress comes reduced power. It's just they are mutually exclusive and at some point people have to choose lower power over performance gains. As energy prices rise this choice is reconsidered.
@@jackdoesengineering2309
Under clocking tricks allow the computational rate to be reduced to the demand rate with lower power consumption. This still allows high computational rates, with high power consumption, whenever they are needed.
To some degree the energy consumption hasn't fallen, especially with desktop PCs, but the greater energy efficiency has made possible all sorts of new form factors which are much more energy efficient: laptops, netbooks, tablets, smart phones and smart watches. Eventually we will get to smart glasses, smart earbuds and smart clothes that are extremely energy efficient and can replace much of the functionality of traditional PCs. If you look at energy consumption in advanced economies, it is actually falling, which is an indication that we are doing more with less energy.
As a computer programmer, I can tell you that energy efficiency is becoming increasingly important in programming. Not only are programmers focusing more on code that can deal with low energy systems running on a battery, but they are focusing more on compiled languages, such as Rust, Swift, Go and Julia, that use less memory and computing cycles than interpreted languages.
@@michaeledwards2251 Hardware implementation of typing, inheritance and bounds as of yet hasn't been able to make any of these things faster for code that does these things as of yet:
- Inheritance is basically a fancy jump instruction. The main problem with this is that with inheritance, your jump address usually has to be loaded from memory, which can take many cycles, and the CPU has to basically guess the branch target and run a whole bunch of speculative instructions while the address loads for real. Having a special version of "jump to variable address" just for inheritance just doesn't gain much over the regular variable jump.
- Bounds is likewise a fancy conditional branch. Conditional branches that rarely get taken are already quite cheap on modern CPUs - they do take up slots in the instruction decoder and micro-op execution but they don't compete for the really important slots (memory loading/storing). In fact, loading the bound is definitely slower than testing it (since it uses a memory load instruction). The speed gain from adding hardware bounding tests is likely to be rather small.
- Typing is in a similar situation. Usually dynamic typed variables are either just dynamic versions of small fixed-size static types (double, float, bool, int32_t, int64_t) or larger dynamic-sized variable types (strings, objects, maps, etc). The larger dynamic-sized types have to be handled in software (too complex for hardware), so you'd still have to load the type and test for it. The small fixed-size types could conceivably be handled in hardware but you'd probably just be using the largest type all the time.
Can confirm we are running out of chips, the chip-to-air ratio changed from 50-50 to 35-65. Trying times indeed. The bag itself is worth more than the chips inside now.
15:11 The interconnect cables which are devised to mitigate energy consumption challenges in data centers are just simply optical fiber interconnects which are directly plugged to the ASIC. Co-packaged optics technology bridges the gap between electronics and photonics by integrating them on a common platform with photonic chip serving as the first point of contact to the external world.
We've already made amazing strides in the power efficiency of computers. An IBM 360/195, with cache and out-of-order execution, like most modern computers, used much more power. And go back to the days when computers used vacuum tubes instead of transistors, and their power consumption compared to the work they could do was much higher.
Epyc.
That is true, but back when that happened the worldwide use of computers was just a tiny fraction of what it is now.
The increase in use means we need to push the hardware efficiency ever further to keep up.
Wirth's Law. We don't necessarily need better computers. We need software to be more efficient. Nowadays it's normal for computer programs to occasionally crash due to memory leaks or bug in the code. I work at a datacenter and I have to use this app on my phone to do daily routine inspections and the app crashes when open for too long... It's crazy how tolerant we became of unstable software.
Most of the data that moves around a server farm, goes over copper. Even when computers are paralleled.
Light travels through fiber at 65% speed of light, through copper at 60%.
The devices that convert data to light have the same limits as the devices that drive wire.
Light can send more than one signal using color, but that only uses a small slice of the available bandwidth.
Copper wire operates at a lower frequency (maybe 10GHz vs 50,000GHz), but uses the entire bandwith of the wire.
The big advantage fiber has is how far a signal can travel.
photonic computers, neuromorphic computers, and cpu's that use carbon nanotubes are very intresting but frankly if we wanted to dramatically reduce computer power consumption we could already do this today.
We could:
- use the programming languages C or Rust instead of popular programming languages like Python (which is something like 45 times less efficient)
- use RISC based CPUs such as ARM chips or RISC5 chips
- underclock CPUs so that they maximise power efficiency rather trying to maximise performance
- use operating system drivers that aim to use minimal power
If we did these things we could probably use < 1% of the power we currently use. We don't do these things largely because it would be slightly more inconvenient and would require social change rather than innovations in technology.
There is benefit for higer languages too c is only for low level / high performance stuff and is absolutely unreplaceable there
other stuff y need higer languages like rust python java to actually get projects done in time and not mess with optimization too much
there is always a use for both
About undercolcking the PC has a thing called a governer who decides clock speed per workload so that's already been done when y don't need max performance
Plus we have ecores
This is such a normie take that it is hilarious.
@@rajuaditya1914 we all are interested learning and yeah we normies may not understand key concepts
@@rajuaditya1914 Sounds like a reasonable argument to me. Maybe you have some insights to share that'd change my mind and his?
Bruhhhh ur point number 1 doesn't make any sense. U r talking in the software domain. Plus there's a reason why everyone use python. It's faster to build and test
I really like the jumps between the founders and the skeptic guy
Nengo is a software that works with Loihi currently with the intention of allowing software applications for the neuromorphic chips. The research they do at Waterloo in general is quite interesting
People always talk about how more effeciency will lead to less energy consumption. But if I know anything about humans, is that they will always push to limits (and power/thermals are the limiting factor right now), so I feel more effecient chips are just going to lead to even more computers and increased performance instead of decreased power draw.
Energy consumption would increase either way
Fascinating, as Spock would say. It's an interesting decade or two ahead as we grapple with these different technologies, in the hope that one of them will become commercially mainstream - and breathe new life into the industry for another 50 years or so until something newer more more radical appears.
Ica is also fascinating, especially it's neuromorphic learning paradigm, and will definitely accelerate the rate at which robot can learn their surroundings and interract, as well as learn from it's past, and build on its intelligence.
The future is definitely bright.
Photonics are the future. I've been blown away with the ways they are devising to build logic gates that function by altering photons on a quantum level. Light based computers have been a mainstay in Science Fiction for a long time now and it's amazing to see actual real-world advances with practical applications being made.
Well yea bur Maybe bit it depends we can always make our current election ones better besides trying to add more transistors I mean yea will need better materials like graphene could make computers hundreds of thousands of times faster.
@@koiyujo1543 Yeah i agree, there are still advancements to be made in electronics, i imagine hybrid photonic/electronic systems will become a thing before we get any fully photonic chips, from what i understand the benefits of photonics to latency and efficiency go far beyond what is possible with electronics.
I have my doubt about carbon. After High-NA EUV lithography has reached it limit with silicon wafers. That like in 2 decades. I think there will be limits found how far you can go in complexity , layers and materials.
Offhand I can't think of any examples of light-based computers being a mainstay of science fiction. Can you cite any?
@@billfarley9015 I can't offhand either. I thought about Data from Star Trek: TNG but he's positronic. I also thought about Voyager's computer but iirc that's organic. There is Orac, The LIberator's supercomputer from Blake's 7, I always assumed that was Photonic but I may be wrong. I'm sure if I was to look hard enough I'd find something soon enough, sci-fi writers have a far greater imagination and scientific knowledge than myself. :)
This new chip sounds like a pathfinding co-processor to my game developer ears. Navigating in real-time an order of magnitude more agents in a dynamic world would revolutionize game development. Everybody's stuck on pathfinding. We're still using algorithms from the 1960s.
Superb presentation. Both “pop culture” exposure, and real technical info for experts
Such an awesome content that you produce. It has something to teach nearly anybody at any level of knowledge regarding the problem.
Could we solve a lot of the energy problem by writing more efficient code? It seems that as processing power has increased developers are less concerned with memory constraints. There is also a lot of pressure to push new features at the expense of optimised code (and of course more and more abstraction layers in coding).
It's like Parkinson's law, but with computer memory.
I could see AI generating new ways to solve computational problems that reduces the need to compute them.
For example DLSS or AI upscaling.
nope. because in order to make money code needs to be shipped fast. there are better. what we can do is encode more in less. so instead of binary computers we use ternary, or even quarternary computation. and that could increase the amount of possible calculations. the reason developers are less concerned with memory constraints is because its expensive to write efficient code, as it takes longer and you need to understand more math and how computers work in order to write efficient code. its also more prone to bugs and errors. what you need is something simple enough to write but provides enough control for the task at hand. and most people don't even know until the product is shipped, optimizations happen after the product is built. a real solution would be using analog computers and a whole bunch of them to do specific calculations, and then translate them into binary. this in principle is how and why asic mining exists because instead of abusing sand and making it think, we simply just let it read the electrical charge outputs from several mechanical computers and let it process those inputs via conventional silicon which will need less power to operate since it need only read from its mechanical computer counterparts and maybe do a few calculations here and there.
the quickest thing you could do is learn how to use linux and a terminal, and you would already be using less power the a majority of people, and use a window manager like i3wm. and use more terminal applications including on your phone like termux. its not as conveniant but you can do quite a bit with a terminal. so much i'm convinced that real work is done in a terminal.
@@Meleeman011 Using c++ server side can have a reduction in energy usage
@@Meleeman011 mate, i dont want to learn anymore than necessary when it comes to computers. They are supposedly made to adapt to us not us to a binary code. Yes i have done unix and linux before but i gave up, i dont wish to reinvent the wheel, i will let other nerds like you do that. Yes you can, yes i can do this and that but i chose not to. I can think of better things to do with my life.
Progress is not tied to computers getting better, it is contingent on excising technology from our lives. Apparently these folks failed to learn Jevon’s paradox: “an increase in efficiency in resource use will generate an increase in resource consumption rather than a decrease”. Nor do they consider the exponential power of compound growth to exceed any linear reduction or transition. Even the Greeks knew Sisyphus would never get the boulder to the top of the hill, yet techno-utopians gleefully assume a smaller transistor is going to solve all problems.
It's got to be tough for these new technologies to compete with Silicon, which has had 50 years of uninterrupted existential growth. Even if a new technology could be better than Silicon, it might never get there because it can't be immediately small enough or fast enough or cheap enough to compete with the cutting edge.
New challenges will create New opportunities. Maybe, not with commercial applications but, these technological breakthroughs will initiate their own journey with defence and space Applications !!!
They will eventually get cheap enough, the beauty of the capitalist system. 50 years ago owning a computer was impossible. Today the average American has two computers.
24:36
Where's the mouse?
* Points at camera *
What a burn
Yeah, money wasted type moment 😂
As an Indian I have to compliment the US for pushing these bleeding edge r&d. I work at neuromorphic computing area in India & I'm sure India will start competing with USA soon
I don't work in that specific field but I'm certain that India will have much to contribute in that technology!
India have great future, however...the day when world will start to cooperate more instead of compete, will be the best day for humanity...also, and i am sure that it wont happen any time soon. People are just too self centric and primal
China's already much further ahead in all types of chips, whether carbon, photonic or RISC
@@J_X999 u mean compared to USA?
@@bhuvaneshs.k638 Compared to India. India has potential but it just isn't as big as many people think
Even though there is hundreds of company racing in this field, but all of them are pushing the world to the front even if they doesn't win the pie at the end.
Just want to thank the team of Bloomberg Quicktake for making these really high quality content for us 🙏🏻♥️
It’s genuinely great
Epyc.
Yep. Too bad they're associated with the totalitarian name "Bloomberg." I've met Bloomberg employees before who were rightfully ashamed to be associated with the name...
they said something without saying anything
You are welcome Mr.Singhania.
I am heavily impressed and amazed at the same time the kind of presentation Bloomberg has presented here..PURELY scientific ...
At least in Minnesota, during winter heating season, I figure waste heat from my PC all goes into heating my home. Of course if I had better home insulation, I'd probably save more in heating that way!
The Cerebras wafer 80 EXO scale processor has 850,000 cores, each core itself is a supercomputer. All ultra interconnected without bus speed, it outperforms every supercomputer ever built, all on one chip. They believe they have found the pathway to singularity. I gather the only supercomputer that's faster, doesn't exist. 44,000 watts, perhaps it could two jobs at once, heating water in the room while predicting the future, it's that fast. You know like when you make a movie and it takes forever for the processor to get done with it. Pictures simulating a nuclear explosion, fluid dynamics. Current supercomputers draw the event much slower than it happens in actuality. This chip can do more work faster and predict the event accurately in great detail faster than it can occur. Made that tsmc at 5 nanoscale. Strangely, Moore's law will continue. IBM has already produced chips at the two nanometers, so surely there's lots of room for improvement yet to come for the cerebras wafer supercomputer.
So maybe the AI bottleneck (ie. class 5 autonomy, 'general ai', etc) is due to the binary nature of the base layer architecture - from this it sounds like the analogue stochasticity of neuromorphic architectures may be required for AI to meaningfully progress...
Imagine how much we could save if millions of people weren't living online every waking minute seeking validation, and simply put thier phone down. It would save not only energy, but humanity itself.
Remember, it is these validation seeking individuals which are pushing scientists and engineers to innovate and come with fundamentally new solutions, ensuring that we progress as a species. If we didn't need to upgrade, no one would care to innovate, and we would still be happy with stone-age technology.
Definitely. This is a pretty important documentary to inspire engineers all over the world.
neuro computer looked fired🔥
I was surprised that Mythic Chip was not included on this video to represent Analog. The Nuero Computing part that is being developed in Mumbai -in this video- has already been created by those guys years ago and currently has a computing power that is equivalent to current digital standards but using only 3 watts of energy.
Mythic chip?
reducing energy consumption is like adding more lanes to a highway, it won't reduce traffic, it will just add more cars
The best alignment process I know of is using magnetic fields. Is there a way to make these nano tubes or the environment in which they are stored temporarily magnetic?
This is really youtube should recommend
Neuromorphic computers are the next key technology.
What would be interesting is if chips can be more three dimensional, as opposed to the relatively two dimensional chips afforded by conventional lithography techniques.
I'm glad to own some oldshool household hardware like my cettle and toaster that don't rely on chips and last for like 3 years, but for over 30 years and counting (got the stuff from my parents and maybe i give it to the next generation as some magical relics with of another time)
I think a few other options have not been duscussed liked spintronics (with MRAM already on the market), and maybe (flexible) organic electronics…
Very informative, thank you from Cameroon ❤️
Should've said "Better Chips AND software"
15:04 "in theory you could have a processor in one room, memory in another, storage in another."
Maybe, but interconnect latency is still a concern.
Light travels 30 cm (1 foot) in 1 nanosecond, which is the duration of one clock cycle when the computer runs at 1 GHz. Latency to communicate with RAM in an adjacent room will limit the usability for many computing applications.
There are datacenters that reuse the heat generated to provide central heating to towns around them. That’s just one example of how much power is wasted on computing - if it’s enough to heat the houses around you and it’s actually even profitable to do that.
Photons move so much faster than electrical impulses - almost 2% faster (Gasp!). The major advantage is additional semantic elements. Using Josephson Junctions allows the use of interferometers as building units. Using tunnel diodes allows q-nary logic. What semantics are available using optical units?
No clue about any of the other stuff you said but wasn't the whole photons idea a bit more about saving energy than it was about the speed? They definitely did say something about it being "so much" faster but also went into detail about energy efficiency, not disagreeing with you though just idk lol
@@zFA113NNINJA The hype was all about speed. For efficiency, look at adiabatic circuits. They change state without consuming energy. In 1981 I designed a stepper motor driver using this concept. I was only able to get 9 times better performance than the motor's rating. Check out adiabatic tunnel diode threshold gate logic. Tunnel diodes are 100,000 times better than CMOS. A tunnel diode threshold gate CPU would run at 2000 GHz at 5 W.
I'm a science nerd and fan. But I hope there are simultaneous efforts to develop safe disposal of carbon nanotube solution 5:19. The tubes are too small to filter conventionally, and they don't easily degrade. Waste has to be considered, particularly since 33% of it is already known to be unwanted by products (full time conducting nanotubes).
There is already a neuromorphic chip company from Australia called BrainChip with their Akida 2nd gen chip out. They have partnered with ARM, Intel, Prophesee, Megachips to name a few
Some corrections: 1) 5 nm process isn't actually 5 nm, it's a marketing term, so the graphic is inaccurate, 2) modern chips are already layered.
@3:32 there should have been a huge asterisk at the figure of 5nm (nanometer). The transistors aren't actually 5nm in size and that '5nm' technology is just advertising for the manufacturing company's next gen transistors. Now due to how transistors can be manufactured and packaged differently, there's no agreed industry standardized size. Some fabs (places where semiconductors are being built), usually give a density figure of x amount of transistors per mm² of their die, but even that is difficult to verify independently.
As a microelectronics engineering grad student, I'm very well aware of the major challenges that power optimization can pose. There have been many attempts to "cheat" the physical boundaries of materials, some have been successful, some have lead to entirely different technologies.
really like for example
Any technology that can build a >1Ghz 32bit exact adder...
Literally just passed my computer architecture final and then this was recommended to me haha. Great video!
Hey, bloomberg, could you put links to the things you discuss in the video description? I'd expect your viewers to be pretty likely to want to look further into things and read stuff.
Reducing environmental impact by choice is said to be used as a performance measure advancing chip design. Could be worst off in no development releasing them, Most designs end up on the shelf without ever being released not even partly small functional designs. All costs go to the larger processing of data while the least of data processing works as well to uncover much of a design. However, I applaud how they go about it.
We obviously need more efficient chips, but they at least run on electricity. Everything that runs on electricity gets more eco-friendly over time. The cheapest electricity at the time is from renewables. In Europe, some data centers provide district heating and more want to switch to it. Yes, heat pumps are way more efficient for heat (300-400%) computer chips are “only” ~100% efficient in producing heat. That heat would otherwise be discarded, which costs more energy, so by using it we are reducing our electricity required.
Photonic chips would still run on electricity.
Yea obviously
@@djayjp yes but their less prone to being to using a lot of electricity and their more efficient
The more energy efficient your devices are the less solar panels, windmill and what not you will need to power them.
Maybe dielectrophoresis in combination with flow fields in solution is a way of tuning and improving the alignment over pre-treated (for instance lithography) inhomogenous surface energy Si wavers.
Worked out pretty well for GaAs nanowires in a study we conducted at the university to align them parallel at contacts.
Am I correct in thinking that the aligned nanotubes would form a large scale matrix of potential MOSFET transistors?
@@foxbat888 In principle i think you could manage to make it that way, however, I habe to admit that i am no expert in Transistor technology.
My knowledge is more coming from the surface science/electrochemistry/interface science especially solid to liquid
My dad works as an electrical engineer in the semi-conductor industry. Pretty crazy stuff.
Wait... Recognize oders? Does that mean chips will be used to snag drug mules? That's awesome!
Even though these chips would be more expensive to produce, the power savings in use makes a huge dent in the life time costs of those chips. This is what big data centers live on. So if there is a 1000 time reduction in power usage, *theoretically* they can cost a 1000 times more to produce, but still be competitive.
beautifully put together.
0:00 misleading image, atomic power plant don't produce carbone emissions, in fact it's one of the best alternative to that specific problem.
I think we will always pump more power even if we have more efficient chips, because there is no limit to what we want to do with them.
So cool, more efficient chips, its great, BUT we will still increase your energy consumption.
This was The best explanation i have heard for quantum tunneling. Thanks guys.
Semiconductor is the world new Gold and Oil.
There's nothing new about semi-conductors.
semiconductor came in 1930s; they are not new
But still begging for oil
Technology that creates more advanced technology should be the primary military focus.
Amazed how long these alternative silicon methods have been in development. Seems like we’re stuck with wafer silicon for this generation
Lets hope not
The advertisement just made tinnitus an effing symptom.
16:27 "...that communicate with each other through electrical pulses known as spikes" this is incorrect, they are known as action potentials. You can have a brief depolarisation event, or spike in charge, that does not reach the threshold of an actual action potential and thus does not result in communication.
They also talk about "moving electrons through copper wires" so yeah this video is at a highschool level of understanding and quite far from reality.
I'd like a study on how much % of this power consumption falls on user profiling and the processing needed for it.
I wouldn't be surprised if it's around or above half of it...
I'm in awe of the technology and ideas which have been developed to enable material manipulation at molecular and atomic scales. Just amazing. My choice application of new AI technology: to recognize and edit ads from UA-cam videos.
I think Analog systems can perform image and motion processing, including calculations such as addition, subtraction, multiplication, division, and logarithms, sine and cosine according to the decomposition of Fourier columns into different functions. Faster than any digital systems as advanced as they may be. Including meter calculations and grades. and providing realistic solutions to multicollection equations.
You could use q-nary logic like in flash memories. You could use adiabatic circuits. You could use self-time circuits. You don't have to stay in the same rut, doing the same thing over and over.
It is interesting that you can’t combine some of these methods as with neuromorphic computing or neural networks , you could use the efficiency of say photonics but the problems with all these chips is running things like an operating system on top of them as photonics wouldn’t struggle with quantum mechanics problems and as photons are smaller than electrons . The problem is that you have to convert electricity to light using oleds,
Neuromorphic computing could solve the carbon nanotube problem but the problem is that you can’t get smaller than the carbon atom which is essentially quite big as it has 14 proton and neutrons. To separate them if it is metallic you would just need a magnet.
The thing is to arrange them on a chip would be the most fascinating thing as you could use a neural network to align the chips in the best configuration
Light to me is the best option with a neuromorphic setup but to make light travel as it has different wavelengths on the electromagnetic spectrum ,
But even if you have light together it is still going to generate heat or if you have light transistors . As you could go down to the width of a photon in electronics or photonics , how can you tell whether a light transistor is on or off. If light has mass that would make a difference. You could control light if it was moving at a specific frequency say red shifted.
These chips are along way from being used to run games on a computer
Quantum computing wasn’t even bought up
Thanks so much for such beautifully illustrated video about modern economy, and its current limits, and those with courage and wisdom to go beyond, yet at same time hopefully to also start reversing our energy footprint causing climate change disaster.
Putting a computer into a toaster is the dumbest use of computing power one can imagine. Are you so lonely that you need emails from your refrigerator? Security = control.
"5nm" in chips is used as a marketing technique and it does not mean that the dimensions of the chip are near 5nm. Contacted gate pitch is 51 nm and metal pitch is 30 nm.
2:17 "semiconductors are made up of transistors". Yeah and concrete is made up of skyscrapers too :)
Transistors are made up of semiconductor materials not the other way around lol.
If the production process always creates metallic nanotubes as a by-product, could those be aligned magnetically before removing them from the semiconducting ones?
Not when they are copper.
Anyone know the name of the track that starts around 09:30 mark?
brilliantly explained to the layman with such recondite acumen.
Amazing content, so cool information, please keep coming up with these kinds of videos
It’s amazing that just a hundred years ago we barely had cars on the road. The speed at which technology is developing is something else.
leaving processor and memory apart, would it cause too much latency? even signals travel at speed of light.
Wouldn't a high frequency vibration like ultrasound while in suspension help to align the nanotubes?
So...how is it that a computer materials scientist expresses himself by saying "100 times smaler...". It's become popular to use "times smaller" in media but it's very inaccurate.
Easier to understand for a Layman. How would you phrase it
Light traveling through optic chips is the future.
Combined calculations of a traditional binary computer integrated with a quantum computer makes sense
what about aligning the nanotubes with gravity? would that be possible? centrifugal or electromagnetic and centrifugal
Yea, it's an interesting puzzle
people forgot to consider how a smartphone saves energy by replacing old solutions like physical maps, books and so on which also takes energy to produce and pollute the planet
24:19 "What is this called" *flips off the teacher* 🤣
In 1968 I worked with tunnel diodes and .5 GHz. TDs are 100,000 more efficient that CMOS. Current TD speeds are 2000 GHz at 5 W. Yet no one is developing a resonant tunnel diode threshold logic front end. It would be smaller, cheaper, and more efficient than CMOS. Yet executives are ignoring it. Never underestimate the power of human stupidity.
The think is that while building data centers, these tech companies also build solar and wind
They're gonna start building reactors in the data centers
Which CPU design should be the first implemented in graphene? The venerable 6502? z80? pentium 4? cray 1?
Also, moving away from rare earth metals and minerals to more common metals and minerals would be a game-changer if somebody can make said technology powerful and energy efficient. It would allow third-world countries more ability to compete and thrive as a result. We just need to deal with antitrust and insider trading issues that often hamper the ability for people to compete with larger corporations that could potentially 'buy them out'.
The guy with the beard is a great commentator/middle man for this Quicktake. Hope to see him again.
There's fundamental difference between lohi intel and what IIT Bombay is doing. IIT Bombay uses memristors/ analog circuits. Lohi only uses neuromorphic architecture but uses digital circuits