Question for Harris: is there any additional hardware required over the existing single wavelength, in order to make the proposed multiple wavelength simultaneous processing work? What proportion of the existing product volume and cost does that account for, and thus; approximately how much would the size and cost of the complete system increase assuming that the costs of production remained the same? I'm after a proportion like "4x for 6 wavelengths than 1 costs and 3x the volume", not necessarily absolute values.
Congratulations for introducing this topic so well in UA-cam.This platform let us spread all we know about the field and from MEETOPTICS we are proud to be part of the photonics community and to help engineers and researchers in their search for optical lenses through our site. We celebrate every step forward.
Maybe this will be interesting: Adiabatic Quantum Computation Applied to Deep Learning Networks by Jeremy Liu et al. including from the University of Southern California and Oak Ridge National Laboratory "The quantum adiabatic computing approach allows deep learning network topologies to be much more complex" Currently, to find the best AI model may need a supercomputer. "The ability to train complex networks is a key advantage for a quantum annealing approach ... High performance computing clusters can use such complex networks as building blocks to compare thousands of models"
Everything on the table is achievable, but we seek the efficient methods that also works well in symbiosis with all the other technologies used. Wholesome thinking is key, with adaptive modulation as framework.
Good question - but if Lightmatter chips would involve regular optical cables, tranceivers & components then they might want to operate it at commonly used wavelengths, which are around 1000-1600nm (mid-infrared light). However, if they would decide to go full custom semi & want shorter wavelength, the shortest they could go would be ~190nm (UV-A light). ~190nm is the shortest possible wavelength for optical semiconductor as that's a) the shortest wavelength for which efficient LEDs can be produced (because this is the shortest wavelength for which reliably working III-V semiconductor on bulk silicon materials & processes exist, namely crystalline InSb with a bandgap of ~6.5 eV) & b) the shortest wavelength for which practicle waveguides exist - OH-free quarz glass. The problem being that pretty much any material whatsoever absorbs radiation above that frequency so transmitting any light at wavelength below 190 nm is near impossible (except in vacuum).
@@FrankHarwald I think he seemed to suggest the computer would run on "RGB" light, and I was trying to politely ask a question that would require him to admit that it probably wouldn't. Just another example of the guy either having no idea what he's talking about, or lying about it.
Question for Harris: compared to existing conventional computers, how does the photonic computer compare in terms of unit of volume/cost/power consumption per processing benchmark unit? And how does the absolute benchmark of your best current photonic computer product compare to the best conventional computer out there?
@22:00 That doesn't answer the question. Though it does elude to the acknowledgement of there being one. A simple "I couldn't give a nm wavelength separation off the top of my head. It depends on the mechanism of separation. 16 and 32 wavelengths concurrent processing is in the works though, which is the logical progression of that question."
Am I right that if we theoretically could implement program flow control and memory completely with optics we wouldn't need the overhead of converting between the transistor and optic world thus dramatically improving speed and reducing power consumption while creating a fully touring complete computer?
Brain is 4D, in addition to "in volume interconnection" there also difference in chemical signals. We are 1D from making artificial brain. This interview is so nice, can't wait things happen.
Question: What specific advantages would photonic computing present in the context of real-time ray tracing ? In particular, wouldn't it be possible to directly define light-matter interaction primitives in a way that doesn't take the linear algebra route ? Since those kind of computations are probably here to stay, wouldn't it make sense to make something like "photonic RT cores" (except the raytracing or smth "close enough" would be an actual primitive), then write some code to interpret these primitives as tensor transformations to recover the linear algebra ? Seems like that alone would make these chips even faster than the current approach could ever allow; at least in RT workloads
@@JorgetePanete I get what you mean but high performance, energy efficient QPUs won't be in consumer PCs for a while but "classical" photonic chips are getting there much sooner than I expected. I'm emphasizing the fact that in the meantime (maybe the next two to five decades), some applications like raytracing in personal computers and mobile devices might benefit from being actual analog primitives within a photonic chip rather than taking the long-winded, digital route (via tensor multiplication based models). We could see some latency increase but the potential gains in accuracy (ironically enough) and throughput could be unfathomable !
Can it outperform current market chips, and what chip would it currently on par with? Obviously the premise is capable of vastly outperforming silicone, but are we there yet?
John, excellent job in avoiding clearing up any of the misinformation from the first interview. What do you get out of the arrangement? I thought sponsors had to be called out? Or are you flattered to talk to a CEO as a smaller creator? Or do you think it's got a good chance of boosting your channel?
Thanks, Leo, just seeing your comment right now. LightMatter is not a sponsor and I just followed up b/c there were so many questions. If I have a sponsor, it will be completely declared in the video and in the video description ... as I did with my creator coin in this video.
Use high density output, to signal some form of Morse code to a transceiver, that then sends out the transmission signal, or coded packets, in low density. That then can be unbundled back into a usable signal. End user delay...🤷
Okay, so we now have this device that can receive light and use that information to compute. This means that we can perform chemistry based off outputs from this device. So if we attach a filter, we could basically manufacture photosynthesis right? I mean I’m sure it be expensive at first, but isn’t all new technology? And if I’m not misunderstanding anything, this could mean we could make portable co2 filters.
Questionish thing: Would photonics allow for a security camera to use no energy? Seems like a traditional camera to monitor setup would be perfect for photonics as there are no calculations required, just a transference of data from one point to another.
@@RomanLeBg I don’t see any reason why you would even need to encode it. A chain of mirrors can be set up in such a way that you can see something fairly far away and it uses no energy - but once you chain too many mirrors the dataloss starts to add up making it harder and harder to see. Optics use total internal reflection, making the dataloss negligible. Adding energy into the system seems unnecessary as images themselves are made out of light and you don’t intend to view it any brighter than it was to begin with.
@@Development530 But a fiber hair for each pixel seems to be a lot of things that could break easily But I feel like we aren't talking about the same things I don't completly understood your idea lol
@@maloxi1472 How? If nothing comes from it, there wont be a big announcement of "this tech doesn't work!" Cause it could take decades to become viable if it *is* viable, so people will just wait and then forget about it eventually.
"It's not gonna run Crysis. It will not be your CPU" and "It would be an awful waste of an incredible hardware" Screw you I will do with my photonic computer, (if I'll have one) do what I want.
@@quentinjorquera8892 It's an American proverb, referring to a historical scam that lots of wealthy people fell for. People would "sell" them a bridge that they didn't have any rights for, with a fake deed, and scam them out of their money. When people are gullible, you tell them "In that case, I have a bridge to sell you."
Nvidia is providing AI machines as well as Graphics cards for crypto mining, but you are addressing graphic processors in the near future. when miners are looking for a machine that consumes less power, I believe this tech could replace current cards. how and when this tech could play a role of the main part of the mining market.
Do u fabricate ur GPU using tsmc or smic etc or u are independent fabricator How can u use light using to to perform logic like how does light based AND gate or OR gate look like 3 ) Can u use light to make rams 4) I believe that u don't need gold or silver in ur case also Is manufacturing of light transistor are different than current electric transistor in mass scale 5) electric computer may represent information electric current will be one and no electric current is zero similarly it may also your computer when there is light it's one and no light is zero.....but can we make a photonic computer that use no light is zero red is one green is two blue is three a quadrital computer 6) so Moore law will have it's funeral soon yet I want to know what is the size of ur transistor are ur transistor optical transistor can they go smaller than 5nm
I can personally recommend this talk from the Photonics Research Group from the University of Gent which deep-dives into how these things works, the theory behind them & how these are being made. I've personally watched it & approve: ua-cam.com/video/CBhdLTTbYoM/v-deo.html
@@FrankHarwald thx very it was quite a 101 video and i did learn quite a lot and quite excited 😊😊😊😊😊 little bit unhappy that big tech undervalues optical computing than quantum computing But it didn't answer my question like optical transistor and other
Totally forgot that I asked a question... You can imagine my surprise when my channel name showed up right at the start. Thanks! Great interview.
Question for Harris: is there any additional hardware required over the existing single wavelength, in order to make the proposed multiple wavelength simultaneous processing work? What proportion of the existing product volume and cost does that account for, and thus; approximately how much would the size and cost of the complete system increase assuming that the costs of production remained the same? I'm after a proportion like "4x for 6 wavelengths than 1 costs and 3x the volume", not necessarily absolute values.
Congratulations for introducing this topic so well in UA-cam.This platform let us spread all we know about the field and from MEETOPTICS we are proud to be part of the photonics community and to help engineers and researchers in their search for optical lenses through our site. We celebrate every step forward.
Thank you for making this topic accessible! Great conversation!
Don't be fooled, its an investment pitch, not an educational video.
Maybe this will be interesting:
Adiabatic Quantum Computation Applied to Deep Learning Networks by Jeremy Liu et al. including from the University of Southern California and Oak Ridge National Laboratory
"The quantum adiabatic computing approach allows deep learning network topologies to be much more complex"
Currently, to find the best AI model may need a supercomputer.
"The ability to train complex networks is a key advantage for a quantum annealing approach ... High performance computing clusters can use such complex networks as building blocks to compare thousands of models"
Everything on the table is achievable, but we seek the efficient methods that also works well in symbiosis with all the other technologies used. Wholesome thinking is key, with adaptive modulation as framework.
Question for Harris: What wavelengths would the computer operate off? Would they actually involve ~440nm for Blue, 530nm for green, and 650nm for red?
Good question - but if Lightmatter chips would involve regular optical cables, tranceivers & components then they might want to operate it at commonly used wavelengths, which are around 1000-1600nm (mid-infrared light). However, if they would decide to go full custom semi & want shorter wavelength, the shortest they could go would be ~190nm (UV-A light). ~190nm is the shortest possible wavelength for optical semiconductor as that's a) the shortest wavelength for which efficient LEDs can be produced (because this is the shortest wavelength for which reliably working III-V semiconductor on bulk silicon materials & processes exist, namely crystalline InSb with a bandgap of ~6.5 eV) & b) the shortest wavelength for which practicle waveguides exist - OH-free quarz glass. The problem being that pretty much any material whatsoever absorbs radiation above that frequency so transmitting any light at wavelength below 190 nm is near impossible (except in vacuum).
@@FrankHarwald I think he seemed to suggest the computer would run on "RGB" light, and I was trying to politely ask a question that would require him to admit that it probably wouldn't. Just another example of the guy either having no idea what he's talking about, or lying about it.
I can forsee having external multiplexer blocks that can be upgraded by the end user for increased color density
Question for Harris: compared to existing conventional computers, how does the photonic computer compare in terms of unit of volume/cost/power consumption per processing benchmark unit? And how does the absolute benchmark of your best current photonic computer product compare to the best conventional computer out there?
4:33 gives me joy
@22:00 That doesn't answer the question. Though it does elude to the acknowledgement of there being one. A simple "I couldn't give a nm wavelength separation off the top of my head. It depends on the mechanism of separation. 16 and 32 wavelengths concurrent processing is in the works though, which is the logical progression of that question."
I couldn't find the list he mentions at 17:20, does anybody have a link to it ?
Is Lightmatter using photonic In-Memory computing? The strategy that's proposed by IBM in 2019 that revolutionize compute greatly
Btw, you not only have capacitance in a wire is also inductance. So in wire at high frequencies, a cap. is a short and inductor is a open circuit.
Am I right that if we theoretically could implement program flow control and memory completely with optics we wouldn't need the overhead of converting between the transistor and optic world thus dramatically improving speed and reducing power consumption while creating a fully touring complete computer?
Brain is 4D, in addition to "in volume interconnection" there also difference in chemical signals. We are 1D from making artificial brain. This interview is so nice, can't wait things happen.
Gamer CEO confirmed.
aweseome. when can we order?
Question:
What specific advantages would photonic computing present in the context of real-time ray tracing ? In particular, wouldn't it be possible to directly define light-matter interaction primitives in a way that doesn't take the linear algebra route ? Since those kind of computations are probably here to stay, wouldn't it make sense to make something like "photonic RT cores" (except the raytracing or smth "close enough" would be an actual primitive), then write some code to interpret these primitives as tensor transformations to recover the linear algebra ?
Seems like that alone would make these chips even faster than the current approach could ever allow; at least in RT workloads
The future might be wavetracing using quantum computing
@@JorgetePanete I get what you mean but high performance, energy efficient QPUs won't be in consumer PCs for a while but "classical" photonic chips are getting there much sooner than I expected.
I'm emphasizing the fact that in the meantime (maybe the next two to five decades), some applications like raytracing in personal computers and mobile devices might benefit from being actual analog primitives within a photonic chip rather than taking the long-winded, digital route (via tensor multiplication based models). We could see some latency increase but the potential gains in accuracy (ironically enough) and throughput could be unfathomable !
@@maloxi1472 I hope so, is there any approximation of when Lightmatter or any other will ship those chips in mainstream PCs?
@@JorgetePanete None so far. They're focusing on the cloud provider market because that's where the ROI is biggest but the tech is there
@@maloxi1472 Good to know, I hope we see its full potential with THz and mW
Can it outperform current market chips, and what chip would it currently on par with? Obviously the premise is capable of vastly outperforming silicone, but are we there yet?
No.
If it exists; and if it works:
Then it could most likely outcompete all market chips simply based on the colour fact.
John, excellent job in avoiding clearing up any of the misinformation from the first interview. What do you get out of the arrangement? I thought sponsors had to be called out? Or are you flattered to talk to a CEO as a smaller creator? Or do you think it's got a good chance of boosting your channel?
Thanks, Leo, just seeing your comment right now. LightMatter is not a sponsor and I just followed up b/c there were so many questions. If I have a sponsor, it will be completely declared in the video and in the video description ... as I did with my creator coin in this video.
Use high density output, to signal some form of Morse code to a transceiver, that then sends out the transmission signal, or coded packets, in low density. That then can be unbundled back into a usable signal. End user delay...🤷
13:10 that's not Cyrillic, John, that's Greek for Vissarion Mema.
Okay, so we now have this device that can receive light and use that information to compute. This means that we can perform chemistry based off outputs from this device. So if we attach a filter, we could basically manufacture photosynthesis right? I mean I’m sure it be expensive at first, but isn’t all new technology? And if I’m not misunderstanding anything, this could mean we could make portable co2 filters.
Questionish thing: Would photonics allow for a security camera to use no energy? Seems like a traditional camera to monitor setup would be perfect for photonics as there are no calculations required, just a transference of data from one point to another.
Do you mean running fiber to every photosensor of the camera ? To encode it in a video and send it across a network you'll need power no matter what
@@RomanLeBg I don’t see any reason why you would even need to encode it. A chain of mirrors can be set up in such a way that you can see something fairly far away and it uses no energy - but once you chain too many mirrors the dataloss starts to add up making it harder and harder to see. Optics use total internal reflection, making the dataloss negligible. Adding energy into the system seems unnecessary as images themselves are made out of light and you don’t intend to view it any brighter than it was to begin with.
@@Development530 But a fiber hair for each pixel seems to be a lot of things that could break easily But I feel like we aren't talking about the same things I don't completly understood your idea lol
@@Development530
The same data loss occurs with optical fibres.
Can it play video games? Where is the question “Can it run Doom?” and “Can it run Crysis?” answered
of course not, at least not at first. as he's said many times it wont run windows or any operating system that could support games
In terms of the next 15 years when fabricating 3d isn't small coves more efficient than large dies. 1 thz requires small dies not big ones.
where is it doe?
Can someone explain to me if it is an actual cpu or a co processing unit?
Both.
I want to be a photonic engineer cause its sounds cool.... and is cool!
I really want to know if you can do mining(bitcoin/ethereum/or any other coin) with lightmatte computer? Can you do it? Is it gonna be efficient?
He doesn't know because it isn't real.
@@TheGrumbliestPuppy I guess we'll find out soon enough...
@@maloxi1472 How? If nothing comes from it, there wont be a big announcement of "this tech doesn't work!" Cause it could take decades to become viable if it *is* viable, so people will just wait and then forget about it eventually.
Huawei indicated they built a Photonic also, did they just copy your product?
RGB is racing stripes for puters, lol
I'm seeing Asimov's positronic brain here
Does Lightmatter have publicly traded stock?
"It's not gonna run Crysis. It will not be your CPU" and "It would be an awful waste of an incredible hardware" Screw you I will do with my photonic computer, (if I'll have one) do what I want.
Fkk this company . Hope their is other company out there , trying to make it a consumer product .
I want to buy shares of that Company
good luck :-)
In that case, I have a bridge to sell you.
@@TheGrumbliestPuppy ?
@@quentinjorquera8892 It's an American proverb, referring to a historical scam that lots of wealthy people fell for. People would "sell" them a bridge that they didn't have any rights for, with a fake deed, and scam them out of their money. When people are gullible, you tell them "In that case, I have a bridge to sell you."
Nvidia is providing AI machines as well as Graphics cards for crypto mining, but you are addressing graphic processors in the near future. when miners are looking for a machine that consumes less power, I believe this tech could replace current cards. how and when this tech could play a role of the main part of the mining market.
I want a card from lightmatter. I have some decentralized Security to provide from my home.
Flux is looking to capitalize this technology on a decentralized scale
no "interesting questions" about mining bitcoin on it?
@Auracle it would be photonic asic, not general purpose pc to "run doom" - so no, it would not "be same thing".
Do u fabricate ur GPU using tsmc or smic etc or u are independent fabricator
How can u use light using to to perform logic like how does light based AND gate or OR gate look like
3 ) Can u use light to make rams
4) I believe that u don't need gold or silver in ur case also Is manufacturing of light transistor are different than current electric transistor in mass scale
5) electric computer may represent information electric current will be one and no electric current is zero similarly it may also your computer when there is light it's one and no light is zero.....but can we make a photonic computer that use no light is zero red is one green is two blue is three a quadrital computer
6) so Moore law will have it's funeral soon yet I want to know what is the size of ur transistor are ur transistor optical transistor can they go smaller than 5nm
I can personally recommend this talk from the Photonics Research Group from the University of Gent which deep-dives into how these things works, the theory behind them & how these are being made. I've personally watched it & approve: ua-cam.com/video/CBhdLTTbYoM/v-deo.html
@@FrankHarwald thx very it was quite a 101 video and i did learn quite a lot and quite excited 😊😊😊😊😊 little bit unhappy that big tech undervalues optical computing than quantum computing
But it didn't answer my question like optical transistor and other
can it run bitcoin miner ?