The Power of an RTX GPU in the Palm of Your Hand!!! (Pocket AI TBT-A500)
Вставка
- Опубліковано 26 лис 2023
- The Pocket AI TBT-A500 is an ridiculously tiny eGPU from ADLINK. They sent me over a review sample, and I was clear that I would be focusing on the gaming performance even though it seems like the primary focus of the device is productivity workloads (which I don't really test on my channel since quite frankly I'm not an expert on those applications). So let's find out if this thing can improve the gaming performance of an older laptop, as well as see how well it stacks up against a new mini PC like the Minisforum UM780 XTX with a Radeon 780M iGPU. I reviewed it using the drivers supplied on the ADLINK website, which seemed to be rather old. Company website (not a sponsored or affiliate link): www.adlinktech.com/en/pocket-...
What equipment do I use to make my videos?
Camera: Sony a6100 amzn.to/3wmDtR9
Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
Camera Capture Card: Elgato CamLink 4K amzn.to/3AEAPcH
PC Capture Card: amzn.to/3jwBjxF
Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
Greenscreen: Emart Collapsable amzn.to/3AGjQXx
Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
RGB Strip Backlight on desk: amzn.to/2ZceAwC
Sponsor my channel monthly by clicking the "Join" button:
/ @danielowentech
Donate directly to the channel via PayPal:
www.paypal.com/donate?hosted_...
Disclaimer: I may earn money on qualifying purchases through affiliate links above. - Наука та технологія
Is that a Large Language Model in your pocket or are you just happy to see me??
Tiny language model
It's a rocket....
@@christophermullins7163it's a Tiny Large Language Module 🤯
You probably can’t even load a single llm with that vram
@@racialmarvel9468 TLLM ™️
The mini pc is a lot more impressive to me than a pocket size connectable gpu
Yup. Plus, it's a whole system ready to go. All you need is a monitor and peripherals.
@@Krenisphia Or TV and peripherals.
and that's a laptop minus the battery
The product makes sense for right now. There's not enough consumer level demand to integrate it into laptops or pcie.
I'm so glad you used it on a Handheld. Seems like a nice match for those.
@@OneZERROone this would improve something like a Win600 significantly.
We will need this in the future to run the AI in our heads.
Power requirements so low, our own body heat can run it.
That looks so very similar to the rumored switch 2! Would be interesting to crank down the power back to 7W
The problem is loopback to the internal display over Thunderbolt. It destroys performance. This device does not have video out, so…
This is really cool stuff!
If DLSS isn't working, did you try FSR with the Pocket AI? Also, if that doesn't make a difference, the CPU is the bottleneck?
I would love to see lightroom tests. Would save me to have to shell for a gaming gpu that I dont need.
Aha, with its 590$ price and measly power it would be a perfect match for somebody that wants to buy overpriced garbage.
And even if it were powerful it would be worse than a 4070 because the 4070 can be resold.
@@marsovac oh shit I missed that part lol. I just bought a 6750xt for 450cad and it feels like Ive been ripped off
bro that gpu is $1050 in my country 💀. could buy a 7800xt or 4070 if i wanted to
I realize i'm being late but i'm really curious if you could check if this little thing is capable of RTX Voice (specifically the noise suppression) and the nvidia nvenc encoding for obs streaming, and if so - what the performance would be. I'm really tempted to get that for a windows vm with these two tasks in mind.
I REALLY wish you would have run FSR on the Pocket AI ..... with DLSS not doing ANYTHING I don't think it turned on at all.... but FSR bumped up everything you put it on, so I THINK it would help the POCKET AI
- ALSO you never mentioned the PRICE.... I'm guessing $350 .... because, you know... nvidia....
$480 according to the links in the description
@@ultraviolet2497 Holy shit, what? You could actually buy a full RTX 3050 (which is pretty much what this card is) laptop for that price!
Edit: sorry, I was wrong. This is literally worse than a laptop 3050, it only uses a 64 bit bus. It's a laptop 2050 with slightly higher boost clocks and slightly lower memory clocks.
It looks to be an RTX A500 embedded solution, based on GA107 die. I suppose I can think of it like a cut-down 3050.
Crap in other words
It's literally a laptop RTX 3050 in a box.
@@HunterTracks a heavily underclocked and under-powered 3050 mobile (25W vs 60W base and 35W "MaxQ")
@Steel0079 the 3050 Laptop GPU has sold better than any AMD GPU. If that's "crap" most vendors wish they could sell as well as "crap".
@@Wobbothe3rd First, sales is not an indication of quality. Second, RX 580 currently has more users on the Steam Hardware Survey than 3050 Laptop, and it came out back in 2017.
the final job is using the pocket ai with the pocket steamdeck? where there is a will there is a way :D digital foundry would love this!
What is that software that you use to benchmark your fps and all the other stats?
I wonder how well it would work for an older home theater pc and RTX video upscaling. You would be surprised how nice RTX upscaling can make a 500mb 4k blu ray x265 rip look on a 4k tv. (I'm using a 4090)
How do use RTX to upscale video?
35: "This is the new OLED model, just so you know" lol
Devices like these are really gonna blow in popularity when Thunerbolt 5 comes out.
thats what she said...
Will this work on an old 2017 predator helios, not really savvy with tech stuff so I've got zero idea for this .
congrats on getting the oled
-How many can eGPU can you run besides being limited to I/O inputs-
Sorry i thought this was LTT
why was cpu usage so much higher in cyberpunk when using the igpu??
You missed a few things. First of all, its USB/tb connection limitation even just only to transfer data to GPU. And the second one that defiantly hits performance - this eGPU does not have any HDMI/dp connection, so all frames comes backward to PC through that tiny USB connection to show rendered frame through iGPU of your PC/laptop. There is so much overhead in this scheme for USB connection that I didn't expect much from it. But I'd like to see discrete video card called NVidia RTX A500, that have one slot design and don't need of any extra power cables. What I see from techpowerup database, is that A500 chip itself when used in laptops have relative performance of Radeon RX590 or 1650super, so it's powerful enough. But connection limitation of USB (chip has 8x pcie4 lanes) and maybe power restrictions of this device (according to techpowerup laptom variant matching 60w tdp) hitting hard performance of this tiny eGPU
Could this be successfully used with USB C port with 40GB speed?
Seems the pocket a.i is probably limited by the FSB and being external as well as simply not being designed for gaming.
Can't complain about a wrench not functioning as a screwdriver.
Nice video, I didn't even know about this little thing, would be good for my brother who's a programmer but he's already got a 30 series recently for that purpose, maybe for travel.
i am pretty sure dlss wasnt working i have no clue why you didnt try fsr after seeing no improvement, especially on the mini pc that obviously didnt cpu bottleneck.
Would be interesting to compare this to something like a 4GB 750Ti
This is much more powerful than a 750Ti. My old 970 is 4 Tflops which is less powerful than this. This would be more on the 1070 range.
Is it gonna work with the gpd win 2 😊
Is it possible to use two pocket AI?
You use a Motorola Thinkphone??
No way Daniel, that's so cool!
Take a look at the GPD G1 and ONEXGPU, 6700M XT inside!
Is it compatible with an MacBook and dose it make sense? 🤔
Can you try this in other games and emulation? We know Cyberpunk has issues with it.
The power of the sun in the palm of my hands
😮😮😮 Is that a Red Magic 9!? 0:24
Holy wows
"Ah Rosie I love this boy"
Of course nvidia will give measly amount of VRAM and charge an arm & a leg for the thing
wtf this has to do with nvidia
@@magicari8919it is from nvidia so why not blame them. If your car comes with only one seat while being a Minivan, who is to blame?
@@MrGermanletsplayerxD it's not Nvidia who sells it lol.
Even if I go for the vram argument, 4gb for such a tiny tiny GPU is completely normal, you are talking about something that is less powerful than an APU
well, vram is expensive, it's not nvidia, my gtx 660 only had 2 gb of vram, rtx 4060 has 4x as much, 8 gb
@@daikon711 Yes except comparable price AMD = 16g.
I wish phones can use this
Is the title a reference to spiderman 2?
It looks like an Atari cartridge! 😂. That would be an awesome case for it!
There's a caveat about how Afterburner reports power draw on iGPUs -- it displays the power draw for the entire APU, not iGPU only. Same for the CPU -- you can frequently see the wattages reported being the same, even if the iGPU is not doing anything. So the real iGPU power draw would be the difference between running the game on a discrete GPU and the iGPU.
Also, yeah, that about figures for the eGPU. It's a pretty low power chip that is further limited by the Thunderbolt bandwidth, you're not gonna be able to play anything even mildly demanding on that thing. Which is a shame, because you'd kinda expect a product this expensive and one that's been advertised for gaming to, you know, game.
It would have been incredible if this was an add on to your harware to handle raytracing specific workloads without directly impacting the primary GPUS performance by overlaying the ray trace effects over the rasterization.
But can it run Chrome?
Anyone in the comments reading this on cyber monday and is looking for a CPU upgrade, the Ryzen 7 5800x is on sale for $174 brand new. Lowest I've ever seen it and a lot of CPU for that price..
It is not available in India 😢
So he DOES have legs! 😂
Not much of an upgrade from my old 970 Mini. There's a reason I now have a full sized GPU.
Hardly possible for the 780M in the 7840HS to do 60W if the whole Apu has a TDP of 70. Having the Cpu and SoC logic run on 10W would show you what a real CPU bottleneck looks like in CP2077. The CPU was doing 50+% usage. You can't do that on 10W.
Simply put this PocketGPU thing sucks for gaming because most of the silicon budget is tensor and rt cores and there is hardly anything to actually render.
Can we use it to transcode Plex streams ?? LOL
That's actually a decent idea
Nothing will stand in our way! NOTHING!
I suggested this product to evga over a year ago.
That's probably why they went out of the GPU business. Bad suggestion. lol😂😂💯💯
It can make a giant hole in your pockets
this might be the future of gaming
dragon dogma req are out
lol $600 😮
I WAS SUPER HYPED for the first minute of your video.
Looked it up instantly.
Saw 4gb of Vram.
Interest lost because I can't actually use this to do ANY of the AI work I do.
I need AT MINIMUM 8gb.
Whoever designed this FUBAR.
If you would have put 8gb I may have bought it. If you put 16gb in it, I'd already have ordered one.
WTF nvidia. Don't release AI products people cannot use for current AI.
This isn't a product from Nvidia, and 4GB of Vram is fine for inference.
@@Wobbothe3rd It's based on RTX A500, which, as you might've guessed, was made by Nvidia.
@@Wobbothe3rd It may be fine for some people, but it is literally 1/4th of what I would find extremely useful and 1/2 of what I need to do anything at all with.
It is literally useless to me. For the specific purpose of AI. And I am basically what the target market is. And if I am not, then they are losing out on their largest target market.
So it's an ultralow power 3050M, which is also choked by pci bandwidth... no thanks.
This GPU has TDP of 25W. Nice. Nvidia should need to make 25W GPU with better performance for handheld with option to change the TDP from 10 to 25W. Then Combining AMD 15W U series CPU without iGPU and 25W RTX GPU will be so much better. If Nvidia release a cheaper 25W mobile GPU with performance of rtx 3050 35W(@10W) to RTX 3050 80W(@25W) the it will be so much impressive.
So nvidia needs to invent a way to run a gpu at 25w and get the performance of a previous gen 80w gpu.
How do you auppose they do that... you made it sound trivial like they could just chose to do it.
@@Jasontvnd9 RTX 4060 is 5% to 10% better tha RTX 3060. But RTX 3060 is 170W and RTX 4060 is 120W.
GTX 1650M is 50W but RTX 3050M 35W perform same as GTX 1650M.
RTX 4050 45W is performs as same as RTX 3050 80W.
So Nvidia can make a 25W GPU to perform same as RTX 3050 85W in when RTX 50 series comes.
Imagine, Steam deck 2 with AMD U series processor (5 to 15W) and Nvidia MGPU 4GB (10 to 25W) coupled with 800p OLED display (That's enough for handheld) and 256GB SSD and 16GB 4800Mhz RAM for 399$.
In 15W we can get performace of Laptop 1650 50W GPU which is achievable only at 50 to 60W by AMD 7840U. It will be awesome. 800p will keep the life of the device as high as possible. If we get DLSS frame generation with it then it will be huge.
But this is not possible I guess because Nvidia making APUs for Nintendo Switch too. So make 25W GPU and selling to Steam Deck will create problem for their own customer Nintendo.
@@karthikeyan53 This is because your logic is completely flawed , Even if the GPU was 25w theres more to a handheld than just the gpu , Add the processor , ram , fans and screen and suddenly your battery is dead in 20 minutes.
The APU amd uses is a soc , Nvidia doesnt make processors so they cant incorporate it onto a soc like amd can.
@@Jasontvnd9 Bro. You can adjust the wattage of the GPU in gaming laptops. It's possible in handheld too. If the GPU is 25W. You can put it to 10W to get the performance of the GTX 1650. Nvidia can make it possible if they want. They have potential for it. RTX 4050M 45W can perform the same as RTX 3050M 80W.
RTX 3050M 35W performs the same as GTX 1650M 50W.
So RTX 4050M at 20W can perform the same as GTX 1650M 50W. So RTX 5050M can provide GTX 1650M performance at 10W maybe. It's possible.
So run GPU at 10W and CPU at 5W. CPU+GPU = 15W. All other components 5 to 7W. So total 22W. If the handheld has a 60Wh battery then it can last up to 2 hours 40 mins max.
No occulink female port no pcie 16x male connection no nvme male or female connection ?
16GB of VRAM and make it 300$ and I could see a future for it
480 bucks for this thing?! wtf?!
>>Pool Pocket
"im shocked at how small it is"... thats what she said, to me 😒😒
It says over 6.5 tflops of computing power , BULLL SH!T
It does have it, 16 Ampere SM at 1.6Ghz. But due to integer arithmetic it's somewhat between 3.3 Tflop and 6.5, plus is choked by bandwidth.
The one that's a LIE is the inference throughput... i dont get more than 52 peak tops by hand.
Seems about right tbh
@@YourPalHDee you do realize xbox one had 1 tflop of computing ,it run cyberpuk at 30 fps so something more than 6 times more powerful barely gets 20 fps with performance upscaling , give me a brake
It can't even outdo AMD integrated graphics. Overpriced (600$) gimmick.
Most of the AI workloads I'm aware of need a lot of vram. I was surprised that it only has 4gb. It must be designed for very specific and oddball use cases.
but u cant put into pocket,that's the main selling point your paying with overly expensive for.
It's focused on AI... Pocket "AI"... Gaming wasn't even a consideration.
@@alistairblaire6001 its probably for inference workloads, which is enough in object detection models
@@alistairblaire6001 Exactly. For something outwardly AI focused, bottom of the barrel VRAM specs don't make sense.
The new generation GPUs rely entirely on up the power usage. So 30 Watts is the limiter. They may have reached some performance levels of the GT 1030 but with 4 GB VRAM. Smaller transistirs help with each gen but then have to be cooled heavier. This does have a heatsink an a fan? At least the GT 1030 does. And the older tech 1030 draws 30 Watts as well. The power draw will determine the performance.
Why do you keep using the Cyberpunk benchmark? It is wildly inaccurate compared to actual gameplay.
Simply because it's easily comparable not only on his end but to the viewers as well.
@@kiburi2903 comparable to what? Cards that cannot reasonably handle Cyberpunk can handle the benchmark, but not the real game. It is neither indicative of real gpu power or performance. So we are comparing bs to bs, I guess.
I thought it's $99
$584 for this shit😂
Heck even used rx580 for $50 can do way better
Pretty sure rdna3.5 igpu or even phoenix2 APU coming in desktop nxtyear be better than this
lies
BUT WITHOUT BEING SAID''?? ..bro are you listening to yourself talkking?
Come on Nvidia, make an RT(X) 4030 already
The 4050 is perfect as is.
It's meant for AI and it only has 4GB? Damn, that thing is going to become useless so fast.
Clueless pretend experts on the internet don't know what inference is. Idiots.
This is literally just the laptop RTX 3050 in a box, which should be a red flag to most people, because the laptop 3050 is not a very good card at all (and I say this as a person who owns one).
Laptop rtx 3050 is far better than this.
Too slow on the pocket AI. Too bad you couldn't provide real world examples of what it was meant for.
This is really cool. Too bad you only tested it for gaming. It doesn't have its own video out, so obviously its not meant to be for video, and thus this whole video ends up being a misleading advertisement for AMD just like everything else on this channel. Oh well.
He talks about the focus of the product, and the thing is named "Pocket AI." If someone still thinks this is for gaming after that, then it's on them.
WOW, the Pocket AI EGX-TBT-A500 with RTX A500 costs $479, that is just ABSURD,
and absolutely for what you get, sorry this thing is really pointless,absolutely for people
who wants to game on there laptop/PC that does not have a graphics card.
Even for AI workloads it was super shit. You can get RTX 3060 12GB for half price of this thing.