Radxa Fogwise AI SBC - An amazingly affordable GPT and Stable Diffusion server for your home!

Поділитися
Вставка
  • Опубліковано 24 гру 2024

КОМЕНТАРІ •

  • @Solkre82
    @Solkre82 6 місяців тому +6

    Regarding the opening. You really should be wearing your wireless livestrong bracelet for static safety.

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +3

      Oooh my mate I did not even think of that too. I've got a magnetic bracelet which can help with any pain from electrocution or cuts too!

  • @TheJmac82
    @TheJmac82 6 місяців тому +5

    I would have actually purchased one of these if they sold a 32gb model. Being it run llama 3 7b that well, I would assume mixtral would be about the same speed and much better outputs.

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +1

      Yeah 32GB would definitely allow some bigger models to run without being quantized. Judging by the SKU (FW190-D16E64R31M0W0) there will be variations eventually!
      And yep I'll likely run laser-dolphin-mixtral-2x7b-dpo on it!

  • @fengbai9170
    @fengbai9170 6 місяців тому +3

    if you use LCM scheduler, you need change LCM models which will reach the same result in 4 steps (1s) as other scheduler in 20 steps

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому

      Excellent to know, thank you! I've not used LCM before, was just clicking the pre-sets. Cheers

  • @KrisDevelopment
    @KrisDevelopment 6 місяців тому +6

    Whiskey, honest review and tech, you sir win a subscriber!

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому

      Hah much appreciated, and you're welcome 😁 (back on the whiskey right now!)

  • @ElectroOverlord
    @ElectroOverlord 6 місяців тому +5

    That fan noise is a deal breaker for me.

    • @turanamo
      @turanamo 6 місяців тому +1

      No display ports either! Strange! And you get only 3GB of RAM for Casa OS.

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +3

      @@turanamo You can re-map the memory at will with command line arguments. LLM needs ~7GB though. Ref: docs.radxa.com/en/sophon/airbox/local-ai-deploy/ai-tools/memory_allocate

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +1

      Yeah absolutely understandable. I'd have to have it far away - I hate noise. But the little bugger gets bloody warm, so it's understandable.

    • @ElectroOverlord
      @ElectroOverlord 6 місяців тому +1

      @@PlatimaTinkers My issue is that my home lab takes up 15 feet of wall and I am not running wire just for it to get it away from my workstation. Tons of these SMC & SOC have just horrid cooling. I'm just going to turn my old desktop into a Unraid box and run shit from that.

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +1

      @@ElectroOverlord Yeah I get that completely! Could always go wifi, but that isn't my preference either.
      The main win here is the TPU/NPU though. It outperforms my RTX3080 at diffusion and inference models, so while you could do a Docker/Kubernetes cluster, Unraid box, etc, unless you throw some serious cash and wattage at it, you're missing that one bit of functionality!

  • @charlesrg
    @charlesrg 2 місяці тому

    Please post more info on this box ? Have you tested the power consumption idle/AI ?

    • @PlatimaTinkers
      @PlatimaTinkers  2 місяці тому

      Yo. radxa.com/products/fogwise/airbox/#techspec (just has 65W max listed, I have not measured it myself)
      Enjoy 🤘

    • @charlesrg
      @charlesrg 2 місяці тому

      @@PlatimaTinkers yes, I'm wondering where does it stays while at idle. perhaps can set the governor from performance to balanced , looking to build a always on home AI box

    • @PlatimaTinkers
      @PlatimaTinkers  2 місяці тому

      ​@@charlesrg In two weeks I'll be back out there and if I remember, I can measure idle for you. Have basically wiped it though, so no AI models but you can assume they max out ~60/65W.
      I'd say this is one of the few solutions for a single boxed product that does what is usually needed. Just LOUD hah.

  • @AlwaysCensored-xp1be
    @AlwaysCensored-xp1be 6 місяців тому +5

    Cool that we get this AI grunt for home use. 30TSOP compared to 13 on the Pi5 with Hailo. SD XL on Pi5 takes 3 minutes per image.

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +2

      Yeah it's insanely powerful. Running Llama on this seems to be as good as running with my RTX 3080!

  • @jyvben1520
    @jyvben1520 6 місяців тому +8

    Radxa need a new quality inspector, typo on box is unforgivable, "sever" or server

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +2

      HAH, I did not even notice this. Crap. I'll let Tom @ Radxa know!

  • @tylerjw702
    @tylerjw702 6 місяців тому +4

    bro do you sell those auto screw drivers on your store or am I blind?

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +1

      Hahaha nah sorry mate, here's an affiliate link for ya: amzn.to/3JruXZD
      Super handy for tightening FEP sheets if you do 3D SLA printing.

    • @tylerjw702
      @tylerjw702 6 місяців тому

      @@PlatimaTinkers unavailable. Lame

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +2

      @@tylerjw702 Ah weak. Okay here's their Ali link instead: www.aliexpress.us/item/3256805050546819.html

  • @turanamo
    @turanamo 6 місяців тому +5

    4 shots of homemade whiskey and my man proceeds to saw off his M.2 SSD! Never drink and demo mate! 😂 Thanks for the entertainment though.

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +2

      Never drink in a demo? Dafuq else am I supposed to do with my time? Bastard took 33 minutes first boot 😅
      And you're very welcome!

    • @turanamo
      @turanamo 6 місяців тому

      @@PlatimaTinkers 😂😂

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому

      @@turanamo 🤘

    • @turanamo
      @turanamo 6 місяців тому

      @@PlatimaTinkers YT deleted my previous reply. Just wanted to say thank you for the Platima Ruler that you added to my order. It's a splendid compliment. Thank you!

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +1

      @@turanamo YT is a pain like that 😑 And you're very welcome - thanks for the continued support!

  • @r0galik
    @r0galik 6 місяців тому +3

    It's not very good as a home server because the idle power is enormous (>30W). And it wasn't using the NPUs when predicting, they need some special software. So the 32 TOPS is locked. This seems to be just a repurposed industrial CCTV device (where they care about the number of streams and not idle power)

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +2

      I am a tad confused by your comment mate. It outperforms my RTX3080 at diffusion models and inference, but where as my RTX chews up 400W, this utilises 1/10th of it. Even my router at home, which doesn't have AI or any apps, consumes 18W on it's own.
      Similarly, it was using the NPU for both the LLM text generation, and the SD1.5 image generation - if that was running on the Cortex-A53 cores, which I've done before, it would have taken days, if not weeks, to spit anything out!
      It'd definitely be excellent as an NVR, that's the purpose I'm strongly considering, but as a home server and for the price I think it's amazing. I mean, even for a 8-core / 16GB / 64GB SBC it's price is blood excellent!
      Crap, I forgot to run Geekbench on it!

    • @r0galik
      @r0galik 6 місяців тому +1

      ​@@PlatimaTinkers huh... Maybe it was really using the NPU because it is quite fast... True, sorry. But 8x core A53 at 2.3 GHz is nothing to sneeze at (orange pi zero 2 w gets ~0.3 tokens/s with 4 cores at 1.5 or 1.2 throttled GHz).
      Learned a new acronym - NVR

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +2

      @@r0galik Yeah and if you look at the CPU utilisation while the inference or diffusion model are running it's quite low. Oh plus those models consume gigs of RAM, eg I struggle with my 12GB GPU to load them. This SBC has 3GB allocated to the system (see docs.radxa.com/en/sophon/airbox/local-ai-deploy/ai-tools/memory_allocate#install-memory-edit-tool-pre-installed-on-the-system for re-allocating RAM).
      What I think it could really benefit from is something like `htop` for the NPU! That being said, I'm shit at reading the docs, so it might exist and I've missed it haha.
      Orange Pi Zero 2W getting 0.3T/s would be running a much smaller model I would expect, not a 7B parameter model, even quantized. If you've got any link I'd be very curious!
      With running inference models, a huge amount of bottleneck is also the IO performance. I found I can back GPU RAM with system RAM, but it actually drops performance due to the throughput rate between the two busses.
      Yep NVR is a good one - I deal with them every day in my day job, and having object/person/action recognition on realtime streams would be bloody amazing in some situations. Eg holiday parks where shit gets stolen all the time!

    • @r0galik
      @r0galik 6 місяців тому

      ​@@PlatimaTinkers yeah, not the full model (it crashes). I was just trying out the TinyLlama llamafile (UA-cam does not like links but you can find it with Mozilla-Ocho)

    • @r0galik
      @r0galik 6 місяців тому

      ​@@PlatimaTinkers hi again, the model was a TinyLlama llamafile (the full llama was indeed crashing).

  • @chandankp
    @chandankp 6 місяців тому +1

    Does it run Frigate NVR?

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому

      letmegooglethat.com/?q=Frigate+on+CasaOS

  • @UEGIIVIRUSIIXO
    @UEGIIVIRUSIIXO 6 місяців тому +2

    You are the Oppenheimer of M.2 SSD's, bringer of death.

  • @pah1of284
    @pah1of284 4 місяці тому +1

    Best channel ever!!!!!!!!!! Love this.

  • @Topy44
    @Topy44 6 місяців тому +2

    Radxa is pronounced "Rad-sha" by the way, according to them

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому

      Oooh okay that is awesome to know - will clarify in a future video. Makes sense too 😂 Cheers

  • @Psirust
    @Psirust 6 місяців тому +1

    lol
    Any plans on redoing this review when you get stuff up and running properly? Because so far it seems none of the AI features seem to be working.
    I'm looking forward to seeing some Yolo / Tensorflow object detection to compare framerates and actual speed.

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +1

      Tad confused by your response - the AI features work fine? Both in this product - as shown in the video - and in other videos where I've shown face or object recognition =/

    • @Psirust
      @Psirust 6 місяців тому

      ​@@PlatimaTinkers maybe it's just me, but @31:50 the image generated does not match the prompt or even come remotely close to the prompt.
      Also @32:27, you changed the prompts multiple time, yet the exact image is generated. Identically down to the pixel (I screen snipped and overlayed each response in photoshop)
      Correct me if I'm wrong, but if it's AI generated then shouldn't the images have variations based on the prompts?
      I also meant Object Recognition for this particular product, the Radxa Fogwise AI.

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому

      ​@@Psirust Hey so the first two images were right, the third definitely busted as you noted. I don't know how to use the LCM scheduler and someone else indicated in a comment that it's supposed to have a different configuration to that, hence it not changing. I tried a few examples from civitai.com/models/81458/absolutereality and they rendered nicely. Should have probably shown them in the vid 😅

  • @DaxSudo
    @DaxSudo 6 місяців тому +2

    Wait mate I gotta destroy it for the Easter egg? I actually wanted to use the Ethernet mate.

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +2

      Ah well if you're good enough you don't have to destroy it to figure it out 😋

  • @SomeMorganSomewhere
    @SomeMorganSomewhere 6 місяців тому +1

    Fuck me that boot/install process is broken.

  • @ted_van_loon
    @ted_van_loon 6 місяців тому +2

    sorry I already took a shit a while ago and flushed it away for the sewer mouses(they are often wireless).
    was wondering what happend to your channel, then I finaly saw a new video. now I noticed youtube seems to have some kind of glitch auto unsubsribing accounts from channels or such, noticed the same happening to more channels which are more honnest than the average channels, so seems like youtube is again doing some form of shadow steering/shadow manipulating, etc. similar to how many creators now have their own comments disapearing from their own videos despite saying nothing wrong.
    so resubscribed and should see it showing up again now even when not speciffically searching for it.

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +2

      Yeah UA-cam has been doing some really jank stuff recently! Had a few people complain, and I've noticed it myself. Eg even when Cody'sLab releases another video it doesn't always show up in my subscriptions feed, but I can see it when I go to his channel.
      Odd. Not sure what is going on. I am guessing though that it's because I swear at the start of a lot of my videos now, and turned off monetisation for the majority of them.
      Thanks for letting me know!

    • @ted_van_loon
      @ted_van_loon 6 місяців тому +1

      @@PlatimaTinkers yeah, youtube is getting really shitty with such things and way more direct it just supressing anything they don't like be it home chemistry like cody, or something else, I assume youtube couldn't handle you using your own oppinions on products instead of saying whatever sponsors want you to say.
      next to that youtube also seems to have a disliking for tinkering and freedom of repair.
      I personally just use other platforms like odysee for most things instead as that doesn't manipulate shit as much, atleast now as the future is harder to know.
      I also know of creators setting up platforms which essentially just link the content from creators together from all networks in case they get shadowbanned, or banned on one, or if their content gets removed.
      was also created for such problems, I think it was called futo, louis rossman has also talked about that a lot, even though on youtube it probably isn't available anymore since apaerently to youtube their main business model isn't advertising or such but ofcource as we already know datamining/gathering, but also litterally manipulating content and content visibility to change the global average bias of people is now also one of their biggest buisness models.

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +2

      @@ted_van_loon Yeah I've been watching more content on Nebula recently, which is excellently priced and has a heap of good creators from here on it. Prob going to get demonetised further just by saying that service name too haha.
      Interesting re the other platform. I'll have to look into it. I hope one day there is something else I can use than here, especially where I have more control over the ads!

    • @ted_van_loon
      @ted_van_loon 6 місяців тому +1

      @@PlatimaTinkers haha, yes indeed they might try to shadowban or demonetize you even more just for mentioning that.
      talking about more controll over ads and such, I have designed(note designed but not yet made since while I could make such a software as it also involves money I am not good enough at making it secure to prevent exploits or leaks or such, I am good at problem solving and the deep functionality design of softwares and such however)
      essentially when louis talked about that platform to subscribe to creators instead of platforms I also figured out one of the biggest problems surrounding monetization and such, as well as a relatively easy but very effective fix.
      while I won't desribe it here publickly to prevent some big corrupt corporation from seeing it and cloning it but making a evil version of it(as it needs to be Free Open Source). I can explain it more direct and optimally encrypted and/or on platforms where not everything you say is directly sold to databroakers.
      but essentially for getting money from videos and other contents you make it is great, especially for the people who actually make good stuff instead of those general shitty low effort things just designed to be pushed by the algorythm.
      that said it still needs some form of a developer/fame team, since with many such things especailly if they are free open source or follow those principles(it is designed to also help free open source a lot) those things often are easily cloned by big corporations which then controll the market so it needs to directly be kown enough that peolpe will see it as the standard option, while not completely needed, it is optimal for proper working since otherwise people might end up using such corrupted rigged alternatives which bring back the same problems such as demonetization and such.
      essentially it is meant to allow they users to decide what they like and what not and how much they like or value or respect something, but it doesn't burden them with having to think about any such things or even with doing the other difficult thigns.

    • @PlatimaTinkers
      @PlatimaTinkers  6 місяців тому +1

      @@ted_van_loon yeah it seems you can't really win in this world unless you DIY or use something bespoke. But so be it. I'm happy just plodding along as I am, and make no money from this yet anyway so am not too fussed.

  • @-iIIiiiiiIiiiiIIIiiIi-
    @-iIIiiiiiIiiiiIIIiiIi- 4 місяці тому

    23:57 Alcoholic at such a young age. Maybe a Dingo stole his PC when he was a wee lad I reckon.

    • @PlatimaTinkers
      @PlatimaTinkers  4 місяці тому

      Alcoholic is a strong word! I enjoy it, but do not depend on it, nor sacrifice anything for it 😊
      That being said, yes, a dingo stole my first ever Seagate 80GB IDE HDD that I saved for months for. It was crafty like a fox too

  • @SPROUTCAST
    @SPROUTCAST 3 місяці тому

    Text to speech?

    • @PlatimaTinkers
      @PlatimaTinkers  3 місяці тому +1

      Yeah mate easy. You barely need any TOPS for that sort of processing

  • @dieselphiend
    @dieselphiend 6 місяців тому +1

    Sever. ha.