I ONLY Used an Intel GPU for 30 DAYS.... Here's What I Learned

Поділитися
Вставка
  • Опубліковано 5 вер 2024

КОМЕНТАРІ • 680

  • @IntelGraphics
    @IntelGraphics Рік тому +590

    This was awesome! Great work and thanks for all the feedback and insights into what you experienced. +Subscribed!

    • @tsubasawolfy
      @tsubasawolfy Рік тому +65

      Wow the official channel!

    • @IntelGraphics
      @IntelGraphics Рік тому +89

      @@tsubasawolfy That's us!

    • @RedRandy
      @RedRandy Рік тому +23

      Love you intel ❤

    • @notinterested8452
      @notinterested8452 Рік тому +9

      @@IntelGraphics Nice to see your comment.

    • @PumpyGT
      @PumpyGT Рік тому +4

      Really? I was expecting your channel to have more subscribers and a checkmark

  • @nick-dogg
    @nick-dogg Рік тому +280

    I can’t wait till they really refine their drivers. Their upcoming iGPU’s seem promising.

    • @cyko5950
      @cyko5950 Рік тому +62

      intel vs amd competition in the igpu space would be insane
      they might accidentally end up making apu's so powerful that low end gpus become pointless

    • @nick-dogg
      @nick-dogg Рік тому +11

      @@cyko5950 It would be insane. All we need now is for intel to step up their drivers =D

    • @SPG25
      @SPG25 Рік тому +7

      The drivers have already come a long way, it's only going to get better.

    • @kotztotz3530
      @kotztotz3530 Рік тому

      Here's to hoping Meteor Lake is as good as the rumors.

    • @kuronoch.1441
      @kuronoch.1441 Рік тому +1

      ​@@kotztotz3530Dunno about the CPU, but the GPU data was based on the interpolation comparing it to current ARC GPUs. With the drivers as of that time. Given that the iGPU is an Alchemist refresh, it will only go up from there, might even be approaching 3050 desktop.

  • @raresmacovei8382
    @raresmacovei8382 Рік тому +394

    Linus: We're doing the 30 day AMD GPU challenge.
    Also Linus: Yeah, we won't be doing a video on this. We forgot we had the GPUs installed. Everything ran fine. We don't need to tell anyone.

    • @Metamine0
      @Metamine0 Рік тому +45

      Oh yeah, they never released a part 2

    • @Michael-wo6ld
      @Michael-wo6ld Рік тому +17

      Linus isn't having issues, I think Luke did and took it out though. They talked about it on the last WAN show

    • @jgvtc559
      @jgvtc559 Рік тому +16

      ​@@Michael-wo6ld😂
      Who's the long hair heavy set guy from LTT
      He is the only person from that page I even bother listening to about anything even if I'm not interested I hear what homie has to say about whatever he is presenting

    • @junoperberry
      @junoperberry Рік тому +1

      ​@@jgvtc559I think his name is Allen or something

    • @nathanlarson6535
      @nathanlarson6535 Рік тому +9

      ​@@jgvtc559Anthony?

  • @theftking
    @theftking Рік тому +65

    The Fortnite results make me think Intel might be optimizing hard for UE5, knowing nearly every game will use it going forward. Plus it's a DX12 game which Arc does best with.

    • @goldenhate6649
      @goldenhate6649 Рік тому +10

      I will say one thing about UE5, if done right, holy fuck does it run better than UE4

    • @FalseF4CZ
      @FalseF4CZ Рік тому +6

      ⁠@@goldenhate6649too bad the devs who made Remnant 2 didn’t do it right

    • @V3RAC1TY
      @V3RAC1TY Рік тому

      the first game used UE4 and also didn't run well at launch, though still not as bad as Remnant 2 dose now@@FalseF4CZ

    • @dreawmy2912
      @dreawmy2912 4 місяці тому

      @@FalseF4CZ lmao

  • @wjack4728
    @wjack4728 Рік тому +167

    I've got the Intel Arc a750, and really like it. I had big problems with the drivers when I first got it about 4 months ago, but with the latest drivers it's running like a champ now. I do very little gaming, and use the a750 mainly for normal daily use and encoding videos, and for that it's a great GPU for me.

    • @alderwield9636
      @alderwield9636 Рік тому +6

      Yea av1 is a champ

    • @johnbernhardtsen3008
      @johnbernhardtsen3008 Рік тому +1

      I got the A750 and its crap for my r7 5700x cpu!no igpu on the cpu and I bought the A750 for premiere pro usage!Now I have to build a nice i5 13500 build and use the A750 on that pc!!

    • @vitordelima
      @vitordelima Рік тому +5

      I wonder if wasting time to support all those legacy APIs (DX9-12, OpenGL, Voodoo, OpenGL ES, ...) in every driver for every GPU is a good idea instead of just investing effort into Vulkan converters for everything (I think Steam Deck already uses something similar).

    • @jsteinman
      @jsteinman Рік тому +4

      Same! I really like the a750. On install the card was horrible. Everything was corrupted- even Edge. After a bios update and latest drivers, it’s near perfect

    • @dnsjtoh
      @dnsjtoh Рік тому

      ⁠@@johnbernhardtsen3008Huh? The only difference between using an intel CPU or AMD CPU on arc is in productivity such as video editing. DeepLink isn’t useful for gaming.

  • @__aceofspades
    @__aceofspades Рік тому +121

    I bought an A750 and have loved the experience. $200 beats the more expensive 6600XT and 3060, and has more features like AV1, XeSS, XMX cores, better encoding, etc. Highly recommend Intel Arc

    • @parthjain5248
      @parthjain5248 Рік тому +8

      What are your thoughts now?? With games and videos rendering?? + Heavy multitasking like gaming streaming or rendering?

    • @HardWhereHero
      @HardWhereHero 9 місяців тому +10

      @John-PaulHunt-wy7lf I wanna see Nvidia burn. I use their GPU's but dont want too. I am getting a 16gb 770

    • @johndc2998
      @johndc2998 8 місяців тому

      6700xt is closer comparison

    • @82_930
      @82_930 8 місяців тому

      ⁠@@johndc2998not anymore, Arc has came such a long way from launch and it’s comparable to a 3070/6750XT or even a 6800 Non-XT now

    • @thespianmask8451
      @thespianmask8451 8 місяців тому +1

      @@HardWhereHero Seriously. Screw Nvidia. I just upgraded from a 3060Ti to an RX 7800 XT. Totally worth it.

  • @QuentinStephens
    @QuentinStephens Рік тому +114

    With regards to your game recording issues, it most certainly does happen on Nvidia GPUs. The Twitch streamer Nethesem experienced it when streaming Jedi Survivor with his 3080 Ti. He fixed it by adding a second GPU (a GTX 1060 IIRC) to handle the recording and streaming.
    I do hope you have yet to return the A770 because you're missing a treat if you don't try it at 4k medium settings.
    FWIW I have an A770 and a RTX 4090 (and a 3090).

    • @Micromation
      @Micromation Рік тому +26

      If one just need 2nd GPU for recording and streaming A380 will be a cracking deal.

    • @QuentinStephens
      @QuentinStephens Рік тому +8

      @@Micromation The trouble with the A380 is that the last time I checked Twitch did not support AV1 streaming.

    • @zackzeed
      @zackzeed Рік тому

      Sorry for the dumb question, but you can use a second gpu for recording? I mean I'm aware of capture cards and such but have never used one so my knowledge is limited.

    • @Micromation
      @Micromation Рік тому +6

      @@zackzeed it's not for capturing from external device is for the processing on the same one.

    • @Micromation
      @Micromation Рік тому +4

      @@QuentinStephens "will be" - I wouldn't personally pay for it more than $50. Not when I've just grabbed 3060 12GB for Blender rendering for 180 🤷🏻

  • @jforce321
    @jforce321 Рік тому +74

    You know battle mage is going to be a big deal when there's rumors that Nvidia is pressuring their partners to not make cards for them

    • @PeterParker-status
      @PeterParker-status Рік тому +19

      They did the same to amd but public outcry made them u turn, people just don’t remember. So it doesn’t really mean anything

    • @theonelad3028
      @theonelad3028 Рік тому +6

      @@PeterParker-status they are pushing for a full block on partners working with intel. from memory they just stopped partners from using the same coolers on amd and nvidia cards. still anti consumer but not quite the same and it really does mean something. nvidia is becoming even more blatantly anti consumer than ever before alongside amd doing similar shit its just a crap time to be a pc enjoyer

    • @OG_Gamers_Live
      @OG_Gamers_Live Рік тому +1

      and every AIB involved should tell NVIDIA well hate to see you go, call us if you change your mind and go on about their day. I Mean NVIDIA screwed every one of them HARD when they Started manufacturing their own FE cards and that's just for starters.

    • @ChrisStoneinator
      @ChrisStoneinator Рік тому

      I support Intel in this endeavour and have no love for nVidia, but that's not necessarily an indicator. As Mark Knopfler said, if they gon' drown, put a hose in their mouth.

  • @noelchristie7669
    @noelchristie7669 Рік тому +36

    Been using my A770 for just about everything since I bought it in December. Only game I've ever had a problem with is emulating totk, hard crashes after some time. Been using the card between my 4k tv and my 1080p monitor. Loving it fr, it is a great upgrade from my GTX 1060! Definitely agree that there are better gaming cards though, I do a lot more than gaming, so I can't complain!

  • @Kitsoonaye
    @Kitsoonaye 8 місяців тому +2

    I have an Intel GPU as well. Games always get random stutters (Example: 14:20) and eventually after a few seconds, the game you're playing will crash. It is really annoying.

  • @guilhermedecarvalhofernand1629
    @guilhermedecarvalhofernand1629 Рік тому +30

    My main issue for getting an arc card IS If they are going to stay in This segment. We gotta remenber This is not the First time Intel makes GPUs. If battlemage actualy comes them we can have a talk. The worst would be buying It And It gets discontinued, specialy If you live in third world countries where the entry level gpu is already expensive as fuck. L live in Brazil btw.

    • @Bajicoy
      @Bajicoy Рік тому

      Intel quitting on NUCs has me pretty concerned for their future

    • @Konkov
      @Konkov Рік тому +4

      they've already invested billions of dollars into these gpu's and have had great sales so far, it wouldnt make sense to backout now

    • @OG_Gamers_Live
      @OG_Gamers_Live Рік тому

      Pretty sure The new BattleMage cards are confirmed as coming in 2024 at a recent Intel Investers meeting.

  • @deneguil-1618
    @deneguil-1618 Рік тому +5

    quick point for the shader compilation part on TLOU. having written shaders myself (OpenGL for a simple 3D renderer) i can guarantee you the CPU is the one doing the compilation. the shaders are run on the GPU when needed in games afterwards

  • @johnnytex1040
    @johnnytex1040 Рік тому +13

    Edit; it's been fixed. The Halo infinite situation is genuinely upsetting, because before season 4 when Halo broke compatibility w/ Arc, it actually performed really really well. On the biggest Big Team Battle map I was getting frame rates north of 110 FPS; that's more than what my friend can get with his 6700 XT.

    • @keiko6125
      @keiko6125 Рік тому +2

      I'm glad I'm not the only person having issues with halo infinite. I hope it gets fixed soon, I'm really liking this intel arc GPU

    • @johnnytex1040
      @johnnytex1040 Рік тому +4

      @@keiko6125 343 has told me they're aware of the bug, but their own website has now been updated to say ARC isn't currently supported, and I've not been given a time-frame as to when any fix will come. Rest assured, Intel has also been made aware, and they're seeing if there's anything the could be done on their end to do a sort of adhoc fix. Here's hoping that's possible.

  • @magnusnilsson9792
    @magnusnilsson9792 Рік тому +21

    A770 16GB vs 4060Ti 16GB on PCIE 3 would be intresting, so would overclocking A750 vs 7600 vs 4060 be too.

  • @SudeeptoDutta
    @SudeeptoDutta Рік тому +21

    After a month's research, I went with *AM4 5600X + 6700 XT* PC Build. This is my first PC build. I was very tempted by Intel's 1st gen GPUs given their _on paper_ price to performance ratio.
    Alongside new games, I also wanted to play a lot of DX9 games which I missed out on the last decade or so. So it was sad to see Intel's performance on DX9 or generally older games.
    I think Battlemage will be great and its a great thing that once I upgrade my GPU in future, I'll have a lot of options including Intel's offering.

    • @Tainted79
      @Tainted79 Рік тому +5

      Good combo, I have the same. Would add the 6700xt out performs the competitor in emulators as well.
      I used the adrenaline software to auto overclock my 5600x to 4.85ghz. Can't believe how cool it runs. And after 15 years AMD got me to replace msi afterburner.

    • @ARTEMEX
      @ARTEMEX Рік тому +1

      ua-cam.com/video/5oVD6yziNDY/v-deo.html посмотри этот тест. Там огромное количество игр с api dx9-11
      P.s. работает хорошо arc в старых играх. Не верьте мифам. 3 вылетвших игры не значит, что все остальные не заработают

    • @82_930
      @82_930 Рік тому +2

      ⁠@@Tainted79you can push a 5600x all the way up to 5.2ghz on all cores if you got a decent silicon, mine had an amazing silicon and I achieved a stable 5.4ghz @1.34 volts until I upgraded to a 5800X3D about a year after, the 0.1% lows were like 30% better @5.4ghz but it drew like 80% more power

    • @RobBCactive
      @RobBCactive Рік тому +4

      That's a good choice! There's been performance problems with just 8GB and no matter what this video says, the established benchmarkers covering a lot of games show the Intel competing with 3060 not really 6700xt on performance and value. As for ray tracing he says himself that it needs turning off, which is also true for the 4060 Nvidia cards.

    • @SudeeptoDutta
      @SudeeptoDutta Рік тому +2

      @@Tainted79 Yes Emulation was also a big reason I went AMD. I also dual boot Linux ( haven't done so yet as my PC is just 2 weeks old :D ) & AMD drivers are generally more stable on Linux as well.

  • @AndersHass
    @AndersHass Рік тому +10

    I would say both XeSS and FSR has taken an open approach, they are both open source and can work on other hardware than their own. It would be interesting if XeSS could use dedicated AI cores in AMD and Nvidia GPUs instead of the less performing method they are using now.

  • @bhoustonjr
    @bhoustonjr Рік тому +8

    If you use Arc with a Intel onboard graphics you can get a 40% increase encoding through Deep Link Hyper Encode. That is one of the reasons I went with Intel.

  • @thr3ddy
    @thr3ddy Рік тому +5

    I was a day one adopter of the A770 (the same one you tested). Things were rough until about a few months ago, but now it's smooth sailing on Windows and Linux. The main issue I've had is soft locks when really pushing the video card. I think it's an overheating-related thing since this card runs very hot to the touch. Also, I play CS:GO competitively, and when I run it without a frame cap, I get over 1000 FPS, which seems to be causing some hitching. I have to limit it to something more reasonable, like 500 FPS.

  • @ElDaniWar
    @ElDaniWar Рік тому +4

    Bro, continua haciendo estos geniales videos. E visto a varios youtubers cuanto empezaban con menos de 10k subs y ahora estan en 3m+ subs y tu me das la misma sensacion de genialidad. Sigue asi y llegara lejos :)

  • @defaultdan7923
    @defaultdan7923 Рік тому +6

    been using an a750 (specifically for gaming) and it has been running like a champ. i don’t think i’ve had a game crash on me yet at all, and only a few run poorly.

  • @larrythehedgehog
    @larrythehedgehog Рік тому +7

    Hey guys its that crazy guy with the sonic icon who won't shut up about ARC again.
    Rumor going around is, Nvidia is preventing AIB's from working with intel. Theyre withholding partners from making cards with both them and intel. If you care about cheap gaming parts, you should reach out to nvidia and tell them to shove it. So battle mage will have more AIB partners than just Acer. Have a good day!

    • @J0ttaD
      @J0ttaD Рік тому +2

      Why do they need AIBs? I like the default cooler better. They dont need em

    • @maddymankepovich1796
      @maddymankepovich1796 Рік тому +1

      ironic that intel is the victim now

    • @larrythehedgehog
      @larrythehedgehog Рік тому +5

      @@J0ttaD they need aibs because aibs push cards further than intel ever will. Aibs are what let kingpin become a force to be reckoned with with EVGA. And aibs help with competition overall simply by existing.

    • @J0ttaD
      @J0ttaD Рік тому

      @@larrythehedgehog Hmm

    • @redslate
      @redslate Рік тому +3

      ​@@J0ttaDBoard partners allow chip manufacturers to meet volume demands. While not necessarily a breaking point this early on, eventually, a suitable supply chain will be necessary for success.

  • @RPGWiZaRD
    @RPGWiZaRD Рік тому +5

    Bought a A750 when it was on sale as I was looking forever to upgrade my 1070 Ti to something newer but was waiting for the prices to come down which never happened and Intel was the only option I saw where price vs performance was meeting my expectations and I'm glad I did. I've been working with suprisingly few issues, especially loving the constant driver updates that tend to improve random games quite significantly in some cases. I'm looking forward to Battlemage now and probably will pick up the top SKU this time as A750 was a bit more of a evaluation purchase and now that Intel has won me over I happily pay a bit more next time.

    • @kolyagreen1566
      @kolyagreen1566 11 місяців тому

      Are you sure about your “update”? I mean 1070ti and A750 are Almost equal.

    • @EuphoricHardStyleZ
      @EuphoricHardStyleZ 11 місяців тому

      @@kolyagreen1566 definitely not from my experience, I did actual benchmarks, it showed between 25 - 75% improvements over 1070 Ti depending on game and what we compare (minimum, average or max fps), in the 25% case I was more CPU limited in max FPS in FC6 where minimum FPS showed a 49 - 58% difference again. Kingdom Come Deliverance showed the biggest gain, 70~75% difference.

    • @kolyagreen1566
      @kolyagreen1566 11 місяців тому

      @@EuphoricHardStyleZ I just saw tests and it's not inspiring. 25% is really small difference (okay 50%) for gpu Upgrade, considering that 1070ti is from 2017 year, okay 2018. Its marginal upgrade and not worth the money. But still, it can be okay upgrade if using old motherboard, CPU and so on.

    • @EuphoricHardStyleZ
      @EuphoricHardStyleZ 11 місяців тому +1

      ​@@kolyagreen1566 25% was a CPU limited case scenario, I just mentioned it to not sound biased, the typical difference I saw in my tests were between 45 ~ 65% which to me is equal to a great generation shift from Nvidia to another. I used to make those kinda upgrades all in the past before prices went crazy and I stopped doing upgrades every 1½ ~ 2 years but I guess that's subjective what's "worthy upgrade" but to me it definitely was, not to mention with the price of the sold 1070 Ti card, the upgrade was the cheapest GPU upgrade I've personally done, bought A750 for 269€ and sold 1070 Ti which was OLD from Nov 2018 for 120€. For me it's all about performance vs price, I've been Nvidia customer for my whole life (and had lots of great bang-for-buck cards such as GF4 Ti 4200, 7900 GTO, 8800 GT to mention a few) until I got the Intel card and probably had 20 or so Nvidia GPUs but I'm done with them with current pricing, neither AMD or Nvidia offers what I see as reasonable pricing atm.

    • @kolyagreen1566
      @kolyagreen1566 11 місяців тому

      @@EuphoricHardStyleZ It’s like “upgrading” from 4070 to 4070ti:D Okay, I got you bro! Thanks for telling the experience of intel Card. Maybe I will consider Intel card in future) sounds cool

  • @mrbabyhugh
    @mrbabyhugh 3 місяці тому +1

    5:30 the main thing is what you said at the beginning, these ARC cards are NOT made dedicated for gaming, they are productivity cards and this is why out of the gate, they performed well in production software (Adobe, Blender, not that much Davinchi) than they did with games. But as drivers update, we see the ARC catching up in games too. Battlemage is where the FUN will begin in the GPU industry!

  • @OG_Gamers_Live
    @OG_Gamers_Live Рік тому +6

    I've actually had issues with DaVinci Resolve Studio and some recording/ streaming with Nvidia this year on my RTX 3090 T.I Fe card but it usually happens when we get a new driver update for a new 40 series card and FYI you should see a LOT Better performance with a system that has resizable bar enabled. I'll let you know when my Arc A770 OC arrives next week. My current system is running an AMD Ryzen 5900x with Resizable Bar enabled. and your computer case looks an awful lot like a cooler master case I have an older build in :P

    • @perlichtman1562
      @perlichtman1562 Рік тому

      I’d be curious to hear about your experiences. I have been using an RTX 3060 12GB with Davinci Resolve Studio Free and Adobe Premiere Pro 23 and 24 Beta. I’m trying to decide between getting a license for Resolve Studio or moving to an A770 16GB (similar pricing) or possibly another video card, so your experiences comparing the A770 with the newest drivers to a 3090 Ti would be extremely helpful.
      Any chance of running the free Puget Bench benchmark for Resolve Studio for each? It’s been tricky finding benchmark results using the same system and just swapping the GPU for those cards.

  • @johnpaulbacon8320
    @johnpaulbacon8320 Рік тому +2

    Great video. Thanks for taking the time to make this and do such an extensive testing.

  • @VicViper2005
    @VicViper2005 8 місяців тому +1

    It’s phenomenal how far Intel graphics have come back like 10 years ago you couldn’t even get 30 fps on intel graphics but now on arc you can get fps in the 100s

  • @badmoose01
    @badmoose01 Рік тому +4

    It’s interesting that you chose to use the rx 6700 non XT for this comparison because that is the desktop gpu that is most analogous to the gpu in the PS5.

  • @drp111
    @drp111 Рік тому +2

    I own a bunch of these and use it in servers for industrial imaging. They do perform good and the compatibility with serious applications is also good out of the box.

  • @nickritz
    @nickritz Рік тому +6

    The a770 is now running pretty close to the 4060. The a770 is within 2%. The a770 occasionally outperforms the 4060 in some games. The only issue I’ve ran into with my predator bifrost a770 is the fact that it doesn’t allow me to change my system refresh rate to max out my monitor (making a custom refresh rate). It’s just not as clean of a process as the Nvidia way of making a custom resolution. I have a 165hz monitor and windows only lets me go to 144hz. I got a custom piece of software to do make a custom one but I was only able to push it up to 155hz for some reason. I quit trying to get those 10hz more because tbh I was satisfied with 155hz for the most part. Overall I haven’t experienced any bugs or glitches with the card. On mostly medium settings with epic view distance on fortnite I’m getting crazy fps up into the 350’s usually resting around the 290 fps on average when looking into the distance. Overall pretty stable experience. Also this is the only gpu that costs $350 or less that offers decent rgb lighting (predator bifrost a770 OC by acer)

    • @AshenTech
      @AshenTech Рік тому +1

      hdmi afik is limited to 144hz, display port you may need to install the monitor driver, once i did that mine defaults to 165hz 10bit rather then 60hz 8bit... same issue i have with nvidia so..l. probably windows more then gpu drivers honestly...

    • @OG_Gamers_Live
      @OG_Gamers_Live Рік тому +1

      and you can get the ASRock Phantom Gaming Arc A770 16GB with RGB on Newegg right now for 309.99 plus tax with the current 10.00 discount applied

  • @grimwald
    @grimwald Рік тому +6

    I don't see Nvidia staying 76% of the market share honestly, especially with how poorly the 4000s series are doing. The real question is who will gain the most out of that - Intel or AMD?

  • @mr.guillotine3766
    @mr.guillotine3766 Рік тому +2

    I'm hoping they get those bugs sorted out. I do need to upgrade, especially since I'm disabled and the only way I have to make enough $ to buy groceries is through using my computer for certain things (not really GPU related most of the time, but I've done some freelance photography/editing here and there), so I need to keep my computer up and running, and it's currently running a 1650 Super. I've considered Intel, but I need whatever I buy to just work. Even though I can troubleshoot a bit, I often lack the energy to do so...my pain levels put me in a place where I don't have the patience to sit here and force my computer to do the job it should just do automatically, so consistent bugs are a dealbreaker for me. Still, I like how much they have improved and if they can get most of it sorted out by Battlemage, then maybe I can consider one then...maybe I'll just find a cheap 6650xt that will be "good enough" for 2 or 3 years and see if any of these companies will care about consumers by then and release a decent card at a reasonable price? idk..thanks for the video.

  • @Ben24-7
    @Ben24-7 Рік тому +1

    Nice video , nice calm and relaxed explanations on your experience with the a770 , unlike a lot of other videos like warching the auctioneer on storage wars , waffling on at 100 miles an hour haha .

  • @anasevi9456
    @anasevi9456 Рік тому +2

    Great video and Secondary has an A770 16GB and is used as a typical single display gaming only machine . One driver didn't like indie unity engine games (crashes); but besides a few visual bugs last year, that's it for issues. Played countless dozen games on it with zero issues, old games with DXVK also run great. XeSS is functionally the same as DLSS2 when used on ARC gpus. Daily a RTX 3080 so I know. If considering ARC; (performance has improved ALOT even in the last 6 months): 1. Motherboard under 6 years old. 2. Bios under 2 years old. 3. Modern monitor too.
    Going by support threads, most having issues are trying to use pro software, or have thrift shop monitors and/or old DDR3 era computers.

  • @zedorda1337
    @zedorda1337 Рік тому +18

    I can not express just how happy and excited I was when I heard Intel was getting into the gaming GPU space. It was a long time coming and I had been expecting it for over a decade. They will become the third leg of a truely competitive market finally.

  • @marxmaiale9981
    @marxmaiale9981 Рік тому +3

    Have been using an a380 since launch. The November driver updates fixed the last of my issues in use and a more recent update for the annoying driver console launch needing administrator permissions.
    What will make or break Arc is 3rd party software support. Programs that exclusively use nVidia/cuda for speed enhancements will continue to perpetuate nVidia dominance in the productivity arena. I expect more support for Arc features once the tGpu elements start reaching developers in meteor lake laptops. The abundance of availability it a great incentive to incorporate features.

  • @spindragonkiller
    @spindragonkiller Рік тому +1

    thanks vex for the insight, time, and dedication. followed your vids on av1 for a while and now this moment and subbed! as a proud owner of a750 I salute you. 😊 and Brian for lending the a770 for testing!
    NB:I recorded & exported 5 hr. ish long gameplay 1080p 60hz AV1 6K BitR around 2 hrs. ❤ not sure if its too long but pretty decent to me.

  • @igavinwood
    @igavinwood Рік тому +7

    Good job in working through an entire month and recording your experience Vex.
    I'd argue that AMD are actually providing the greatest competitive push to upscaling with their Open Source approach, as it can be used by every GPU, meaning that the propreitry software of nVidia and Intel has to be maintained and developed to stay meaningful.
    So if you think the GPU market is bad now, imagine how worse it would be without AMD pushing an upscaler that can be used by everone. That's why I don't like all the "dumping" on FSR, not because of it's performance, or lack of, but because of it's pressure to keep the propreitry software more "honest" and the option for it to be developed outside of AMD and across all platforms of games because it's Open Source. Of course AMD could also go propreitry with FSR and tie it to their CPU development. I'm glad they haven't, but if people keep dumping on FSR without seeing the bigger picture I can see a point where they say screw it, let's bring it in-house and tie it down as propreitry software, as we're not getting any marketing benefit by keeping it Open Source.
    As for the blocking of other upscalers on their sponsored game titles, I haven't seen any developer come out saying that's true, but I've seen reports where some say it isn't. Poor communication as always with AMD. Let's see how true or not it may be. At least they are not doing another nVidia manipulation with their AIB partnership programme v2 shennanigans.
    What always baffled me with ARC was that DX is an Intel tech, so how could they screw it up so badly at the start?

    • @phoenixrising4995
      @phoenixrising4995 Рік тому +6

      XeSS is also open source, Ngreedia's DLSS isn't, but at least they're not locking out other tech like AMD. Seems like both AMD and Nvidia are going the same crap direction for the value priced GPU market (sub-$500 at this point). I hope Intel can solidify as a player in the GPU space, otherwise get ready for more 4060 or 4060Tis, but $150 more expensive at launch.

    • @vishnuviswanathan4763
      @vishnuviswanathan4763 Рік тому

      when people say "FSR looks bad" ya of course it does it doesn't have dedicated hardware and it is nothing but black magic some time it comes close to DLSS

    • @phoenixrising4995
      @phoenixrising4995 Рік тому +2

      @@vishnuviswanathan4763 FSR is just slightly more advance version of fractional upscaling. Nothing more, nothing less.

    • @Mcnooblet
      @Mcnooblet Рік тому

      @@vishnuviswanathan4763 Which is why it can be open source. Ngreedia somehow gets crapped on for hardware acceleration, and that somehow makes them "greedy" since it can't be open source as it isn't purely software based. Nvidia with tensor cores = bad to many people, make it work on AMD with no tensor cores!

  • @dracer35
    @dracer35 Рік тому +4

    I bought the A770 LE 16gb at launch just because I was curious, wanted to play around with it for fun and personally I think its the best looking GPU on the market right now. Since then I've swapped it in and out a couple of times. The very first drivers were pretty bad but I noticed it getting better and better with the newer drivers. The last time I pulled my 4090 (that I bought used for $200 under MSRP) out and dropped in the A770 in about a month ago. Obviously there is a huge performance difference in the numbers. But in realty, it has been doing what I need it to do and I haven't felt like I needed to go back to my 4090 to have a good time. That being said, Its worthwhile to note that I don't use my GPU for anything other than gaming so I cant attest to streaming or editing workloads. Running triple monitors. Two 32" 1440p 144Hz monitors and a third 1080p monitor for watching casual videos while I'm gaming. I like my A770 and I plan to try Battlemage when it comes out also.

    • @johndc2998
      @johndc2998 8 місяців тому

      Should I buy a A770 or wait for battlemage? Or just get a 6700XT on sale for 100$ less than a770.

    • @dracer35
      @dracer35 8 місяців тому

      @@johndc2998 As much as I like the A770 as an enthusiast, if you just want something to play games right now, id probably recommend the 6700xt if it's really $100 less than the A770. I know I will be buying battlemage but I buy them because I enjoy playing with different hardware for fun.

  • @Dangeruss
    @Dangeruss Рік тому +7

    I can't wait to see what Battlemage brings to the table. I'm definitely switching to Intel. I want Intel to do well. I've always wanted an all Intel PC!

    • @VintageCR
      @VintageCR Рік тому +1

      Battlemage's "entry" level card is going to have close to 4080 performance, entry level meaning, their lower-end battlemage card.
      the price is going to be around 450 to 550.

    • @Dangeruss
      @Dangeruss Рік тому

      @@VintageCR Are you serious I'v been holding out on a new build for years! I'm still on my intel 5960x and 1080Ti.

    • @horus666tube
      @horus666tube Рік тому +1

      @@VintageCR i am not big into tech news for quite some time alrdy, but what you are stating there sounds like dreaming ... :D especially for that price range

  • @RedRandy
    @RedRandy Рік тому +2

    Thx for the 30 Day Experience man. Intel has been doing great with there GPU Drivers. Hopefully they could be the next Big Bang for the buck GPU’s

  • @zalin2000
    @zalin2000 Рік тому +1

    I built a new high end gaming pc 6 months ago. I sank so much money into it I got tired of spending so I bought a A770 to use until I bought a 4090. I got my 4090 coming today, but I used the arc for about 6 or 7 months and I loved it. The card was great for the price point! And I would have used it for much longer if it was able to handle my ultra wide screen 240hertz. I love the design and I will keep it around because I love this little champ.

  • @flintfrommother3gaming
    @flintfrommother3gaming Рік тому +3

    ARC A770 has 16 GB of VRAM, atleast the limited edition one which I think will still be made by third party manufacturers (unless Nvidia forces them to stop creating more Arc GPU's, ALLEGEDLY.) ARC A750 has 8 GB.
    The choice is not so black and white, that extra 8 GB of VRAM in Arc A770 is something to consider if you play AAA games nowadays.
    I personally lean more to the side of Arc A770 since they basically give out the extra VRAM for free considering what Nvidia did with the RTX 4060 Ti.

    • @gorrumKnight
      @gorrumKnight Рік тому +2

      The extra VRAM is definitely nice. I bought the A770 & it's been solid af at 1440p. And that's with the Linux gaming tax.

  • @mRibbons
    @mRibbons Рік тому +12

    Intel is gonna do whats called a "Ryzen move". Solid in value, but lagging behind. Then close the distance and refine over time.

    • @redslate
      @redslate Рік тому +1

      Nah. First Gen Ryzen was a flop. It wasn't until the Second Gen that Ryzen offered an alternative consideration. Intel's already offering a competitive product with their First Gen ARC GPUs.

    • @redslate
      @redslate Рік тому

      @bon3d3th Objectively, it offered far less than Intel's offerings at comparable pricing. I agree that it helped lay the groundwork to eventually supersede Intel in later Gens.

    • @mirceastan4618
      @mirceastan4618 Рік тому +1

      @@redslate Not really, Ryzen 1600 destroyed i5 7400 at the same price especially in terms of longevity.

    • @redslate
      @redslate Рік тому

      @bon3d3th Short of early adopters and diehard AMD fans, once performance metrics were widely available, practically nobody wanted Ryzen Gen 1. Their products had to be deeply discounted to move units.
      It wasn't until Zen+ (Gen 2 and 3) when Ryzen began to become appealing. This really 'woke up' Intel, forcing them to resume R&D, as they were essentially just re-releasing Skylake year-after-year, with a stopgap of: add 2 more cores... okay, add two _more_ cores...
      -Skylake 6xxx
      -Skylake _II_ (Kaby Lake) 7xxx
      -Skylake with extra cores (Coffee Lake) 8xxx
      -Skylake with extra cores _II_ (Coffee Lake II) 9xxx
      -Skylake with _even_ _more_ cores (Comet Lake) 10xxx

    • @redslate
      @redslate Рік тому

      @mirceastan4618 The 7400 ($182 MSRP) competed with the 1500X ($189 MSRP) at launch, not the 1600 ($220 MSRP), which was in a completely different product/price tier; again, lending itself to my point, "[AMD's Zen] products had to be deeply discounted to move units."

  • @DIvadxat
    @DIvadxat Рік тому +2

    Good video, Intel definitely have an opportunity for gains in the GPU space provided they continue the driver improvements and software updates.
    Specifically I can confirm that both Halo MCC and Halo infinite launch issues have been fixed with the latest beta driver update. To be noted that some games like Halo infinite are poorly optimised by the developers 343.

  • @anarchicnerd666
    @anarchicnerd666 Рік тому +5

    It's awesome to see Intel doing shockingly well on their first attempt with graphics cards! Unfortunately, it's not for me - I'm a games preservation enthusiast, and while DXVK is a lifesaver, it's a band aid at best. It gets you more stability, but the 1% lows are still a problem. It's tempting tho. ESPECIALLY the encoder, wonder if running an A380 for just the encoding alongside another GPU is worth it? Probably not given the headaches with conflicting graphics drivers, but still, cool to think about.
    Awesome video Vex :) Keep it up!

    • @brugj03
      @brugj03 Рік тому +1

      Shockingly well is something completely different in my book.
      Just acceptable maybe if you`re not picky.

    • @anarchicnerd666
      @anarchicnerd666 Рік тому +6

      @@brugj03 AV1 encoding, superior ray tracing performance, machine learning based upscaling, a 256 bit bus and competitive value aren't enough for you? You're hard to please :P

    • @redslate
      @redslate Рік тому +4

      ​@brugj03 As a competent first Gen product offering competitive price : performance with substantial regular improvements,
      "shockingly well" is a more than apt description.

    • @brugj03
      @brugj03 Рік тому +2

      @@anarchicnerd666 No.
      Unreliable in games, artifacts in games, different behaviour on different systems, terrible software support an unusable in productivity.
      And i`m not even being too picky.
      How about that.....

    • @brugj03
      @brugj03 Рік тому +1

      @@redslate If you say so......
      It seems you don`t care about your gaming experience, cheap and good on paper is all that matters.
      I rest my case.

  • @MGTEKNS
    @MGTEKNS Рік тому +2

    I was torn over getting the intel arc 770 over the 4060. I'm not using it to game but rather encode, decode, and transcode AV1 as soon as programs like Handbrake once its supported. I know OBS does but I want to transcode all my media to AV1 like movies and shows to save on hard drive space. I have a 3080 Ti for gaming so the 8GB model 4060 doesn't bother me.

  • @prince.1033
    @prince.1033 Рік тому +2

    5min before I watched your amd GPU video, & searched vex Intel arc, & u uploaded before 10min 😂

  • @Daffmeister187
    @Daffmeister187 Рік тому

    15:15 if the screen goes black, use the shortcut 'Win + Ctrl + Shift + B' to reset the video driver instead of having to reset/power off.

  • @LewisConley
    @LewisConley Рік тому

    15:38 This is totally because of the resizable bar. I thought it was on because it was for my rx580, but the a770 required a bios update. Played games at 4k60 just fine without it, but literally could not record anything. Since updating the mb, it has been capturing the best 4k60 hdr i have ever seen

  • @maksiu11
    @maksiu11 Рік тому

    Glad somebody finally put together a video like this. Keeping my fingers crossed for inte gpul lineup

  • @FhtagnCthulhu
    @FhtagnCthulhu Рік тому +1

    I've been running an A750 since shortly after launch launch and it has been hit or miss for me. You see, I am running linux (Fedora), and that has complicated things. When the cards were new, things were kind of tricky to get running, but that has been cleared up. Performance is good in Linux with the open source drivers, and they are actively developed. My main issue has been that a key feature required to run some DX12 games (sparse residency) is not implemented. This makes an ever growing list of modern games unrunnable. They are looking to sunset the current driver and move to a freshly written one that should support this feature, but that is in development still and as far as I know that driver will not support HuC for DG2 cards.
    I got the card as a novelty, to tinker with and because I cannot believe how much new AMD and Nvidea cards cost, so I am not really disappointed. However, it is not as clean a story as the improvements to windows drivers. Performance updates keep coming through.

  • @Nelthalin
    @Nelthalin Рік тому +1

    That 980 Pro is also not stretching its legs on a B350 and will be limited in certain situations. My 990 Pro still performs better than the 970 Evo Plus in a 3.0 4x slot so its useful but the 4.0 slot makes the raw throughput 2x as fast. IOPS went up about 20-25%.
    Good to see ARC is doing quite ok. I wonder what battle mage will do and if they still dare to release a GPU the size of the A770.
    Because lets not forget the Die size and power usage difference between the A770 and the RX6700 or better yet its fully enabled RX6700 XT brother.
    406mm2 vs only 335mm2! thats a major difference in production cost. And the RX6700XT and RX6750 XT are quite a bit faster than the RX6700.
    The RX6700 is using about 15-20% less power compared to the A770 so in the long run it will be even cheaper because of the electricity bill.
    So yes its not bad but there are some things that still need improvement.

  • @R3TR0J4N
    @R3TR0J4N 9 місяців тому

    i appreciate the journalism work documenting this Vex! it means a lot for the PC community. this wont go unnoticed

  • @MrSmitheroons
    @MrSmitheroons Рік тому +2

    Very interesting and thought provoking review!
    I think two things will matter the most for intel soon, and it's not this first gen... It'll be with Battlemage.
    - One, the performance has to be there. That is the majority of the beginning, middle and end of the conversation for GPUs. (Related is it has to be decent price for the performance, but maybe that's obvious.) A new gen with new chips is a chance to change the performance numbers in a big way, and they should use it.
    (Sidebar: Maybe it's dumb since so few people buy top end, but I think they need a "halo" product to compete with 4070, 4080 for performance. Just to show they can. People naively or wishful-thinking transfer their feelings about the halo product onto the lower-end products. That's why they need it, IMO. It's irrational, but that's marketing for you.)
    - The other big thing is, they're still proving themselves for quality and eyes will be on them, whether fair or not (it's a new product line, I think it's justified). Quality needs to keep going up up up for the drivers and software ecosystem integration. If they can nail power efficiency, heat, low idle consumption, rock solid encoder smoothness, least crashes possible, all the software integrations like OBS, DaVinci, Premiere, etc, all the smaller things, that will cover more bases of "quality."
    Bonus for intel would be more XeSS adoption, since it seems to be keeping up okay with DLSS (not a trivial thing!!).
    Something to look out for: AMD has invested a lot in AI chips the past few years. Will they get an AI solution baked in and finally use it for a better upscaler??? Can they use it to enhance their encoders??? Catch up to NVidia????!?!?!? Time will tell for AMD's next (or next-next) gen...
    Lastly, on a bit if a tangent, something people aren't talking about much: Arc (actually Xe) low-power variant is going to be the new iGPUs. Will intel's iGPUs suddenly be competing with AMD's APUs? Outperforming them? Or does the lower wattage and die size (read: cut down perf and features?) make them not exciting? The potential there is really really huge IMO, but the potential for a let down is equally as big. It could just be a light bump over UHD iGPUs. I'm personally very excited to see what happens with the iGPUs in Metor Lake and then Arrow Lake (2024), Lunar Lake, etc...
    Lots to think about, and I hope it shakes up the GPU space for the better.

  • @Voltic_Playz
    @Voltic_Playz 11 місяців тому

    i bought my first pc with i3-12100F + Arc A750 just 4 months ago and the Driver Updates are making my editing Flow so much faster!!!, i was concerned about my decision and that i should've gone for the RTX 3060 since their drivers are more stable but boy i dont regret having this slick baby running with a stock cooler in this hot atmosphere rendering my 40GB project within 8 minutes!!!, and btw if you use The AV1 it'll shrink the size to 10GB but i love to go the casual OBS way.

  • @Alalpog
    @Alalpog Рік тому +1

    Thank you for showing me what GPU to buy next.
    I know you're gonna be massive one day.
    Like really massive.

  • @narwhal4304
    @narwhal4304 Рік тому +4

    As a 6700 owner, I might look towards Battlemage as my next upgrade given RX 7000 and RTX 4000 don't offer good value. And honestly, I don't think RX 8000 and RTX 5000 will be any better given the GPU market right now.

    • @johndc2998
      @johndc2998 8 місяців тому

      Debating between 6700xt 12gb 330$, arc a770 16gb 430$ or waiting for battlemage :/ what do you think? I've compared hardware specs but need real insight

  • @ElJosher
    @ElJosher Рік тому +9

    I’d like to see emulation benchmark compared to other gpus.

    • @saricubra2867
      @saricubra2867 Рік тому

      Emulators are basically 100% CPU bottlenecked.

    • @Nicc93
      @Nicc93 Рік тому

      @@saricubra2867 checkout SomeOrdinaryGamers video where he turned a single 4090 into 32 gpus. You can basicly create multiple pcs in one and each can have its own gpu

    • @Nicc93
      @Nicc93 Рік тому

      @@saricubra2867 there are some pretty heavy emulators out there and virtualization takes it to a next level :)

    • @saricubra2867
      @saricubra2867 Рік тому

      @@Nicc93 I only have Intel UHD 770, playing Smash Bros at 1080p with 8 ice climbers lvl 9 CPUs, i have a 100% CPU bottleneck with one of my cores but it's perfect and locked 60fps with a flat 16.67 ms frametime, not a single stutter, meanwhile a REAL Nintendo Switch tanks reaching 40fps.
      It's funny when techtubers talk crap about gaming fps and nothing of frametimes.
      "4090 into 32 GPUs"
      That's nice, but if you use a 4090 for emulation, it's a huge waste of sand.
      Virtualization adds another massive layer of bottlenecks or CPU overhead. A really BAD idea.
      A 4090 makes your mods and PC ports look better at absurd resolutions and settings, but for a potato 2015 phone hardware like the Switch has, it's just dumb.

    • @saricubra2867
      @saricubra2867 Рік тому

      @@Nicc93 You can run every emulator with top notch perfomance with a Ryzen 9 7900X (finally, a proper Ryzen desktop gen with integrated graphics), Core i7-12700K as the minimum alone without dedicated graphics cards.
      Anything below will be worse specially for shader compilation (5900X kinda is the exception, but single core perfomance is worse by a significant margin like 25% or more).

  • @l.i.archer5379
    @l.i.archer5379 Рік тому

    I picked up the ASRock Phantom Gaming Arc A770 16GB graphics card for $299.99 (pre-tax) from MicroCenter a week ago. I played Halo MCC and Infinite on my 5800X PC, and both games ran flawlessly. The drivers are getting better and better, and I predict this 16GB card will be on par with an RTX 4070 in the near future, and at half the price. I can't wait to see what BattleMage brings.

  • @OlavAlexanderMjelde
    @OlavAlexanderMjelde Рік тому +1

    Problem in Norway is that the 16Gb 770 costs 595 USD.
    8 GB version costs like 350 USD of the 770. The 750 8Gb costs also 350 USD.
    For the 16GB version you can in Norway get better cards from AMD and Nvidia.
    I was looking into building ARC rig, but its not priced right in Norway I think.

    • @chev9632
      @chev9632 Рік тому

      It's the same problem I had in Brazil, I would jump from a GTX 980 to an ARC 770 but the price is not competitive.

  • @ScottJWaldron
    @ScottJWaldron Рік тому +1

    Interesting, great job on the video! Probably not building a new system anytime soon but nice to see there could be a third viable option by then. Though any hard locks or maybe even blue screens can be concerning. With GPUs I'm especially interested in encode and decode of video which I do more than gaming. Still on a 1060 6gb so probably won't touch anything under 12GB whenever the time comes. 😄 We will also see if Battle Mage has a good price to performance ratio like their first generation. Arc seems like a special case and its 256 bit memory bus and 16GB version.

  • @Felix00007
    @Felix00007 Рік тому +3

    4060 vs a770?

  • @PinkyTech
    @PinkyTech 11 місяців тому +1

    I recently switched to the A750 and I am really enjoying AV1 encoding for streaming and recording.

  • @Tyrelguitarist
    @Tyrelguitarist Рік тому +3

    Thank you for your work. Your effort is paramount in getting the information to the people. Looking forward to the future of the GPU market. Battlemage gonna change the game. Nvidia already allegedly mad about it. And thank you Brian for providing the product.

  • @snowyowlnugget
    @snowyowlnugget 9 місяців тому

    I believe that crash on 6/22 mentioned at 12:40 is a windows thing, my windows install used to do that a LOT before I wiped it, and what you seem to be describing seems like exactly what happens when explorer.exe crashes without being restarted. The taskbar will go away, but I'm not sure what the background thing is about. However when that happens any windows you have open at the moment will stay open, but you won't be able to do anything related to the task bar until you restart.

  • @hw8991
    @hw8991 11 місяців тому

    Oddly enough, if you look at the upscaling comparisons in this video FSR is closer to the source than XeSS here and there. Especially the vertical text on the building at 6:18 in the video, which is pretty much butchered by XeSS, but not by FSR. Surprising to me. I have an A770 so obviously I want the hardware ML solution to be better - full stop. So there is work to be done here to improve XeSS, just like the rest of the software on top of the hardware. At the same time though, DLSS also introduces artifacts here and there even though they have had an enormous head start over the other teams. And the XeSS implementation, like DLSS, is subject to each individual game's implementation of it, so the results will be more or less advantageous, as in over / under, a good quality regular upscaler.
    Anyways, purchasing my A770 is still much more exciting for me than any other graphics card I've had, since it's the origin of a completely new product and the progression it has and will have; I'm continuously looking forward to seeing the next advancement.

  • @TheDoomerBlox
    @TheDoomerBlox Рік тому

    16:21 - holy smokes you've had a decent experience with Arc on PCIE3 that's crazy man, that's the highlight of this video

  • @BluesFanUK_
    @BluesFanUK_ Рік тому +3

    Intel could save their reputation by disrupting the GPU market after years of fleecing people in the CPU market.

  • @mikeclardy5689
    @mikeclardy5689 Рік тому +2

    intel is starting to look good. Though it seems they targeted the low end spectrum of gpus to enter into the market, if they fix their performance on some older games they could easily take the king of the low end title. cause you are right, they packed allot of good stuff into this card. it just needs better drivers to get it going at a steady pace.
    Battlemage is their actual serious gpu, i think the arc was a test run that they can still collect allot of useful data from. im kinda excited to see it, 4080 performance at a price lower than the 7900xtx... sounds amazing. it will crush me a bit considering i have a 4080... but hey, it means more pc gamers and upgraded veterans. also game devs can target higher specs if the baseline specs of users improves. besides, nvidia forced ray tracing to be a thing, and now everyone needs good performance in it. more competition will bring healthy innovations towards that and probably more tricks on scaling. or simply cutting resource usage down (which would help lower end cards keep up).

    • @Zxanonblade
      @Zxanonblade Рік тому +1

      Their whole gpu brand is arc, this first gen was called Alchemist. Next gen will be *Arc* Battlemage.

  • @tony001212
    @tony001212 9 місяців тому +1

    I don't understands why are people complaining, if you take into account this cards are first gen products I Think intel did prety well including funtionalities like upscalling and raytracing, things that took the competence years to implement, looking fordward to Battle mage.

  • @J0ttaD
    @J0ttaD Рік тому +7

    Vex could u try the game called "Control" I swear to got AMD is broken in that game I keep getting insane 1fps for like 5 seconds randomly. Could u check if intel does this? Also the RTX and details in that game is stunning, would like to see intel vs amd comparison on it. Cheers.

    • @daedalduo
      @daedalduo Рік тому +3

      I second this idea. I want to see Control on an RX 6000 series GPU.

    • @maddymankepovich1796
      @maddymankepovich1796 Рік тому +3

      i tried it on rx6600xt and 6800xt. In terms of performance it works fine, but the game was broken on dx12 when i was playing

    • @daedalduo
      @daedalduo Рік тому +1

      @@maddymankepovich1796 ah i have a 6700 xt and things seems off even tho i was getting 100+ fps without rt and 70fps with rt medium. I might try playing on dx11 instead of dx12. Thanks!

    • @raresmacovei8382
      @raresmacovei8382 Рік тому

      Don't use RIS + ReBar on AMD GPUs in general. Use one or the other, it's a driver bug with leaking VRAM.
      Played Control earlier this year at 1080p120 with FSR2 Balanced on a 5700 XT. Without ReBr. Game ran perfectly fine.
      Just use the HDR+FSR2 mod in Control y'all

    • @NipahAllDay
      @NipahAllDay Рік тому +2

      Worked fine when I used to own a 6900 xt and a 6950 xt. It also works fine with my 7900 xtx.

  • @iseeu-fp9po
    @iseeu-fp9po Рік тому

    Good job, young man! You're 21 and already making videos this good. Keep it up!

  • @technicalfool
    @technicalfool Рік тому +1

    Depends on your budget, but have you considered getting an external capture box? Slap a v30 or v60 SD card in it, plug it in like a second monitor and you get 4K fluidly-recorded loveliness that you can then do whatever you want with, regardless of graphics card or drivers.
    I use an Elgato 4K60 S+, but there's other, possibly cheaper options.

  • @danielfernandezaguirre
    @danielfernandezaguirre Рік тому +8

    I used only Nvidia gpus for years, then I switched to amd with the 6950xt and my experience has been great and stability is rock solid (i only update to recommended driver versions). Last year I built a second pc with an rtx 4080 and I would say Nvidia and amd are on par on terms of experience. Of course Nvidia is better for ray tracing and media work but i don't work on media, I'm a developer and amd is better for my work. I would give intel a chance but not now. I would wait at least 2 more generations to do that.

    • @WheresPoochie
      @WheresPoochie Рік тому

      Developer in what regard? I do video and audio editing and am trying to pick between a 7900 XTX and a RTX 4080 for my 4K60hz monitor.

  • @CRBarchager
    @CRBarchager Рік тому +1

    Very good presentaion. Laying it out there for us to see. Looking forward to see that Intel can come up with in the future. It's really looks like they are trying.

  • @Davidium84
    @Davidium84 Рік тому +1

    I really enjoy your content my friend! You're spot on with basically everything you say and I agree that quality AI-upscaling are the future as-well as more advanced tech like hardware based acceleration in ray tracing. I like what Intel is doing for sure!! Keep the great content coming and I will absolutely keep watching ^^

  • @jvidia
    @jvidia Рік тому

    Vex, first time I write a comment to you ... 1st of all .... great channel mate!!! A breath of fresh air in the IT YT channels. All other channels have more "bling" but they are almost all biased in some way or form. Keep the good work mate and please put your case in a table beside you ;)

  • @bighersh14
    @bighersh14 Рік тому +1

    I actually had the a750 until I upgraded to a 6950xt and Intel is getting so good so fast that I think it won’t be long until they actually put some pressure on amd to make good products.

  • @DanDanNoodls
    @DanDanNoodls 12 днів тому

    I’m excited for the value that battlemage may bring to the market. I might go Intel GPU when those release. 👀

  • @mtaufiqn5040
    @mtaufiqn5040 Рік тому +2

    i can see it that you would married your own gpu based on your thumbnail

  • @TonyC300
    @TonyC300 Рік тому

    Glad to see someone is thorough and non-biased when it comes to reviews. It's a breath of fresh air that you're not dull (Linus) either. Reminds me of Gamers Nexus. Great Job, Subscribed!

  • @theHardwareBench
    @theHardwareBench Рік тому +1

    My a770 doesn’t seem any worse really than any other card so far. It does occasionally crash on moto gp 19 but I haven’t played it with anything else so might not be cards fault. It’s Rock Solid in War Thunder pinned at 120fps max setting 1080p.

  • @R3TR0J4N
    @R3TR0J4N 9 місяців тому

    sounds like the right time to get one when it comes to concerns about its drivers

  • @averagesnailgodpraiser3840
    @averagesnailgodpraiser3840 3 місяці тому

    The fact that those GPUs have dropped by $50-100 now is insane too

  • @R3TR0J4N
    @R3TR0J4N 9 місяців тому

    W Brian what a GIGACHAD for loaning the 770, without Brian there wont be this video

  • @akippnn
    @akippnn 5 місяців тому

    Note that DXVK will always be used by default on Linux so that's why it's also gaining traction within the Linux communities.

  • @hansolo5291
    @hansolo5291 Рік тому

    Thank you so much this is so help full, I'm a big AMD fan boy been using them for ever started with the old ATI so over 30 years, but yyppeee i just bought a new ARC a770 16gb. So again thank so much.

  • @rangermatthijs1740
    @rangermatthijs1740 Рік тому +1

    I bought the A770, because i wanted to support Intel to increase the competition in the market. I also wanted a cheap but decent 2nd GPU for work (trading) and multimedia (netflix, hbo etc.). I must say, after installing the GPU, never had a problem. But, that being said, I am not a intense user.

  • @Adept_Austin
    @Adept_Austin Рік тому

    The Halo MCC problems are the singular reason I won't even consider an Intel graphics card at the moment. My group of friends don't have great hardware to game on, but that's a game everyone can run and everyone enjoys.

  • @TheReal_ist
    @TheReal_ist Рік тому

    mmmmm that short military cut bro...
    does wonders indeed

  • @OpsMasterWoods
    @OpsMasterWoods Рік тому +4

    You should also test minecraft java with RT (Seus PTGI) looks much better than mc bedrock with rt

  • @AjrAlves
    @AjrAlves Рік тому

    I have the A750 in a PCIe 4.0 system, the encoder issue is much less common, but still happens sometimes.
    BUT, I have less (basically zero in the games I play) performance and stutter issues in PCIe 4.0 vs 3.0
    FYI: paid 199 USD in a ASRock A750

  • @nenadcaric5195
    @nenadcaric5195 Рік тому +1

    Dude! My mind blown when I heard that you use it on gen3... i remember that on gen 3 intel was absolutely terrible. Like much worse than with gen4 and bar turned on. Good job intel. Bring the competition!

  • @awesomstuff
    @awesomstuff Місяць тому

    I use a A310 in my dedicated stream PC and I can confirm: don't use ARC GPUs if your PC doesn't support Resizable Bar. Really...don't. So far i'm also very happy with the work performance of this little precious ❤

  • @alexxbellisle1080
    @alexxbellisle1080 Рік тому

    when upgrade my GPU ... i had choice between a 3060 TI and intel 770 but i got not fast enought and all intel got sold out but ... like a couple of month i got driver update isues.... A LOTS... in AA title like MW2, Fortnite, Escape from Tarkov, i got somes DLSS isues like crash out FREEZ and BLUE screen.... i got a moment of multiple monts a .01% low drop to like under 12 FPS on SOMES games... like if NOTHING to do with a RTX 3060 ti ... at a moment i put it away and get better performance on 1050TI not on top FPS but in 1% low is wath i'm loking at... everybody should not care the 400FPS if the 1%los is like under 25 game still shuttering anyway Thanks for you video!! for the evolution of the card!! the mind of the company evolution !! AND for all the time you take to make all this work !!!

  • @obi-wankenobi1190
    @obi-wankenobi1190 Рік тому

    Hey nicely done yea I too Want to pick up an ARC770 LE some day nice collectors item as well :) Many thanks for sharing this with us :)

  • @roasthunter
    @roasthunter 11 місяців тому

    I've been using the iGPU on my i7-10700k for a few months now after selling all my GPU, I haven't missed gaming. I keep thinking about a new GPU and an A770 8GB can be had for about $250 here.

  • @neoRen1
    @neoRen1 Рік тому

    I've played a few titles w/ a770-8g so far being satisfied with all its performance, however GTAonline is the only game that sucks on an arc as it crashes very frequently. Two things I've found it inconvenient with an arc is that having a multi-monitor setup makes it even more unstable in some games, plus the lack of Shadowplay-like recording feature, which I've ended up using Xbox game bar that makes the gpu double unstable. Im genuine looking forward to Intel competing in the gaming market, with their upcoming Battlemage (& Celestial, and Druid).

  • @mistermystery819
    @mistermystery819 9 місяців тому +1

    I wonder why they don't just use Intel HD Drivers for better Dx9 and Dx10 performance since I used to game on a Uhd630 and it was really good for Dx9 and dx10.

  • @Shepard_Mango
    @Shepard_Mango 4 місяці тому

    I understand why they would target the medium to low price range in terms of performance, but I’d really like if they made a couple of high end cards. In the future as I would like to see what battlemage has to offer