Props to you for showcasing that you actually own these GPUs and they're installed in the system before showing the benchmarks, so many false benchmarks out there are claiming to own the actual hardware
I agree, it’s more convincing to see something physically shown rather than just stats. I know many channels that don’t show GPU showcases or specs but do benchmarks, and I know they’re real, but for those who don’t, they might seem fake. I wish everyone would show gpu showcase or specs in their videos so viewers know. Fake benchmarks personally annoy me too, not just because they’re fake but because they overshadow those with real GPUs. P.S. I follow your channel, I like your content, big regards!
@@edwardbenchmarks Being convincing isn't the same as being real. Instead of trying to "look" convincing, be competent. Leave no serious room for doubt instead of doing these suspicious videos. Since you are going through the minor trouble of showing the GPUs, you might as well say something like "hello, today we are going to be testing the 4060 and the new Intel Arc... as you can see, I have them both right here, now let me plug them into our test system, which is an xyz machine, I'll show you". If you can afford these GPUs and a system to put them in, then you certainly can afford a basic microphone. You supposedly have a camera, and you supposedly spent hours recording the benchmarks(which is a lot longer than the few minutes to prove that the hard work is actually meaningful). I also don't understand why channels like yours always give us lazy static shots instead of putting effort into proving the programs are identifying the video card. It's very easy to paste a screenshot of GPU-Z from Google Images onto a still picture to "prove" the GPU is installed. The way it is here it's a start, but look at channels like zWORMz Gaming and you will see how you leave no room for doubt while making sure your channel looks professional and grows.
@@dantemeriere5890 It’s clear that you haven’t been following me. I never leave room for doubt and I’ve always shown more than just GPU-Z. I rarely show the physical card, but I do show it. GPU-Z, Task Manager, GeForce Experience, etc., are usually always included. I have a mic, and 90% of my videos are recorded with it. This particular video is one where I didn’t have time to record voice or any commentary because I was short on time. Plus, I already did that in the previous video, where I tested 1080p instead of 1440p, which is in this video: ua-cam.com/video/8AG9oIqpmbM/v-deo.html This is how I record. I’m not Pedro, and I know kryzzp and his zWORMz Gaming channel. I don’t do this professionally or for incomem, it’s just a hobby I work on in my free time. It’s simple. Just a benchmark, RAW Benchmarks. Posting a screenshot of GPU-Z from Google with the latest driver and everything else to make it look legit is very difficult, incredibly dumb, and practically impossible in 4K resolution, it would look fake. Additionally, I’ve shown the cards, and even in the games, the GPU name is displayed when I show the settings. So I don’t leave room for doubt. I record everything in detail, and I’m not looking for advice from others. I appreciate reasonable and non arrogant suggestions. You didn’t take a lot into account and you watched one video of mine and wrote a comment as if you’ve known the channel for 10 years and as if we’re best friends. Ofc this is an open platform where you can comment on whatever you want, and so can I. Peace ✌
@@edwardbenchmarks I'm not talking like we're best friends, I'm talking like we're sworn enemies. I'm fairly certain I've come across your channel many times. Thing is, your presentation is fairly generic and that makes you look like one of the one million fake benchmark channels out there. I'm not saying this to annoy you. I'm saying that the videos where you speak make you look like a completely different channel, to the point I never associated the two and grouped the videos without your speaking with all the generic fake benchmark channels out there. I understand that this would sound unfair, but in a river of fake crap and click baits you start to create these quick associations to save time. I wrote the comment like I've known the channel for 10 years because I have. Not your channel in particular, but all the hundreds of similar-looking channels where the guy doesn't speak and just dumps some benchmarks with dubious numbers. You can't blame me for grouping you together when you are deliberately presenting your channel, or at least some of your videos, in the same vein. If you don't take care to not look like them then you can't complain when you are treated like them. Also, changing the name of the GPU inside the game isn't really anything hard to fake, you could just steal footage, which I'm convinced some do. If you mean Riva Tuner's overlay, that's even easier to fake. But while I might be annoying and verbose, I'm not unjust. You've convinced me that you put more effort into your videos than they do and that your benchmarks are probably not fake. I stand corrected, you do have a microphone and you do speak in your videos. So I apologize if my tone offended you. Good luck on your endeavours.
yeah its up to Intel and AMD now, they just have to make sure theyre Product good for RTX n Multimedia, then they can advertise that so Casul Market will switch to them
@@HaveYouHeardOfTheHighElves. i don't think they will do that. Amd is mainly for gaming but I thought Intel would be a productivity + gaming card. But nope. Intel's gpu has been out there for a while so no hope from me
@@Duronto07 This is pretty outdated information. Cuda isn't nearly the moat that it used to be. Why do you think you can't do rendering on Intel or AMD GPUs? You can even do AI inference easily now on Intel and AMD GPUs. There are many backends that support Intel GPUs for inference and training. It isn't as widespread as Nvidia but it is very much viable. I don't believe you have actually used these cards for these purposes and you are repeating outdated information.
@rgbplague7834 so you are saying I can use after effects and Blender to do editing and 3d and intel/amd gpus will still be get their power utilized like nvidia?
It runs out of vram in 1440p but at 1080p it runs really well. Also be careful with your purchase because nvidia is buying all those games starting with Indiana Jones where i had like 30 more fps than b580.
@@danielmachac4764100% Bro im upgrading my gpu just because of indiana jones game So i dont know should i get the better card overall or should i get a nvidia card while its more expensive and worse just because for few games and indiana jones game
@@alvarowab1274 actually my first option was rx 6750 xt before intel b580 released but due to power supply i should go with intel but obviously rx 6700 xt is better but i wanna ask you do u have problems with it at indiana jones and i have 550w power supply do you think it will make problems if i buy rx 6750 xt
@@Alosman_6 minimum is 650w and above this is the reason I'm not buying Rx 6800 I gotta upgrade my PSU and then it will be way costlier for me and B580 is out of stock
I think that's not strange, it's an absolutely normal occurrence between GPU brands, even AMD and NVIDIA have some differences, but Intel has the most. The second series of graphics cards is Intel's B580, and I think it's quite okay, in my opinion much better than the ARC A series, everything looks much more stable.
@@DarthVadercc yeah ik that, but still seeing a 10-15 fps difference between two cards of different brands just because of drivers and compatibility is just... ik maybe that's normal...
@geralt_silverhand crazy right I went with an amd build when Intel was at the top in cpu but I trusted ryzen. Ist gen then 3rd gen and now I'm upgrading gpu and going with intel How the tables turn. And I definitely think amd abandoned budget gamers Which wasn't the case back then
People who said lower memory bus dozen affect performance prolly be eating dust now. Intel gives more VRAM because it going to use all of it. Still better than being left unused. I hope with 5060 they implement some sort of extreme VRAM compression to justify that 8GB. 🤣🥴
nope in terms of gaming they're pretty similar, im not that impressed especially when it comes to 3d rendering " blender or some editing videos softwares, it sucks
Bruhh see there is soo much stutters in intel side we can see every frame lagging there now I feel like intel has payed these youtubers for a good review and made bota to coment good if you need good budget gpu go for amd it's safer choice
@@youtubeforfun rx 7600 sucks even compared to the 4060 and the b580, I bought my rtx 3070 for 210 and it even beats the 4060 Ti which is just baffling.
интел поздно выпустила эту карту, 4060 уже продалась миллионами и занимает первые места в стиме и никто в здравом уме ее менять на б580 не будет. Интересно будет сравнить ее с 5060, особенно если выпустят 8 гб версию, вот тогда у нвидиа есть все шансы обосраться.
What time? I want to check it too, since I had some issues with a capture card. That’s the reason I added frametime. Tag me with the time when that stutter happened, if you can, ofc.
@@edwardbenchmarks 3:11, the 1% low is dropping crazy, also its not only cyberpunk, I saw some tester try some games like CONTROL its dropping the 1% low like crazy.. probably driver issue
@@amrulhaqqizulkarnain7373 I just checked, and 1% is usually around 50, which is normal. 0.1% is about the minimum, around 12 fps. I’m not sure if that’s a big deal, I didn’t notice anything special. It’s usually around 20-30 for 0.1%.
@@edwardbenchmarks okay here is the timestamp i got black myth wukong -> 1:47 1:53 2:01 2:07 2:11 cyberpunk -> 3:11 3:14 i made the timestamp seconds earlier before stuttering happened (edit it's hard to timestamp the actual stuttering, srry :")
Great review looks like it was tested with the older Intel driver version. New driver was just released be interesting to see if any performance difference.
Did the RTX 4060 11:23 in the Horizon Forbidden West test really run with FSR 3.1 on Quality and Frame Gen on against the Arc B580 without help of AI upscaling? This would be insane for the RTX 4060 to really need this to compete.. I also noticed that in Red Dead Redemption 16:00 your test results for the RTX 4060 only were better with FSR 3 on native AA and that the Arc B580 had no upscaling to help and got behind. Was this setting on for both cards at all test or did you forget to remove this from another video of yours?
Edward i am getting constant stuttering in FC 4 on my RX 6500XT 8gb on pcie 3.0. I tested gtx 1650 in x4 mode in bios and it did not stutter even a bit. Can you test that game on your 6500xt 4gb on pcie 3.0 and tell me the results ?
Hmm, which CPU do you have exactly? You know, for the 6500 XT, you’ll need something overkill to handle that GPU because the GPU chip is much stronger than the 1650, but it has only x4 lanes. Bandwidth could be an issue if you have PCIe 3.0, especially with something like a Ryzen 3 or slower. So, which CPU do you have?
@@blanklife2025that and Intels still bad at ue 5 titles it looks like they are leaving performance on the table and in some other games gpu utilisation isn't 100 percent.
@@edwardbenchmarksBruhh see there is soo much stutters in intel side we can see every frame lagging there now I feel like intel has payed these youtubers for a good review and made bota to coment good if you need good budget gpu go for amd it's safer choice
In the first game at 1:11 look at the 0.1% lows of the RTX 4060 the VRAM is full and had to spill it to the system ram it has a huge spike in the frame time graf. 12GB is a must have in 1440p even though the average is similar.
По энергопотреблению виден запас производительности у B580 минимум 25%. С выходом драйверов производительность во многих проектах еще увеличится. 4060 - оверпрайснутый жалкий огрызок...
The B580 will age so well with optimisations. On paper it has a lot more than 4060 so come on intel optimise those drivers and take the gpu crown it’s for the taking.
Intel's new card has sexy design, lower offial price(in fact 4060 has been out for a while and the price should be the same). At the same time B580 more often shows better performance (if its techologies are not inferior to DLSS 3). But what I noticed in this video is that the colors are a bit different in games and I like the colors better on the 4060(starting with the RDR tests, the colors have gotten better on Intel). Also in some games I noticed micro-lags(freeze's) on the 580, it's like it stutters in places. Other than those things, it does look more attractive.
I was just finished building my kids PC and when it comes to graphics cards I went with the arc b580 only because it was neck and neck and the fact that b580 has 12gb gddr6 so plenty of ram
I have a 3070 TI 8 GB, which is a lot faster than a 4060, but it still crashed out from the game because it ran out of VRAM while only playing 1080p. Many users argue that this is only a medium card, and 8 GB is enough, but adding VRAM to 10 GB or 12 GB I don't think will increase $100 like their 16 GB version. Then I have no idea why Nvidia still refuses to do that. Let's see what happens to the 5060 series... I hope what Intel does will open Nvidia's eye like their logo.
This is the budget card for me... For h.265 10bit 422 codec on Davinci Resolve Studio, this card even beats 4070 Ref: Puget Benchmark And yeah, driver update will make this card even better 😅
From what i see, Intel should be able to beat RTX 4060 on every game, but only held back by their drivers. As you can see, Games where Intel has more FPS consumes more power like Cyberpunk or The Witcher (120-130W) than games where intel underperforms like in Silent Hill 2 (100W). There are also some games where Intel can't utilize 100% of their GPU. Since Nvidia drivers are considered as the most stable out of three main GPU makers, Intel still has a lot of headroom to improve their GPU drivers. Yeah, Intel clearly win this one.
OK, I'M PLANNING TO BUILD MY FIRST EVER HIGH END PC ( COMING FROM i5 2500 and gt 520 ) Should i wait for 5060Ti or should i buy ARC b580 ? ALSO I'm GONNA BUY i5 14500, and 32GB DDR5 6000 MHz
5060 ti is gonna be almost twice the price of b580 but also much better performance and there's no such thing as high end PC with b580, not even 5060ti
5060 ti will likely have more memory will be to the tune of least 30-40 percent fast with Nvidia near perfect software but will likely be close to 2x the price if in your region the intel one is over priced get the ngreedia card if not buy the intel one it will be much cheaper and has decent performance.
@@venvox8008 bro, 5060 will have 8 gb vram💀. Better buy b580 while you still can. But if you're wishing to play on 1440p, I recommend you to buy a higher end amd card
Currently where i live 4060 is cheaper. And it's more reliable, in future if they decrease the cost of 580 and bring drivers for all the games then i would surely pick 580. For now my trust on intel is same as trust on teachers that they won't punish us if we tell the truth 😢
1060-6 GTA 5 NVE FREE MOD and DLLS mod 1080p 6GB VRAM. Type fit girl GTA 5 platinum NVE. You can't even use NV VRAM hungry advertised tech RT 2 FG 2 PT 4GB on 4060-8. There is 4060ti16 450$ cheapest 4k card.
Yah if only The Intel card can't be found for under 420€ and a 4060 is 309. If the extra VRAM is worth it for you go for it but it's in no way some super deal.
Just coz of inconsistency of intel i can't fully go with it ik the card is powerful but in some games it's losing to 4060 just coz of drivers hope intel patches it up nd destroy it's competitor in every single scenario
Props to you for showcasing that you actually own these GPUs and they're installed in the system before showing the benchmarks, so many false benchmarks out there are claiming to own the actual hardware
I agree, it’s more convincing to see something physically shown rather than just stats. I know many channels that don’t show GPU showcases or specs but do benchmarks, and I know they’re real, but for those who don’t, they might seem fake. I wish everyone would show gpu showcase or specs in their videos so viewers know. Fake benchmarks personally annoy me too, not just because they’re fake but because they overshadow those with real GPUs. P.S. I follow your channel, I like your content, big regards!
@@edwardbenchmarks Being convincing isn't the same as being real. Instead of trying to "look" convincing, be competent. Leave no serious room for doubt instead of doing these suspicious videos. Since you are going through the minor trouble of showing the GPUs, you might as well say something like "hello, today we are going to be testing the 4060 and the new Intel Arc... as you can see, I have them both right here, now let me plug them into our test system, which is an xyz machine, I'll show you". If you can afford these GPUs and a system to put them in, then you certainly can afford a basic microphone. You supposedly have a camera, and you supposedly spent hours recording the benchmarks(which is a lot longer than the few minutes to prove that the hard work is actually meaningful).
I also don't understand why channels like yours always give us lazy static shots instead of putting effort into proving the programs are identifying the video card. It's very easy to paste a screenshot of GPU-Z from Google Images onto a still picture to "prove" the GPU is installed. The way it is here it's a start, but look at channels like zWORMz Gaming and you will see how you leave no room for doubt while making sure your channel looks professional and grows.
@@dantemeriere5890 It’s clear that you haven’t been following me.
I never leave room for doubt and I’ve always shown more than just GPU-Z. I rarely show the physical card, but I do show it. GPU-Z, Task Manager, GeForce Experience, etc., are usually always included. I have a mic, and 90% of my videos are recorded with it. This particular video is one where I didn’t have time to record voice or any commentary because I was short on time. Plus, I already did that in the previous video, where I tested 1080p instead of 1440p, which is in this video: ua-cam.com/video/8AG9oIqpmbM/v-deo.html
This is how I record. I’m not Pedro, and I know kryzzp and his zWORMz Gaming channel.
I don’t do this professionally or for incomem, it’s just a hobby I work on in my free time. It’s simple. Just a benchmark, RAW Benchmarks.
Posting a screenshot of GPU-Z from Google with the latest driver and everything else to make it look legit is very difficult, incredibly dumb, and practically impossible in 4K resolution, it would look fake. Additionally, I’ve shown the cards, and even in the games, the GPU name is displayed when I show the settings. So I don’t leave room for doubt.
I record everything in detail, and I’m not looking for advice from others. I appreciate reasonable and non arrogant suggestions. You didn’t take a lot into account and you watched one video of mine and wrote a comment as if you’ve known the channel for 10 years and as if we’re best friends.
Ofc this is an open platform where you can comment on whatever you want, and so can I. Peace ✌
@@edwardbenchmarks I'm not talking like we're best friends, I'm talking like we're sworn enemies. I'm fairly certain I've come across your channel many times. Thing is, your presentation is fairly generic and that makes you look like one of the one million fake benchmark channels out there. I'm not saying this to annoy you. I'm saying that the videos where you speak make you look like a completely different channel, to the point I never associated the two and grouped the videos without your speaking with all the generic fake benchmark channels out there. I understand that this would sound unfair, but in a river of fake crap and click baits you start to create these quick associations to save time.
I wrote the comment like I've known the channel for 10 years because I have. Not your channel in particular, but all the hundreds of similar-looking channels where the guy doesn't speak and just dumps some benchmarks with dubious numbers. You can't blame me for grouping you together when you are deliberately presenting your channel, or at least some of your videos, in the same vein. If you don't take care to not look like them then you can't complain when you are treated like them. Also, changing the name of the GPU inside the game isn't really anything hard to fake, you could just steal footage, which I'm convinced some do. If you mean Riva Tuner's overlay, that's even easier to fake.
But while I might be annoying and verbose, I'm not unjust. You've convinced me that you put more effort into your videos than they do and that your benchmarks are probably not fake. I stand corrected, you do have a microphone and you do speak in your videos. So I apologize if my tone offended you. Good luck on your endeavours.
Nvidia monopoly must end
yeah but like if you are into productivity + gaming like editing/rendering/3d edits then there is no choice but Nvidia cause of their cuda cores
yeah its up to Intel and AMD now, they just have to make sure theyre Product good for RTX n Multimedia, then they can advertise that so Casul Market will switch to them
@@HaveYouHeardOfTheHighElves. i don't think they will do that. Amd is mainly for gaming but I thought Intel would be a productivity + gaming card. But nope. Intel's gpu has been out there for a while so no hope from me
@@Duronto07 This is pretty outdated information. Cuda isn't nearly the moat that it used to be. Why do you think you can't do rendering on Intel or AMD GPUs? You can even do AI inference easily now on Intel and AMD GPUs. There are many backends that support Intel GPUs for inference and training. It isn't as widespread as Nvidia but it is very much viable.
I don't believe you have actually used these cards for these purposes and you are repeating outdated information.
@rgbplague7834 so you are saying I can use after effects and Blender to do editing and 3d and intel/amd gpus will still be get their power utilized like nvidia?
Hail to the new budget king B580, indiana jones black screen on 4060 🤣
It runs out of vram in 1440p but at 1080p it runs really well.
Also be careful with your purchase because nvidia is buying all those games starting with Indiana Jones where i had like 30 more fps than b580.
@@danielmachac4764100%
Bro im upgrading my gpu just because of indiana jones game
So i dont know should i get the better card overall or should i get a nvidia card while its more expensive and worse just because for few games and indiana jones game
@@Alosman_6 i just got a 6700xt, better than these 2 and cheaper 😂
@@alvarowab1274 actually my first option was rx 6750 xt before intel b580 released but due to power supply i should go with intel but obviously rx 6700 xt is better but i wanna ask you do u have problems with it at indiana jones and i have 550w power supply do you think it will make problems if i buy rx 6750 xt
@@Alosman_6 minimum is 650w and above this is the reason I'm not buying Rx 6800 I gotta upgrade my PSU and then it will be way costlier for me and B580 is out of stock
What is crazy about this card truly is the power efficiency. Putting up these numbers at 100-120 watts. Seems like a fantastic card for budget gaming.
12:18 😂 wasn’t expecting that
Yeah, that's actually hilarious 😁
Nvidia supremacy they said. 🤣🥴
out of vram, even as 1080p 4060 still dont doing great, even those 3060 12gb doing better better than 4060 while 3060 raw power weaker
"Emotional Damage"
Strange that some games work better on RTX 4060 and some work better on Arc B580...
I think that's not strange, it's an absolutely normal occurrence between GPU brands, even AMD and NVIDIA have some differences, but Intel has the most. The second series of graphics cards is Intel's B580, and I think it's quite okay, in my opinion much better than the ARC A series, everything looks much more stable.
due to driver mate !
it's not weird, it's driver related. b580 will destroy 4060 when driver support for these games comes.
@@DarthVadercc yeah ik that, but still seeing a 10-15 fps difference between two cards of different brands just because of drivers and compatibility is just... ik maybe that's normal...
It's also because nvidia is buying all those games to make them run better on their cards...
I probably will upgrade from amd rx 580 to intel b580 in a month or two
I hope till then optimized drivers are released and prices don't inflate
From red to blue
@geralt_silverhand crazy right
I went with an amd build when Intel was at the top in cpu but I trusted ryzen. Ist gen then 3rd gen and now I'm upgrading gpu and going with intel
How the tables turn.
And I definitely think amd abandoned budget gamers
Which wasn't the case back then
@@qamararfin7534 yep, 4060 and 7600 both are piece of sh...
prices are already inflated right now
@@kanuhthese tariffs will make you look back on PC prices today
People who said lower memory bus dozen affect performance prolly be eating dust now. Intel gives more VRAM because it going to use all of it. Still better than being left unused. I hope with 5060 they implement some sort of extreme VRAM compression to justify that 8GB. 🤣🥴
b580 looks like the 4060 we wanted and were expecting.
I hope people buy it. intel needs the win.
nope in terms of gaming they're pretty similar, im not that impressed especially when it comes to 3d rendering " blender or some editing videos softwares, it sucks
Is there any video out for b580 testing with blender @@burnedmemory
no DLSS
@@burnedmemory who is editing on budget cards?
I buyed it im waiting for it 😊
hate to tell this... in certain countries specifically in southeast asia, the price is around USD$310-330
its gonna go down in a few months. give it time.
Would have liked to see Upscaling FPS comparisons as well, if possible.
Nice vid!
10:17 The game is stuttering on the B580 but the 1%Low is fine ???
у меня на 3060 так же тормозит, а 1% нормальный
12:06 this is the one real reason to buy the b580 and same with rdr2 and cyberpunk, including ray tracing, that is also better on the b580.
Bruhh see there is soo much stutters in intel side we can see every frame lagging there now I feel like intel has payed these youtubers for a good review and made bota to coment good if you need good budget gpu go for amd it's safer choice
@@youtubeforfun rx 7600 sucks even compared to the 4060 and the b580, I bought my rtx 3070 for 210 and it even beats the 4060 Ti which is just baffling.
интел поздно выпустила эту карту, 4060 уже продалась миллионами и занимает первые места в стиме и никто в здравом уме ее менять на б580 не будет. Интересно будет сравнить ее с 5060, особенно если выпустят 8 гб версию, вот тогда у нвидиа есть все шансы обосраться.
It's the 3060 who ranks first not 4060
@@younesouldrabah5588скоро обойдёт😂
is it me or you guys notice some stuttering in black myth and cyber punk (arc b580)? Is it driver issue?
What time? I want to check it too, since I had some issues with a capture card. That’s the reason I added frametime. Tag me with the time when that stutter happened, if you can, ofc.
@@edwardbenchmarks 3:11, the 1% low is dropping crazy, also its not only cyberpunk, I saw some tester try some games like CONTROL its dropping the 1% low like crazy.. probably driver issue
@@amrulhaqqizulkarnain7373 I just checked, and 1% is usually around 50, which is normal. 0.1% is about the minimum, around 12 fps. I’m not sure if that’s a big deal, I didn’t notice anything special. It’s usually around 20-30 for 0.1%.
@@edwardbenchmarks okay here is the timestamp i got
black myth wukong -> 1:47 1:53 2:01 2:07 2:11
cyberpunk -> 3:11 3:14
i made the timestamp seconds earlier before stuttering happened (edit it's hard to timestamp the actual stuttering, srry :")
How is the idle power consumption 😅 hope its not 40 watts like the prev gen
Is driver issue solved? Are drivers stable? Should I buy it over 4060?
1) Mostly. 2) Yes. 3) Yes.
100% buy this over 4060 and 7600 8gb cards
It is no brainer
Yes, easily B580 over 4060.
Get a 6700xt
24:13 why is rtx 4060 image so trash here? There is a huge difference
Great review looks like it was tested with the older Intel driver version. New driver was just released be interesting to see if any performance difference.
This video was recorded a couple of hours before release. Anyway, as planned, I will test it again.
Did the RTX 4060 11:23 in the Horizon Forbidden West test really run with FSR 3.1 on Quality and Frame Gen on against the Arc B580 without help of AI upscaling? This would be insane for the RTX 4060 to really need this to compete.. I also noticed that in Red Dead Redemption 16:00 your test results for the RTX 4060 only were better with FSR 3 on native AA and that the Arc B580 had no upscaling to help and got behind. Was this setting on for both cards at all test or did you forget to remove this from another video of yours?
No, both use fsr3.1 Q + frame gen
is last of us stuttering in intel? btw please do one with 7600 and in 1080p
Good power consumption on Arc,
Edward i am getting constant stuttering in FC 4 on my RX 6500XT 8gb on pcie 3.0. I tested gtx 1650 in x4 mode in bios and it did not stutter even a bit. Can you test that game on your 6500xt 4gb on pcie 3.0 and tell me the results ?
Hmm, which CPU do you have exactly? You know, for the 6500 XT, you’ll need something overkill to handle that GPU because the GPU chip is much stronger than the 1650, but it has only x4 lanes. Bandwidth could be an issue if you have PCIe 3.0, especially with something like a Ryzen 3 or slower. So, which CPU do you have?
@edwardbenchmarks r3 3300X. The gpu clocks drop to 40MHz when the stutters happen.
@@flakmonkey6796 what is your power supply? also could be 3300x issue.
@@xfr0st585 CX 550. Chipset driver issue ? I am on 5.08 amd b450 chipset driver windows 10 22h2
@@edwardbenchmarks6500 xt is 8gb or 4gb vram
You can buy 6500 xt itx 8gb
4060 still has its edge on DLSS3 with Frame Generation. It will take some times for games to adopt Intel's XeSS2
4060 is pure trash.
Nuh uh
Мусор ты
in previous arc b580 video ghost of tsushima was not working maybe they fixed it in driver
If you look closely, RTX has better textures but Intel is doing a good job here. Hope they can compete with AMD and bring graphics card price down
intel got more contrast and sharpening which is kinda bad but not very noticeable
@@blanklife2025 это особенность кодека интел, картинка везде одинаковая
@@blanklife2025that and Intels still bad at ue 5 titles it looks like they are leaving performance on the table and in some other games gpu utilisation isn't 100 percent.
what's the price
Cuando sale oficialmente?
Can you make a video oc the b580 im really curious how much performance it will gain
how the hell is it that RDR1 runs WORSE than RDR2?! WUT
Different game engine. APIs too.
@@edwardbenchmarksBruhh see there is soo much stutters in intel side we can see every frame lagging there now I feel like intel has payed these youtubers for a good review and made bota to coment good if you need good budget gpu go for amd it's safer choice
In the first game at 1:11 look at the 0.1% lows of the RTX 4060 the VRAM is full and had to spill it to the system ram it has a huge spike in the frame time graf. 12GB is a must have in 1440p even though the average is similar.
Overwatch 2 & marvel rivals benchmark 🙏pls
You guys think with the upcoming drivers B580s performance will improve even more? Seriously considering buying the intel.
Can you test it with loss less scaling?
Looking to buy this exact gpu. Can't wait for restock.
Why b580 freezes
Yeah why
Game developers haven't optimised their softwares for XeSS2 yet
Optimization and updated card driver will make this card even better
thanks bro for the beanchmark ,
can it runs on 4k?
ive noticed the colors/contrast look better on the b580 too
По энергопотреблению виден запас производительности у B580 минимум 25%. С выходом драйверов производительность во многих проектах еще увеличится. 4060 - оверпрайснутый жалкий огрызок...
Потому что 4060 - это по сути переименованная 4050ti с ценником в $300
4060 is not a 1440p gpu
I tested 1080p first here ua-cam.com/video/8AG9oIqpmbM/v-deo.html
But people asked me for 1440p
The B580 will age so well with optimisations. On paper it has a lot more than 4060 so come on intel optimise those drivers and take the gpu crown it’s for the taking.
Intel's new card has sexy design, lower offial price(in fact 4060 has been out for a while and the price should be the same). At the same time B580 more often shows better performance (if its techologies are not inferior to DLSS 3). But what I noticed in this video is that the colors are a bit different in games and I like the colors better on the 4060(starting with the RDR tests, the colors have gotten better on Intel). Also in some games I noticed micro-lags(freeze's) on the 580, it's like it stutters in places. Other than those things, it does look more attractive.
I was just finished building my kids PC and when it comes to graphics cards I went with the arc b580 only because it was neck and neck and the fact that b580 has 12gb gddr6 so plenty of ram
I have a 3070 TI 8 GB, which is a lot faster than a 4060, but it still crashed out from the game because it ran out of VRAM while only playing 1080p.
Many users argue that this is only a medium card, and 8 GB is enough, but adding VRAM to 10 GB or 12 GB I don't think will increase $100 like their 16 GB version. Then I have no idea why Nvidia still refuses to do that. Let's see what happens to the 5060 series...
I hope what Intel does will open Nvidia's eye like their logo.
bro can you tell me how to install last of us part 2 in shadps4 emulator
I mean it runs cooler and less watts.....so long term this card will pay its self off.
Anyone test the high- end card in engine game like Unreal??
This is the budget card for me...
For h.265 10bit 422 codec on Davinci Resolve Studio, this card even beats 4070
Ref: Puget Benchmark
And yeah, driver update will make this card even better 😅
From what i see, Intel should be able to beat RTX 4060 on every game, but only held back by their drivers. As you can see, Games where Intel has more FPS consumes more power like Cyberpunk or The Witcher (120-130W) than games where intel underperforms like in Silent Hill 2 (100W). There are also some games where Intel can't utilize 100% of their GPU. Since Nvidia drivers are considered as the most stable out of three main GPU makers, Intel still has a lot of headroom to improve their GPU drivers.
Yeah, Intel clearly win this one.
Bro B580 is the new budget king.
I still love my RX 6600 but this new intel GPU is the new king, power efficient, 12GB VRAM at the price of what 250$?
rtx 5060 gpus also rumored to have 8 gigs of vram, guess they never learn
Can you share the performance of ghost recon wildlands please.
b580 83% in ghost of tsushima? damn still alot on the table there.
12:17 that's why RTX 4060 will always be a meme
The funny thing is tdp for b580 is 190w
But its using less power than rtx 4060 nd providing better performance 🎉
Techpowerup says that arc b580 is 1% better in performance than my rtx 2080 Super, I'm depressed now)))
OK, I'M PLANNING TO BUILD MY FIRST EVER HIGH END PC ( COMING FROM i5 2500 and gt 520 )
Should i wait for 5060Ti or should i buy ARC b580 ?
ALSO I'm GONNA BUY i5 14500, and 32GB DDR5 6000 MHz
5060 ti is gonna be almost twice the price of b580 but also much better performance and there's no such thing as high end PC with b580, not even 5060ti
@@mudzibaba BRO I THINK YOU DID'NT UNDERSTOOD MY COMMENT :D
I have a pc with i5 2500, and a frikin gt 520
@@mudzibaba YEAH I KNOW A HIGH END PC CONSISTS OF 4070+ GPUs
BUT YOU KNOW, rtx 5060ti or amd's rx 8600xt will be heaven for me
5060 ti will likely have more memory will be to the tune of least 30-40 percent fast with Nvidia near perfect software but will likely be close to 2x the price if in your region the intel one is over priced get the ngreedia card if not buy the intel one it will be much cheaper and has decent performance.
@@venvox8008 bro, 5060 will have 8 gb vram💀. Better buy b580 while you still can. But if you're wishing to play on 1440p, I recommend you to buy a higher end amd card
Good luck finding it right now though
they are in stock in Europe
Currently where i live 4060 is cheaper. And it's more reliable, in future if they decrease the cost of 580 and bring drivers for all the games then i would surely pick 580. For now my trust on intel is same as trust on teachers that they won't punish us if we tell the truth 😢
Too bad there's no way to get this GPU here in Mexico. Practically no one sells it. 😂😂
Intel shouldn't pull out from graphics card back in the 90s with their 9xx series.
nVIDIA monopoly must end.
Poor people do not buy 5090
You're wrong, RTX 4060 and ARC B580 are good.
What a legend, thank you
B580 win in 8/19 games in fps wtf Intel did man
wish we had a b590
The answer is obvious. Time to buy the RTX 4060🤣
Is the end of Nvidia Card with high cost and Vram limitations, the 5060 with 8 Vram it will be the worst launch of NVIDIA of Hisrory
Damn man B580 is amazing
I will wait when this B580 end. June 2025 maybe I will get one to replace my 5700xt
Wow I like the new Intel Card
Where is Devil may cry 5 😢
Next time add pubg also
intel much more expensive than nvidia in my country's market.
Indiana Jones 0fps😂 reminds me of a certain video of ngreedia
unsaleable in Europe between 350 and 400 eur 🤣🤣🤣🤣
Soo close but Rx 6700xt is way better
ben gta 5 oynuyorum bu yüzden rtx 4060 alırdım
1060-6 GTA 5 NVE FREE MOD and DLLS mod 1080p 6GB VRAM. Type fit girl GTA 5 platinum NVE. You can't even use NV VRAM hungry advertised tech RT 2 FG 2 PT 4GB on 4060-8. There is 4060ti16 450$ cheapest 4k card.
B580 is stuttery
Fortnite ??? 😢
Yah if only
The Intel card can't be found for under 420€ and a 4060 is 309.
If the extra VRAM is worth it for you go for it but it's in no way some super deal.
Intel doesn't have volume of the situation doesn't improve then b580 will become a disaster as it's available only for 400 💰.
do pubg please
Please 4K DLSS FG 🙏 GTA 5 FREE NVE MOOD, DLSS MOD, SPACE MARINE 2, 🕷️ REMASTERED DLSS ENABLER MOD - FG AUTO. Show this is CHEAPEST 4k card 😃🙏🙏🙏😃
4060 is cheaper than B580 here...
also can u test on MSGBO2 ? its a free game the really demanded gpu in some map with a lot of water
Just coz of inconsistency of intel i can't fully go with it ik the card is powerful but in some games it's losing to 4060 just coz of drivers hope intel patches it up nd destroy it's competitor in every single scenario
Crazy 63 fps in cyberpunk on the otherrhand 4060 give only 43🫨😵