No you are not because you tested this GPU already so you know what you know and what we don't know xD But seriously, I wish Intel the best because we need competition in this broken duopoly market more than ever.
Man congratulations!! getting a review sample is a big deal. I do not know if many are interested in this, but could you also mention idle power draw? normally it is always excluded from reviews but for some ppl (like me who keeps the gpu in idle 95% of the time) it has some weight on purchasing decisions. Cheers!
Arc -> Battlemage -> Chewbacca? (would be hard to get it) Chivalry? Cataclysm? But yes, those are really good. I get similar feeling as I had, as a kid, with Riva TNT or a Voodoo. This is pretty important, because let's image a kid, who asks parents for PC, - fast forward, in a shop: "we have PCs with RTX... RX and a Battlemage. I know what I would ask for if I was 9-12 again, and parents gave me a limit of 700-900$ for a new PC.
0:10 I added a timestamp so people would see this. He does not benchmark the card at all, he just yaps about the theoretical performance the entire time
Hilarious how you think he's the stupid one, but completely ignorant that even if he tested it, you cannot bypass the embargo and just release info before lauch without permission lol The fact they sent it to him early, means they are confident and expect to hear good results. Use your brain tik tok kid. 10 seconds and already understimulated. No wonder multiplayer games are so trash when they cater to tiktok losers while long, fleshed out, in depth single player games have been incredible. Bozo
Yo bruh....... I HOPE LIVE A GOOD LIFE MAN!! 🙏 IF U EVER NEED A KIDNEY OR A LIVER OR WHATEVER JUST TELL ME I'LL HAPPILY GIVE IT TO YOU!! I HOPE YOU FIND A 100 ON THE GROUND BRUH THANK YOUUUUU🙏🙏🙏🙏🙏
I have a RX 580 8gb and Going to a 4060/7600 with 8gb doesn't make sense because i have a 1440p moniter now. Crazy the 4060/7600 can do 1440p but the vram holds it back in the graphs here and hardware unboxed. No doubt the RTX 5060 and 8600 will have 12gbs and cost $329-350 making the B580 the only option. Having 12gbs, decent performance, and intels XeSS means 1440p gaming at $250 is finally here. Also 1440p high refresh moniters are below $250-300 now. intels claims 1440p being the new 1080p in a few years is believable. XeSS also uses AI like DLSS so it won't be a hit or miss like AMDs FSR. This card is exactly what i been waiting for. Its funny upgrading from a RX 580 to another 580 though
I loved my old rx580 lol, I ended up finding a sale sometime last year for an rx 6700xt for like $270 and man was that a fantastic card. I hope this new B580 can give that level of performance without all the issues their first release had. I just pray both Intel and AMD start making headway in the GPU market because DLSS aside, Nvidia hasn't exactly been making quality products for the entry level market recently.
Mostly I dont even watch the screen when a game is running, I just stare at my closed case and Imagine the beauty taking place within. TAKE MY MONEY INTEL
Ah, our boy Vex made it guys. He's now a reviewer, getting free stuff, now jokes aside, get back to work and start benckmarking that baby. We can't let the ozzies at Unbox beat you to publishing benchmarks.
Yeah, receiving review samples from intel is deffo a "I've made it" milestone. Now you gotta resist the pressure to become a part of their marketing arm and start shilling!
@@BOZ_11yeah man I am happy for him but I will have to see if he can keep doing what he did before or he revolves around the feet of companies and rolls us into buying a bad deal. For now he earned it and I hope he earns even more.
I appreciate your talking pace and tone. I like to listen to videos while I work and so many tech youtubers talk so fast that I can't really take in what they are saying unless I'm specifically trying to pay attention. You talk at just the right pace I can listen to what you are saying without getting distracted from my work tasks
The fact that many of the official charts talk about XeSS is a bit worrying, to be honest. Nowadays, GPU builders only swear by upscalers and framegen technologies...
"XI" in "Draw XI" stands for "Execute Indirect" which allows the GPU to manage and execute a series of rendering commands without constant oversight from the CPU. In other words, it's a technique that reduces the CPU bottleneck used in modern games. The A series didn't have XI support in hardware, and had to be emulated in software, incurring a major performance penalty.
Call of Duty: Black Ops 6 is one game that I know of which makes heavy use of execute indirect. It would be a good title to put into your test suite IMO
@@granityseis104 stalker 2 is soo poorly optimized though. even fucking 4080's are being bent to their knees for that dumb game that dosen't even have LIFE-AI, the thing that made stalker 1 such a timeless classic. now its just generic spawn in AI
To be fair it doesn't look like it's going to be anything faster than what we already have, and we've had very affordable GPUs for a while now. We just haven't seen a brand new launch be so affordable
We got a budget 12gb card from intel and the upcoming release of the midrange targeted 8th gen radeon cards coming out in a month from amd. nvidia better watch out!
I take way higher framerate and overall smoother performance with native 1080p any day over the slightly higher DPI image with 1440p. Doesn't matter what the monitor costs, the PC is the expensive part..
Proportion of what is being used now is not the same as proportion of what people want to buy now. People buying new GPUs at this moment probably would get 1440p if they are gonna buy a display in the current market.
High refresh rate 1440P monitors are cheap just a few years ago people were paying the same price for high refresh rate 1080P monitors. Building a PC to run 1440P at high frames is a different story.
i want Intel to build something to compete with the 5080! i rekon if this model does well they should compete! AMD stepped down on higher end GPU's i think INtel need to step up!
The one thing giving me pause right now is the fact that on a lot of these comparison charts, Intel is clearly leveraging XeSS a lot. Looking forward to the reviews with some real world data.
Agreed but also pretty hopeful as XeSS is already way better than any FSR. Intel main issue was driver support, which just gets ironed out with time and use data. Think imma build a budget PC with intel card and give it away for xmas
Yeah, and intel has better performance on XeSS than non-intel cards, it’s “technically” a fair comparison, but kinda sus. If the 4060 was on DLSS, it might be faster, although ever since XeSS 1.3, it typically get similar perf as DLSS
@@vextakes It sucks that we're the guinea pigs training the AI, but at this point it's unavoidable. Eventually it'll be the norm if it already isn't becoming it. My optimistic take is that hopefully this AI feature stuff gets exponentially better & better, and delivers the things they are promising. We are at the infancy of it, kinda scary lol
The first thing you showed in the video was the fact that you have the physical card in your possession, at no point did you mention that you will not be testing it in this video, and then proceeded to yap about theoretical performance for 25mins...
@@LaserVelociraptor I know, that's the reason im upset. I thought he may have some kind of deal to give us a sneak peek before that. I patiently watched half of the video believing that, then realized he doesn't and became disappointed.
I agree, this is very uncatchy clickbait and I think it's intentional. If he had said on the beginning that there will be no real tests on this video, 90% of viewers would have closed it.
As an AMD user I cant wait and hope this can compete with other mid-ranged GPUs in both performance and price because we really competition in the market
Competition in the GPU market doesn't exist, 4060 is king for the budget because of fram gen. It gives 70-100% better performance than 3060. That's why Nvidia has 90% market share in gaming gpus. Amd doesn't even make most of their money from gamers that's why they don't wanna compete with 4090 or 5090.
@@WHAT._ISPAT the 60 series is always way overpriced on launch and then comes down to reasonable prices 6 months after, which by then means the next generation is coming out. The 4060 is hardly a budget card.
@@ratnodeepbain7758 Rtx 4060 give 75-90 fps with RT Ultra Frame gen on cyberpunk meanwhile 3060ti with Dlss gives 28-55 fps. Similarly a lot of games run fantastically better with 4060 why are you criticising it even steam data says that 4060 is the new popular choice.
@@Lightning_aus utter stupid reason to become an AMD peasant. Rtx 4060 fucks rx 7600 from every angle with frame gen. It's not a software but a chip you pay for.
Tech Yes City did a really good job examining the benchmarks in detail, without doing his own for obvious reasons, and there were some in inconsistencies in some games. Basically Intel did fudge some of the numbers in certain games, giving them a clear win, when it in reality they numbers should have been less. Yall should go check that out. Great video as always my man!!!
There are no real benchmarks or gaming tests, only some Intel promotion stuff which is offcourse heaven. First real benchmarks, then seeing is believing, not sooner.
For those that haven't seen the latest MLID vid on Battlemage it is pretty much DOA, they are losing between $30-$60 a card so they will only release enough to burn through the stock required of the TSMC contract and then pull the plug. They have already fired HALF the GPU division so expect driver support to be poor and only last a couple years and after this Celestial will be iGPU only. You can thank now fired CEO Pat Gelsinger for this as TSMC had given them a sweetheart deal with 40% off the wafers then he made a crack about Taiwan in an interview which caused TSMC to revoke the discount so they are having to pay $23,000 per wafer instead of $13,800 which means there is no way they can even sell these at break even. That one crack was the final nail in the coffin for ARC sadly, but what can you expect from the CEO that canceled their new CPU design, sold so many divisions like Optane, all to chase an AI ship that had already sailed.
25:11 Can you in the future reveal something more about this feature? I am using RTSS to limit FPS in game and remove screen tear as I am on 60Hz display (TV). Will this driver level limiter be comparable (maybe easier/faster to use) to what we can get from RTSS?
You know what’s funny. The fact that literally since the announcement I’ve picturing you doing a review on them dressed the way you are. I’m so glad you get to give a look at the card I want first lol
If RT Remix software can also be used with Intel cards, then i will defenitely buy it. i really hope they finally open up the software and not block it for other cards anymore.
It isnt blocked on AMD cards to my knowledge. It shouldnt be blocked on intel. But maybe the cards themselves are the problem since they dont have directX 9 support, which is the main api used by remix
@@spike-the-red-king it isnt? I looked up the official list of supported cards on Nvideas site and they only listed their RTX cards. Thats why i thought they gatekeep the program for their own cards.
@@spike-the-red-king yes, and the games created in RTX Remix (like Portal RTX and Half-Life 1) can be played in AMD graphics card but you can't use RTX Remix functionalities in AMD cards in my knowledge
@@spike-the-red-king the game doesn't ever end up rendering in dx9, it is all translated in rtx remix, probably to dx12. you would never be able to do path tracing like that in dx9.
@@EduardoSantiagoDev It translates it to vulkan. Doesnt change the fact that the games that are supported in remix are DirectX 9, 8, and some 7. Im just saying there could be some issues only intel would have just because the games had to be translated even without remix to run
Hey if all else fails, they still dominate the console market at this point and even won the deal with Sony for the PS6 against Intel so that's a plus I guess
"you can change the hue on your display if you really want to trip on acid or something"" made me chuckle. Thanks for making me start the day with a smile
Honestly, one of the 3 companies needs to normalize 16 GB minimum card vram. It's easy enough to do, drop the speed of the vram to a gen prior, say gddr6 to gddr5 or maybe even create a mixed memory balancing algo so you can half and half it and give players more bang for less money on low end cards - you don't go ballistic ultra settings on your game as a result, but you also dont stutter due to every "AAA" studio sucking at optimization on a current gen GPU like with nvidia's entry level market
Its not really about the optimization, its the sheer scale of games that needs to be processed. Games are just way too complex to run smoothly on 2070.
I think they could easily fix this in two ways: make separate installs for quality settings (if you only want 1080p, you can remove 4k and 1440p resolutions) and increasing gbs to 12. 16 is kinda nutty tbh, I don't think we need that much until you get to high fps 1440p (Or unoptimized UE5 slop)
@@kebaba.k.atortilak2718 It is about optimisation, and he was talking about the 4060 not the 2070. There's no excuse for a game to run like ass at 1080p on a current gen card, especially when the result doesn't really look better and has artifacts all over the place.
$50 less for 7600 xt and 4060 performance is just not exciting since new gpus will hit the market in a few months, should've been $200 or higher performance imo.
There will be no GPU's dropping at the 250 price point for a hot minute. The market for this card is not the ones gunning for the top of the line but the best performance per dollar and assuming their marketing isn't complete cap it's going to be running pretty uncontested for a while in the 300 or less range. At the very least the aggressive pricing will at least threaten AMD to lower their last gen card prices to match.
@@LvcIvsLIcInIvsLvcvllvs It's not about running a charity, i'm simply not buying intel over nvidia at $50 less for the same performance and neither is majority of the market.
new high tier cards that are nowhere near the $250 bracket will hit the market in a few months. the real competition to these two cards wont release in the next two quarters
Very cool to see you getting such an influential card directly from Intel…. …you made it as a UA-camr! Very cool indeed 😊 you should feel very proud of yourself!
We haven't had this pricing since the rx 480 was released 8 years ago let that sink in. Both nvidia and amd have been upping the price of bottom tier gpus rtx 4060/4060ti/Rx 7600
@@tadsnyder2756 in that case, he needed to make 3 videos. "I have a video card", "I have a video card, but the overview is based on slides" and "video card review and tests". I like his tests and reviews because he does them later than anything else and is very informative, but here it turns out that he did it like everyone else
@@red2 he is not creating clickbait idiot, he is just simply making a speculation and giving a overview of the card awaring his audience that he has a unit that he can review to publish result.
When you test this please include Noise info. I have a very quiet environment and want to keep it that way. I was about to buy a Gforce 3080 when I saw this. My recently crashed pc.. was dead quiet. The current one is driving me nuts. And one other thing! I mostly play GuildWars2. The current GPU's fans are louder than I would like. No one ever says "online MMOs will not tax this unit". I think that is true, but not sure.
The only problem is the awkward timing, we know nvidia and amd will release new gen in next few months. If this released a year earlier it will be great, for now it's good card for value but will it be the same in next few months?
If anything, this is likely to be the most well-timed launch of any recent Intel consumer product - actually slipping in while the established players are busy intentionally neglecting this segment. Though it's unlikely that supply is going to be particularly good for these cards, which is a bummer if it manages to catch a wave of interest.
Well, I doubt Nvidia would make the 5060 a good value card. If anything, I am expecting it to be an 8gb card again, but 20% faster than the 4060, while selling for $300 or $350. If this happens, then price to performance wise, the B580 will win. It's AMD that I can't make a good guess on with their 8600.
@@fedrecedriz6868 Nvidias lowest performance GPU is just that, why people him and haw, complain, etc when all you have to do is stick a pry-bar in your wallet and get the GPU you need, or buy a lesser AMD or perhaps Intel version and quit bellyaching
First of all, congrats on 100K...2nd I really do hope intel gets better with their GPUs I'm planning on buying on next year to run as an eGPU for my ALLY X
@@Worgen4ik I don't buy that. There's definitely a B750 or B770 in the works. It'd make no sense for Intel to not release a successor to the A770 and A750 especially when the RTX 5070 and RX 8800 XT will be on the playground.
@@laszlozsurka8991 Intel only made the B580 and B570 because they'd already bought the wafer capacity from TSMC. They've already confirmed that their hardware teams have more-or-less moved on to Celestial and Druid.
It likely will. But with how AMD and Nvidia's product stacks are right now, Intel's best shot at getting some market share is launching the 500 series first. Wonder if they'll even bother with a discrete 300 series this generation.
Vex, from the videos that Intel showed their card running various games at 1440p without any image upscaling, I have noticed that their card was floating around 30 to 50 FPS per game. Would you be able to test the card against it’s competitors at 1080p so we could see if the card is on par with the rest of them (or maybe even better) and then rerun the tests with the features enabled side by side (including when RT is enabled and disabled). Would be greatly appreciated! 😊
One thing to remember is the use of lumen and nanite in ue5 modern titles. A lot of games have these active with no choice to turn it off. Having more raytracing cores could increase the performance dramatically in these games
Lumen and Nanite need proper developer oversight and optimization, or they’ll tank performance. Too many developers rely on UE5’s default toolset and a 'max fidelity at any cost' approach instead of fine-tuning it. UE5 wasn’t meant to be used this way-it’s flexible but demands skilled engineers to optimize features like Lumen and Nanite for specific projects. Look at Bodycam: impressive for being built by just two developers, but it highlights the limits of under-resourced development which just mirrors an industry-wide issue: studios skimping on talent and passing performance issues onto hardware rather than fixing them. Why invest in software engineers when you can just check boxes and rely on hardware to cover inefficiencies? This is why people wrongly blame Nvidia for DLSS and frame generation issues. Those technologies were designed as a value for users, not as crutches for lazy development, but we're already seeing irresponsible devs leaning on them in their minimum hardware requirements to mask their shortcomings.
@@xilix Mostly true except later, both DLSS and later FG were developed to justify that big part of die they force you to buy just because they purposely design them with AI in mind back then. Sooner or later normal consumers would start to ask question about that so Nvidia have a choice - spend half a billion designing more SKUs or just throw 50M to software engineers to make some use of it for gamers.
I could see b580 being the 4060 8gb replacement since the ti version with 16gb cost over 450 dollars and the b570 stomps the 3050 8gb by price and performance the 3050 is complete joke when released
B570 would most likely land as a 4060 performance but with 10gb vram and probably even faster. B580 is a succesor to an A580 An A580 was supposed to outperform a 3060 but couldn't due to alchemist microarchitecture being deficient (they showed the improvement in the xe slide). A750 and a770 was targeting 3070ti performance. So intel next cards in B series would scale greatly. For now this is is great. Although they're pretty far behind I don't think it matters much because that aspect depends on the suppport of consumers and buyers.
That would be a relief, I remember Alchemist had a lot of problems in Linux in the beginning, and I lost interest in them before they reached feature parity, if they ever did.
@@EduardoSantiagoDev can confirm this myself as an A750 owner(on an arch linux rig) the drivers are stable and improved pretty often... as a 1080P gamer i'm quite happy with since it runs my entire gaming catalog with no issues usually with maxed out(or close to) settings. I got mine fairly close after launch when the drivers weren't up to snuff yet so it was well under the MSRP price(around 40$ less if i properly recall it) and paired it up with an i5 on a H610M Gigabyte mobo and 64Gb RAM and a 500gb SSD(hosts only the OS and games i'm currently playing)&1Tb HDD(that stores everything else) ... its been quite a worth while bang for the buck budget build that will last me for a few more decent years of 1080P gaming. PS:about ray tracing... who even gives a damn ?! i'm mostly into RTS games&good quality indie RPGs 😉
Interested to see how it performs in multimedia production in terms of video decoding, encoding and supported codecs? There is life around video cards outside of gaming. Even support for gpu based vst sound effects is growing. Since I read they have dual encoding engines and can work with the intel processors built in Quick Sinc hardware encoder version 6-9. Could be a good value for money video card.
Bro you could've just printed out a picture of the card for this video. Why do you make a 30 minute video telling us what Intel has said about the card, while you have the card? With your "I guess we are going to see the benchmarks", you have the fucking card, you can do benchmarks. You having the card means you can actually review it instead of reading their ads about it.
You should have run some initial benchmarks I have the GPU & the same information the rest of world does is of little use Cyberpunk and Tomb Raiders built in benchmarks would have been a great indicator
Really enjoying the work of the gpu division. Excited for the future, might look into and testing the cleric or druid generations... I hope those are the names
HD 6950 MSRP was $300 whereas the HD 6970 MSRP was $370 - The GTX 570 MSRP was $350 whereas the GTX 580 MSRP was $500. These were the costs of GPU's in 2010 - 2011.
Given this is an entry level card and there isn't going to be a mid or high end one, I feel like it would have made a lot more sense to just forget about ray tracing altogether and spend that money on raster. If it didn't have RT but had the raster of a 4060Ti or even a 4070 it might actually have been interesting and it wouldn't be made immediately redundant in a month.
I’m excited
Leave anything you want me to specifically test on it and I’ll see if I can make that happen
No you are not because you tested this GPU already so you know what you know and what we don't know xD
But seriously, I wish Intel the best because we need competition in this broken duopoly market more than ever.
I notice you're optimistic about battlemage and I am too. Intel noticed as well. 😁 I am glad you got a sample!
I'm super excited about this too. Well done Vex. I'm buying 2-3 of these things for my various PC's.
@@christroy8047thank you for your sacrifice
Man congratulations!! getting a review sample is a big deal.
I do not know if many are interested in this, but could you also mention idle power draw? normally it is always excluded from reviews but for some ppl (like me who keeps the gpu in idle 95% of the time) it has some weight on purchasing decisions.
Cheers!
Intel is in a weird place where everyone is booing them for their CPUs and cheering them for their GPUs 😅
Reverse old AMD
Im afraid Intel will soon become a GPU company...
@@aleksdeveloper698 could be a good thing if they do it right, big if.
Considering Intel has been launching the same CPU for like... 20 years... yeah, they are pretty behind on CPU technology compared to AMD currently.
Because the way intel treat their customer
Intel GPU naming: 😀
Intel CPU naming: 💀
Yes!!
Yes~~
Arc -> Battlemage -> Chewbacca? (would be hard to get it) Chivalry? Cataclysm?
But yes, those are really good. I get similar feeling as I had, as a kid, with Riva TNT or a Voodoo.
This is pretty important, because let's image a kid, who asks parents for PC, - fast forward, in a shop: "we have PCs with RTX... RX and a Battlemage.
I know what I would ask for if I was 9-12 again, and parents gave me a limit of 700-900$ for a new PC.
@@WłochateStopy
It's Alchemist -> Battlemage -> Celestial -> Druid if I remember correctly.
@@albertalmodal4331E for elephant i hope
0:10 I added a timestamp so people would see this. He does not benchmark the card at all, he just yaps about the theoretical performance the entire time
Hilarious how you think he's the stupid one, but completely ignorant that even if he tested it, you cannot bypass the embargo and just release info before lauch without permission lol
The fact they sent it to him early, means they are confident and expect to hear good results. Use your brain tik tok kid. 10 seconds and already understimulated. No wonder multiplayer games are so trash when they cater to tiktok losers while long, fleshed out, in depth single player games have been incredible. Bozo
Yo bruh....... I HOPE LIVE A GOOD LIFE MAN!! 🙏 IF U EVER NEED A KIDNEY OR A LIVER OR WHATEVER JUST TELL ME I'LL HAPPILY GIVE IT TO YOU!! I HOPE YOU FIND A 100 ON THE GROUND BRUH THANK YOUUUUU🙏🙏🙏🙏🙏
because emabrgo, dummy
I hate these kind of reviewers who put in 0 effort. All he does is pull out some random graphs instead of firing up some games and showing us.
Nice thank you
I have a RX 580 8gb and Going to a 4060/7600 with 8gb doesn't make sense because i have a 1440p moniter now. Crazy the 4060/7600 can do 1440p but the vram holds it back in the graphs here and hardware unboxed. No doubt the RTX 5060 and 8600 will have 12gbs and cost $329-350 making the B580 the only option. Having 12gbs, decent performance, and intels XeSS means 1440p gaming at $250 is finally here. Also 1440p high refresh moniters are below $250-300 now. intels claims 1440p being the new 1080p in a few years is believable. XeSS also uses AI like DLSS so it won't be a hit or miss like AMDs FSR. This card is exactly what i been waiting for. Its funny upgrading from a RX 580 to another 580 though
Haha, same here. My rx 580 gives black screen on demanding games. Looks like I'll upgrade to another 580 now😅
@@killingmachinelpReally black screen? I have RX6600 and never experience a degree of such
I loved my old rx580 lol, I ended up finding a sale sometime last year for an rx 6700xt for like $270 and man was that a fantastic card. I hope this new B580 can give that level of performance without all the issues their first release had. I just pray both Intel and AMD start making headway in the GPU market because DLSS aside, Nvidia hasn't exactly been making quality products for the entry level market recently.
@@toututu2993 it's from 2018, I guess it's showing is age.
It's funny 😂
do it for the memes
I hope Intel succeed with their GPUs 🙏🏽
it would be great to have a third player on the gpu market
@@Necrofem fr we need more gpus on the market.
@@belzeb4352and cpu
i'll buy one just to celebrate this.
The hope lies in our wallets. I don't need one and I want one lol
You know what else has 12GB Vram? The $700 4070 Ti...
you really think you will be able to get intel card at msrp? thats funny
Easily no scummy business is interested in the Intel cards
@@kaimojepaslt It's on pre-order for 300€ (including 25% VAT).
@@kaimojepaslt dude I have a not busy microcenter like 10 mins from my house. It still has the 9800x3d in stock at msrp
in italy the rx 7600 costs like 20 euro less than the 4060
Looks very clean streamlined, i like the intels look of gfx cards
Yes, the lack of cheap/corny plastic and RGB diodes is definitely a plus.
They look slick, they are the best looking cards IMO. I may pick one up later
Yep, their cards are the best looking by far.
Mostly I dont even watch the screen when a game is running, I just stare at my closed case and Imagine the beauty taking place within. TAKE MY MONEY INTEL
Intel Cards on my opinion look great
If Battlemage shows good improvement, then Celestialmage might actually be pretty damn competitive.
it's just Celestial. It's not like CPU naming where everything is a lake. Alchemist, Battlemage, Celestial, Druid.
@@bhume7535 It's obviously Alchemistmage, Battlemagemage, Celestialmage and Druidmage
@@glitterhoof420 yeah like... isn't it obvious?? some people rlly tryna act like a nerd when they are wrong with what theyre saying
Waiting for Fromage, personally
@@therealwhite the cheese architecture is gonna be fire, can't wait
Ah, our boy Vex made it guys. He's now a reviewer, getting free stuff, now jokes aside, get back to work and start benckmarking that baby. We can't let the ozzies at Unbox beat you to publishing benchmarks.
HE'S EARNED IT. That video comparing XeSS to DLSS to FSR was a masterpiece.
Yeah, receiving review samples from intel is deffo a "I've made it" milestone. Now you gotta resist the pressure to become a part of their marketing arm and start shilling!
I honestly paused when I saw the video like wait.... He has what? Oh snap the boy done made it!
@@BOZ_11yeah man I am happy for him but I will have to see if he can keep doing what he did before or he revolves around the feet of companies and rolls us into buying a bad deal.
For now he earned it and I hope he earns even more.
@@ravijotsingh7831 he was already doing that with amd.
I appreciate your talking pace and tone. I like to listen to videos while I work and so many tech youtubers talk so fast that I can't really take in what they are saying unless I'm specifically trying to pay attention. You talk at just the right pace I can listen to what you are saying without getting distracted from my work tasks
real, most of them talk like they have gun to their head + overdosing on adderall
Doki Doki Literature Club music is an interesting choice.
It's so good of a background music that it took me a bit to realize "wait, this is from that horror anime visual novel game"
AINTNOWAY
I mean, if you dont have any context is kind of a vibe tbh
Peak bgm
at 2:40 i was like wait thats the music from the anime visual novel game and i paused it
The fact that many of the official charts talk about XeSS is a bit worrying, to be honest. Nowadays, GPU builders only swear by upscalers and framegen technologies...
"XI" in "Draw XI" stands for "Execute Indirect" which allows the GPU to manage and execute a series of rendering commands without constant oversight from the CPU. In other words, it's a technique that reduces the CPU bottleneck used in modern games. The A series didn't have XI support in hardware, and had to be emulated in software, incurring a major performance penalty.
Call of Duty: Black Ops 6 is one game that I know of which makes heavy use of execute indirect. It would be a good title to put into your test suite IMO
based.Man Stalker 2 beats the shite of my cpu/gpu.I hope it's great.I need a upgrade.
@@granityseis104 stalker 2 is soo poorly optimized though. even fucking 4080's are being bent to their knees for that dumb game that dosen't even have LIFE-AI, the thing that made stalker 1 such a timeless classic. now its just generic spawn in AI
Thx for explaining
Might be useful for ps3/360 emulation
Wow man affordable gpus am I dreaming ?
To be fair it doesn't look like it's going to be anything faster than what we already have, and we've had very affordable GPUs for a while now. We just haven't seen a brand new launch be so affordable
@@gabrieli6008 based
We got a budget 12gb card from intel and the upcoming release of the midrange targeted 8th gen radeon cards coming out in a month from amd. nvidia better watch out!
@@gabrieli6008faster? … the point is cheaper
more like the last 5 years has been a nightmare
I'm really looking forward for these new Intel GPU generation . I just recently bought secondhand A750 for about 130 USD for some light gaming.
thank you for that detailed Arc journey.
I watched all without rewind
greetings from Poland
Let's GO INTEL (I can't believe I'm rooting for them... but at this point any fight against monopoly is a win for humankind)
Maybe you shouldn't be so poor. I have octo crossfire 4090 because I'm considerably richer than yoooouuuuu
Who asked you?@@johnsmith-i5j7i
@@johnsmith-i5j7i ah yes the just stop being poor meme. thanks for solving poverty worldwide through your manifestation!
@ModulerDrone whooossshhh
You are so poor lol I have 6 rtx 4090 connected to my local powerplant using crossfire@@johnsmith-i5j7i
The design's simplicity and plainness appeal to me, and its entirely black color scheme is particularly impressive.
I love how clean they look. I'm really hoping they don't have any issues because it would look great in my PC.
Agreed, just a sleek black design without heaps of RGB puke. I'm all for it.
I like how this channel has this down-to-earth vibe and not being elitist
7:25 Still over 58% of people on steam use 1080p. That’s not changing till 1440p is as cheap or cheaper than 1080p monitors
And judging by how demanding UE5 games are getting I think 1080p is still gonna stay for a while longer
I take way higher framerate and overall smoother performance with native 1080p any day over the slightly higher DPI image with 1440p. Doesn't matter what the monitor costs, the PC is the expensive part..
Proportion of what is being used now is not the same as proportion of what people want to buy now. People buying new GPUs at this moment probably would get 1440p if they are gonna buy a display in the current market.
They're already cheap. You can get a 27-inch 2K 180Hz monitor from ViewSonic for around $150.
High refresh rate 1440P monitors are cheap just a few years ago people were paying the same price for high refresh rate 1080P monitors. Building a PC to run 1440P at high frames is a different story.
i want Intel to build something to compete with the 5080! i rekon if this model does well they should compete! AMD stepped down on higher end GPU's i think INtel need to step up!
The one thing giving me pause right now is the fact that on a lot of these comparison charts, Intel is clearly leveraging XeSS a lot. Looking forward to the reviews with some real world data.
Here's hoping that they're actually as powerful in the real world.
Agreed but also pretty hopeful as XeSS is already way better than any FSR. Intel main issue was driver support, which just gets ironed out with time and use data. Think imma build a budget PC with intel card and give it away for xmas
Yeah, and intel has better performance on XeSS than non-intel cards, it’s “technically” a fair comparison, but kinda sus. If the 4060 was on DLSS, it might be faster, although ever since XeSS 1.3, it typically get similar perf as DLSS
XeSS looks like total trash compared to DLSS...
@@vextakes It sucks that we're the guinea pigs training the AI, but at this point it's unavoidable. Eventually it'll be the norm if it already isn't becoming it. My optimistic take is that hopefully this AI feature stuff gets exponentially better & better, and delivers the things they are promising. We are at the infancy of it, kinda scary lol
The first thing you showed in the video was the fact that you have the physical card in your possession, at no point did you mention that you will not be testing it in this video, and then proceeded to yap about theoretical performance for 25mins...
The embargo lifts on the 13th
@@LaserVelociraptor I know, that's the reason im upset. I thought he may have some kind of deal to give us a sneak peek before that. I patiently watched half of the video believing that, then realized he doesn't and became disappointed.
I agree, this is very uncatchy clickbait and I think it's intentional. If he had said on the beginning that there will be no real tests on this video, 90% of viewers would have closed it.
Battlemage sounds like a Nvidia gtx ready PC or something like NVIDOA BATTLEMAGE 3D READY
dont insult intel like that man
3dfx Battlemage
@@mat_maxfr
Battlemage sounds like "HEY COOL KIDS BUY GRANDADS GPU AND PLAY FORNITE AND WATCH THOSE STRANGER THINGS IN 17K PIXEL RESOLUTIONS!!!!!1
Im glad you got a review sample, keep it up brother
Congrats on getting a sample, and congrats on 100k my man!
This card looks crazy. Excited for u guys being able to get one
I love this video it was soo good i finished all 27 minutes in about hmm 80 seconds
Speed run
best comment. I can relate
Try this GPU to:
- LLM models (AI)
- Graphic Diffusion models (Generate Graphics)
Why would you want a budget gpu to do that?
@@Shahzad12357 there is budget friendly model.
Bro just take a college art class at that point so you can make something thats not shit
I agree
As an AMD user I cant wait and hope this can compete with other mid-ranged GPUs in both performance and price because we really competition in the market
Competition in the GPU market doesn't exist, 4060 is king for the budget because of fram gen. It gives 70-100% better performance than 3060. That's why Nvidia has 90% market share in gaming gpus. Amd doesn't even make most of their money from gamers that's why they don't wanna compete with 4090 or 5090.
@@WHAT._ISPAT4060 is an utter piece of shit and a disgrace to the gamers.
@@WHAT._ISPAT the 60 series is always way overpriced on launch and then comes down to reasonable prices 6 months after, which by then means the next generation is coming out. The 4060 is hardly a budget card.
@@ratnodeepbain7758 Rtx 4060 give 75-90 fps with RT Ultra Frame gen on cyberpunk meanwhile 3060ti with Dlss gives 28-55 fps. Similarly a lot of games run fantastically better with 4060 why are you criticising it even steam data says that 4060 is the new popular choice.
@@Lightning_aus utter stupid reason to become an AMD peasant. Rtx 4060 fucks rx 7600 from every angle with frame gen. It's not a software but a chip you pay for.
Tech Yes City did a really good job examining the benchmarks in detail, without doing his own for obvious reasons, and there were some in inconsistencies in some games. Basically Intel did fudge some of the numbers in certain games, giving them a clear win, when it in reality they numbers should have been less. Yall should go check that out. Great video as always my man!!!
Congrats on 100k subs! ❤❤
There are no real benchmarks or gaming tests, only some Intel promotion stuff which is offcourse heaven.
First real benchmarks, then seeing is believing, not sooner.
For those that haven't seen the latest MLID vid on Battlemage it is pretty much DOA, they are losing between $30-$60 a card so they will only release enough to burn through the stock required of the TSMC contract and then pull the plug. They have already fired HALF the GPU division so expect driver support to be poor and only last a couple years and after this Celestial will be iGPU only.
You can thank now fired CEO Pat Gelsinger for this as TSMC had given them a sweetheart deal with 40% off the wafers then he made a crack about Taiwan in an interview which caused TSMC to revoke the discount so they are having to pay $23,000 per wafer instead of $13,800 which means there is no way they can even sell these at break even. That one crack was the final nail in the coffin for ARC sadly, but what can you expect from the CEO that canceled their new CPU design, sold so many divisions like Optane, all to chase an AI ship that had already sailed.
3:55 I love how those prices in € are not only higher due to conversion but are just higher 😂
25:11 Can you in the future reveal something more about this feature? I am using RTSS to limit FPS in game and remove screen tear as I am on 60Hz display (TV). Will this driver level limiter be comparable (maybe easier/faster to use) to what we can get from RTSS?
If Intel comes back by taking the budget market it would be amazing
DG1, A380, A580, A750, A770, all these are designed for budget market.
20:00
The graph says the finished artificial image is before the UI
That means no messed up UI like DLSS3?
Congrats on 100k man, keep it up!
I love that intel gave you a review sample! Congrats :))
The price sure is right. If it beats a 4060 they've got a solid entry level GPU already.
Midrange*
A little bit late, but 250 bucks 😀
Congrats for the 100k
You know what’s funny. The fact that literally since the announcement I’ve picturing you doing a review on them dressed the way you are. I’m so glad you get to give a look at the card I want first lol
If RT Remix software can also be used with Intel cards, then i will defenitely buy it.
i really hope they finally open up the software and not block it for other cards anymore.
It isnt blocked on AMD cards to my knowledge. It shouldnt be blocked on intel. But maybe the cards themselves are the problem since they dont have directX 9 support, which is the main api used by remix
@@spike-the-red-king it isnt? I looked up the official list of supported cards on Nvideas site and they only listed their RTX cards. Thats why i thought they gatekeep the program for their own cards.
@@spike-the-red-king yes, and the games created in RTX Remix (like Portal RTX and Half-Life 1) can be played in AMD graphics card but you can't use RTX Remix functionalities in AMD cards in my knowledge
@@spike-the-red-king the game doesn't ever end up rendering in dx9, it is all translated in rtx remix, probably to dx12. you would never be able to do path tracing like that in dx9.
@@EduardoSantiagoDev It translates it to vulkan. Doesnt change the fact that the games that are supported in remix are DirectX 9, 8, and some 7. Im just saying there could be some issues only intel would have just because the games had to be translated even without remix to run
This was way more thorough than I expected.
AMD should be scared. Intel already having more "anti lag" options lmao
Hey if all else fails, they still dominate the console market at this point and even won the deal with Sony for the PS6 against Intel so that's a plus I guess
"you can change the hue on your display if you really want to trip on acid or something"" made me chuckle. Thanks for making me start the day with a smile
Seems like Intel is really trying to earn budget minded gamers business. Nice to see the competition
Congrats, my man! Can't wait to see your review.
Honestly, one of the 3 companies needs to normalize 16 GB minimum card vram. It's easy enough to do, drop the speed of the vram to a gen prior, say gddr6 to gddr5 or maybe even create a mixed memory balancing algo so you can half and half it and give players more bang for less money on low end cards - you don't go ballistic ultra settings on your game as a result, but you also dont stutter due to every "AAA" studio sucking at optimization on a current gen GPU like with nvidia's entry level market
Games just need more optimization to run 6gb
Its not really about the optimization, its the sheer scale of games that needs to be processed. Games are just way too complex to run smoothly on 2070.
I think they could easily fix this in two ways: make separate installs for quality settings (if you only want 1080p, you can remove 4k and 1440p resolutions) and increasing gbs to 12. 16 is kinda nutty tbh, I don't think we need that much until you get to high fps 1440p (Or unoptimized UE5 slop)
@@kebaba.k.atortilak2718 It is about optimisation, and he was talking about the 4060 not the 2070. There's no excuse for a game to run like ass at 1080p on a current gen card, especially when the result doesn't really look better and has artifacts all over the place.
Nice thumbnail - hope the card lives up to it!
The closer we get to the release date. The more I want this card.
Getting review samples now happy for you 😁
Congrats on the review sample! You're the first content creator I follow that got a review, keep the good work up!
Damn that starting camera transition 😂. Heck yeahhhh let's go Budget GPUs let's gooooo
$50 less for 7600 xt and 4060 performance is just not exciting since new gpus will hit the market in a few months, should've been $200 or higher performance imo.
There will be no GPU's dropping at the 250 price point for a hot minute. The market for this card is not the ones gunning for the top of the line but the best performance per dollar and assuming their marketing isn't complete cap it's going to be running pretty uncontested for a while in the 300 or less range. At the very least the aggressive pricing will at least threaten AMD to lower their last gen card prices to match.
Intel can always drop the price. Why run a charity?
@@LvcIvsLIcInIvsLvcvllvs It's not about running a charity, i'm simply not buying intel over nvidia at $50 less for the same performance and neither is majority of the market.
@@MetaMdad The 6750 xt is already $300 and matches the 4060 ti in performance with 12gb vram.
new high tier cards that are nowhere near the $250 bracket will hit the market in a few months. the real competition to these two cards wont release in the next two quarters
Very cool to see you getting such an influential card directly from Intel….
…you made it as a UA-camr! Very cool indeed 😊 you should feel very proud of yourself!
101k subs ❤
We haven't had this pricing since the rx 480 was released 8 years ago let that sink in. Both nvidia and amd have been upping the price of bottom tier gpus rtx 4060/4060ti/Rx 7600
Where are you Vex. Is everything okay?
"I have the New Intel Arc Battlemage GPU" and no tests in video, but unbox and reviewed by intel's slides....
I think it is the law intel made. This guy "borrow" use it and feed back. He can say but no image
Dude he’s under embargo like everyone else
@@tadsnyder2756 If hes under embargo whys he creating a click bait video ?
@@tadsnyder2756 in that case, he needed to make 3 videos. "I have a video card", "I have a video card, but the overview is based on slides" and "video card review and tests". I like his tests and reviews because he does them later than anything else and is very informative, but here it turns out that he did it like everyone else
@@red2 he is not creating clickbait idiot, he is just simply making a speculation and giving a overview of the card awaring his audience that he has a unit that he can review to publish result.
I hope the benchmarks are accurate. Time will tell
3:04 Bro is collection chrome's tabs 💀
Loved the livestream btw, hope to see more. Excited about the new competition in the GPU space ngl
When you test this please include Noise info. I have a very quiet environment and want to keep it that way. I was about to buy a Gforce 3080 when I saw this. My recently crashed pc.. was dead quiet. The current one is driving me nuts.
And one other thing! I mostly play GuildWars2. The current GPU's fans are louder than I would like. No one ever says "online MMOs will not tax this unit". I think that is true, but not sure.
I like this new development. More competition is needed.
The only problem is the awkward timing, we know nvidia and amd will release new gen in next few months. If this released a year earlier it will be great, for now it's good card for value but will it be the same in next few months?
Yes, but the rtx 5060 and the rx 8600 will probably not launch before spring or in late summer -25, so still quite a head start.
If anything, this is likely to be the most well-timed launch of any recent Intel consumer product - actually slipping in while the established players are busy intentionally neglecting this segment.
Though it's unlikely that supply is going to be particularly good for these cards, which is a bummer if it manages to catch a wave of interest.
Well, I doubt Nvidia would make the 5060 a good value card.
If anything, I am expecting it to be an 8gb card again, but 20% faster than the 4060, while selling for $300 or $350.
If this happens, then price to performance wise, the B580 will win.
It's AMD that I can't make a good guess on with their 8600.
You can't release a GPU before its developed, what should they have done? Wait until theres more competition in a few months or release it now?
@@fedrecedriz6868 Nvidias lowest performance GPU is just that, why people him and haw, complain, etc when all you have to do is stick a pry-bar in your wallet and get the GPU you need, or buy a lesser AMD or perhaps Intel version and quit bellyaching
First of all, congrats on 100K...2nd I really do hope intel gets better with their GPUs I'm planning on buying on next year to run as an eGPU for my ALLY X
Looking forward to the B770 and B780
Won't happen.
Only B500 series cards.
Maybe B300 later on but not sure
@@Worgen4ik won't upgrade then I don't like down grading tier of gpu.
First there never was a 780 there was 770 and 750.
@@Worgen4ik I don't buy that. There's definitely a B750 or B770 in the works. It'd make no sense for Intel to not release a successor to the A770 and A750 especially when the RTX 5070 and RX 8800 XT will be on the playground.
@@laszlozsurka8991 Intel only made the B580 and B570 because they'd already bought the wafer capacity from TSMC. They've already confirmed that their hardware teams have more-or-less moved on to Celestial and Druid.
Congratulations on 100k man!
I'm interested in seeing what the B770 will end up looking like. (if one will exist)
It likely will. But with how AMD and Nvidia's product stacks are right now, Intel's best shot at getting some market share is launching the 500 series first. Wonder if they'll even bother with a discrete 300 series this generation.
Hell yeah intel 🎉, support small UA-cam channels.
I wish this new GPUs to succeed.
It's the kick up the arse Nvidia & even AMD needed
Higher vram, well done Intel at the lower end
PC is back!
Vex, from the videos that Intel showed their card running various games at 1440p without any image upscaling, I have noticed that their card was floating around 30 to 50 FPS per game. Would you be able to test the card against it’s competitors at 1080p so we could see if the card is on par with the rest of them (or maybe even better) and then rerun the tests with the features enabled side by side (including when RT is enabled and disabled). Would be greatly appreciated! 😊
One thing to remember is the use of lumen and nanite in ue5 modern titles. A lot of games have these active with no choice to turn it off. Having more raytracing cores could increase the performance dramatically in these games
While lumen be (but it's not requirement) hardware accelerated nanite can't care less about number of RT accelerators your card have.
Lumen and Nanite need proper developer oversight and optimization, or they’ll tank performance. Too many developers rely on UE5’s default toolset and a 'max fidelity at any cost' approach instead of fine-tuning it. UE5 wasn’t meant to be used this way-it’s flexible but demands skilled engineers to optimize features like Lumen and Nanite for specific projects. Look at Bodycam: impressive for being built by just two developers, but it highlights the limits of under-resourced development which just mirrors an industry-wide issue: studios skimping on talent and passing performance issues onto hardware rather than fixing them. Why invest in software engineers when you can just check boxes and rely on hardware to cover inefficiencies? This is why people wrongly blame Nvidia for DLSS and frame generation issues. Those technologies were designed as a value for users, not as crutches for lazy development, but we're already seeing irresponsible devs leaning on them in their minimum hardware requirements to mask their shortcomings.
@@xilix Mostly true except later, both DLSS and later FG were developed to justify that big part of die they force you to buy just because they purposely design them with AI in mind back then. Sooner or later normal consumers would start to ask question about that so Nvidia have a choice - spend half a billion designing more SKUs or just throw 50M to software engineers to make some use of it for gamers.
Nice lil bend on that pcb below the pcie slot for some sort of riser adapter? @ 22:36
I could see b580 being the 4060 8gb replacement since the ti version with 16gb cost over 450 dollars and the b570 stomps the 3050 8gb by price and performance the 3050 is complete joke when released
B570 would most likely land as a 4060 performance but with 10gb vram and probably even faster.
B580 is a succesor to an A580
An A580 was supposed to outperform a 3060 but couldn't due to alchemist microarchitecture being deficient (they showed the improvement in the xe slide).
A750 and a770 was targeting 3070ti performance.
So intel next cards in B series would scale greatly.
For now this is is great.
Although they're pretty far behind I don't think it matters much because that aspect depends on the suppport of consumers and buyers.
Who else thought that in this vdeo he was raling to much at the beginning...
Show us some games already bro.... 🤣 😂 Too excited here!
How well is it currently supported on Linux?
This! Really hoping it will be better than the Alchemist cards (for gaming)
alchemist works fine on linux, same as their integrated gpus.
Given that intel drivers are open source and being updated constantly, as in 1-2 update every week, i'd say pretty well supported
That would be a relief, I remember Alchemist had a lot of problems in Linux in the beginning, and I lost interest in them before they reached feature parity, if they ever did.
@@EduardoSantiagoDev can confirm this myself as an A750 owner(on an arch linux rig) the drivers are stable and improved pretty often... as a 1080P gamer i'm quite happy with since it runs my entire gaming catalog with no issues usually with maxed out(or close to) settings.
I got mine fairly close after launch when the drivers weren't up to snuff yet so it was well under the MSRP price(around 40$ less if i properly recall it) and paired it up with an i5 on a H610M Gigabyte mobo and 64Gb RAM and a 500gb SSD(hosts only the OS and games i'm currently playing)&1Tb HDD(that stores everything else) ... its been quite a worth while bang for the buck budget build that will last me for a few more decent years of 1080P gaming.
PS:about ray tracing... who even gives a damn ?! i'm mostly into RTS games&good quality indie RPGs 😉
Interested to see how it performs in multimedia production in terms of video decoding, encoding and supported codecs? There is life around video cards outside of gaming. Even support for gpu based vst sound effects is growing. Since I read they have dual encoding engines and can work with the intel processors built in Quick Sinc hardware encoder version 6-9. Could be a good value for money video card.
I'm so hyped, i really hope Intel succeeds with gpu's and Nvidia might have competition.
No
Congrats with getting into the BIG league! Speaking about names I have Sonos Arc soundbar. Yeah companies can't come up with unique names.
Bro you could've just printed out a picture of the card for this video. Why do you make a 30 minute video telling us what Intel has said about the card, while you have the card?
With your "I guess we are going to see the benchmarks", you have the fucking card, you can do benchmarks.
You having the card means you can actually review it instead of reading their ads about it.
i think He simply rented that GPU, and according to the rules, he cannot use it on a computer.
Congrats on 100k man
Lets go!!!!!
Let's go
What's incredible is that 8gbs is not enough, when my first nvidia graphics card in the later 90s was the Nvidia TNT with 16 mbs of ram.
You should have run some initial benchmarks I have the GPU & the same information the rest of world does is of little use Cyberpunk and Tomb Raiders built in benchmarks would have been a great indicator
Yeah, and I'm sure he will when the Embargo lifts like everyone else
Looking forward to the review vex! I can se your exited, so that gives me positive hope. ;)
I like the design of this card, no stupid leds or over the top chrome.
I love these for the mainstream pc gaming crowd. I hope intel one day hops into the midrange and really puts the pressure on
Im excited about a 250$ card good job intel. Need higher end tho
And what’s that thing glowing in the inside of the fan at 16:40 it looks it’s looks fire 🔥
Let me know when the real review come out.
Really enjoying the work of the gpu division. Excited for the future, might look into and testing the cleric or druid generations... I hope those are the names
Entry level price of $250... oh how times have changed. A long time ago you could get almost top of the range for that price.
Cool it's not 2005 anymore. nothing costs what It once did 20 years ago.....
What in the medieval period?
bro bought his first gpu in doubloons
Wow it's not 2005 anymore, crazy right?
HD 6950 MSRP was $300 whereas the HD 6970 MSRP was $370 - The GTX 570 MSRP was $350 whereas the GTX 580 MSRP was $500. These were the costs of GPU's in 2010 - 2011.
This is what I plan on using in my budget build come Jan or Feb. Supporting competition.
The b580 makes me want to take fent
Given this is an entry level card and there isn't going to be a mid or high end one, I feel like it would have made a lot more sense to just forget about ray tracing altogether and spend that money on raster. If it didn't have RT but had the raster of a 4060Ti or even a 4070 it might actually have been interesting and it wouldn't be made immediately redundant in a month.
Given the uprise in games that simply require RT hardware acceleration to run with playable framerates at all I don't think that'd be a wise decision.
This is 1st time since long ass time that i got excited for new GPUs. i really hope Intel hits jackpot with battlemage :)