RTX 4060 Ti In a 13-Year Old Alienware PC - Some Interesting Results
Вставка
- Опубліковано 20 лип 2024
- After selling the GTX 1050 inside my old Alienware I decided to slap a Palit RTX 4060 Ti inside it instead.
How do games perform at native resolution, does this x8 card work inside this PCIe 2.0 machine and can DLSS3 Frame Generation help alleviate the CPU bottleneck? Let's find out!
0:00 Intro, Palit 4060 Ti Dual OC and I7 920 Overclocked CPU Performance
2:06 Alienware Aurora and Test Methodology
3:11 Cyberpunk 2077 1440p
4:14 Marvel's Spider-Man Remastered 1440p
5:06 The Witcher 3 Next-Gen 1440p
5:43 Apex Legends 1440p
6:34 Grand Theft Auto V 1440p
6:56 Fortnite 1440p
7:32 What About DLSS3 Frame Gen?
7:52 Cyberpunk 2077 1440p Frame Gen
8:43 Marvel's Spider-Man Remastered 1440p Frame Gen
9:35 The Witcher 3 Next-Gen 1440p Frame Gen
Graphics Card Used in This Video:
www.palit.com/palit/vgapro.ph...
Thanks for watching :) - Наука та технологія
And they called me a madman for trying to pair a 2070 Super with a 3770k 3 years back.
😂
And the funny thing is the 4060 ti isn’t even much faster than the 2070 super
Driver overhead
i have 2080 paired with i7 2600 💀
7700k + 3070Ti here
It's crazy how some 1st gen Intel chips are still good for gaming under the right circumstances
Yeah definitely
Surely these results show you that it isn't good at all...
@@heylookthatboi Man, I've lived with a slow laptop for years. 50-60 fps on anything is buttery smooth lol
@@EpicTyphlosionTV Yeah okay, but slow laptops are terrible and not meant for gaming at all. Even in a laptop you're much better off with a budget gaming one I think, I've seen a Lenovo one for 600€ at an eletronics store in my city with a Laptop GTX 1650 and it sure as hell beat anything with integrated graphics at that price point
Oh buddy my pc in Monday was Athlon II x4 651k + 8gb ddr3 1333mhz + R7 250x 1gb , and for a year I was looking for PC that I wanted to buy , I looked in to new option Ryzen 5600G , then Ryzen 4 4500 new + RX580 used , and then I found used Ryzen 5 3600 +1070 for 340$ ... and then I found i5 2500 + 8gb ram + GTX760 3gb 192bit OEM for 70$ in cool looking case , I bought it , is huge jump for me GTAV look awesome , Fallout 4 run nice , Death Stranding run well , that PC was recently someone who OC it so CPU run 4.2ghz (temps are fine) it have biffy cooler , good brand name psu work well for me , and my inner geek is in 7th heaven , as I allready have plan I get RX570 4gb/GTX 970 , then 16 GB ram and i7 2600k and I will have fun to build new PC and for cheap platform 70$ + ram 30$ + cpu 50$ + gpu 60$ = for 210 I will have PC that will run World of Warships well , and all games that I have in epic libary , and I will wait until RTX 5000 will get out see how good will be and buy new PC with 5060ti in like 2 years :)))
Never underestimate an old high end machine.
My daily driver for years was a 2006 dual Xeon IBM workstation that a neighbouring company were throwing away.
I’d upgraded it with dual 3GHz quad cores (despite the data sheet saying they were unsupported) and 32GB of dirt cheap server RAM.
I later dumped a GTX 970 in it so I could play Fallout 4 and Project Cars, an amazing beast of a machine.
Ahh, yes, the Frankenstein build.....
I'm thrilled at the overall low price of those old machines and have gathered up about 50 gigs of registered DDR2 dimms for such a build. I fully plan on pushing it to its limit
Ye until 2019 i was on a i7 2600 office PC that I added 8 more gb of ram to it and a gtx 970 EVGA blower card. Was completely fine until I upgraded to a 1440p 144 hz monitor.
And what's the total power draw?
Niiiiice, another great pc saved and upgraded
Good job, pal))
The fact that a cpu released 15 years ago even runs these games is mind blowing. Still does better than some consoles, lol. This is exactly why something like a oc'ed 4790k w/ ddr3 2400mhz is still a beast.
I still run a 4790k OC to 5GHZ, Been a solid unit for so many years and for no reason but because i can, I run crossfire RX580 8gb Strix cards. I originally only had one but after the fall of gpu mining i picked another one up for cheap and wacked it in for shits and giggles.
Also shows how mature cpu technology already was. Back in the 90s, I remember how cpus even a few years old would struggle to run anything new.
@@mikeycrackson I had friends with the 4790k who could only get theirs to 4.8 and 4.9 I think I got lucky with my chip. Took a lot of playing around to get it stable and reliable. Delidded it and liquid metal for thermal and an aio for cooling. Peaks at 84degrees c max
@@mikeycrackson I replaced mine two years ago for a i9 10900 and then 11900k and the gains are there, (bf2042 runs like arse on one) however for *most* games a 4790k is overkill. Ran mine at 4.4ghz all core, 32gb ddr3 cas10 2400mhz. Still using it in my living room pc.
@@mikeycrackson Cities skylines runs bad even on a 13900K.
Nehalem has been around for almost 15 years and it still runs games. Sandy Bridge is nearing 12-year mark and it still runs games pretty well; sold mine to a friend three years ago and it's still going strong.
Stuff like this did not happen in the 90s or early 00s. It was 3 years max and you replaced the whole computer (hard drives and RAM might've been recyclable). I do not long for those days.
It's true what you say.
I believe that there are 2 main causes for this
First of all the 28nm stagnation for nearly half a decade and most importantly the 4 core stagnation that lasted for about a decade with held many things back.
I always want more advancement, but I adore dragging an old system kicking and screaming into modern workloads.
weeelll... the C64 computer was sold between 1982 to 1994, which was kind of good I think. (I also think it still is the most sold computer at 17 million units)
My parents have a Nehalem based iMac. After upgrading it to a i7-880 and 16 gb ram it handles your standard computer duties with absolutely no fuss whatsoever.
Sandy IS over 12 years, they were released on Jan 2011. :P
I remember watching your videos 5 years ago, when I was just starting to build budget gaming pcs, I'm so glad to see you're still making awesome videos and great content :D
When I watched the first video On the Alienware rig and you mentioned putting a newer card in to check it out, I was almost counting the moments waiting haha. You my friend, deliver!
I'm still using first gen! I replaced my i7-920 less than a year ago with a Xeon X5070. I went from 4 cores, 8 threads to 6 cores, 12 threads with the Xeon. The 920 actually has a faster single core score but the Xeon beats it on everything else. I would definitely use the 920 again if I have to but I would LOVE to get a hold of the fastest i7 (990?) for my system to try out. First gen systems were built like tanks and they are still a lot of fun to play around with IF you can find one at a decent price. What some people want for a price just for a first gen board you can get a FAR faster and NEWER mid range system instead and you would be foolish not to in that case. 👍👍
I LOVE that Alienware system you have their bro and I would be more than happy to own one even without the graphics card included. Nice video as it was definitely interesting to people like me who still have first gen on hand to play around with. 👍👍
Those 8 pcie lanes are a huge limiting factor. I expect a 3060ti to do better just because of the pcie bandwidth.
Interesting point. I bet it would. Man, 8 lanes and 128 bit bus for $400
💀💀💀
@@lashyndragon yeah bro but it has AI, it's better
Tests have been made. It's around 5% worse in pcie 3
@@Phenom98 That CPU is pcie 2.0 so I expect something like 15% and in some cases even more.
Mate, can you explain ? I'm really interested by this comment. What is the 8 lanes thing you talk about ? I don't know much about this. Thx in advance.
Very nice idea for a video!
I believe it would also be interesting testing the 4060ti with a 990x or a Xeon equivalent as you mentioned.
I can tell you that the equivalent xeon is defintly a decent choice for budget gaming ( X5650 or faster)
The Xeon equivalent to a 990X would be the X5690. And those Xeons are dirt cheap these days.
Yeah, maybe testing the 4060ti on an older socket X58 or X79 motherboard would be nice to see, X58 in particular as it allows for 6C/12T Xeons.
My man rocking a 4060 Ti, while the walls are bereft of paint or wallpaper! Gamer priorities right there 👍
😂
Yes. Please revisit this experiment with a Xeon X5690 and a full 24gb of ram. It's the budget friendly alternative to the 990X and a really impressive overclockable chip. The lack luster bandwidth from the LGA 1366 pcie buss will surely hold that 4060 ti back. But you can still achieve a reasonable gaming experience. I paired my X5690 to a R9 390X, and then later, a RX 590. Both of those cards seemed to be a good match for that processor.
Thanks for sharing! Maybe you could run a similar test including a 4th or 6th gen processor?
Yeah at some point for sure
Well done, friend. Love your content.
Excellent video. Seems like my I5 8400 will have a second life with DLSS 3.
i havent watched you in like a year and is it me or are you alot slimmer my friend lol you're looking good bro
It would be fun to see how many FPS an RX 6600 or RX 6650 XT would get in this system. I have a sneaking suspicion that the numbers would be VERY similar
Didn't AMD start with the x8 thing in the mainstream video card market? I know they started using x8 with RX 5500 XT. Same with the RX 6600/6650/XT family.
I took my 3070ti and put it in my old I7 3770k and I was actually surprised on how well it handled it , Was a good cpu bottle neck but it still preformed decent for the age !
Same for me. 3060 12GB and my old trusty 3770K. In 1080p ultra I get 80~95 fps in almost every AAA games. Sure it's a bottleneck, but for now I see no reason to rush and buy a new CPU =)
this is actually a use case that makes sense for frame gen! on modern systems evn if your fps appears more fluid, it doesn't seem worthwhile for the trade off in input latency, but on a super old build, where your framerate is hella inconsistent it suddenly makes perfect sense.
I wanna thank you for this video.
There's been so many people in the PC gaming space who is rushing to buy a x3d chip and trying to pair that with a 3070 cause that's all they can afford. I've been trying to tell most people, frankly, unless you specifically need to play a CPU intensive game, an i3 12100f is almost enough for even a 3090ti at 1440p. What this video shows is that most games you play, the cpu matters very very little, moreso in the 1% low.
hello this is crazy old pc keeps up so well!😀😀😀
Nice content! What we love you for! I would be interested in how you felt about the 1pc lows as does DLSS 3 really help with stuttering? It can interpolate a frame so guess visually so but not responsiveness so?
Obviously this system wouldn't be actually representative of the 4060ti in a modern system.
Very nice video!
Never thought that you can improve your cpu's performance with gpu
Recently I've put together my first pc
It rocks i7 860, 16gb of ram on a pretty decent motherboard, together with rx 580 4gb, 480gb a400 ssd and a 600watt cougar bxm psu
For now i just use this pc for studying, web browsing and lite gaming, and it does really great (i mostly play valorant and it runs 120fps on max settings)
I wonder if i7 860's performance can be improved too
But i don't think I'll be able to buy something like rtx 4000 series card
Very enjoyable video, thank you very much
You're improving your content with each new episode
Hope to see you in the next one))) ❤❤❤
Given the nVidia driver overhead, would also be interesting to see how the RX 7600 holds up on these older systems.
lol, still gonna be bottlenecked
Finally a compelling use for DLSS3.
Broo I love your vidss❤😊 u are a real one ur content is amazing
One of the engineers I work with gave me his old PC, that at the time was 5 years old but high end. Now 8 years later it is my sons and still works great, although looking to build a new one for him, but it has been great for quite a while.
So after ages finally you got sponsorships!
Would love to see a comparison by replacing the i7 920 with a six core higher clocked Xeon x5680 or similar, especially as these are now so cheap.
It's interesting to see how well the 4060 Ti got on with the older system. It proves that you can do it, although as you said, whether you should is another matter.... I'm actually running an HP Z620 Workstation as my home PC, got it reasonably cheaply last year, but the only graphics card I had to hand was an old GT710 2GB card, which is woefully inadequate. Sadly I lost my job a couple of months ago so I can't afford anything else....
I would love to see if a XEON X5650 or similar chip from that family would help leverage that 4060 Ti
It's not going to improve anything, it's bus limited. It's an absolute waste of silicone as a GPU..
over all surprisingly that 4060ti and that 1st gen i7 920 seem to play nicely together. My first gaming system back in 2011 was a i7 980x 24gb ripjaw ram and a GTX 580 FERMI gigabyte card. what a beast. Miss that old rig.
Amazing. I love that first gen i7/xeons.
old cases were just 👌 love them
I had an old HP workstation that I slapped an i7-870 and a GTX 980 in a while back. Used it for 4 years before upgrading to a GTX 1070 and later doing a new build. Still acceptable performance to this day
Dont forget RTX 4060 Ti has only 8 PCI lanes
Yeah that’s also why I wanted to use it here 😁
My first gaming PC was the Aurora R3, had 2xGTS460 in SLi, and a i7, it run any game i threw at it, i upgraded to a GTX680 EVGA FTW 4gb, and it did all I needed it to do for yrs before I ended up selling it, this case always has a soft spot in my heart, it was stable and never had issues :)
I had an i7 920 paired with a 1080ti. Cpu still did alright, upgraded 1 year ago
I'm seriously considering chucking one of these into my i7-3770K rig as an upgrade over a GTX 980 and to extend its life by another 2-3 years. Crazy to think when I got it 10 years ago I thought I'd be able to just about eke out 5 years from it, and yet here we are.
I remember i had a i7 920 6 years ago, for how old it was, it was a hell of a cpu for gaming. But man is that thing old.
While socket 1366 sure was considered a power house back in the late-2010s, the i7 920 needs help, via OC'ing.
How’s the diy coming on? I recognise all too painfully the look of freshly skimmed walls in the background
great video and cant wait for the 4060ti 1440p video.
but one thing is still unclear: did pcie 2.0 affect the performance and if yes, then how badly?
I think the 1060 was probably the last card that would see little difference between 2.0 and 3.0, however this chip is probably struggling to keep up with the 4060 ti, so I doubt it's making much difference in this particular case.
An overclocked X5690 would probably see this card run into some issues, or cranking the res on the current processor.
A comment above this got me wanting to put my RX 6600 into my 6 core 1366 build.
I had an i7 920 gateway gaming system that carried me all the way till last year when I got a 5600x. I replaced the original gt260 with a 1080ti before the crypto boom and I had no problems at 4k.
I run an old i7 4770k and gigabyte z87x-ud3h motherboard with 16Gb ddr3 ram paired with a RTX 3060 and I can say that the i7 CPU holds is weight in gold and does very well with keeping up with the 3060 graphics card. The only downside to this processor, is that I will not be able to run Windows 11 of which I don't really need or want at the moment.
I wonder how much the PCIe 2.0 slot played a part on the i7 system and it's performance ?
this GPU really only works as intended with PCIe gen 4, since it only supports x8 (of course if you aren't heavily CPU bottlenecked like here LMAO) but you probably would have a hard time finding a CPU like that on a modern motherboard
@@H4GRlD I know, but 8X PCIe 2.0 is only equivalent to 2X PCIe 4.0 which surely hampers it. I was just wondering by how much.
Yeah good question, I've only seen one video on gen3 where it was behaving technically as 3060Ti or even slightly worse, so probably it's quite a significant performance loss on even roughly half of that.
Wow this is interesting video
Surprised 15 years old i7 920 still performing quite ok
Do you think i5 4690 (non k) will perform similar?
Yep, the earliest Core i7 batches will be 15 years old, later this year.
Ha! That was my first “gaming” pc back in 2009 before I wised up and discovered how much better (and more fun) it can be building my own!
True! Used to spend HOURS flipping through Computer Shopper putting together possible builds. First one I put together had a Cyrix 686 (I think) chip 😅
Untill last year I ran this exact Alienware but with a 1060, it lasted way longer than I would ever have anticipated and it still works for games like RDR2/Far Cry 5 etc
I wonder how the 6 core LGA 1366 Processors would stack up in this comparison.
Yes I wanna see this! The 990X could still prove to be a beast!
Although Dell would never admit it, Most of the boards in these do work with 1366 xeons. My buddy is using one of these Aurora boards with a x5680 and a 6600 non-xt and it's a very decent 1080p machine.
(Not the card I would have picked given these run pcie 2.0 and a 6600 only has 8 lanes but I was surprised how good it was.)
@@slimjim2321 Hehe that's unfortunate, but 1440P would probably even that out. I love my 6 core xeon build, and I have a 6600 to put in it. Thanks for the cracked idea, I bet it'll do pretty well!
@@slimjim2321 AMD seemed to start the "8-lane train" on mid-range cards, notably with the RX 5500 XT.
Still was using my x58 asus prebuilt til last January. Swapoed the gtx 260 to a 6870, 660ti, then 1050ti. i7 920 swapped to a 6 core 12thread xeon @4.20. still does the job
I'm still running at 80 ~ 95 fps in 1080p ultra in all the recent AAA games with an i7 3770K OCed to 4,5 GHz, 32 GB 1866 MHz DDR3 and a RTX 3060 12GB. (Horizon Zero Dawn, Cyberpunk 2077, SotTR, Forza, etc.) I was planning to change for a 5800/5900X but I'm not in a hurry right now. And I might buy a RTX 4060Ti instead. This frame gen is really nice =)
If you have a W3690 that could substitute a 990X for testing, I was running one until quite recently and it did me quite well for most games
This: it's arguably better, because the Xeons generally got better quality bins, so you'll get better OCs. I've heard the X series are actually better overclockers, but at the cost of upwardly locked multipliers: so getting the max out of both your RAM and your CPU will be trickier.
My partner is running a W3690 overclocked to 4Ghz in an old Z400 workstation with 24GB mem and a 1070ti, it is still an absolute powerhouse and she hasn't encountered a game yet it can't run @ 60fps and highish settings. Will have to see how AVX requirements in newer games go though, as these old chips don't support it.
0:05 please tell me the case is making Cylon noises while the front cover slides down
It's always those 1% and 0.1% lows that just make games so stuttery. Still love seeing old CPUs perform "well enough" to justify saving some cash though. I tend to just leap frog between my CPU and GPU. Recently I upgraded from a 3570K(OC) to a 7700X, but still using my 1080 Ti. In many cases, my FPS doubled or tripled. Based on the games I play, I think I can wait for cheaper used RDNA3/RTX4xxx before upgrading my GPU.
I always tell people to plan to do this. Get a good case, and go "all out" on one or the other - more of the budget spent towards either CPU OR GPU - then every 2-3 years just replace them back and forth. That way your oldest component is 4-5 years old and one of the two is always current.
I personally like a top of the line CPU + midrange/budget $400-500 ish GPU, which was historically the XX70 tier, then going "big" on GPU a couple of years later, then a couple of years after that a platform upgrade of CPU/mobo/RAM, repeat. This allows you to spend ~$900-$1000 every two or so years and always have a good rig. Long term it's the same money commitment as buying a new iPhone every 2 years.
In many games I play the 1% and 0.1% mean nothing, cause in Fortnite, the fps can dip in the bus when it loads in, then you get a 1% of like 20fps.. but thats the only part when you see this 20fps.. rest of game is 120fps. Same with Horizon zero dawn, I can get those low numbers when a cut scene enter. .but when i play I am no where near those figures.
@@AndrewTSq It should be obvious that we're only talking about gameplay performance, not loading screens and cut scenes. When you capture frame time data for benchmarking, you should always take those out anyway.
@@nathanddrews I don't think RGIHD trims those metrics when entering/exiting cutscenes, levels loads, etc. as I've seen him mention it specifically to explain some low .1% numbers in his data.
There is no SHOT that a 1997 processor could play a 2010 game competently, yet here we are, a first gen i7 getting 60fps in 1440p in one of the most demanding (shit) games of our generation.
Im not sure what the best angle to take is for this, that 2010s procs are beast or the less attractive reality that computing technology has plateaued somewhat
i run a 4090 with my old i7 5820k. that cpu was from 2014 and i still use it atm as my gaming rig (allcore OC to 4.4GHz) with quad channel DDR4, 32GB, 2666MHz CL14. Red Dead Redemption 2 : 100+FPS, Assetto Corsa Competizione: ~80-120 FPS, Dying Light 2: ~100-140 FPS with DLSS3, BF 2042: ~70-100 FPS, A Plague Tale: Requiem with DLSS3: 120FPS (some stuttering sometimes, cpu load 90-100% xD ). All games, even newer ones are absolutely playable even with this old CPU.
BUT: you feel the age anyways. long loading times in games, high consumption (highest was up to 160 watt in BF 2042), frame rate instability due to slow uncore/bus or cache speed that shows as minor lags or stuttering ingame. Nevertheless, games a very playable after all, just minor lags in some of the newer games. it was good enough that i haven't feel the urge to upgrade immediately to a new platform. some of my new PC parts arrived yesterday, but i used this CPU/GPU combination for 6 months now.
You can't do this with a 9 year old GPU, its possible you can't even start new games because of missing driver support. and if they run, maybe only in lowest settings with low framerates. sorry for my bad english
Interesting. The 4060 or a possible 4050 might be good options for older systems or weaker modern systems like a G7400.
Can you do 4060 ti pcie 3 vs 4 @1440p? Pcie 4x8 should have noticeable impact in performance on pcie3 boards. 🧐
do you have a sense how much power is uses? Did you upgrade the PSU?
I am thinking about putting it in my old PC with an AMD FX 8250.
This makes the most impressive usecase for dlss3 frame gen to date., IMO. If only more games supported it
I love the mad science vibe some of your testing gives off... And that ugly old Alienware makes it even more mad sciency!
Put an X5690 (or a similar 6-core) in it!
I love the x58 platform.
I have a 990x x58 and am curious if i should slap a 4060 into while I plan my next PC build. My current RX480 drops frame rate below 60 FPS on 1080p
My PC is same age but a 3rd gen! Still played today title at least 50 frame and upwards with RX 580 on a 1080p med/high/ultra depends on the games. Now because am seeing the age of cpu and board thinks I'll need to upgrade sometime this year
In Apex Legends we have the lowest frames on i5, are higher than top frames on the old i7. It really shows us how much progress did CPU made.
You siting there in the room with a camera kinda makes look professional compared to your outside camera shoots.
well i would love to see how 4770k would hold up against the i5 coz its a very popular processor from past but still old
Can you run a similar test with a Haswell 4770 or 4790? (hell, even a 3770 would be a big jump over a 970!) I suspect that hardly anyone still uses Gulftown-era CPUs any more, as compared to Sandy Bridge to Haswell.
Honestly had no idea such an old CPU could be still so capable in 2023. I think I have one myself knocking about, but no motherboard. Might have to hunt one down and try it out for myself.
Brilliant performance considering the age of 1st gen i7...and can be purchased for just 10p @ CEX. Maximum bang for buck!?
You'll pay it back in the power bills ;)
I have the same rig same cpu. Wonder what gpu i can pair to still keep using it. Any suggestions?
Yeah wow - the 4060 Ti doesn't look too mismatched here indeed. The usage % weren't too far apart in most games. With 1440p or even some older titles in 4K and frame gen this looks pretty usable indeed.
Exactly what I was thinking after seing this video. I might buy a 4060Ti instead of changing my 3770k... 🤔
The fastest GPU back when this came out was a GTX 285, so it's wild to imagine how much power it was holding back to now nearly max out a 4060ti
I bin dived a lenovo m700 still had it's 6th gen i5. Stuck my 2tb ssd and it's working great. It's small form factor thought (not mini) so my obese 1060 can't fit so I'm in the market for sleek card of the same or more power.
Any thoughts?
Would love to see how this gpu pairs with a x5650 clocked at a basic 3.8-4Ghz.
I think a good idea would be if a GTX 1080 or 1080TI would give the same performance as the 4060ti. It would show a good performance pairing on why pairing older CPU and newer GPU isn’t good. But that PC for $85 and a $100 GTX 1080 would be a great pair for 1080p gaming today for under $200
I am curious on how this RTX 4060 or even an RTX 3080 would behave with a socket 775 system and quad core CPU.
How is DLSS activated? I keep hearing about it. Is it something in the GPU drivers or is it a Windows 10 feature?
Will you get the 16gb and non-Ti variant as well to collect all 4060 Infinity Stones
Because 4060ti has 8 pcie lanes, der8auer tested it with a 10850k and showed pcie3 actually bottlenecks it. I've looked at afterburner here and barely ever did either gpu or cpu hit near 100%. I'm pretty sure you'd have some interesting results comparing it to a 3060ti. Too bad nVidia is locking frame generation only to 4000 series, I really believe it would work on older cards but hey, something has to make new cards seem better than the old ones.
frame gen is literally running via hardware acceleration on the new cards which the old ones dont have
they're supposed to retrofit all their 30 series with the new acceleration cores?
@@gozutheDJ and Ivy bridge was running hot because smaller manufacturing process. I don't believe every piece of marketing bs and damage control manufacturers serve me.
@@youzernejm it’s not marketing bs lol, its how they do dlss as well. you could argue they should make these things at the software level and run them on shaders, but dedicated hardware acceleration will always be better so it’s silly to make that argument
@@gozutheDJ been in marketing for a decade, in it for longer, learned not to believe everything big tech tries to serve us as a reason. If you want to believe them, that's your choice. I don't.
@@youzernejm lmao
as i said, they dont have to hardware accelerate it, but it produces better results
same reason older amd cards cant do ray tracing
You can put a xeon X5675 in that thing for only 15 quid and have a beastly 2011 six core
Yeah will try it
@ 4:20 Average FPS was doubled but 0.1% low was tripled 17 FPS @ i7 times 3 is 51 and newer processor had 56 FPS, so few frames above triple
2:48 Triple Channel Alienware POG
with something like frame gen you can make most systems really good in gaming
How about upgrading to an i7 990X/980x and retest again?
Did the BIOS start acting strange after the card was installed: Look for these symptoms: 1: BIOS menu laggy, when in the BIOS config page. 2: The battery does not retain BIOS settings, only the standby power does, so if you turn off the PSU, the BIOS on next boot will prompt to re-enter everything.
Maybe it's because I used to own an Alienware Aurora that was like this one that I have really enjoyed these past two videos but it also upsets me that I sold mine. It also had an I7 920 and 12gb Ram. It had a Radeon Hd5870 1gb that I replaced with an Rx550 4gb. I played Farcry 4 on that setup and it was a decent experience. Also, this kind of showcase is the only way that seeing 4060ti benchmarks is pleasing to me. Cuz...well, you know....Nvdia.
In 2020, I did a "during the pandemic project", which was a Corsair Vengeance 12 GB triple-channel upgrade. I was able to install and update Windows 7 without a "PAGE_FAULT_IN_NON_PAGED_AREA" BSOD or "BAD_POOL_HEADER" BSOD. That was an ongoing issue I had with the Asus P6T6 WS Revolution, which I bought from eBay in summer of 2019 and I was facepalming. Because the motherboard seems to be in excellent condition. I did the project on December 5, 2020.
I ran an OC'd Xeon 5670 until a year ago. Depending on the game it held back my 1060 quite considerably. How cheap Ryzen 5000 is now these old platforms make no sense at all. I'm also intrigued to know what that PCI-E 2.0 @8 is crippling things.
Ryzen 5000 cost like 300$ mother cpu and ram . You can get old stuuf for 40$ whit ram and xeon.
my lga 2011 system give me b2 postcode with black screen on start,buy 4060 after using 1050ti long time,any idea how to fix this?
This makes me curious how old of a CPU u can get without lossing much or any fps with the 4060ti or similar GPU.
You should try one last attempt at the Alienware with those dirt cheap old xeons such as the x5675
The CPU holds the graphic card because it's a 8x PCIe graphic card I think, this help a LOT with old CPUs to have such less instructions on the bus trying to get out to the controller.
Around less than 40% bottleneck on such a wide gap is pretty low
I would like to see an AMD 3200 test with a 40-series card or AMD one. Just because there is such a big delta between low AMD chips and Intel chips (cpu's)...
That's actually a lot better than i thought
Im gonna pair a 3050 with an i5 4590
I remember in November 2020 when I paired an RTX 2060 with an i7-7700.
For everyone looking to build a new machine, every website and UA-camr is awesome. For the legion still running older hardware (or with kids running older kit), you sir are a godsend. Thank you for showing how newer GPUs in older systems will perform.
just bought this GPU yesterday. I'm going to be replacing my 3060 with it. IDK how much of an upgrade it is but hopefully i can see the difference lol. I will pair it up with a i7 8700. Hopefully it wont bottleneck it that much.
4050 TI and old system pair wonderfully
I have been touting these 1st Gen Core i-series CPUS as my new favorite "retro" chips. Thank you for doing this video! I feel justified.
All these types of videos show a few things that I find very interesting. First the DLSS tech which I basically never use because I just don't need to do it - but it is a fascinating technology. Something else that is very interesting is then how backwards optimized many games, in fact, are on PC. You could even attempt to run a new release like Spiderman on say a PS 2 right? The difference is just too great - but PCs are so easily configured you can really roll with something like this. Thirdly, this is why optimization and bug fixing for PC titles is much more intensive than for consoles. Being that consoles are "set" in their hardware and features and PCs are all over the spectrum in terms of power. Maybe someone has a great CPU and GPU but 8 gigs of RAM. It'd obv run better with 16 or 32 of course but can it squeak by with 8? Just the sheer possible numbers of viable system configurations is very expansive.
Final Point: this reminds me of what I did about 20 years ago. You'd get a system and maybe you couldn't afford to upgrade EVERYTHING. So you'd slap a bit more RAM at a faster clock speed in. You'd get a faster drive. You'd maybe just save up for a new GPU. And sort of "stagger" your upgrades so you never really had the top of line for any part but there was also always an obvious bottleneck to the system that when it was upgraded you always went "WOW this is great!". Now I just save up as soon as I buy a system for about 4 or 5 years (change, a bit from every check, etc) and whatever amount that is ends up being my budget.
Really great tests and interesting results!
Rgb = more frames