Did Drivers Save The Radeon 8500?

Поділитися
Вставка
  • Опубліковано 14 тра 2024
  • In this video we take a look at the Radeon 8500, ATI's flagship from 2001 that was well known for being an underwhelming graphics card. Over time however driver development saw this card slowly but surely improving. In the end, was it enough to make this card a beast? Let's find out.
    I was going to launch this video on Patreon a day before the UA-cam release, but the only Patron that this is applicable to at the moment (Ferroset) requested that the video be uploaded straight to UA-cam instead. Thank him for the early upload!
    Links
    Discord: / discord
    Patreon: / spng
    Timestamps
    0:00 - Intro
    0:38 - Background
    1:18 - Release of the 8500 & its improvements
    2:03 - TruForm & Early Tessellation
    3:40 - A note on tech demos
    4:09 - How did it do against the competition?
    5:10 - Drivers & Testing Conditions
    6:03 - Benchmarks
    10:08 - 3DMark 2001 SE Results
    10:30 - Conclusion
  • Наука та технологія

КОМЕНТАРІ • 86

  • @SPNG
    @SPNG  18 днів тому +19

    Turns out I had Vsync enabled in the Max Payne benchmark, which does interfere with the results slightly (you can see it running into the framerate cap in the footage). There isn't an option to disable it in the game so I took it for granted that there was no way to turn it off but a few people pointed out it's possible to disable it in the drivers. I retested the game like this and here are the results (like before the benchmark was completed 3 times and then averaged):
    Cat 5.8 Average: 64 fps
    Cat 5.8 1% Low: 30 fps
    Cat 5.8 .1% Low: 17 fps
    7.60 Average: 57 fps
    7.60 1% Low: 18 fps
    7.60 .1% Low: 10 fps
    With this, the actual results don't change too much (+12% instead of +10%) but I wanted to drop that breadcrumb for anyone who noticed. Here's the frametime graph as well, the results are pretty much the same here too: imgur.com/a/LDBImpf

  • @DrivingVertigo
    @DrivingVertigo 17 днів тому +20

    Another amazing feature of the Radeon 8500 was that the anisotropic filtering was almost free. That alone gave it the best image quality in 3D games at the time. Nowadays I can't imagine running games without anisotropic filtering, it makes textures so clean and crisp.

    • @SPNG
      @SPNG  17 днів тому +2

      I plan to test some more games on this card with AF when I do my early 2000s GPU comparison, in hindsight I should have tried AF given the card's proficiency at it. It's funny because this is a stark contrast to the slow super sampling AA this card has, it looks great but the performance penalty is pretty extreme.

  • @PixelPipes
    @PixelPipes 14 днів тому +1

    It's cool to see the driver evolution. This is definitely one case where ATI made some of their biggest strides.

    • @SPNG
      @SPNG  12 днів тому +1

      Honestly I was a bit of a doubter of this card myself until I really tested it out... changed my mind about it very quickly, it's impressive!

  • @mrmcguru163
    @mrmcguru163 7 днів тому +1

    Dude this is soooo fricken good would to see more videos like this! Subbed!

    • @SPNG
      @SPNG  6 днів тому

      Much appreciated man, welcome to the channel 🙂

  • @JohnDoe-ip3oq
    @JohnDoe-ip3oq 18 днів тому +10

    I had one, the issues were over blown. You didn't need 300 fps in 3dmark, you wanted to max out settings at your refresh rate, and this is exactly what the 8500 did over Nvidia. Especially with FREE anisotropic filtering, 3dfx quality anti aliasing, and actually usable 16bit color unlike Nvidia for those old games. The analog signal and color was also cleaner than Nvidia. Bonus to tessellation was having it's own T&L besides shaders, while all other cards used shaders. PS. Benchmark any period correct Nvidia card vs ATI using anisotropic filtering. Nvidia chokes. There's also the multimedia capabilities via the MMC software, watch your videos transparent against your desktop. There's no question ati was superior if you had one, Nvidia just had better marketing.
    Also, history of Nvidia TnT2 drivers, Nvidia disabled multi texturing in 16bit and AGP4x to sell GeForce, Riva tuner had registry hacks specifically to fix this. The XP drivers were also half the speed of 9x drivers until years later. This, and the absolute scam pricing was why I tried a Radeon, knowing that there was more to the story than Nvidia marketing lies. People bought the lies so hard they bought GeForce FX with Pentium 4, while I was using Athlon with Radeon 9800, and it only got worse from there, so bad people think Nvidia's 8gb 128bit cards are actually worth it for DLSS and raytracing. Lol. History repeats.
    Speaking of, none of Nvidias dx9 cards were good. The 6000-7000 series had broken sm3 for 30fps, early cards had design flaws in video support, there was a cache flaw that broke resolution scaling. You literally couldn't use the cards for what they were sold for. Remember xbox360 vs PS3? That was latest Gen Radeon vs GeForce, and that's considering console games were more optimized. It was a total slaughter on the PC.

    • @shadowopsairman1583
      @shadowopsairman1583 16 днів тому +1

      I stopped using NV with the R9700 Pro, I haven't looked back.
      Seen too many bash AMD for drivers when I had no problems, yet I see too many nv problems with drivers or hardware failing.

    • @JohnDoe-ip3oq
      @JohnDoe-ip3oq 16 днів тому +1

      @@shadowopsairman1583 There were 2 instances of real driver problems, the dx8 cards on Windows 9x, and when AMD bought ATI and mishandled drivers until the drivers for Rage caused them to change the release model. Both cases were not big deals. The dx8 drivers got patched and you were better off using XP, and AMD did fix the release model. Keep in mind both times were on unpopular hardware. Dx8 vs 9, dx10 vs dx11. Also, the whole VLIW VS GCN. If you went with the popular cards you didn't have problems. Nvidia shills like to pretend last gen fake problems apply to perfectly fine current hardware. They also don't acknowledge their own problems, which often are unfixable hardware design flaws.

  • @ThePuuFa
    @ThePuuFa 14 днів тому +1

    I had one of these with a few pretty cool mods back in the day. I cut up a large slot1 heatsink for cooling the ram chips, mounted an amd k6-2 cpu cooler for the core and of course made a volt mod and maximum overclock! That thing was an absolute beast! Maybe some day I'll make another one just for the nostalgia😁

  • @squeeeb
    @squeeeb 17 днів тому +3

    Nice review. I liked the Truform history lesson!

    • @SPNG
      @SPNG  17 днів тому +2

      Thanks, glad you enjoyed! I wanted to put some focus into it as it's something a fair amount of people gloss over when looking at this card, despite how ahead of its time the feature was.

  • @JeordieEH
    @JeordieEH 17 днів тому +2

    I honestly really wanted an 8500 and thought they were impressive, I had a 7000, 9200 and 9600. Yet I remember thinking of how awesome an 8500 AIW was. I really liked the AIW series, I eventually bought a radeon capture card to have a home theater pc. Sadly I was hated seeing ati and nvidia moving away from that trend and HD video not really being an option, but the internet came up with a better solution. Then media eventually realized streaming and ease of use was something that had to catch up.

  • @shotgunl
    @shotgunl 17 днів тому +1

    Another great video! That's some impressive performance uptick in the newer drivers. Truform was always cool at the least, or at least I thought so. I never had any experience with the Radeon 8500. In my desktops, I had been using Nvidia products since the TNT2 and did so until I got a Radeon 9800 non-pro in '04 when my GF4 Ti 4400 started artifacting. However, after I purchased a 4:3 32" Sanyo HDTV CRT with an HDMI 1.0 input, I did put together a home theater + living-room emulation box SFF PC (based on a Biostar IDEQ 210V barebones + a Thorton-based Athlon XP 2400+) in Aug '04 that I put a Radeon All-in-Wonder 9000 Pro in. It was fine for what the PC was intended for. We used it in some LAN and online gaming in mainly UT2004 but a few other games, including WoW when the roommates and I got into that in early '05. That CRT TV-which I still have and use on occasion-will accept up to 1280x960 in 4:3 or 720p in 16:9 in progressive scan or 1920x1440 4:3 and 1080i in 16:9 in interlaced mode, and will not display anything below 640x480 through the HDMI input: it dsyncs. I usually did 800x600 in contemporary games, if I recall; definitely WoW was at that because I have a few screenshots backed up from before that SFF PC got water damaged... Well, keep up the great work!

    • @SPNG
      @SPNG  17 днів тому +1

      Much appreciated man. Sounds like it would have been an awesome HTPC, the AIW 9000 PRO is a very sensible choice for a system like that. I hear those IDEQ cases were excellent too, reminds me of the cases a lot of the old Shuttle PCs used! Shame the system ended up getting water damaged 😢Growing up my family used to have a similar Sanyo CRT, it was really nice. So many memories of playing the Original Xbox and N64 on that! I would do retro gaming on a CRT monitor myself but they're a little pricey and bulky... maybe I'll bite the bullet on one at some point though. Thanks for another one of your anecdotes, always fun to read about your experiences with old hardware 😀

  • @madchiller123
    @madchiller123 16 днів тому +1

    Blast from the past. In 2002 my dad and I built our first diy machines. I got the Radeon 7500 and he got the 8500. On top the Amd Athlon Xp 1800+ and a massive 1 GB of ram. Good times.

    • @SPNG
      @SPNG  16 днів тому +1

      Both would have been very nice systems at the time! I'd love to try this card in an Athlon XP system at some point

  • @podio_km4g532
    @podio_km4g532 18 днів тому +11

    This must be the 5th video in which GT5 OST interrupts me listening to GT5 OST lol

    • @DesautelsComputer
      @DesautelsComputer 18 днів тому +7

      SPNG has been using gran turismo and PGR2 music forever, so yeah that's a pretty spot on assessment

    • @podio_km4g532
      @podio_km4g532 18 днів тому

      ​@@DesautelsComputer Makes sense then

  • @tamw
    @tamw 18 днів тому +2

    Awesome card, great video man, such a trip down memory lane. A friend of mine was still running this card in 2006 with his p4 2.4ghz, 512mb ram and big old 19" crt, haha. I lent him a 2gb kit of ram and he was so amazed about all the textures he could suddently see in world of warcraft. Imagine having 512 + 64mb to work with in 2006, what a trooper.

  • @Shmbler
    @Shmbler 17 днів тому +4

    Drivers made such a big difference back in the day.Today when I test old ATi cards I quite often find myself thinking "wow, not as bad as I remembered it to be". Drivers were the thing that nVidia did right, beginning with the TNT.

    • @SPNG
      @SPNG  17 днів тому +2

      Yeah, it was something NVIDIA more or less got right the first time. ATI could release their cards with stable and fast drivers too but the 8500 definitely wasn't one of them.

    • @shadowopsairman1583
      @shadowopsairman1583 16 днів тому

      Cough detonators burning up their gpus,

    • @RyanAumiller
      @RyanAumiller 16 днів тому

      @@SPNG well, all nVidia did was buy 3Dfx and build off the GLIDE API which was essentially a reverse engineered version of Silicon Graphics' Hardware rendering API.
      If ya can't beat 'em BUY 'EM has been the corporate modus operandi ever since!

    • @JohnDoe-ip3oq
      @JohnDoe-ip3oq 16 днів тому

      Until GeForce came out, and the TNT performance took a nose dive. Same with XP support. There were registry tweaks in rivatuner to re enable multi texturing and agp. Very clear it was deliberate.

  • @awnordma
    @awnordma 18 днів тому +6

    Back in late 2001 there was a monthly computer show that I'd go to when building a new PC. There was a seller that had new OEM 8500s for $200, it was a killer deal of since the retail was $300. I was building a friend a new system and was blown away by it. Went back the next show and got my self one too. Got about 3 years out of the system before upgrading to a new system with a Geforce 6600 GT.

    • @SPNG
      @SPNG  18 днів тому +2

      Damn, that was closer to the going rate of an 8500 LE, would have been a steal for this card! With the driver improvements I can see how you got a good amount of longevity out of this thing!

    • @JohnDoe-ip3oq
      @JohnDoe-ip3oq 18 днів тому +2

      The dx9 Nvidia cards were dog water. You couldn't get 30 fps in farcry. SM3 was broken. All Radeon cards were superior until dx10. The x800 was sm2, but Nvidia couldn't run sm3 over 30fps, and the x1900 was basically xbox360 vs PS3. The x1900 destroyed Nvidia because they never fixed sm3 performance. The 6000-7000 series was only good at sm2 and marketing lies. Their mid range cards were also typical super crippled, just like today, while Radeon didn't do this until low end. Bonus tip: Nvidia also hit a resolution wall due to some cache design flaw, Radeon didn't, same with vram. You had to super cherry pick benchmarks and market bs to sell Nvidia, which is what they did.

    • @JohnDoe-ip3oq
      @JohnDoe-ip3oq 18 днів тому +2

      The real price steal was the Radeon 9000, got one for around $50, faster than GeForce 3, full dx8 unlike GeForce 4MX. Only downside was bilinear when using anisotropic, but this was a true budget dx8 card and this was the catch.

    • @SPNG
      @SPNG  16 днів тому +2

      @@JohnDoe-ip3oq I wouldn't say that was the reason they fell behind. SM3 wasn't broken on the 6000/7000 series cards, the reason they fell behind X1000 (at least in the high end) is because the GPUs and architecture were not well equipped for complex shader code used in later games. For one, X1900 had 48 pixel shaders compared to the 24 pixel shaders of the 7900 GTX, in games of the time they performed very closely to each other but as game demands changed they started to make use of the X1900's brute force pixel shading power. It also helped that R580 (and the rest of X1000) had dedicated scheduling hardware, which helped a lot with more complex branching code. In contrast G70/G71's lack of scheduling hardware meant it would run into missed branches more often and its huge thread size caused it to wait much longer to flush the pipeline when this happened.

    • @JohnDoe-ip3oq
      @JohnDoe-ip3oq 16 днів тому

      @@SPNG Exactly, the entire point of SM3 was for complex shader code, and Nvidia couldn't run it effectively. It was a tech demo, like dx9 support on the FX. They tricked people into thinking they had a good architecture, especially because it ran SM2 well, but SM3 did not. The x1900 wasn't better just because it had more shaders. The x1800 had WAY LESS SHADERS and could do sm3. Not only that, the x800 had SM2B or some in between support that could run back ported sm3 effects. If you want to argue semantics that SM3 wasn't "broken" in the purest definition, you can have it, but the point stands it was sub par and poorly designed to run SM3. Nvidia from the very beginning of GeForce's approach was brute force bad design over efficiency. So Radeons had hyper z, better shader performance, and free anisotropic filtering. PowerVR had tiling. 3dfx had glide. Nvidia had 256bit, the dust buster, and tech demos. Plus they cheated in those tech demos, and had better OpenGL support. So Nvidia won by marketing and drivers, and lost with actual capabilities. They could have fixed SM3 in the 7000 series because they knew it was bad in the 6000 series, and they didn't. Hell the biggest reason we didn't get high quality shader games until the xbox360, was because Nvidia never had hardware capable of it's marketing claims, and because they had a monopoly on PC, it took having a Radeon in the Xbox 360 to force PC games to look better. Before that we had what? Unreal engine 2 that didn't support bump mapping. Half Life 2 and Doom 3 was only a big shock because nobody was making games for Radeon capabilities. Which all Radeons could run from the 8500, 9800, x800, etc. Nvidia could run what? Everything on low settings? 30 FPS farcry? Nvidia coasted on marketing until they were forced to compete on hardware, which then they just dominated from dx10 forward, and changed tactics to planned obsolescence vram and overpriced low spec cards sold as mid range.

  • @PKmuffdiver
    @PKmuffdiver 15 днів тому +1

    I still have mine. I mostly played Wolfenstein Enemy Territory. As I remember, I was very happy with the performance. I was paying, so it was a great choice at the time. Good video.

    • @SPNG
      @SPNG  15 днів тому

      Much appreciated, glad you enjoyed. Nice that you hung onto your old card over the years!

  • @Stephen-OldBadGamer
    @Stephen-OldBadGamer 16 днів тому +2

    Looking at these cards now, is like watching a Model T Ford.

    • @SPNG
      @SPNG  16 днів тому

      Come on, they're not that old 😅

    • @Stephen-OldBadGamer
      @Stephen-OldBadGamer 15 днів тому

      @@SPNG I mean, they are beautiful things. Single slot gets me going every time...... (Homer drool) ............... Lol.

  • @alexandreconfiant-latour2757
    @alexandreconfiant-latour2757 12 днів тому +1

    Great video !
    Just wondering if this driver speed improvement is still significant on period correct hardware like an Athlon (Thunderbird / Palomino), Pentium 3 (Coppermine / Tualatin), Pentium 4 (Willamette / Northwood)
    The Athlon64 X2 is faster and supports more features than previous generations CPUs. For instance, Pentium 3 supports MMX and SSE while the K8 Athlon64 x2 supports MMX, SSE, SSE2, SSE3 and have 2 cores.
    The 2005 driver build could be newer enough to take advantage of some of these features to increase the performance gap. (As 2 cores cpu entered the consumer market in 2005, the same year that the driver you're using. And SSE2 and 3 were already here for a year or two already)
    I really love this short early programmable shaders era. With a DirectX release every year. Things were moving fast. Games had multiple rendering paths because DX7 class cards were (and would remain) widespread for years. DX8 lifespan was short in comparison. With only ATI R2xx, Nvidia GF3/4ti (and not to forget Matrox Parhelia).

  • @maxeluy
    @maxeluy 18 днів тому +1

    It's amazing how much performance a simple drive can unlock.

    • @SPNG
      @SPNG  17 днів тому +1

      I was very impressed, I'll admit that I wasn't expecting the drivers to do that much, figured it would be a 10-15% boost at best.

  • @draganraxrax7497
    @draganraxrax7497 17 днів тому +1

    I have this same card ;)

  •  15 днів тому +1

    I had the 8500LE version back in the day (Gigabyte Maya 8500 Pro) - going from a noname GF2MX400 felt like such an upgrade...

    • @SPNG
      @SPNG  15 днів тому

      Damn, that would have been a massive upgrade! The 8500 LE is not that much slower than the 8500 shown here, could overclock to 275MHz easily.

  • @kathleendelcourt8136
    @kathleendelcourt8136 18 днів тому +1

    The Radeon 8500 was an absolute beast. At the begining of it life is was struggling against the Geforce 3 Ti 500 because of immature drivers, and as they improved it ended up trading blows with the Geforce 4 4200. Oh and it seems that you left v-sync on in your Max Payne benchmark (the framerate keeps bouncing from a locked 30 to a locked 60 fps) which could explain why the difference in average framerates is so small.

    • @SPNG
      @SPNG  18 днів тому +1

      I want to try this card against my Ti 4200 at some point, would be interesting! And yeah I noticed that 🤦‍♂If I remember correctly though I didn't see an option to disable it, guess it slipped my mind. You can see it in the frametime graph too.

  • @MrGoneja
    @MrGoneja 15 днів тому

    Loved my aiw 8500 back in the day

    • @SPNG
      @SPNG  15 днів тому +1

      Would have been a great one card solution for a media center, especially if you wanted faster graphics for the time!

  • @ditroia2777
    @ditroia2777 День тому

    I put in a powercolor 8500 into my Athlon 10000. Upgrading from an nvidia card. Never had any issues with it.

  • @xXmobiusXx1
    @xXmobiusXx1 17 днів тому

    When Doom 3 was ment to be run on the 9800 pro, I remember the day I got doom 3 at release as a gift looked at the bottom of the box to find I needed the 9800 pro, I had the 9200 at the time. Asked my my uncle if he could help get me one as a early Christmas gift, he said no but then days later there it was, chillin on the table. Got home put it into my AMD athlon XP 2800+ pc and man o man, it ran doom 3 without issue at 1024 x 768 max settings. Made my friend jealous, he got the BFG FX 5800 to play it on his HP pentium 4 ht, didn't run as smooth.

    • @SPNG
      @SPNG  16 днів тому

      Nice, awesome that he got you the card 😀 Would have smoked the FX 5800 for sure!

  • @Pillusch
    @Pillusch 18 днів тому +1

    Very good review of this impressive card. I miss this time of rapid innovation. Truform alone was a very impressive technology that I read a lot about but never got to see in person as an old Nvidia fanboy. I would have liked a few more comparisons to the Geforce 3 😅

    • @SPNG
      @SPNG  18 днів тому +1

      Much appreciated man! TruForm definitely flew under the radar for a lot of people, I think they were overly focused on when it didn't work so well (like the inflated gun example I showed in the video). Even if it didn't make the most sense to implement I think it could have impressive results! And comparisons are definitely coming 😉

  • @YTKeepsDeletingAllMyComments
    @YTKeepsDeletingAllMyComments 18 днів тому +6

    Not gonna lie, I was a blinded nvidia fan boy at this time and instantly wrote off anything ATI. The R300 9700 piqued my interests but the 9800 won me over. My shoddy Geforce 4, price and the awful Geforce FX cards didn't help Nvidia so I went ATI for the next few generations.
    It really shows how blinded by fanboyism someone can be and how it skews perception. I would have told you no other card could compete with the Geforce 3, yet in retrospect this card can. At this point I hope AMD (or Intel for that matter) can release a card that really does what R300 did and just stick it to Nvidia.

    • @SPNG
      @SPNG  18 днів тому +1

      It's good to have an open mind and suspend any brand preferences when looking at hardware! I used to pick sides as well but it's really not the best way to look at things, you can end up missing out on so much good stuff. Also it would go against my idea of trying to review this hardware from an objective standpoint!

    • @JohnDoe-ip3oq
      @JohnDoe-ip3oq 18 днів тому

      Buy a 7900 GRE then. That's what you are looking for, if actually serious.

    • @YTKeepsDeletingAllMyComments
      @YTKeepsDeletingAllMyComments 17 днів тому

      @@JohnDoe-ip3oq I'm not currently in the market for a GPU, my 3070 is holding me over fine currently although the 8gb VRAM is concerning and I will probably upgrade in the future.
      Just saying I wish a card would come along that would really disrupt Nvidia. The R300 (9700, 9800) back in the day really shook stuff up when it came out. It was priced cheaper and performed better than Nvidia's offerings in most cases. Competition is always good for the market. I know AMD has some good cards with good rasterized performance but I don't think they are quite the slap in the face to Nvidia that the 9800 was.
      Thanks for the suggestion though. I'm hoping AMD's or Intel's next offerings can make an even bigger dent in Nvidia.

    • @JohnDoe-ip3oq
      @JohnDoe-ip3oq 17 днів тому

      @@YTKeepsDeletingAllMyComments if you're going Nvidia, the cheapest worthy upgrade is the 16gb 4070 tisu, but still terrible price. Everything AMD disrupts Nvidia under that. Low end? 6600. Cheap Mid range? 6700. Upper mid/cheap high end? 6800 or 7800. Real high end for cheap? 7900 GRE. The 7900 series is a 4k segment. You DON'T need upscaling, and the raytracing is 3080-3090 level. The only question you should have is if you actually care about raytracing, which bare minimum needs the 4070tisu for high and still requires dlss and framegen, and then you only get cyber punk. So if you don't care about marketing scams, the 7900 GRE is the best value because it does everything at the lowest price, and you can overclock it. Also, the 40 series power cables are dangerous, still causing failures, and will be redesigned for next gen. AMD is backwards cable compatible with no failures. I wouldn't say there's a clear winner, just best card for the price, and what you're willing to spend, and Nvidia has terrible pricing. Money doesn't matter? Buy Nvidia. Money matters? Stop being a fanboy, the marketing is a lie, buy AMD. Nvidia has scammed people since dx9, FX being fake, sm3 getting 30fps, then scammed physx because late to dx11, held back dx12 because didn't have real dx12, then pushed Rtx, and the 20 series can't handle rtx. They are the masters of marketing scams. So it shouldn't even be a question when considering value, because they don't offer viable value, only high end.

  • @michaelwood9866
    @michaelwood9866 16 днів тому +1

    i have the 8500dv all in wonder card

  • @AJ-po6up
    @AJ-po6up 13 днів тому

    Shoddy drivers have always been the bane of ATI/AMD even to this day, I've been using their cards for over 16 yrs and this is a pattern that always repeats and gives the win to nvidia, in the beginning at least.
    For example even my current RX 7900XTX has gotten better with each driver update compared to the RTX 4080, I guess it's what some people call "FineWine'" but I wouldn't call it that myself. I'm just amazed that even in 2001 or even before, this was happening.

  • @dyslectische
    @dyslectische 15 днів тому

    You can flash it with a radeon 9000 serie drivers.
    I think the radeon 9100 or 9200 bios
    I had that done with my 8500pro in that time.

  • @BalancedSpirit79
    @BalancedSpirit79 16 днів тому

    Back in the day, SharkyExtreme reviewed the Unitech Optimus 8500, which was enhanced to the point where it was able to challenge the GeForce 4 cards.

    • @SPNG
      @SPNG  16 днів тому

      I had no idea that card was a thing, or that there was any factory overclocked models of the 8500! Very interesting for sure, that must have been one of the best 8500s given the clock speed boost and 128MB of VRAM, kind of reminiscent of the FireGL 8800 in specs.

  • @shieldtablet942
    @shieldtablet942 15 днів тому

    I had one of these in black with 128MB. (8500LE).
    But I noticed performance was a miss in some titles and ended up trading it for a Ti 4200.
    I also had a 9800, of which my only complaint is the reliability. Great card that one.

    • @SPNG
      @SPNG  15 днів тому

      I don't blame you, the Ti 4200 would have been a decidedly better card at the time! I'd imagine the 8500 LE would have closed the gap over the years though.

  • @GraveUypo
    @GraveUypo 16 днів тому

    one of my all-time "wish i had this" gpus. this thing was... just a ray of hope. it was cheaper than geforces and faster, as well as more modern. it could run bf2, while my stupid geforce 4, which came later, couldn't, because it didn't support sm1.4

    • @SPNG
      @SPNG  16 днів тому

      It's a capable card for sure. I want to pit it against my Ti 500 and Ti 4200 at some point.

  • @Pillokun
    @Pillokun 17 днів тому

    Back then most I know had 17" crt monitors or even 19" with a res at 1280x1024 or higher on the 19" and above. heck I had an 15" until 2000 and still used the 1280x1024 res. 1024x768 was so old even back then when this card was the thing to have.
    But still good re:test.

    • @SPNG
      @SPNG  17 днів тому

      Thanks, glad you enjoyed. It's hard to make everyone happy in this regard since they experienced one thing or another back then, but the comparison still stands. To be fair this is the 64MB card, I would be a little more willing to crank up the resolution if I was using the 128MB version.

  • @myne00
    @myne00 17 днів тому

    "Fine wine"
    I always found ati and amd drivers to be annoying.
    I even isolated and reported a bug down to the exact version where it started and how to use an older dll to fix it in the rx580 era. A few years later I got an email that someone replied asking why it wasn't fixed yet.
    It's been 20 years, guys. Get it together.

    • @click4dylan2
      @click4dylan2 17 днів тому

      nvidia completely broke rendering specific older DX8 games in the gameready drivers released for GTA 5 PC, and didn't fix it for around 3-4 years despite numerous reports

  • @GraveUypo
    @GraveUypo 16 днів тому

    hey, it was nvidia that cheated benchmarks back then. i remember clearly that they switched their anisotropic to a bilinear filtering pass because they were getting slaughtered in aquamark, and it looked ugly AF with clear seams in the texture between mip map levels. iirc, they also rendered "32bit color" at 24 bits to save bandwidth. ATI had way better image quality and in that particular benchmark, it also smoked nvidia. i would take this card over any geforce3 in a heart beat. in fact i'd probably have traded my ti4400 for one of these when i realized i couldn't play bf2. (instead i bought a 9800pro which oh boy. legendary card. it was the 1080ti of the 2000's)

  • @omfgbunder2008
    @omfgbunder2008 16 днів тому

    I used to sell these cards in the late 90s early 00s. Doom 3 and Half Life 2 didn't even exist back then. I don't think that's really a fair test. 🤔

    • @SPNG
      @SPNG  16 днів тому

      I included them to show that this card could handle future demands. Its not amazing in Doom 3 or anything, but the fact that it can run it is impressive given how much of a system killer that game can be. Half Life 2 has become a bit of a mainstay in my testing as well. I wanted to show that anyone who did hang onto this card by that time could play the games to a decent ability.

  • @UCs6ktlulE5BEeb3vBBOu6DQ
    @UCs6ktlulE5BEeb3vBBOu6DQ 17 днів тому

    The 9800XT was the flagship of its year. It came with a Half Life 2 voucher (unreleased yet). The card was unusable for over 8 months causing multiple BSOD per day until drivers improved. I boycotted ATI after that till this day.

    • @Keullo-eFIN
      @Keullo-eFIN 17 днів тому +1

      9600 XT also came with the voucher. Personally I boycott Ngreedia these days.

    • @UCs6ktlulE5BEeb3vBBOu6DQ
      @UCs6ktlulE5BEeb3vBBOu6DQ 17 днів тому

      @@Keullo-eFIN I also had the 9600XT at the same time. Strangest thing happened to me. I benchmarked both of them and sure enough 9600XT was much slower. I then swapped the 9800XT for the 9600XT without re-installing drivers or anything and I got the benchmark score of the 9800XT with the 9600XT.

  • @asasom
    @asasom 17 днів тому

    honestly true form sounds like ray tracing for its time

    • @SPNG
      @SPNG  17 днів тому +1

      Yeah, they have a lot of parallels in that a fair amount of people saw them as a gimmick, and they ended up becoming part of DirectX in one way or another. I think a lot more people saw TruForm as a gimmick compared to ray tracing though, which is funny considering unlike RT it had no performance hit.

  • @fullfunk
    @fullfunk 17 днів тому

    I guess this gpu can output 15khz for arcade roms?

  • @omidlara4838
    @omidlara4838 15 днів тому

    ATI ALL-IN-WONDER RADEON 8500DV 64MB

    • @SPNG
      @SPNG  15 днів тому

      💯

  • @dyslectische
    @dyslectische 15 днів тому

    That stutter on this card is the reason i have sell it.
    And go for a geforce 3 ti200 @ GeForce 3 standaard card .
    Games go smoothly with the GeForce.
    I play a lot Warcraft 3 in that time.
    Special the online part of it.

  • @wowitsshit9734
    @wowitsshit9734 15 днів тому

    Use omega drivers