Did Shader Model 3.0 Help the 6800 Ultra?

Поділитися
Вставка
  • Опубліковано 12 вер 2024
  • In this video we put Shader Model 3.0 support on the GeForce 6800 Ultra to the test, and see if it really made NVIDIA's highend 2004 offerings more "futureproof".
    Intro Animation By Ken Gruca Jr - Inquire at kjgruca@gmail.com!
    Music: www.purple-plan...
    Visit on Facebook! / pixelpipes
    And on Instagram! / pixelpipes

КОМЕНТАРІ • 117

  • @loganiushere
    @loganiushere 4 роки тому +12

    It reminds me of the Voodoo cards:
    One card has truely awe inspiring performance, but skimps out on features, whereas it’s competitor(s) have less capable in power, but more feature speced cards, and that card was, in the end, more future-proof.

    • @wrmusic8736
      @wrmusic8736 7 місяців тому

      Voodoo GPUs got hopelessly outdated 2 years into their lifetime. Granted by 1998 standards 2 years are a technological eternity, but nowadays replacing a videocard every year seems insane.

  • @SteelSkin667
    @SteelSkin667 6 років тому +30

    I owned a X850XT at the time. It was infuriating to see all those games being purely and simply incompatible with an otherwise really fast card. The worst thing was that I was unable to upgrade for a while, meaning that I kept that useless thing until 2008, which is when I got a then brand new 8800 GT.

    • @PixelPipes
      @PixelPipes  6 років тому +3

      SteelSkin667 Ouch! Thank you for sharing!

    • @HappyBeezerStudios
      @HappyBeezerStudios 6 років тому +1

      The X850XT was still a nice card. But going to an 8800 GT is a fine choice.
      Just hoping for you that it wasn't one of the awful single slot reference designs hot enough to cook on it.

    • @SteelSkin667
      @SteelSkin667 6 років тому

      HappyBeezerStudios - by Lord_Mogul As a matter of fact it was one of the single-slot reference models. Honestly I didn't know better at the time, but fortunately it didn't give me any issues whatsoever. I have no idea how hot it got, but I do remember is that is wasn't as noisy as the X850XT.

    • @HappyBeezerStudios
      @HappyBeezerStudios 6 років тому +1

      Same goes for Half Life 2 and the other source engine games. Even read that some of them even offer a DX7 render path.
      Oh the dreadful single slot 8800 GT. Had one back in the day and it really started to shine after I modded the cooler with some case fans.

    • @GraveUypo
      @GraveUypo 6 років тому +1

      huh, i just posted something from the opposite perspective. seems the 6800 series really was the right choice. here's a copy paste of my post:
      well, since i kept my 6800GS until early 2008, it paid off. when my first xbox360 3rl'd i was devastated. but since there was gears of war and test drive unlimited (my favorite games on the xbox360 at the time) versions available on pc, that kept me going for the year it took me to buy a new 360 (and a 8800GT). even though the 6800GS ran those like crap. at least it did run them.
      that said, i did run into that same issue with the geforce 4. i hated seeing people with the older, slower 8500PRO playing battlefield 2 when i couldn't with my shiny super fast ti4400. then the 9500pro came along and i realized the gf4 was not only incompatible with the latest games, it was also MUCH slower than its direct competition. ugh i hated that card. and it was the most expensive card i've ever bought (corrected for inflation and currency exchange value between real and dollar).

  • @MFG9000
    @MFG9000 5 років тому +10

    Some ATi owner disliked this video.
    I love watching your content, sometimes over and over again, for nostalgia and for it's accurate information. Like this one that reminded me of that Far Cry console variable that enabled HDR lighting.

    • @PixelPipes
      @PixelPipes  5 років тому +3

      Makes me proud to hear that. Thank you!

    • @vladmihai306
      @vladmihai306 4 роки тому +1

      Not some. Only one user. Other agree.

  • @hblankpc
    @hblankpc 6 років тому +17

    I've always wondered about the longevity SM3 gave the GF6/7, this tackles it perfectly. I've got a lot to learn from you :)

    • @PixelPipes
      @PixelPipes  6 років тому +2

      Hey Obsoletist! Welcome to the channel!

    • @raresmacovei8382
      @raresmacovei8382 5 років тому +5

      Basically, having a higher end GeForce 6 or 7 allowed you to play almost all games release during the X360/PS3 era, while ATI X800 series got axed past 2007, where all games started requiring Pixel Shader 3.

  • @sinizzl
    @sinizzl 6 років тому +11

    I upgrade from a Geforce FX 5700 to an X1950XT in 2007...I did nothing but play Oblivion until the end of that year.. good times!

    • @justiny.1773
      @justiny.1773 5 років тому +2

      I had 2900xt 1gb in 2007 good times

  • @synixx9286
    @synixx9286 6 років тому +6

    I remember being 12 years old in 2005 and buying black and white 2, only to be confronted with the error "This game requires pixel shader 1.1 to run, please upgrade". The sales rep at PC world had told us our new celeron D PC could play new games but that was clearly a load of crap. So I went on ebay and bought the cheapest graphics card I could see which was the Radeon 9250 256mb. The game now ran! I was playing oblivion at the time and even for my 12 year old self it felt pretty choppy so I eventually got a Radeon X1650 Pro 512mb and upgraded to 1GB DDR RAM. Both games looked amazing with SM3 enabled and I knew I could do better so the next year, with crysis on the horizon I saved up my paper round money over the entire year and built myself a pc with an amd athlon x2 4200, x1950pro and 2gb ram, as well as a kick ass 22" widesceen 1680x1050!
    It was pretty awesome but when I eventually got my hands on an 8800GT in late 2008 I was blown away. That card was phenomental. I feel as though I missed out on the pre 2005 generation of graphics cards but my early expereinces with SM3 and the struggles of peasant gaming on a celeron D continue to humble me to this day!

    • @PixelPipes
      @PixelPipes  6 років тому

      Great story! Thank you!

    • @serenameep8565
      @serenameep8565 5 років тому

      Near same story here, i had to get the msi 9250 as my integrated graphics on my aldi medion sempron system couldn't even play C+C or even the sims well enough!
      I went sli gt7600 to play oblivion and like you when i got my first 8800gt i were blown away how everything became playable at high frames and details.

  • @fabiolorefice1895
    @fabiolorefice1895 6 років тому +4

    Cool throwback. I remember the discussion although I didn’t really care at that moment as I was still rocking a GeForce 4 4200Ti. Although I did buy a 6600 GT (AGP) later in 2004 Keep your content coming BTW! It is always nice to watch deep dives into this retro stuff.

  • @Shuttersound1
    @Shuttersound1 6 років тому +4

    Such nostalgia! I'm so glad I found your channel. Makes me miss the good old days of playing games like Farcry, Doom and HL2 on my AthlonXP 2800+ PC with an AGP 6600 GT. PC gaming hasn't really felt as special to me since then, so being able to relive those times through your videos is ace. :)
    Also, I remember an issue with Halo where cards like my 6600 with SM3 wouldn't display pixel shader effects like bump mapping and specular, but an X800 I picked up for a later system did. Maybe there was only support up to SM2 in Halo since it was made with the SM1.1 GeForce 3 GPU in the Xbox? Or maybe I just had some weird driver issue at the time.
    Also, (if you're still reading this) maybe you should make a video about the arrival of the 8800GTX and stream processors? I remember how much of a big deal it was at the time, and when I bought my 8800GTX it was like a night and day difference compared to what I'd seen from traditional pipe based GPUs. One of the biggest differences was in Oblivion, where the most cards before would have a massive disparity between indoor and outdoor framerates, but the 8800GTX had the raw power to stabilise this. :)

    • @PixelPipes
      @PixelPipes  6 років тому +1

      Glad I could help you relive the good ol' days! The 8800GTX is definitely a monumental moment in graphics card history and deserves some focus.

    • @HappyBeezerStudios
      @HappyBeezerStudios 6 років тому +1

      Those were the machines I've dreamt of back then :D

    • @levimaynard2237
      @levimaynard2237 2 роки тому

      Definitely the good old days...

  • @Carstuff111
    @Carstuff111 4 роки тому +1

    When it comes to PC gaming, I am, I admit, a bit behind the curve. Most of the games I tend to play are not the latest and greatest. I had a Radeon X800 Pro card that my roommate had bought brand new, and day one, he installed his own cooler (an AMD Athlon 64 x2 cooler he modded to fit) because he saw the card hit 80 degrees C almost instantly in a game. Once he put the modded cooler on, at full load, and overclocked, it barely got 5 degrees C over ambient temperature. And when I got that card, I ran it for a good, long time. Most of the games I played back then either needed SM 2.0 or older, and the X800 was a huge upgrade over the Radeon 9800 Pro I had before it (also modded and HEAVILY overclocked) for a long while. After the X800 Pro paired with an Athlon 64 dual core, I upgraded power supplies and ran a factory overclocked X1950 Pro for a little while, before jumping to a quad core Athlon II with first a Radeon 7770 (my first and only new video card ever) and then that same machine was upgraded to a Phanom II x6 1045T and Radeon 7850, both very overclocked. I do have a love for ATi/AMD cards I have to say, to the point my current rig now has the first Nvidia card I have owned since I had a GeForce 4200Ti that was bios modded to a 4500se. At this time, the heavily overclocked GTX 1070 I am now running was the best bang for the buck card to run with my new to me AMD Ryzen 5 1600X.

  • @soylentgreenb
    @soylentgreenb 3 роки тому +3

    I played oblivion on a 9800 non-pro at 800x600. Pretty sure I disabled HDR to get better framerates. At least I didn't have to use the "oldblivion" mod like geforce FX 5900 owners :P

  • @tHeWasTeDYouTh
    @tHeWasTeDYouTh 4 роки тому +2

    "Let's lay it out and not mince words. XFX's GeForce 6800 Ultra 512MB graphics card, priced at around £470 or so, doesn't represent decent value to the gamer right now."
    back in the day when I read that I made an account on the forums just to troll the entire site..........lol first time I did that. I loved my 6800 Ultra

  • @rijatru
    @rijatru 5 років тому +2

    Seeing the backlash against raytracing brought me to this video. Half of performance on current high end GPUs, standard in most games in a couple of years.

  • @SirDimpls
    @SirDimpls 2 роки тому +2

    I have a cheap old phone I bought as backup, it's a Sony E4g and it's crappy. This phone has a Mediatek 4-core chip with a low-end Mali-T760 MP2 GPU which has a theoretical computation power of 48 GFLOPs only and can do DX11 & SM5.0. While the Geforce 6800 Ultra discussed in this video has 54 GFLOPs. And that realisation blows my mind.

  • @HappyBeezerStudios
    @HappyBeezerStudios 6 років тому +8

    The GF 6 cards were fine, Same goes for the GF7 series. That was a phase when both companies had cards reasonable to choose after the disaster that was the GF FX.
    The GF 8 on the other changed alot with an insame performance gap and the 8800 GT being the "minimum requirement" on may games even almost a decade later.

    • @PixelPipes
      @PixelPipes  6 років тому +5

      The 8800GT was a truly legendary card, almost as much as the 8800GTX itself.

    • @likeclockwork6473
      @likeclockwork6473 6 років тому

      I'd say the Radeon HD 7870 should take the 8800's legend status. I've never seen an old re branded GPU bargain GPU thoroughly destroy a previous generation in sheer compatibility and general performance like that since the 8800. The HD 7870 was less than $300 when Nvidia's 6xx line released shortly after and remained priced well until now actually. Sure its not ideal but when you consider these GPUs can play everything the 8800s did and nearly everything today, its hard not to build the legend up for them. Better than the HD7970 in terms of market longevity.
      Look at the HD 7870 from this perspective. The R9 270 was selling for maybe $20 more than what the better versions of the GT 1030 is selling for now and that was years ago. The 1050 is its equivalent and its more expensive than the R7 270s were. What other GPU product have these manufacturers made that has remained competitive 6 solid year after launch?

    • @ChannelSho
      @ChannelSho 4 роки тому

      ​@@likeclockwork6473 Little late to this part, but what made the 8800 GT so legendary was that it destroyed the current generation market very hard. You could basically get then flagship performance for at least half the cost and it had less requirements overall (such as having a single slot cooler and needed less power) to use it. I don't think anything really came close to that.

  • @GraveUypo
    @GraveUypo 6 років тому +5

    well, since i kept my 6800GS until early 2008, it paid off. when my first xbox360 3rl'd i was devastated. but since there was gears of war and test drive unlimited (my favorite games on the xbox360 at the time) versions available on pc, that kept me going for the year it took me to buy a new 360 (and a 8800GT). even though the 6800GS ran those like crap. at least it did run them.

    • @PixelPipes
      @PixelPipes  6 років тому +2

      The 6800GS was a great card! I v-modded and overclocked the crap out of mine!

  • @retropcscotland4645
    @retropcscotland4645 6 років тому +1

    Reminds me of the old ATi rage pro chip. It had the CIF 3D application that wasn't widely used. Only company that released a working CIF 3D patch were Eidios for the Tomb Raider gold game. That patch works very nicely with the old ATi Rage Pro and proper era CIF enabled driver.

  • @ofoosy
    @ofoosy 6 років тому +2

    You have to do a video on that 6800gs!

    • @foch3
      @foch3 3 роки тому

      That's what I'm talking about.

  • @wrmusic8736
    @wrmusic8736 7 місяців тому +1

    Early tech adopting hardware is seldom actually fit to handle the new feature set.
    For every exception like Radeon 9700 or GeForce 8800 there are GeForce 3, GeForce FX, Radeon HD2000, Radeon HD5000, GeForce RTX20 which couldn't handle their new stuff.
    In fact since GeForce 8800 we haven't had a single GPU feature pioneer that could reliably deal with a cool new thing - and that was 18 years ago.

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk 6 років тому +1

    There were games released in 2010 -2013 that had directX 9 support like Crysis2, Far Cry 3 & Metro: Last Light. I suspect a lot people used their 6800's for a long time. I remember having an AGP 6800gs that I unlocked to a 6800 ultra.

    • @Knaeckebrotsaege
      @Knaeckebrotsaege 6 років тому

      I used a heavily modified and overclocked AGP 6800GT till I replaced the whole system in mid 2008 with a Core 2 Duo and 8800GTS 320MB. I still kept the old rig as a 2nd PC and occasionally tried games on it. I remember being surprised at the 6800GT being able to run games like Racedriver Grid no problem on pretty high settings

  • @trajanaugustus8783
    @trajanaugustus8783 6 років тому +1

    I had a PNY 6800GS with the pixel & vertex pipelines unlocked with Riva Tuner, ran on par with the 6800GT for less money. Great video btw.

  • @auroraattardcoleiro1455
    @auroraattardcoleiro1455 6 років тому +1

    Awesome channel :) Just subscribed. I hope that you grow as much as RetroGamingHD and Budget Builds!

  • @likeclockwork6473
    @likeclockwork6473 6 років тому +1

    By the end of 2007 ATI users wanting to upgrade could have picked up a HD 3850 for only $180 and that supported shader model 4.0. Kind of silly to look at the compatibility in games 3 years later as anything that special at the time.

  • @KinoKonformist
    @KinoKonformist 5 місяців тому

    I think at the moment in 2004-2008 most people without problems played on lower resolutions, cause most people don’t have LCD displays. So playing with 1024x768 or even 800x600 was ok.

  • @Silikone
    @Silikone 6 років тому +1

    ATI's SM 3 inclusion was perfectly timed. I played Mass Effect 2, a game from 2010, on an x1950 pro with satisfactory performance. Seems like the 6 series fell a tad short of the demands, though I'd love to be shown that this isn't the case.

    • @CompatibilityMadness
      @CompatibilityMadness 6 років тому +1

      X1950 Pro has 36 Pixel Shaders, that's 12 more than 7900 GTX :D
      Basicly : Any game that uses Pixel Shader code will fly on them.
      Also, they support Oblivion AA + HDR mode without problems :)
      However, You forgot about X1800 series which were the actual first ATI SM3.0 cards ;)

    • @GraveUypo
      @GraveUypo 6 років тому

      no, it wasn't perfectly timed, it was late.
      if had i gone with an ati at the time i'd be as mad as i was with my gf4 ti. never get feature-lacking cards unless you have a short upgrade roadmap

  • @Xbfg
    @Xbfg 6 років тому +2

    Keep up the fantastic content i still have my leadtek 6800gt lol its in my collection

  • @vansyly9794
    @vansyly9794 4 роки тому

    I upgraded from GF4 TI 4400 to 6800gt and it was a great experience. Huge performance upgrade with actual SM 3.0
    The Farcry and SC:Chaos Theory gaming was great.

  • @Synthematix
    @Synthematix 4 роки тому +1

    Id take the 6800gt anyday, im using the 6800gt gainward golden sample in my win98 machine, its bloody awesome. the 6800 supported directx 9c, the ati cards did not.

  • @Mantis4
    @Mantis4 6 років тому +1

    yet another great quality video keep em coming :3

  • @playingwithtrainerspcmt6407
    @playingwithtrainerspcmt6407 4 роки тому

    brings me back...at that time i think i only had the ati x700..great vids

  • @lucskyes2748
    @lucskyes2748 6 років тому

    Amazzing work bud. Hugs from Brazil !!

  • @inwork1
    @inwork1 5 років тому +1

    There was a community patch for BioShock that converted it to shader model 2.0b, it would be interesting to compare them.

    • @PixelPipes
      @PixelPipes  5 років тому +2

      I investigated that and it's extremely glitchy. It was never completed and many parts of the game are incorrectly rendered (or not rendered at all).

  • @hulkaman1a
    @hulkaman1a 6 років тому +1

    Great video, really enjoyed it. A++!! One observation I feel like you left out (sorry if I missed it) was the fact that, in 2004 the 6800 could do SLi. This put Nvidia ahead of ATi in terms of performance, albeit at a heavy cost. An early adopter of 6800 Ultra's in SLi would have benefited from more playable frame rates and higher resolution than what was shown in your video. All of that aside, I remember buying a factory overclocked BFG 6800 GT in 2004 for two games. Doom 3, and Everquest 2.

    • @PixelPipes
      @PixelPipes  6 років тому +1

      Crossfire definitely started in 2004 as well, albeit requiring an external dongle and "Master/Slave" cards. But I did not mention this, so it's a good point to bring up.

    • @hulkaman1a
      @hulkaman1a 6 років тому

      Crossfire was September of 2005. By then Nvidia had regained the performance crown with the 7800 GTX.

    • @PixelPipes
      @PixelPipes  6 років тому

      Ah you're right! My memory fudged that one!

    • @hulkaman1a
      @hulkaman1a 6 років тому

      An honest mistake! I'm a retro builder too, and sometimes I get my facts mixed up. I have dedicated 2004 and 2005 machines, that I've worked on lately, so it's all fresh.

  • @wishusknight3009
    @wishusknight3009 4 роки тому +1

    I was still rockin my 9700Pro AIW at that time. I held onto it until replacing it with a 7950GT. Perhaps close to the longest I ever had a video card. I only did that as I was able to use an Asrock 939Dual Sata2 board, which could facilitate the upgrade path. The video card was the last. Though I was not exactly paying all the AAA titles as they came out, i have generally waited for sales. And since hten I have stayed a bit behind the curve in graphics performance.

  • @Aranimda
    @Aranimda 5 років тому +1

    I still play DirectX 9.0c titles on a daily basis: Guild Wars and Guild Wars 2. Tho the second one would not run with proper framerate on a 6800 Ultra.

  • @evolucion888
    @evolucion888 3 роки тому

    The issue was that using shaders smaller than 64 per pixel would yield into severe performance degradation with the 6000 series, along with the fact that the issue with enabling HDR and no AA working was related to the ROPs doing the FPU filtering for the HDR, so they could either work one thing or the other. AMD had much better shading performance on the X1K series, and the shaders were the one doing the FPU filtering resolve for HDR and the ROP doing their magic for MSAA.

  • @robinenbernhard
    @robinenbernhard 5 років тому +1

    Love my sli 6800 ultra in bf1942 with 2 hhd in raid 0 in desert combat back in day. I was first one in game so i can get every vehicle

  • @DanielGT_93
    @DanielGT_93 10 місяців тому

    i had an e2160 overclocked to 3Ghz with an X800XT pci-e at the time of 8800GT launch. Still played a lot of games until i got an 8800GTS 512 used in 2010.

  • @soylentgreenb
    @soylentgreenb 4 роки тому

    Historically, the introduction of new features is not for the consumer initially, but for the developer. If the 6800 did not have SM3.0, the consumers wouldn't have games with good SM3.0 support for the 8000-series. Not just because developers didn't have cards with support to develop on, but because there needs to be wide support before developers even bother.
    The geforce 256 and radeon had per-pixel fixed function lighting. With like 6 passes or something it could run doom 3. Not playably, of course. T&L would eventually be assumed and polycounts could be made much higher; but not before cards without hardware T&L became uncommon. Shader model 8.0 on the radeon 8500 and geforce 3 could have been used to make the water in Half-life 2 and other neat features, but instead it was used to make player models and levels look like they were smeared in vaseline and make water look like liquid mercury. It wasn't until much faster DX9 cards that I saw it put to good use. This pattern has sort of repeated over and over, were early adopters don't get much benefit before the card is already nearing obsolescence.
    As an exercise to the reader, where will the 2080 ti be when hardware raytracing is commonly used in games as a set-it-and-forget-it feature?
    (I never realized Oblivion looked so much less terrible with HDR off).

  • @levimaynard2237
    @levimaynard2237 2 роки тому +2

    Some games in shader model 2.0 look cleaner in my opinion. May be the engines that were used, but as an example, source engine with half life 2 series, or Unreal tournament 2004, have "less detail" in some areas, but the overall presentation seems cleaner and subjectively prettier. Unreal engine 3.0 games by comparison, have that plastic look and its alot dirtier. Its hard to explain. As far as HDR is concerned, I think Day of Defeat source has some of the best implementation of it (also shader model 2.0).

  • @killmeh2
    @killmeh2 2 роки тому

    Great videos, brings nostalgia though. :D

  • @DyoKasparov
    @DyoKasparov 3 роки тому

    Man I remember when pixel shader 3 was new and bloom was hip

  • @cyphaborg6598
    @cyphaborg6598 7 місяців тому

    Oh yes, Splinter Cell Chaos Theory. They couldn't be bothered with shader support but added that POS Starforce.

  • @CamaroWarrior
    @CamaroWarrior 3 роки тому

    Did you make a video on the shader model 4.0 features, or even 5.0/5.1 ? ...... I remember I ran bioshock back then when it was new with a shader model 2.0 Radeon x850xt 256MB card with a patch, it ran good and looked nice but didn't look as good as a SM3 card.

  • @tylermartin7245
    @tylermartin7245 2 роки тому

    I miss this

  • @tHeWasTeDYouTh
    @tHeWasTeDYouTh 4 роки тому +2

    5:32 no joke even in 2006 when Oblivion came out I thought the game looked like garbage. the art style looked really lame compared to other games and the graphics really didn't sell me the game. In my mind I thought that since the game is so big then you have to have mediocre graphics and visuals to make the game work

  • @BlackDragon-xn2ww
    @BlackDragon-xn2ww 6 років тому

    This goes in the opposite direction but I like to take a old card and see what a game looks like in software mode or any mode really it is surprising some of the interesting color schemes giving a almost rustic look to some games lol

  • @sonicblast19
    @sonicblast19 6 років тому

    I remember there was a hack for Bioshock that made it run on Shader Model 2. I remember playing it on my Radeon 9600.

  • @jakedill1304
    @jakedill1304 11 місяців тому

    2:15 got to rebind q and e from the lean buttons LOL.. c&v work way better.. and that way you don't get stuck trying to lean your way into interactables LOL. I have a sneaking suspicion aside from the 20 year Time Warp of dual analog controller, q and e being a default for lean is why we don't see it anywhere near as much as we should have or do in modern games.

  • @jetbrucelee
    @jetbrucelee 4 роки тому

    Omg yes sm3.0 was a big deal. I bought a laptop in 2005 with an x700 ati and my buddy got a laptop with 6600gt. We played age of empires 3 and I was so pissed I couldn't play it with sm3.0 on. And it really made it looked so good.

  • @mr.hairyface8158
    @mr.hairyface8158 5 років тому

    Nice work! :)

  • @FatheredPuma81
    @FatheredPuma81 4 роки тому

    "If you can stomach dropping down to 800x600"
    m8 I played Team Fortress 2 at 640x480. It wasn't actually that bad but TF2 was build to be playable at stupid low resolutions so...

  • @mauriciochacon
    @mauriciochacon 4 роки тому

    had the ati x800pro and i could play all the next gen games at low res, it sucked but still could lol, crysis 1, tomb raider legend even with forced dx9c stuff, cod4, gow, UT3. btw you missed tomb raider legend with shader model 3 settings

  • @christiangarcia2533
    @christiangarcia2533 6 років тому +1

    Hay you should do a video on a amic a276308a. On the back of the graphics adapter was this 13k8214698. I found it in my mom's old computer. Do you think you could do a video on it please?

  • @Devilot91
    @Devilot91 6 років тому

    Oh the good old time I was so upset that my 9800XT and then my X800 XT could not run any game with shader model 2.0. Ati was smarter to focus on performance over technology, making it a really fast card but with the compromise of having older shader model 2.0b (which were a little upgraded version of "regular" 2.0). Viceversa, nVidia pushed more over future technology with shader model 3.0 but they were so useless like explained in this video. For this i hated for long years nVidia for damn shaders 3.0. Good old times :D

  • @DarkDrakee
    @DarkDrakee 5 років тому

    But there was also software to make SM3 games run on SM2 hardware.

  • @ching-chenhuang8119
    @ching-chenhuang8119 4 роки тому

    Well, it's sort of too late, back in 2007 I changed my display card from 6800GT to 8800GT, and I played Bioshock on my Xbox 360.......

  • @hate_mate7054
    @hate_mate7054 6 років тому +2

    subscriben, because you talk about my favourite piece of hardware

  • @jakedill1304
    @jakedill1304 11 місяців тому

    Weird.. I was able to run Bioshock way better on a 6800 ultra, with nowhere near as many of the settings downgraded for whatever reason.. I had to do a lot of ioni tweaks and run a separate program to fix the fov but.. it wasn't that big a deal to get it playable, if only it was also a good game.. why did they have to release SWAT 4 right before this..? Just to let us know what we could have had if this were a PC game...?

  • @Dysphoricsmile
    @Dysphoricsmile 6 років тому

    I upgraded FROM an ATI X850 Platinum TO a 9800 GT - because SM 3.0+! And from the 9800 to a GTX 460, to a 560 Ti that I got free, to a GTX 770, to now, a Gigabyte G1 Gaming GTX 980 Ti!
    And BEFORE the X850 I had an FX 5600, and before that I owned THE GeForce 4 Ti 4600!
    That is the GPU history of my, PERSONAL rigs!

  • @LastOneLeft99
    @LastOneLeft99 4 роки тому

    I went from 6800 GT to 8800 GTS just for Bioshock.

  • @dabombinablemi6188
    @dabombinablemi6188 5 років тому +2

    SM 3.0 really didn't help out the lower end models, such as the 6200A. It was simply too weak to see any benefit, and indeed was best suited to DirectX 8.1 and older titles, though still far better than the FX5500/5200 Ultra. Despite the DDR2 266 being limited to a 64bit bus.

    • @raresmacovei8382
      @raresmacovei8382 2 роки тому

      A friend was happy with his GeForce 6200 due to being able to play Crysis and least enable physics by going to the Medium preset (just for Physics and nothing else). It was a bit ridiculous that you needed a Pixel Shader 3.0 card for anything more than Low preset, (including actually getting destruction physics), but here we are.

  • @antwanarmstrong5987
    @antwanarmstrong5987 6 років тому +2

    Strange my copy of chaos theory supports SM2.0

    • @PixelPipes
      @PixelPipes  6 років тому

      Oh hey you're right! A later patch added support for Shader Model 2.0! Apparently ATI worked with them to get it implemented. Interesting!
      www.bit-tech.net/reviews/gaming/pc/scct_sm20_sm30/1/

  • @razvanfodor5653
    @razvanfodor5653 6 років тому

    ATI made a huge mistep in my opinion with the missing SM 3.0 support. They had the edge in performance since the 9700 and banked on it alone. Even at that time Nvidia made shady moves, so it was a shame ATI couldn't establish itself better.

  • @XzPERFECTIONzX
    @XzPERFECTIONzX 3 роки тому

    I got my 6800 Ultra today for 75 Euros! :)

  • @gregoryberrycone
    @gregoryberrycone 4 роки тому +1

    chaos theory wasn't a 360 title

    • @cyphaborg6598
      @cyphaborg6598 4 роки тому

      Huh!!,that's odd I know that it was on the PS3 because I have them all 3 (trilogy)
      I figured it would also be on the 360,I can't find a reason as to why it wasn't on the 360.
      (I knew it was already out before the 7th generation of consoles started)
      Perhaps due to Backwards compatibility?at some point the PS3 didn't have it
      You can buy the Xbox version for the 360 via the marketplace (since mid 2019)
      *shrugs* That latter doesn't make much sense lol but..
      Oh right Xbox One X enhanced.

  • @thelasthallow
    @thelasthallow 6 років тому +1

    so do a video on the 7800GT and compare it to the 6800ultra?

    • @CompatibilityMadness
      @CompatibilityMadness 6 років тому

      7800 GT will rape 6800 Ultra (in everything).
      I would argue, that 6800 Ultra isn't that great of a SM3.0 GPU because most games with native support for it are optimised forGeForce 7000 series of cards.

    • @thelasthallow
      @thelasthallow 6 років тому

      who gives a shit? thats litterally the point of this channel to bench old cards, i dont give two fucks if a 7800GT will "rape" a 6800 i just want to see what the performance increases would be.

    • @CompatibilityMadness
      @CompatibilityMadness 6 років тому +1

      OK. Check this : forum.pclab.pl/topic/1160166-AGP-Aftermath/page__st__20__p__14279091&#entry14279091
      Graphs with results are below text, I hope you don't mind having AGP instead of PCI-e versions. 7900 GS (20PS/7VS) vs. 6800 Ultra (16PS/6VS).

    • @thelasthallow
      @thelasthallow Рік тому

      @@CompatibilityMadness 5 years late to reply i know, OK thanks for the links i just checked them out. im building a retro PC and got a 6800GT but i also bought 2 7950GT 512MB cards for like $25 a piece on ebay. i also got a couple of 8800GTXs as well for super cheap. gonna do SLI with one set of them, not sure yet.

  • @ricardobarros1090
    @ricardobarros1090 Рік тому

    Good and thank you. Jesus bless you

  • @vdrand9893
    @vdrand9893 6 років тому

    You can unlock 6800 ultra with software to an quadro

  • @sniglom
    @sniglom 6 років тому

    Personally I always disabled hdr in oblivion. I thought it looked bad and wasn't worth the performance hit.

  • @FatheredPuma81
    @FatheredPuma81 4 роки тому

    Not sure why he didn't run Skyrim on this...

  • @dyslectische
    @dyslectische Рік тому

    I had a 6800le one you can unlock to a 6800gs but with 6800le memory chip.
    A good boost but really you can go for a 6600gt in that time.
    Funny thing is that was a gigabyte 6600gt sli on one board .
    But only work on gigabyte motherboard.
    Compleet stupid that never a unlock bios is voor dat card so you can use it on other boards.

  • @TimTaylor99
    @TimTaylor99 3 роки тому +1

    ... and don't forget AMD being the first with real DX10.1 support 🧐😅🙏

  • @Pidalin
    @Pidalin 4 роки тому

    I am playing FarCry in 1024x768 on my 6600 without problems, it runs well. :-D

  • @loganiushere
    @loganiushere Рік тому

    40 FPS = Barely playable?!

  • @Protector1rk
    @Protector1rk 3 роки тому

    When games start to required SM3 6800 Ultra became garbage piece of shit and Ati have X1XXX series card with this feature. Whitch outperform Geforce 7XXX.

  • @PyromancerRift
    @PyromancerRift 11 місяців тому

    Typical nvidia. Release a feature, but it is useless on the first gen.

  • @wakesake
    @wakesake 6 років тому +1

    i have the 850pe and the 6800ultra , to be real ATI killed them with ease
    i never liked bloom or blur i see that as a weakness , i respect Detail only
    btw lets get real Biosmuck was/is consoletarded rip off from System Shock

    • @CompatibilityMadness
      @CompatibilityMadness 6 років тому

      ATI kill them with Fillrates, NV went into functions and efficiency.
      Two diffrent paths, but for SM3.0 only GPU, ATIs X1900 series is THE king.

    • @GraveUypo
      @GraveUypo 6 років тому +1

      ouch, such fanboyery. get that flag down, we're not brand-worshipping in this channel.