Use to love matrox during the 90s :) I remember how cool the triple monitor option sounded, and in 1996 my parents bought a PC with a millennium so I followed matrox but never bought another one, I do however still have the millennium my parents bought back in 1996 :)
That's what I would like to have today. I have a single GTX 980 card but it has 2x DP 2x HDMI 1x DVI-I. I wasn't sure how to make a game cover all monitors so I went back to 1 monitor.
I have a Parhelia in my Tualatin 1.4GHz system. It's a really cool card, despite the somewhat lackluster performance. The big problem, was that Matrox was primarily catering to the professional market segment, and because of this, they often focused on developing new clever technology and delivering long term product support, rather than focusing on pure performance. It is very hard to deliver both a professional product for broadcast, medical and control system usage, that also works well as a high end gaming card. Especially if you need to deliver it at consumer level prices. In the end, Matrox decided to pull out of the consumer market, and stopped producing their own GPUs altogether, but they are still creating products for use in many different professional market segments.
Back then Microsoft was releasing too many versions of Direct X that only ATI and Nvidia were able to keep up with it... hence if Direct X had stayed in 8.1 for at least until 2004 or two years... it might have led to a three way war... otherwise too many versions of Direct X was a problem... it also does not give game devs enough time to optimize game much less the driver team which was something that ATI needed to fix.
So a 15 year old card that supports surround gaming, ultrawide resolutions and HDR. Why does it took the other companies so long to get that out to the consumer mass market.
Was there really a market for a triple screen with bulky CRT monitors? And HDR wasn't really feasible until a few more years, because the computation power needed for that was simply not possible.
Snetmot Nosrorb - Analog VGA Monitors back in the day were driven with analog signals, so you could get the benefit from 10bit on every monitor. No support was needed from the monitor at all. And i personally had a triple screen setup for a while, using a 16:9 21" TFT as a center with 2 17" CRT monitors in 1280x1024 as side monitors. It is quite bulky and you need a big desk to set it up, but it worked like a charm. Now that i have a RX580 with 5 monitor outputs on one single card, and have 5 TFT's around, i might set up a 5-screen surround setup some day.
Yeah, but they invented cars with diesel motors before oil was invented. There was absolutely no need for this stuff, especially when gaming with these options was almost impossible.
@@snetmotnosrorb3946 yes, i wasnt even a well off gamer, but yes, the fact that you could get a crt for a very reasonable price 2nd hand, that could do 1200p at 85hz... (not many years from then i had a 3 monitor setup with 3x23" crt's that would do 2560x1600@85hz, yes it was bulky, and..guess what...i still regret trading it off..the guys still using them...)
Deep colour wasn't available in consumer cards until 2009. Quadro and FireGL got it in 2006. Without that HDR is pretty useless, as it will introduce banding. And computers needed to be able to feed the data to the cards, billions of colours instead of millions. Not feasible in a time when people still used 16-bit colours for performance reasons.
I've always wanted to cover this card! I'm glad you tested Serious Sam: SE as that's one of the few games that can take advantage of the quad-texturing architecture of the chip.
Ana E Didn't help that the design of the chip was slightly broken. It was supposed to be the first DX9 card but the vertex shaders fell short of the spec.
I have a Mattrox card in my PIII 333mhz machine and the VGA output is incredible. It honestly blew me away. I've only ever known VGA to be a fuzzy mess on these modern machines.
I had a overclocked Matrox G400 Millennium Dual-head back in the day and I always wanted to upgrade to a Parhelia for the triple monitors but the card was very pricey so i never did, Matrox is still around though and still making unique video cards and other things.
Matrox did the typical (grave) mistake with the design of Parhelia -- overspending the transistor budget on vague and corner case features at expense of the raw performance, relative to the contemporary market needs. This is the very reason they opted for the wide 256-bit interface, to compensate for the lack of HSR and depth-buffer compression logic in the GPU. Yet, the vertex shader units were ridiculously over-engineered (fully DX9 compatible) and actually capable of primitive tessellation, totally overwhelming the following pixel shader stage to process so much geometry. The choice of bolting four texture units per pixel pipe in a time, when the game developers were clearly expecting and demanding more flexible pixel shader model and moving away from the simple texture combining, was more likely a gambling effort to compensate for the lack of next gen pixel shader capabilities. Triple monitor output and 10-bit HDR color format were way ahead of its time obviously, and the GPU fill-rate and memory bandwidth weren't at speed to provide usable frame-rates for all those features enabled.
And triple monitor output wasn't really all that viable with CRTs, due in large part to their heaviness. Once crystal based solutions started taking off (LCD, LED, OLED, etc.), triple monitor surround setups became more viable, though, not everyone uses them, especially not streamers. And some, like Wide As Fuck (or WAF for short), prefer ultrawides to surround setups.
Digital monitors/displays is a better term for what you talk about, Link. "LED-display" is just a misleading term for an LCD-screen with LEDs for backlight instead of CCFL. OLED is something completely different from an LCD, the diodes actually make up the pixels.
Although Matrox graphics cards are popular in older gaming computer configurations, Matrox is primarily for business and CAD applications. It would be interesting to try commercial applications tests compared to Quadro FX 2000.
Best era of, "have your cake and eat it too," Matrox Millennium with its insanely crisp VGA output for my "pro" 2D/3D app needs an then a Voodoo2 for gaming. What a thing it was!
I gathered quite a crowd behind me at lanparty when I was playing Quake 3 with three CRT screens using a Parhelia. It was just a slideshow but still cool.
Hey phil. First off, I really liked this video! Just about a month ago I got a few used Parhelia cards as Id always wanted to try one out. With the Parhelia I noticed there is a big difference in performance between driver versions. In MDK2 for example, there was a 36% difference between the fastest (105_107) and the slowest (111_000_114_HF). The newer drivers tend to be much slower in OpenGL. I also got a bit different results, for me the Parhelia was faster than the Radeon 8500 in UT2004 and in halo (even with 1.1 shaders) but that may be down to the drivers and settings used in those games. Another thing id like to point out is that the Geforce4 only seemed to give me a 16bit z-buffer in a number of OpenGL games, unlike the Parhelia, Radeon 8500 and even the Geforce3. Serious sam SE was one of those, although you can enable a 24 bit z-buffer with a command. Finally, the reason the Radeon doesn't take much of a performance hit with AF enabled is because it only does bilinear filtering with AF enabled. So the mipmaps don't really get blended together. The AF on the Radeon is also fairly angle dependent. Which isn't really noticeable in your example. You could maybe try something like Morrowind which has more uneven terrain.
I looked into the AF of RV280 GPU recently, in some games you can lean, and I found that you have to be at quite extreme angles, like 45 degrees to see AF stop working basically. But with slight angles, it still looks razor sharp. I miss this kind of detail in a lot of discussions, most just repeat what's on wikipedia, but haven't actually tried it. The issue with the mipmaps can be seen with tools, like Quake III you can color them in, but in normal gameplay, I struggle to see the issue. But that's just me :)
Basically, Matrox did not give a damn about OpenGL. It took a lot of programmer time to create good GL drivers while for D3D, part of the front end was programmed by Microsoft so it is easier to create D3D drivers. From what I heard, Matrox's GL driver was a GL to D3D wrapper, so it is normal that performance suffered.
I was an Nvidia/3DfX guy back in the late 90's and in the start of the 00's. Yet I dreamt about Matrox. It looked so different, and I dreamt about multi monitor setup from way back in the 80's. I shure remember the first time parhelia came out. Other than that, I was never a fan of mainstream stuff. I wanted Sound Galaxy and GUS, yet ended up with the Value edition of SB16. That said. I love my G400's and my TNT2's. Not a big fan of GF4, as it feels too polished. I like Radeon's though. Yet it can not be denied that Geforce cards are awesomme on their own. Performance are good, yet niche tech and quirky tech are just pure awesomeness. Like MPEG-1 decoder cards in full length isa versions.
PhilsComputerLab True that. I think that anything faster than 9600, is way overkill for Win98se. Though anything 9600 and faster is suitable for WinXP. Personally I am not ready for WinXP yet. It is still too new for vintage. And retro/nostalgia for me is Win98se and down.
Very nice review, cheers! The Parhelia is the only Matrox card I've never played with. Looks cool though, I may need to hunt one down. Massive kudos for the comparison screens where you flip between the three cards so we can see the visual quality! That's impressive reviewing and most helpful. :)
Those screens of X2: The Threat just brought back memories. To me it is still the best game in that series. To be honest I would have thought that this card would struggle with that game but I guess I am just forgetting how long ago it was.
Parhelia had the biggest silicon of the three cards in the 2002 generation. I´m pretty sure that Matrox lacked on driver knowledge, compared to nVidia and ATI. The same happened with the G400, however the G400 was able to compete in performance. Unfortunately, that was not the case with Parhelia. Nonetheless, it was a strong GPU and had the best image quality output and best shader version support of the three cards. I lost the oportunity to buy this card one week ago for 30€. I regret that i haven´t bought the card.
It was not just the drivers, the lack of any kind of Z-compression made it bandwidth starved even with 256-bit bus, and they also threw in too many TMUs on pipeline, giving the card free trinilear in times when anisotrophy was already a holy grail (tho AF is not equal, foir example the pre R300 radeons had the ugly binilear dithering on their AF which only manifests in motion aka you cant see it here on Phils static wolfenstein scene), but still, AF was the future. In other words, they made very big and beefy but technologically old GPU that focused on the wrong things. If they spent all those transistors on pure fillrate and shader performance and the card had Z-compression it would have been a monster. I have parhelia, actually both the AGP and the late PCI-E version and there is nothing to want there. It gets trashed by GF4TI. The output quality is also half hoax, yes its really good (analog that is, its DVI is trash the GPU does not nativelly support it and it uses the SIL A-D chip to do it and it maxes out at 1280x1024), but good branded Radeon and GF cards are also good, so its no real advantage. The 10-bit color is actually a valid agvantage, but if you want that in 3D it will cost you performance, a thing you are already lacking. I don't know.... I don't like the parhelia that much.
I'll have to look at AF on the R200/RV again, I do know of the shortcomings on paper, but I struggle to notice it in games. I'll make sure to capture some gameplay the next time I do something with those GPUs :)
Its easy to see, take some game with high texture resultion (UT2004 for example) and you must see that the distant textures are sharp, but in motion they will have shimmer on them, with visible lines where the LOD changes that's the binilear dithering. But you cant show this on youtube cause most of that shimmer will be lost here in the low video bitrate compression. Different people also see different things and details. I for example see differences between AMD and NV modern GPU AF filtering (one of the reasons I have nvidia) but most people are completely oblivious about it. The important thing is always how it looks in motion. On static screen, having no filtering at all would look amazing but the shimmer in motion would be unbearable.
Nice video, I didn't knew older graphics cards can do triple screen support (which modern cards do nowadays.) I remember a graphics card that looks similar to the Radeon 9700 Pro (the red one), because one time I used to have a stock of surplus graphics cards.
Matrox always made interesting, progressive graphics cards with innovative features, the problem was performance and support. Turned out after Parhelia they went the "professional graphics" route, i.e. multi-monitor cards.
The Zalman clone HSF on yout Ti4600 brings back some great memories. That design, particularly Zalman's full copper version (vs your aluminum type), was truly exceptional and provided seriously excellent heat dissipation.
I always dreamed about a triple monitor setup with that matrox card back then but i couldn't afford it. Now that i have a triple monitor setup i cannot go back. It is just so damn convenient.
I know that my comment doesn't have anything related to this video, but have with your channel. I'm simple computer enthusiast and I like to play with hardware, especially with older one, and your channel and videos you make are awesome. I found a lot of useful information's related to older hardware, only in compare to you I play with Linux operating systems. I have few Socket A processors, DDR 1 RAM memory on 400Mhz (4x256MB) and Gefore 4 Mx440 GPU, left over from my old hardware which sits in cardboard box which I found few weeks ago. I'm in search for motherboard with socket A and ATA Hard drive to complete my configuration (actually found few in my country), and wanna build a configuration which will run Linux system. Also I found my old SkyStar2 digital satellite card, and main use of this computer will be to run this card, to listen satellite radio stations and watch few FTA channels.
My very first accelerator card was a matrox mystique back in 97 or so. Not a great card driver wise and the performance was also pretty poor but that box art was amazing and the card will always have a special place in my heart. Also i have to just say after hearing it enough times i honestly love how you pronounce wolfenstien. Please use it more just so i can hear you say it.
No, there were more companies than just Matrox. S3 with their Chrome series of cards, SiS with the Xabre cards and Trident with their XP series (not to be confused with their Blade XP chip). In 2003 SiS' graphics division spun off and acquired Trident Microsystems to form XGI and released the Volari cards in 2003/2004.
Apparently, the AF in the Parhelia was only 2x, which explains why the textures were so much blurrier. Plus, of course, with the Radeon 8500, you still had the whole mipmapping scandal that was going on back then.
Right? I'm sure the Ti4600 and Radeon 8500 would look similar when set to 2x. I'm still trying to search online for a way to get the Parhelia to work at 8x or even 16x AF.
I had one and loved it at the time. But the anti aliasing technique it used gave me a white line bug in some games if I remember correctly. And again as you said the performance wasn't the best and I upgraded for that reason.
My dad got me a Mystique so I could play Moto Racer, and surprisingly the game came in the bundle (the one shown on wikipedia for the retail release of the card was the same, apart from having Destruction Derby 2). A decade later I noticed my dad had an unused Parhelia, and used it on a computer for my "stepbrothers" (the little was a little "crap" so as punishment his pc is back to motherboard integrated graphics) and while it did perform better on Battlefield 1942 than the integrated... it wasn't by much. I still wonder if there's a way to OC the card
This card is a real Victory, the must have for Halo and Far cry ....this type of game was in the Most Loved.. This card was a rendez vous for the futur.. This parhelia was just a model for this Série, the custummer was Aware of this ?
I'd love to have each and every device you've shown in this video. AMD ATI rock's, the newest AMD graphics card's have ReLive support which is an affordable card when software for capturing is ridiculous.
relive isn't that great though. it has twice the latency of nvidia's solution (15~16 ms vs 8ms). not that good for streaming games to other devices (like steam link).
GraveUypo mine works flawlessly with the RX 460 Relive has no noticeable lag at 1080p playing Overwatch. At medium settings 300 fps capped 115 fps under load. Nvidia is also expensive and do not hold driver support for their older chip sets. Which forces the customer too purchase another more expensive Nvidia card. AMD I have not experienced this with their products.
well, it's only 15 ms after all. there are screens laggier than that. if you're streaming to a tv most of the lag probably comes from it, not from the streaming. but it's still a little bit more laggy. but yeah, not enough for me to care at all. i won't be giving up freesync anytime soon so fuck nvidia
Wow. PCs were so far ahead of consoles at the time! Even the XBOX! I think the lower difference this day is a result of consoles becoming much closer to PCs in their fundamental being. They're basically all just x86 machines. The main difference is the OS and the form factor.
not really, xbox was released that year and it basically had a geforce 3 in it, which is more or less similar to a radeon8500 pro in performance (without filters). pcs always have the largest gaps to consoles just before a new generation is launched. this was just after. the gap only actually widened a lot after r300 was released and crushed all of these cards into oblivion. so it was a good almost 2 and a half years of pc superiority until the xbox360 came along and completely bridged the gap again. and that was the last time consoles where anywhere near as powerful as a high end pc of their time.
I remember looking at this card or the ATI 9700 Pro back when the 9700 Pro launched, glad I went with the ATI. :) In all seriousness, I recall reading something about the Parhelia being slow not only due to maturity of drivers but a supposed lack of Z-Occlusion Culling. Something about not rendering parts of the screen that would be obscured and out of sight anyway. Perhaps Matrox thought enough bandwidth would solve the issue instead?
Lets get this over with as I need to be somewhere else, try the AUSTECH EN9500GT you will be amazed you will need a SPDIF connection cable to connect the SPDIF connection on the sound card to the SPDIF connection on the EN9500GT to get sound through the HDMI port.
I remember when my dad got an AMD Athlon 600 with a Matrox Millennium G400 and Windows 2000 as his business PC. I was tranquilized by the beautiful G400 TechDemo with EMBM. (Basically, per-pixel shaded water before the GeForce 3 and pixel shaders) ua-cam.com/video/P_YiZzi9PkU/v-deo.html I never had any first-hand experience with the Parhelia, but Matrox was too far behind in the competition with nVidia and ATi back then.
Pretty neat gpu there for its time. Have you tried messing about with higher resolutions then a monitor supports natively with CRU aka custom resolution utility or with the drivers if the graphics card allows you too set custom resolutions like that R7 240 that has windows xp drivers ? doesi t allow you too do gpu scaling & then make custom resolutions ? Perhaps that works better with the gt 1030 Might be worth looking into if you force the screen scaling on the gpu rather then the monitor itself. For example a older crt or tn panel that doesn't do 1600x1200 but natively takes 1280x1024 if you can force it too run in 1366x1024 & 1440x1050 you'd have a resolution in between that's better then the native resolution of the monitor at the moment. With my current gtx 780 i can force a HP1740 too 1440x1050 atm as i mentioned at 60 hz gets a bit too blurry past that though, but on a crt it probably will be sharp still. But 1366x1024 looks better while being a bit higher still then 1280x1024. Ofc if its a good crt monitor that does 1600x1200 without problems it might do even higher resolutions then 1080p at a decent hz.
Awesome ! I was really waiting for this video as it was such an amazing thing back then ! the only drawback was the awful performance and price of the matrox cards in general ... I used to have a G400 and it was really really bad ... switched to GF2 that was way better but quality wasn't there ...
Hey Phil, what's the verdict on VESA compatibility for the Parhelias? I've managed to come into a PCIe Matrox M9120 Plus LP x16, and was thinking of tossing it into an Athlon64 I'm setting up to run DOS games. I can't imagine much has changed on the VESA front since the Parhelia debuted, but any insight you have would be appreciated.
I always thought there weren't many games that supported triple monitor gaming back in the day. I didn't upgrade to dual monitors until around 2008, and even then there were only a few RTS games that supported it (the main one I got them for was supcom forged alliance). Dual monitors was pointless in FPS games, so it was either single or triple. A mate did have a triple setup, but this was also much later than the matrox era, and I remember he had to tweak it a bunch since he was running a larger centre screen and two smaller side screens. It just seemed like more hassle than it was worth. Even today I only run a single screen at home, although that's mostly for desk space concerns (it's a bit difficult to fit _two_ 32" screens on one desk!), although I do find myself missing being able to put a movie on the second screen while working on a project on the main screen.
Yea the Quake 3 engine games all support it AFAIK. Also a few flight sims. But yes, many games require fiddling around with config files and whatnot. It was definitely way ahead of its time.
back when they first came out matrox cards were higher price point compared to ATI cards so never had a chance to try them and avoided nvidia pretty much too cause of the drivers were not all that good ATI had issues too but not as bad never heard of that card interesting video
I remember reading about this card in MaximumPC and they used Max Payne to show off the 16x anti aliasing abilities. It wasn't faster than the other cards at the time so your results aren't that surprising.
oh i remember this card. it was a bit of a hidden dream to own one of these for most people, but since money is expensive, they'd go with the safer, more future-proof cards. also de 8500pro was one of my consume dreams i've never fulfilled. how i so regret buying a geforce 4 ti4400 instead of this beast. but back then i'd go always for the very fastest card i could possibly stretch my budget to buy. and boy, did i stretch it to get that damn geforce. i regret it so much.
As far as I can remember Matrox was always the "professional" brand. If you were a computer graphics oriented builder, it was Matrox that you were supposed to buy. Not their gaming performance but image quality and advanced features. Looks like Parhelia (always thought it was Parnelia) was feature rich, but lacking in performance.
And you can still buy new Matrox cards like the M9140 for around 660 dollars. But what's the point? New GPUs have just as good image quality and they are way faster. What am i missing here? Why are the Matrox cards still around? And for such a high price...
There might still be a niche market for people like Traders or security systems that need something like an "all in one" solution for multiple monitors. And having like 3 ATI 5450/6450 for all the monitors will be a nightmare in terms of heat and possible driver issues. But that's just a wild guess.
That ain't too bad man, but in Lithuania yo ucan buy new MATROX M9188 2GB for almost 2000 euros, which is bit over 2000 dollars. Really why would anyone need it? You could literally buy 2 Radeon Pro WX 7100s and have lots of cash left. If you need lots of outputs, then maybe low end crossfire will work just fine. Also max resolution of that Matrox card is only 2560 x 1600. I guess it's per monitor, but still quite low in 2018.
Hey Phil, I have a intel dq67ep mobo laying around. I would love to give it to you for a review. Other than that i want to donate a shit ton of agp and pci cards (graphics cards,soundcard,modems etc) and maybe some RAM. My Problem is that i live in Germany and Shipping is too expensive to Call it a donation. Maybe we can find a way to do that.
Ah, I assume its a less common card not worth risking. I'm thinking the overclocking could make up for the lower base clock which causes the lower performance compared to the nvidia and radeon
The Parhelia is one of my favourite cards ever :) I have quite a few Matrox cards from the original Mystique to the more recent ones ua-cam.com/video/Pa08lCtqtzc/v-deo.html
This card show a good framerates with a maximum load of détails. This card is near 60 fps with everythings on. The 256 MB version is maybe better. The peoples had no real reason to beat to death this compagny. If They Buy like They votes , dont be surprise that we live into dictatorial States.
Use to love matrox during the 90s :) I remember how cool the triple monitor option sounded, and in 1996 my parents bought a PC with a millennium so I followed matrox but never bought another one, I do however still have the millennium my parents bought back in 1996 :)
It's awesome playing around with parts that, back in the day, cost an absolute fortune and were just totally unreasonable in terms of cost :D
PhilsComputerLab I have to agree there :)
BEXY'S PC Matrox and Intel Tualatin😍
Triple screen in 2002, madness man!
That's what I would like to have today. I have a single GTX 980 card but it has 2x DP 2x HDMI 1x DVI-I. I wasn't sure how to make a game cover all monitors so I went back to 1 monitor.
I have a Parhelia in my Tualatin 1.4GHz system. It's a really cool card, despite the somewhat lackluster performance. The big problem, was that Matrox was primarily catering to the professional market segment, and because of this, they often focused on developing new clever technology and delivering long term product support, rather than focusing on pure performance. It is very hard to deliver both a professional product for broadcast, medical and control system usage, that also works well as a high end gaming card. Especially if you need to deliver it at consumer level prices. In the end, Matrox decided to pull out of the consumer market, and stopped producing their own GPUs altogether, but they are still creating products for use in many different professional market segments.
Until NVIDIA figured it out with Quadro's, heh.
Back then Microsoft was releasing too many versions of Direct X that only ATI and Nvidia were able to keep up with it... hence if Direct X had stayed in 8.1 for at least until 2004 or two years... it might have led to a three way war... otherwise too many versions of Direct X was a problem... it also does not give game devs enough time to optimize game much less the driver team which was something that ATI needed to fix.
So a 15 year old card that supports surround gaming, ultrawide resolutions and HDR.
Why does it took the other companies so long to get that out to the consumer mass market.
Was there really a market for a triple screen with bulky CRT monitors? And HDR wasn't really feasible until a few more years, because the computation power needed for that was simply not possible.
Snetmot Nosrorb - Analog VGA Monitors back in the day were driven with analog signals, so you could get the benefit from 10bit on every monitor. No support was needed from the monitor at all. And i personally had a triple screen setup for a while, using a 16:9 21" TFT as a center with 2 17" CRT monitors in 1280x1024 as side monitors. It is quite bulky and you need a big desk to set it up, but it worked like a charm.
Now that i have a RX580 with 5 monitor outputs on one single card, and have 5 TFT's around, i might set up a 5-screen surround setup some day.
Yeah, but they invented cars with diesel motors before oil was invented. There was absolutely no need for this stuff, especially when gaming with these options was almost impossible.
@@snetmotnosrorb3946 yes, i wasnt even a well off gamer, but yes, the fact that you could get a crt for a very reasonable price 2nd hand, that could do 1200p at 85hz... (not many years from then i had a 3 monitor setup with 3x23" crt's that would do 2560x1600@85hz, yes it was bulky, and..guess what...i still regret trading it off..the guys still using them...)
Deep colour wasn't available in consumer cards until 2009. Quadro and FireGL got it in 2006. Without that HDR is pretty useless, as it will introduce banding. And computers needed to be able to feed the data to the cards, billions of colours instead of millions. Not feasible in a time when people still used 16-bit colours for performance reasons.
I've always wanted to cover this card! I'm glad you tested Serious Sam: SE as that's one of the few games that can take advantage of the quad-texturing architecture of the chip.
Nice, that explains the great performance in that game :)
Ana E Didn't help that the design of the chip was slightly broken. It was supposed to be the first DX9 card but the vertex shaders fell short of the spec.
I wonder how things had turned out if Matrox had made it right.
Every time I watch your videos I wish I had a time machine.. So I can send these to myself in the past =)
The precursor to AMD Eyefinity and NVIDIA Surround.
I love watching this stuff. It's unfortunate I missed out on this stuff growing up. My interests were elsewhere at the time.
I have a Mattrox card in my PIII 333mhz machine and the VGA output is incredible. It honestly blew me away. I've only ever known VGA to be a fuzzy mess on these modern machines.
Yup that's what they are known for :)
333 Mhz? Or 933mhz?
I had a overclocked Matrox G400 Millennium Dual-head back in the day and I always wanted to upgrade to a Parhelia for the triple monitors but the card was very pricey so i never did, Matrox is still around though and still making unique video cards and other things.
I was a huge Matrox fan back in those days. Thanks for the nostalgia trip.
Matrox did the typical (grave) mistake with the design of Parhelia -- overspending the transistor budget on vague and corner case features at expense of the raw performance, relative to the contemporary market needs. This is the very reason they opted for the wide 256-bit interface, to compensate for the lack of HSR and depth-buffer compression logic in the GPU. Yet, the vertex shader units were ridiculously over-engineered (fully DX9 compatible) and actually capable of primitive tessellation, totally overwhelming the following pixel shader stage to process so much geometry.
The choice of bolting four texture units per pixel pipe in a time, when the game developers were clearly expecting and demanding more flexible pixel shader model and moving away from the simple texture combining, was more likely a gambling effort to compensate for the lack of next gen pixel shader capabilities.
Triple monitor output and 10-bit HDR color format were way ahead of its time obviously, and the GPU fill-rate and memory bandwidth weren't at speed to provide usable frame-rates for all those features enabled.
And triple monitor output wasn't really all that viable with CRTs, due in large part to their heaviness. Once crystal based solutions started taking off (LCD, LED, OLED, etc.), triple monitor surround setups became more viable, though, not everyone uses them, especially not streamers. And some, like Wide As Fuck (or WAF for short), prefer ultrawides to surround setups.
Digital monitors/displays is a better term for what you talk about, Link.
"LED-display" is just a misleading term for an LCD-screen with LEDs for backlight instead of CCFL.
OLED is something completely different from an LCD, the diodes actually make up the pixels.
It was a stupid mistake to use one 256-bit interface instead of 4x 64-bit like R300 did.
Brilliant review of such a rare and unique card, love the comparisons!
Although Matrox graphics cards are popular in older gaming computer configurations, Matrox is primarily for business and CAD applications. It would be interesting to try commercial applications tests compared to Quadro FX 2000.
Best era of, "have your cake and eat it too," Matrox Millennium with its insanely crisp VGA output for my "pro" 2D/3D app needs an then a Voodoo2 for gaming. What a thing it was!
I gathered quite a crowd behind me at lanparty when I was playing Quake 3 with three CRT screens using a Parhelia. It was just a slideshow but still cool.
3 CRTs? Man that would have been a pain to cart around and setup :)
Hey phil. First off, I really liked this video! Just about a month ago I got a few used Parhelia cards as Id always wanted to try one out. With the Parhelia I noticed there is a big difference in performance between driver versions. In MDK2 for example, there was a 36% difference between the fastest (105_107) and the slowest (111_000_114_HF). The newer drivers tend to be much slower in OpenGL. I also got a bit different results, for me the Parhelia was faster than the Radeon 8500 in UT2004 and in halo (even with 1.1 shaders) but that may be down to the drivers and settings used in those games. Another thing id like to point out is that the Geforce4 only seemed to give me a 16bit z-buffer in a number of OpenGL games, unlike the Parhelia, Radeon 8500 and even the Geforce3. Serious sam SE was one of those, although you can enable a 24 bit z-buffer with a command. Finally, the reason the Radeon doesn't take much of a performance hit with AF enabled is because it only does bilinear filtering with AF enabled. So the mipmaps don't really get blended together. The AF on the Radeon is also fairly angle dependent. Which isn't really noticeable in your example. You could maybe try something like Morrowind which has more uneven terrain.
I looked into the AF of RV280 GPU recently, in some games you can lean, and I found that you have to be at quite extreme angles, like 45 degrees to see AF stop working basically. But with slight angles, it still looks razor sharp. I miss this kind of detail in a lot of discussions, most just repeat what's on wikipedia, but haven't actually tried it. The issue with the mipmaps can be seen with tools, like Quake III you can color them in, but in normal gameplay, I struggle to see the issue. But that's just me :)
Basically, Matrox did not give a damn about OpenGL. It took a lot of programmer time to create good GL drivers while for D3D, part of the front end was programmed by Microsoft so it is easier to create D3D drivers. From what I heard, Matrox's GL driver was a GL to D3D wrapper, so it is normal that performance suffered.
I was an Nvidia/3DfX guy back in the late 90's and in the start of the 00's. Yet I dreamt about Matrox. It looked so different, and I dreamt about multi monitor setup from way back in the 80's. I shure remember the first time parhelia came out. Other than that, I was never a fan of mainstream stuff. I wanted Sound Galaxy and GUS, yet ended up with the Value edition of SB16. That said. I love my G400's and my TNT2's. Not a big fan of GF4, as it feels too polished. I like Radeon's though. Yet it can not be denied that Geforce cards are awesomme on their own. Performance are good, yet niche tech and quirky tech are just pure awesomeness. Like MPEG-1 decoder cards in full length isa versions.
The 9000 series Radeons are great value, still overlooked somewhat :)
PhilsComputerLab True that. I think that anything faster than 9600, is way overkill for Win98se. Though anything 9600 and faster is suitable for WinXP. Personally I am not ready for WinXP yet. It is still too new for vintage. And retro/nostalgia for me is Win98se and down.
Very nice review, cheers!
The Parhelia is the only Matrox card I've never played with. Looks cool though, I may need to hunt one down.
Massive kudos for the comparison screens where you flip between the three cards so we can see the visual quality! That's impressive reviewing and most helpful. :)
Thank you! I do want to cover more image quality comparisons, it's easy to just focus on FPS :)
As i said. This channel is gold! Awesome reviews..
Those screens of X2: The Threat just brought back memories. To me it is still the best game in that series. To be honest I would have thought that this card would struggle with that game but I guess I am just forgetting how long ago it was.
Ah the long awaited video,thanks Phil!
Parhelia had the biggest silicon of the three cards in the 2002 generation. I´m pretty sure that Matrox lacked on driver knowledge, compared to nVidia and ATI. The same happened with the G400, however the G400 was able to compete in performance. Unfortunately, that was not the case with Parhelia. Nonetheless, it was a strong GPU and had the best image quality output and best shader version support of the three cards. I lost the oportunity to buy this card one week ago for 30€. I regret that i haven´t bought the card.
It was not just the drivers, the lack of any kind of Z-compression made it bandwidth starved even with 256-bit bus, and they also threw in too many TMUs on pipeline, giving the card free trinilear in times when anisotrophy was already a holy grail (tho AF is not equal, foir example the pre R300 radeons had the ugly binilear dithering on their AF which only manifests in motion aka you cant see it here on Phils static wolfenstein scene), but still, AF was the future. In other words, they made very big and beefy but technologically old GPU that focused on the wrong things. If they spent all those transistors on pure fillrate and shader performance and the card had Z-compression it would have been a monster. I have parhelia, actually both the AGP and the late PCI-E version and there is nothing to want there. It gets trashed by GF4TI. The output quality is also half hoax, yes its really good (analog that is, its DVI is trash the GPU does not nativelly support it and it uses the SIL A-D chip to do it and it maxes out at 1280x1024), but good branded Radeon and GF cards are also good, so its no real advantage. The 10-bit color is actually a valid agvantage, but if you want that in 3D it will cost you performance, a thing you are already lacking. I don't know.... I don't like the parhelia that much.
I'll have to look at AF on the R200/RV again, I do know of the shortcomings on paper, but I struggle to notice it in games. I'll make sure to capture some gameplay the next time I do something with those GPUs :)
I could get one for 30€ right Now. Should i do it?
Its easy to see, take some game with high texture resultion (UT2004 for example) and you must see that the distant textures are sharp, but in motion they will have shimmer on them, with visible lines where the LOD changes that's the binilear dithering. But you cant show this on youtube cause most of that shimmer will be lost here in the low video bitrate compression. Different people also see different things and details. I for example see differences between AMD and NV modern GPU AF filtering (one of the reasons I have nvidia) but most people are completely oblivious about it. The important thing is always how it looks in motion. On static screen, having no filtering at all would look amazing but the shimmer in motion would be unbearable.
I would.
Nice video, I didn't knew older graphics cards can do triple screen support (which modern cards do nowadays.) I remember a graphics card that looks similar to the Radeon 9700 Pro (the red one), because one time I used to have a stock of surplus graphics cards.
Matrox always made interesting, progressive graphics cards with innovative features, the problem was performance and support. Turned out after Parhelia they went the "professional graphics" route, i.e. multi-monitor cards.
Phil it looks like you have the refreshed parhaila card, that can overclock to 300mhz. That was the speed matrax wanted the card to operate at.
need mo card reviews like this man highlight of my week dat friday wideo
watchin the game. mikes harder purple drink.. uefi bugusnes ill deal with it later. go lakers
im haveing isues with a prototype system. and rufus. nothin huuge
The Zalman clone HSF on yout Ti4600 brings back some great memories. That design, particularly Zalman's full copper version (vs your aluminum type), was truly exceptional and provided seriously excellent heat dissipation.
I always dreamed about a triple monitor setup with that matrox card back then but i couldn't afford it. Now that i have a triple monitor setup i cannot go back. It is just so damn convenient.
Back in 2002 CRT monitor's were still standard. I couldn't imagine 3 CRT monitor's on single desk back then.
I know that my comment doesn't have anything related to this video, but have with your channel. I'm simple computer enthusiast and I like to play with hardware, especially with older one, and your channel and videos you make are awesome. I found a lot of useful information's related to older hardware, only in compare to you I play with Linux operating systems. I have few Socket A processors, DDR 1 RAM memory on 400Mhz (4x256MB) and Gefore 4 Mx440 GPU, left over from my old hardware which sits in cardboard box which I found few weeks ago. I'm in search for motherboard with socket A and ATA Hard drive to complete my configuration (actually found few in my country), and wanna build a configuration which will run Linux system. Also I found my old SkyStar2 digital satellite card, and main use of this computer will be to run this card, to listen satellite radio stations and watch few FTA channels.
Nice interesting reviews
Matrix always had excellent OS/2 drivers and performance.
Dude I had forgotten how good things looked back then, we get wrapped up in modern hardware and forget there was those that came before...
My very first accelerator card was a matrox mystique back in 97 or so. Not a great card driver wise and the performance was also pretty poor but that box art was amazing and the card will always have a special place in my heart.
Also i have to just say after hearing it enough times i honestly love how you pronounce wolfenstien. Please use it more just so i can hear you say it.
striderx777 that's how one should pronounce it since it's german
Source: i speak german
I remember reading about those, I've never seen one in real life.
Actually, the one thing that I remember about the parhelia is triple monitor support. I was always amazed by that back in the day.
I would like to see overclocking on this card and how much performance you can get out of it! ;)
I always remembered Matrix cards having such good image quality.
No, there were more companies than just Matrox. S3 with their Chrome series of cards, SiS with the Xabre cards and Trident with their XP series (not to be confused with their Blade XP chip). In 2003 SiS' graphics division spun off and acquired Trident Microsystems to form XGI and released the Volari cards in 2003/2004.
Edit; small mistake; S3's Chrome cards were from 2004 onward. My Bad. So just SiS and Trident in 2002.
And they all sucks....only one able compete was KYRO II from 2001. And it was capable beat only GeForce 2MX.....
they shoulda continued makin' cards for gaming on tha Matrox side
Apparently, the AF in the Parhelia was only 2x, which explains why the textures were so much blurrier. Plus, of course, with the Radeon 8500, you still had the whole mipmapping scandal that was going on back then.
Only 2x? What a letdown, all that great IQ...
Right? I'm sure the Ti4600 and Radeon 8500 would look similar when set to 2x. I'm still trying to search online for a way to get the Parhelia to work at 8x or even 16x AF.
I had one and loved it at the time. But the anti aliasing technique it used gave me a white line bug in some games if I remember correctly. And again as you said the performance wasn't the best and I upgraded for that reason.
I haven't had this issue, but then when I do these videos I benefit from being able to user newer drivers, so maybe it's something that got fixed.
Keep up the Fantastic work! It would be really cool if you got some Hercule's 3D Prophet graphic cards
The merge of 3dfx with matrox, the lost rendez-vous, the forgotten alliance.
My dad got me a Mystique so I could play Moto Racer, and surprisingly the game came in the bundle (the one shown on wikipedia for the retail release of the card was the same, apart from having Destruction Derby 2). A decade later I noticed my dad had an unused Parhelia, and used it on a computer for my "stepbrothers" (the little was a little "crap" so as punishment his pc is back to motherboard integrated graphics) and while it did perform better on Battlefield 1942 than the integrated... it wasn't by much. I still wonder if there's a way to OC the card
This card is a real Victory, the must have for Halo and Far cry ....this type of game was in the Most Loved.. This card was a rendez vous for the futur.. This parhelia was just a model for this Série, the custummer was Aware of this ?
I want one, my modern machine with the 1080ti has tipple screens. I love Surround gaming.
Really cool, please drop me link to download Matrox Parhelia Reef tech demo.
Check my website under software > tech demos.
as i said always, you're the master :D
I'd love to have each and every device you've shown in this video. AMD ATI rock's, the newest AMD graphics card's have ReLive support which is an affordable card when software for capturing is ridiculous.
The newest Cards? As far as i know, Cards back from 2012 can use ReLive.
kjjustinXD that's great news
relive isn't that great though. it has twice the latency of nvidia's solution (15~16 ms vs 8ms). not that good for streaming games to other devices (like steam link).
GraveUypo mine works flawlessly with the RX 460 Relive has no noticeable lag at 1080p playing Overwatch. At medium settings 300 fps capped 115 fps under load. Nvidia is also expensive and do not hold driver support for their older chip sets. Which forces the customer too purchase another more expensive Nvidia card. AMD I have not experienced this with their products.
well, it's only 15 ms after all. there are screens laggier than that. if you're streaming to a tv most of the lag probably comes from it, not from the streaming.
but it's still a little bit more laggy. but yeah, not enough for me to care at all. i won't be giving up freesync anytime soon so fuck nvidia
With consoles connected via LAN switch Gran Turismo 4 allows to play in 5 monitors max if I remember correctly.
Are you seriously comparing LAN gaming on multiple devices with multi-screen on ONE device ?
Looks hella nice! I've only tried triple screen gaming when I had a GTX 980, and 5760x1080 was too much for it.. :)
Wow. PCs were so far ahead of consoles at the time! Even the XBOX! I think the lower difference this day is a result of consoles becoming much closer to PCs in their fundamental being. They're basically all just x86 machines. The main difference is the OS and the form factor.
Yea this tech was so far ahead of its time. 2002! Unreal.
not really, xbox was released that year and it basically had a geforce 3 in it, which is more or less similar to a radeon8500 pro in performance (without filters).
pcs always have the largest gaps to consoles just before a new generation is launched. this was just after.
the gap only actually widened a lot after r300 was released and crushed all of these cards into oblivion. so it was a good almost 2 and a half years of pc superiority until the xbox360 came along and completely bridged the gap again. and that was the last time consoles where anywhere near as powerful as a high end pc of their time.
I remember looking at this card or the ATI 9700 Pro back when the 9700 Pro launched, glad I went with the ATI. :) In all seriousness, I recall reading something about the Parhelia being slow not only due to maturity of drivers but a supposed lack of Z-Occlusion Culling. Something about not rendering parts of the screen that would be obscured and out of sight anyway. Perhaps Matrox thought enough bandwidth would solve the issue instead?
Lets get this over with as I need to be somewhere else, try the AUSTECH EN9500GT you will be amazed you will need a SPDIF connection cable to connect the SPDIF connection on the sound card to the SPDIF connection on the EN9500GT to get sound through the HDMI port.
I remember when my dad got an AMD Athlon 600 with a Matrox Millennium G400 and Windows 2000 as his business PC. I was tranquilized by the beautiful G400 TechDemo with EMBM. (Basically, per-pixel shaded water before the GeForce 3 and pixel shaders)
ua-cam.com/video/P_YiZzi9PkU/v-deo.html
I never had any first-hand experience with the Parhelia, but Matrox was too far behind in the competition with nVidia and ATi back then.
Pretty neat gpu there for its time.
Have you tried messing about with higher resolutions then a monitor supports natively with CRU aka custom resolution utility or with the drivers if the graphics card allows you too set custom resolutions like that R7 240 that has windows xp drivers ? doesi t allow you too do gpu scaling & then make custom resolutions ?
Perhaps that works better with the gt 1030
Might be worth looking into if you force the screen scaling on the gpu rather then the monitor itself.
For example a older crt or tn panel that doesn't do 1600x1200 but natively takes 1280x1024 if you can force it too run in 1366x1024 & 1440x1050 you'd have a resolution in between that's better then the native resolution of the monitor at the moment.
With my current gtx 780 i can force a HP1740 too 1440x1050 atm as i mentioned at 60 hz gets a bit too blurry past that though, but on a crt it probably will be sharp still.
But 1366x1024 looks better while being a bit higher still then 1280x1024.
Ofc if its a good crt monitor that does 1600x1200 without problems it might do even higher resolutions then 1080p at a decent hz.
Awesome ! I was really waiting for this video as it was such an amazing thing back then ! the only drawback was the awful performance and price of the matrox cards in general ... I used to have a G400 and it was really really bad ... switched to GF2 that was way better but quality wasn't there ...
Nice techdemo Parhelia ! Looks awesome
No CRT. :( You really can't show off the Parhelia without a CRT because that's the only way to get the higher color depth.
Hey Phil, what's the verdict on VESA compatibility for the Parhelias? I've managed to come into a PCIe Matrox M9120 Plus LP x16, and was thinking of tossing it into an Athlon64 I'm setting up to run DOS games. I can't imagine much has changed on the VESA front since the Parhelia debuted, but any insight you have would be appreciated.
VESA compatibility? I didn't even consider using such a card for DOS :D
Are there any games from that era that supported triple screens?
I honestly thought the first triple screen gaming was from AMD with eyefinity
I always thought there weren't many games that supported triple monitor gaming back in the day. I didn't upgrade to dual monitors until around 2008, and even then there were only a few RTS games that supported it (the main one I got them for was supcom forged alliance). Dual monitors was pointless in FPS games, so it was either single or triple. A mate did have a triple setup, but this was also much later than the matrox era, and I remember he had to tweak it a bunch since he was running a larger centre screen and two smaller side screens. It just seemed like more hassle than it was worth.
Even today I only run a single screen at home, although that's mostly for desk space concerns (it's a bit difficult to fit _two_ 32" screens on one desk!), although I do find myself missing being able to put a movie on the second screen while working on a project on the main screen.
Yea the Quake 3 engine games all support it AFAIK. Also a few flight sims. But yes, many games require fiddling around with config files and whatnot. It was definitely way ahead of its time.
Cool. Yeah, I remember it wasn't easy in 2008, so it must've been tough to run a triple screen rig in 2002 or 2003!
back when they first came out matrox cards were higher price point compared to ATI cards so never had a chance to try them and avoided nvidia pretty much too cause of the drivers were not all that good ATI had issues too but not as bad never heard of that card interesting video
I remember reading about this card in MaximumPC and they used Max Payne to show off the 16x anti aliasing abilities. It wasn't faster than the other cards at the time so your results aren't that surprising.
wow, this is basically like 4k res in 2002
Thanks! I want one!!
Did matrox make any card faster than the parhelia with their own chips?
oh i remember this card. it was a bit of a hidden dream to own one of these for most people, but since money is expensive, they'd go with the safer, more future-proof cards.
also de 8500pro was one of my consume dreams i've never fulfilled. how i so regret buying a geforce 4 ti4400 instead of this beast. but back then i'd go always for the very fastest card i could possibly stretch my budget to buy. and boy, did i stretch it to get that damn geforce. i regret it so much.
As far as I can remember Matrox was always the "professional" brand. If you were a computer graphics oriented builder, it was Matrox that you were supposed to buy. Not their gaming performance but image quality and advanced features. Looks like Parhelia (always thought it was Parnelia) was feature rich, but lacking in performance.
I have those exact heatsinks on my Raspberry Pi, lol
It’s interesting, can Parhelia run this demo on 21:9 monitor
Buy me one and I'll try it :)
Now days we can just buy an ultra wide 35 inch screen or bigger but they did not have that back then I dont think?
Did you try overclocking to like 275mhz? xD or maybe even 300
Cute
Do the same test with a cotéduo2 or a quadcore let see the Victory of this bandwith
Wonder if you could have the RES that high on a ultra wide 35 inch screen to? on such an old card.
Send me that monitor and I'll find out :D
Does Matrox Parhelia has Win9x drivers?
Yes.
And you can still buy new Matrox cards like the M9140 for around 660 dollars. But what's the point? New GPUs have just as good image quality and they are way faster. What am i missing here? Why are the Matrox cards still around? And for such a high price...
TriGGletyplay I know they are multi monitor focused cards, but is that it? You pay 600 dollars to hook up 9 displays? What a f*cking ripoff
There might still be a niche market for people like Traders or security systems that need something like an "all in one" solution for multiple monitors. And having like 3 ATI 5450/6450 for all the monitors will be a nightmare in terms of heat and possible driver issues. But that's just a wild guess.
r4ndom reviews True...i asked before thinking it through. Thanks for answering.
That ain't too bad man, but in Lithuania yo ucan buy new MATROX M9188 2GB for almost 2000 euros, which is bit over 2000 dollars. Really why would anyone need it? You could literally buy 2 Radeon Pro WX 7100s and have lots of cash left. If you need lots of outputs, then maybe low end crossfire will work just fine. Also max resolution of that Matrox card is only 2560 x 1600. I guess it's per monitor, but still quite low in 2018.
Many high quality outputs on a single low profile card mainly
Oh wait i thought this was Buget-builds Offical
Hey Phil,
I have a intel dq67ep mobo laying around. I would love to give it to you for a review. Other than that i want to donate a shit ton of agp and pci cards (graphics cards,soundcard,modems etc) and maybe some RAM. My Problem is that i live in Germany and Shipping is too expensive to Call it a donation. Maybe we can find a way to do that.
How does that DVI to Dual VGA cable work?
It has one DVI on one end, then splits into 2 VGA female.
I have the PC-X Version of it.
I need win98 drivers...
Interesting video but uh, Parhelia sounds like some disease you'd get after vacationing somewhere they tell you not to drink the water.
How does it do overclocking?
I don't like overclocking retro parts, though I sometimes make an exception.
Ah, I assume its a less common card not worth risking. I'm thinking the overclocking could make up for the lower base clock which causes the lower performance compared to the nvidia and radeon
The Parhelia is one of my favourite cards ever :) I have quite a few Matrox cards from the original Mystique to the more recent ones ua-cam.com/video/Pa08lCtqtzc/v-deo.html
Nice. My favourite is the G400 MAX. All their cards are so well built, and best VGA signal quality hands down.
Probably Matrox at their consumer height with the G400MAX (once they sorted their OpenGL shenanigans!) mmm EMBM :)
use gpu-z v1.7.4
This card show a good framerates with a maximum load of détails. This card is near 60 fps with everythings on. The 256 MB version is maybe better. The peoples had no real reason to beat to death this compagny. If They Buy like They votes , dont be surprise that we live into dictatorial States.