Scoop 2: Solid G550 news

A Matrox rep at the AMD Tech Team dinner in Philadelphia tonight provided
solid info regarding Matrox’s current and future plans. To quote: ‘”G800
is completely dead.” Apparently, the G800 was ‘based on G400 design’ and
featured doubled pipelines, 128bit DDR memory, hardware transform and lighting,
and some form of dual-chip technology. A slimmed down version will be
hitting markets shortly, however. G550 is a single-chip solution
featuring the same doubled rendering pipelines, 128-bit DDR support, but
only a limited form of T&L.


According to the rep, Matrox analyzed T&L and found only one feature which needed immediate implementation: four-vertex matrix skinning. For those who didn’t read up on Radeon’s new
tech, this skinning method allows two polygon objects to be “glued”
together, and the hardware provides an interpolated join, resulting in
smoother looking joints in 3D models, especially when moving.

The G550 will,
of course, feature Matrox’s great eDualHead tech and their renowned 2D
clarity. The card is expected to fall between the GeForce 2 MX 400 and
the GeForce 2 Pro in speed, most often falling just behind a GeForce 2 GTS.
The G550 is expected to run in the $150 to 200 price range and be available
within the next few weeks, “end of May at the latest.”

In a more nebulous statement, the rep said, “Remember when G400 MAX first
came out, and it was fastest thing and looked best? We are doing that
again in first half 2002. We will be kings of 3D again, until someone
jumps past us again. You know how 3D is.” Sounds like G1000 will be a
kicker next spring. Here’s hoping Matrox can field a six-month cycle and
get back in the game.

Comments closed
    • Anonymous
    • 18 years ago

    <flame>
    AG#61 (MR Amiga) : I disagree. I would say you have big problems. You’re probably around 30 years old if you’re into Amiga equipment. I hope you get laid before you’re 60. Obviously you wank too much especially when you hear the word “Commodore”. Get out more and try to improve your personal hygiene.
    </flame>

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    wtf is scrooling??

    • Anonymous
    • 18 years ago

    if those nvidia cards can only doing scrooling instead of proper scrolling then who on earth needs one!!!

    You won’t see a Matrox scrool. Matrox is [H]ard!

    fishpowder

    • Anonymous
    • 18 years ago

    Yeah MR amiga : )

    just wait for the ‘one’

    • Anonymous
    • 18 years ago

    matrox make the bestest cdroms ever!!!!!!!!1

    don’t knock matroxxxx!!!!!!

    • Anonymous
    • 18 years ago

    Gene / lenzenm : Thankx for the voice of rationality!

    How fortuitous! I’m waiting for AMD’s Palomino desktop CPU before building a new system (unless they’ve left out thermal protection). I will probably put a Matrox G550 in it.

    I’m no lackey to any manufacturer, but I’ve had my G400MAX for over a year now and it has been simply fantastic. True, it’s not the fastest card these days, but it’s certainly no slowcoach either. As a professional software developer who routinely (no pun intended) programs the DirectX API I can say that Matrox cards are exceptionally well designed and by far meet my needs the best. Don’t get me wrong, the GeForce & Radeon range are fine, fine products in their own right but I wouldn’t swap the 10-40 fps increase in games for the Matrox feature set.

    To the gerbil that complained about driver support, WTF? Stop talking out of your rectum, kiddie. Matrox drivers piss from a great height over just about all other manufacturers. Regularly updated with performance improvements, fixes and always rock solid. (although to be fair I always found 3Dfx’s to be very stable as well).

    And to the game-lamers who fawn and bray about how much better 201fps on an XYZ video card is *so* much better than 200fps on an ABC video card, and how the world would be a better place if everyone owned the same card as them… Grow up. You lamers clearly have an inferiority complex and only want to believe that you have the most super-duper PC in the world. If it’s any consolation, any half decent computer pro who knows what they’re talking about will laugh at most of your BS.

    Regards, Maka

    • Anonymous
    • 18 years ago

    I’m feeling [H]ard today!

    • Anonymous
    • 18 years ago

    Being an AMIGA user I dont Have any problems… Sorry
    (har har). MR Amiga.

    • Anonymous
    • 18 years ago

    G400 series are made in Canada..
    However G450 series are made in China…

    Does anybody knows where will G550 series be made? Canada or China?

    • Anonymous
    • 18 years ago

    Hi all Matrox user..

    I have been using Matrox card from the start and never both
    anything else. They have always been the nr. 1 Graphic card.
    and still is.

    now..lets talk about nivida vs Matrox. i have used
    Nivida GTS for some time now..and the GPU is very fast..
    every games runs fast with lots of FPS etc..one thing i wonder about is when u put the nivida card out of transform and lighting..u will see that the card is much slower than G400 with Directx 3D. G400 work just fine with DiretcX 3D with out T&L and every games i tried G400 work good for this “old” card.

    Is GTS slower than G400? no i don

    • Anonymous
    • 18 years ago

    I have an IBM A21p with a 15″ LCD. It runs 1600×1200 and it’s just beutifull.
    It’s amazing how much desktop space you get, but for people with poor eyes it’s definitely not the way to go.

    DVI = Great!!
    Analog = AnalGo, ASS

    • Gene
    • 18 years ago

    Oh yeah, I think I remember a company that did that a while back…what was their name, I’m sure it had something to do with computers…oh yeah, now I remember…it was AMD.

    • lenzenm
    • 18 years ago

    Gerbil #55:
    1) Matrox debuted the G400MAX at ~$300 US retail..didn’t sell well at that price.
    2) Matrox sells cards that cost over $1000 US; they are special-purpose, non-gaming cards, however. Their G200-cored, eight monitor supporting card comes to mind.
    3) Matrox is still primarily and OEM for office vendor…some secretray doesn’t need TnL, they need crisp 2D, for <$100.
    4) Even in the gaming market, the $150 US cards make more money for the manufacturer just through sheer volume sold. The R&D money that Matrox would have to spend to develop a $300 US card would never be recovered if the product was late/flopped…they are playing it “safe”.
    5) This is a “placeholder” card, until they can justify the R&D cost for the $300 card.

    Would I like to see a $300 card from Matrox that would be worth $300? Hell yes. Will we soon? Dunno. Realistically, this is probally the best route for them. Build a reputation with a low-cost card, then launch the high-dollar one.

    • Anonymous
    • 18 years ago

    If a $150 is performing like a GeForce2, wouldn’t a $400 card outperform the GeForce3?

    Why don’t Matrox make expensive cards?

    I’m tired of “budget price” marketing ploy. *cough*Kyro*cough*

    • Forge
    • 18 years ago

    Uh oh. Sounds like my praise of ClearType has got Twofer investigating WinXP…. Don’t do it, Twofer!! The ClearType is the only current redeeming value!

    • TwoFer
    • 18 years ago

    You’re welcome, Hallucinosis — I’m glad it went well for you. Yeah, you’ll see the biggest change at high rez/large color depth/high refresh rate, because the RFI choke’s cutoff is below the bandwidth you’re running the monitor at… In both my and Forge’s experience, you’ll get another nice increase if you short the inductors, too.

    If that Toshiba screen’s running at 1600×1200, your horizontal pitch is under .18mm — pretty fine! If you’re only at 1280×1024, the horizontal pitch is .22mm, which is probably the same as your Hitachi (dunno which way they’re measuring pitch). To me, an LCD screen looks very crisp, because there’s no bleed or misfocus — the transistor’s either on or off, unlike the potential errors in focus/aim on an electron beam.

    But because of that, aliasing can be a real problem — it’s hard to cheat like the analog CRT can do, so very small fonts get distorted and so on Which is what Microsoft’s ClearType is all about — they address the RGB subpixels separately, to do antialiasing at a rez approximately three times better horizontally than the “native” max of the LCD. Works, too…

    • Hallucinosis
    • 18 years ago

    TwoFer: Thank you.

    So far I’ve only been able to do the first part of the modification– removing the capacitors. I broke the capacitors into tiny bits using a pair of needle-nose pliars, brushed away the remains, and used a blade to remove any bits that didn’t get removed by crushing the capacitors.

    It looks noticably better at 1280×1024 @100Hz and 1600×1200 @85Hz… it’s not up to Matrox G200 quality, but the improvement is most certainly welcome.

    Right now I’m posting this on my laptop (Toshiba Satellite 2805-S402, GeForce2 Go, 15″ screen)… while it does display a crisp image and has very true color and has great geomety, I don’t think it looks as good as my Hitachi CM813 .22dp 21″ CRT while using a Matrox G200… Mainly I think the sharpness of the image is a lot better on the CRT. I think most of these LCDs have dot pitches above .26.

    • lenzenm
    • 18 years ago

    Ok, I will confess to Already owning a G400MAX DH & being quite happy with it. My primary ulilization is 2D, although I like to game (HL, UT, HG-2, SC, C’nC, etc. I don’t care for Q3, although I’m gonna pick up the Q3-engined ST:Elite Force… I only game when I have the time…I actually should game MORE, as I’m a beta tester for a software company you probally haven’t heard from yet…)
    Gerbil #35, 40, 47: I now have a 17″ Trinitron running @1152x864x32 @85Hz, and a 15″ Trinitron sitting next to it running @ 864x632x32 @85Hz, both running of the G400, in Win2K. Other than a few other cards from Matrox, and a few >$1000 professional vid cards from Diamond & 3DLabs, I couldn’t do this, since nVidia’s offerings can’t support diffenent resolutions in 2K, and their 2D sucks. Or, I could toss a couple ~$1000 LCD’s into the mix, so I can get the same visual quality I get now?
    Lesse….~$220 for the G400, 279 for the 17″, & 189 for the 15″ is less than $600 total. Or, I could get the $120 GF2MX, & then spend $2000 on monitors to get the same clean, crisp video that I need, since like TwoFer, I’m in front of a monitor at least 8hrs a day, and frequently far more. What’s more, I multi-task a great deal, so usable screen area is a must..hence the second monitor.
    Have you ever used Opera? It allows you to view a web page at any% of its given size. Try viewing a page designed for 1024×768 with an 8pt font, when shrunk to 80%. I do this frequently to maximize the amount of apps I can fit onto my screens at once. The only card I have access to that can render the fonts readable when compressed is the G400. I also own/use cards from s3, ATI, 3dfx, and even nVidia; these render the reduced text as unreadable black squiggles.
    My next vid card may very well be a G550 DH, so I can have crisp, multi-monitor support, with enhanced 3d. Decent sub-$200 upgrade.
    Am I a dedicated Matrox fanboy? No. If nVidia came out with a card that can do what I do now, with their enhanced 3D capabilites, I woluld probally buy it the same day. But untill then, I’ll stick with what’s been proven. My needs may be different from yours.
    If all you really care about is gaming, no arguements here-go buy nVidia…they have the best product out there for you…just don’t expect me to buy a monitor that cost 3X’s what my current one does, to make up for their (admittedly, few) shortcomings.

    On a side note, why is it that some of the most arguementative posters always post anonymously?

    • TwoFer
    • 18 years ago

    Thanks for that info, Gene — direct comparision can’t be beat.

    Gerbil #47 — I don’t disagree with you about DVI (my Inspiron notebook’s 15-inch 1400×1250 screen puts any 17″ monitor I’ve seen to shame, and is a fair competitor to my 21″ ViewSonic). But that’s no reason to slack off on analog quality, which you seem to be defending.

    Anal? LOL — I spend 14 hours a day in front of this monitor… it damned well b[

    • Aphasia
    • 18 years ago

    Nice post Gene!!!

    cheers

    • Anonymous
    • 18 years ago

    “I haven’t seen the Leadtek part, but I in fact have a Hercules Prophet II MX — also with 5ns RAM — and it gave me headaches until I chopped off the RFI filter caps; I just today shorted the inductors, to make it the equal of an ancient NumberNine Revolution3D! Does that make a point? (And like Hallucinosis’ experience, I find the MX has a skewed image which I can’t fully compensate for — while my V5 was perfectly true out of the box.)”

    If you’re this anal about 2D Image quality, you should be creaming your shorts over DVI. Because it removes all the analog stuff from the equation, and puts it where it belongs: in the monitor.

    My wife’s computer has a 15″ DVI-only LCD and the image quality is fantabulous. Far better than any equivalent 17″ LCD, of any price.

    Mark my words. DVI is like USB in 1996. It’s the future. Even for CRTs..

    • Hallucinosis
    • 18 years ago

    Oops… I accidentally clicked “Post Anonymous”
    Damn… I hate it when I do that. =)

    • Anonymous
    • 18 years ago

    TwoFer: Could you please provide details on how I could reproduce the modifications you made to your video card on my own Geforce DDR card? My experience with electronics is somewhat limited (when I was in Middle School I took a summer electronics course at the local community college and my late Grandpa on my mother’s side was an electrical engineer and taught me some basics), but I do know how to use a soldering iron.

    What really irks me is that the image is messed up a little in the dimension my monitor has no adjustment for– the vertical (rather than the standard horizontal, left and right) trapezoid and pincushion. The rotation is not cutting it for the clumsy image this Geforce DDR puts out– it’ll always look skewed as long as I’m using this card and it’s not my monitor’s fault.

    If this G550 performs as well as they’re claiming, I’m might be willing to give up some 3D performance for image quality. Perhaps I’ll go back to having a machine just for productivity and a separate gaming box. Before anyone goes out and buys a quality monitor (mine are pretty good), they ought to consider how good the image quality is coming from their video card. This Geforce DDR I’m running in my main machine doesn’t keep my monitor happy.

    Incidentally, my Leadtek Geforce DDR’s image quality is not nearly as good as my Diamond Viper V770 Ultra (TNT2 Ultra)…

    At least the nVidia cards look better than the very old S3 card I have in my firewall boxen (Linux, ipchains)… That S3 card looks miserable, even in text mode, on a good monitor. =)

    • Gene
    • 18 years ago

    Sorry folks, I had a little power failure fiasco. I have a UPS and it did a good job shutting down my computer safely…so I’ve written a new post, it’s a little shorter than I intended. If you have any questions feel free to post a follow up question.

    I have every intention of providing impressions only; benchmarks can be gotten from ample websites catering to that sort of thing. I don’t notice any difference between 50-60 fps and 110-120 fps. So be it. I will however endeavour to provide you with the culmination of impressions from myself and people who have used these computers for a while. There is always a difference of opinion regarding what constitutes visual quality; however, there is a consensus of opinion regarding which cards end up being the most appealing. It’s a style and feel thing, not a scientific thing. That’s why I’m always upgrading and trying new hardware.

    2D. The Matrox card is crisp, clean, accurate, and distortion free from edge to edge, text is easy to read. I spend four or five hours reading every day. The Matrox card is easily my favourite 2D card for reading. The Radeon is able to provide accurate colour and provide a crisp signal but leaves me with the impression that the text is slightly blurred, if you focus on it long enough it looks OK but when compared to the Matrox it just doesn’t compare. In stark contrast, both the Asus cards have issues around colour rendering and clarity. Visually both Asus cards appear more-or-less identical; I cannot feel comfortable reading text using either of these cards. The 5500 looks very good in 2D, I am still impressed by this card from time-to-time. If I’d characterize this cards 2D performance I would say it’s slightly too rich, colours appear somewhat over-saturated, text is easy to read, though not as easy as the Matrox card. The biggest surprise is the Elsa card, although sharing pedigree with the GF2 cards; it performs very well in 2D. I’m sure the 350 MHz RAMDAC contributes to this considerably. (The G400max=360 MHz)

    3D. I use 3D applications, 3DMax and AutoCAD. I play many different games, multiplayer games recently include NFS: PU, UT, Giants, Serious Sam, I used to play a lot of Q2 and HL: TFC, I don’t play Q3 or CS. It really depends on what generation of 3D games we are talking about when we discuss 3D quality. I’m going to reflect on the quality of Giants, it is the game that simultaneously demands top-notch performance from a card and shows the differences between cards most vividly. I’m specifically comparing a scene about half way through the Mecc story but I believe that my observations are reasonably consistent with general 3D performance. The scene is mountainous, water in the distance on both sides, greens hills in the valley below, a few buildings in the distance. Without a doubt in my mind the Elsa card looks best (it should for the money!) whatever nVidia has done poorly with the GF2 line they have made up for in the Quadro2 line. I’m going to try that soldering trick and make the MX card a Quadro, I’m very curious if the quality will change. Textures look right, colour is accurate, sky looks like sky, there is very little to argue with here…except when you compare it to the Matrox, the Matrox looks pretty good too. There is a total lack of eye-candy with the Matrox but it looks darn good. The land and water textures aren’t nearly as defined (realistic?) but the transitions between texture zones look smoother. The Matrox looks more like a video game than the Elsa, but I’m not sure I find it less appealing. The ATI and the 5500 both look very good. That said, the ATI seems to have an excellent balance between eye-candy and quality…as with the 2D, the 5500 looks sort of over done. When I compare the ATI to the Matrox I’d have to say that the ATI is probably a more compelling image because of the advanced surface rendering, but the Matrox has a coherence of image that the ATI lacks. The 5500 looks slightly cartoonish beside all the other cards, I’m not sure that this is a bad thing but I prefer a slightly more understated look myself. The Asus cards both look odd to me, the land and the water textures are rendered well…as well as the Elsa, but the colour is wrong. The beach areas are fogs of yellow and orange and look terrible compared to the other cards.

    I am very tweaked by the GeForce3 programmability; I look forward to the optimized images and games that can be produced by the GF3. Given the graphics business history though, I

    • TwoFer
    • 18 years ago

    Gerbil #35/40 — my reading skills are just fine, thank you. Go back and read Forge’s comment, then yours, and see if it doesn’t sound like you’re saying he should get a digital monitor if the GF’s image isn’t good enough…

    I agree that digital end-to-end is great — but that’s not the majority of what’s fielded, and not an adequate excuse for crappy visual quality on Nvidia’s part. Other vendors have much higher 2D quality — Matrox being the outstanding example — and Nvidia’s just ignoring it because it doesn’t matter to their primary target, the gamer.

    Spend some time on sites which deal with serious graphics — 3D CAD and modeling — and you’ll find plenty of bitches about the Quadro’s quality… and that is the best Nvidia’s managed.

    I haven’t seen the Leadtek part, but I in fact have a Hercules Prophet II MX — also with 5ns RAM — and it gave me headaches until I chopped off the RFI filter caps; I just today shorted the inductors, to make it the equal of an ancient NumberNine Revolution3D! Does that make a point? (And like Hallucinosis’ experience, I find the MX has a skewed image which I can’t fully compensate for — while my V5 was perfectly true out of the box.)

    If Leadtek’s card has better quality, then they’ve probably dispensed with the RFI filter, too… but since it’s in the reference design, few vendors mess with it. As for graphs… please! Graphs mean less to me than the paper they’re printed on, unless there’s good science to back it up — something which advertising rarely provides.

    • Hallucinosis
    • 18 years ago

    AG #35: DACs in monitors would be completely preferable. I’d like to see more CRT monitors with digital interfaces.

    The quality of the DAC on my Leadtek Geforce DDR is garbage even compared to my extremely old 4MB Matrox Millenium 1 (I bought it right after the 4MB version came out). Looking a little more recently, at the Matrox G200 PCI in my good ole PPro 200 (oc to 233) , I can honestly say I think the DAC quality on my nVidia based cards suck. I also have a Diamond Viper V770 ultra, which was supposed to be one of the higher image quality TNT2 based cards– still not nearly as nice looking as my Matrox G200.

    I’m doing side by side comparisons on the 2 21″ .22dp Hitachi CM813s I have side by side in my bedroom, at 1600×1200 @85Hz, 32bpp and 1280×1024 @100Hz, 32bpp.
    Furthermore, the image coming from my Leadtek card isn’t as square, so I have to do a lot more monitor adjustment….

    • Anonymous
    • 18 years ago

    “All these apologists for Nvidia are amusing — radical 3D is nice, but it’s only a tiny part of what makes up a really good graphics card. And Nvidia isn’t cutting it, in the big picture. I expect other companies will do far better in the next few months than the GeFarce-fans forsee…”

    I’m not an apologist for anyone, except for perhaps your reading skills. Going end-to-end digital with DVI has tremendous benefits:

    1) LCDs don’t need RAMDACs period. No brainer here.

    2) I’ll wager the entire continent of Canada that you will end up with far higher quality RAMDACs and associated circuitry if you buy it with your monitor.

    3) Why buy a bunch of RAMDAC circuitry with every new video card? It’s just added cost. I buy one monitor I use for 3-4 years, yet I may buy three video cards in that timeframe, all with their own RAMDACs.. and associated RF circuitry? Stupid.

    4) The “final solution” for analog display issues is to move them out of the video card completely and go end-to-end digital.

    5) Analog sucks. Digital rules.

    DVI is the future, just like USB was back in 1996. If you can’t see it, then perhaps you’re blind? I can’t help it if you’re lacking skills as an industry analyst. I’m not.

    Also, there are puh-lenty of nVidia cards with excellent RF circuitry. Check out the Leadtek Geforce 2 MX DH Pro. 5ns RAM, plus they advertise the 2D quality– with [i]graphs[/i] no less- right on the box.

    • Anonymous
    • 18 years ago

    *[

    • TwoFer
    • 18 years ago

    Gene, I’d love to hear your side-by-side comparison — 2D as well as 3D, in fact.

    Post away…

    • Gene
    • 18 years ago

    It always strikes me as funny that people get so worked up and strong in their opinion about graphics cards. Three years ago the 3DFx fans said that nVidia was a piece of crap, now nVidia fans call everyone else’s product a piece of crap. How many of these nVidia people were previously 3DFx people?

    Gaming is a huge market and is very important to consider and if gaming is the single biggest factor for you when purchasing a video card you need to look for gaming performance, I can understand that. Speed performance is not everything however; even in games, quality is just as important. I have the incredible luxury of having six computers at my disposal, all connected to a LAN and all used for gaming as well as graphics work. I’m incredibly lucky to be able to compare video cards side by side simply by turning my head. I have made a few conclusions that might be of interest to the readers of this thread.

    The current crop of cards are: Matrox G400max, ATI Radeon All-in-Wonder, Asus GF2 MX, Asus GF2 Pro, 3DFx 5500, and an Elsa Gloria III. If you’d like me to post my comments on this, post and let me know.

    The bottom line about all this is simple, what do you want to do with your computer? I think MS may be on to something with the X-box, if all you do with a computer is play games why not create a dedicated system to harness the power that is now available to everyone. I’m sure the X-box will deliver awsome visual quality when it’s released, but I won’t be working on it. If on the other hand you want to work AND play games you are going to have to make compromises. I think this is the space that Matrox wants to fill, and I’d bet that they will be successful with it. The G4xxx series of cards provide an excellent platform for work and do an adequate job of gaming. I’m currently able to play Giants on any of the above systems and you may be surprised which stations look best.

    When weighted to OVERALL usability, I’d say that Matrox cards and technology have scored near the top, and will continue to do so for some time. 2002 is a long time to wait for a killer gaming card, but put into the historical context of the graphics card business you can never be sure. Who would have predicted in 1998 that nVidia would be the kingpin of the business by 2001?

    Let’s all drop our superficial prejudges and move forward together to demand excellent quality in games and graphics, and graphics cards, we all win that way.

    • TwoFer
    • 18 years ago

    AG#35 — why should I have to buy a monitor based on my graphic card’s deficiencies? It’s much easier for Nvidia to pay attention and fix their problem, especially since it’s such a well-known issue.

    All these apologists for Nvidia are amusing — radical 3D is nice, but it’s only a tiny part of what makes up a really good graphics card. And Nvidia isn’t cutting it, in the big picture. I expect other companies will do far better in the next few months than the GeFarce-fans forsee…

    (Flame alert — I i[

    • Anonymous
    • 18 years ago

    “You know NV is just getting 2D done to have it done when you can **radically** improve image quality by **removing** the RF filter. That sucker is supposed to minimize RFI, not garble the main signal!”

    So buy a monitor with a DVI interface, and go end to end digital.

    (Okay, the DAC is inside the monitor, but I’d wager you will get better DAC circuitry by purchasing it from your monitor maker.)

    Makes sense to me. Analog = ass.

    • Anonymous
    • 18 years ago

    If you really want to see Forge pissed, try this:

    The Radeon sucks.

    And here’s the kicker– it really does!

    • Anonymous
    • 18 years ago

    #30: Forge
    Lol…. I really should have put a smiley in that ‘thread Close’.
    Anyway, hope Matrox does well.
    BTW rumor has it twin-P4’s (Foster) now roaring into plain sight …. all hands, Man yer’ AA-guns !!

    • Anonymous
    • 18 years ago

    trilinear filtering on the TNT2 was never really broken… the chip itself could only do true trilinear filtering OR multitexturing. since all the quake games defaulted to using multitexture extensions, trilinear was disabled. you can see this in quake 3 if you disable opengl extensions (its very slow 🙂

    • Anonymous
    • 18 years ago

    im ag26. it wasnt even quake3 it was halflife TFC. glquake 1 didnt look much better. the system had no obvious conflicts (aside from having such a god awful card for 3D). im not bashing the g400 so much since it sounds like it was decent but whoever said that nvidia cards come with broken features shouldent talk, i think the g200 is permanently broken. i finally got him to give up waiting for matrox and he went out and bought a voodoo 5 like 2 weeks before they got bought out, lol he practically shot himself but at least the V5 works properly.

    • Forge
    • 18 years ago

    /me finishes smashing AG #24/29. 🙂

    Really, though, did anyone ever say the G550 was for gaming? The guy at the Tech Tour seemed to be just mentioning it as an aside, like G550 is G400 MAX MAX or something. I don’t think G550 has been hyped as a gaming solution by anyone but overanxious end users. G1000 or G1200 or whichever (the spring 2002 part) is the one to watch, if you’re a gamer.

    Getting brutally honest, however, I don’t see Matrox reclaiming the 3D performance crown for quite some time. They really wedged themselves into a corner/niche by deemphasizing 3D, and now they have to dig back out of it before big bad NV comes in to get them. I wouldn’t be surpised to know that NV is working on a GeForce 3 HIQ (high image quality) type card, with 3D turned down slightly and more than 5 minutes spent on 2D design. You know NV is just getting 2D done to have it done when you can **radically** improve image quality by **removing** the RF filter. That sucker is supposed to minimize RFI, not garble the main signal!

    • Anonymous
    • 18 years ago

    Uh, I was kinda kidding on ‘Closing’ the thread. So don’t anyone get heated and come lookin’ for me with a rocket launcher, OK? <g>).

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    Hi people

    I am AG#1. I see two kinds of people here, those that think Matrox aren’t living up to the hype, and those that think the G400 is pretty good (and most of these seem to be professional types).

    But I think the point was the G550 was going to be a gaming card, and most (not all) of the gamers with Matrox experience posting here seem to think that Matrox’s drivers aren’t up to scratch.

    It’s been my experience that Matrox haven’t delivered on the G200, so why should I pay good money for the 550? When you get burnt this bad it’s very hard for a company to win back the business. One of the “so-called” selling points of Matrox is there driver support. In my experience, that support just hasn’t been there.

    Matrox may have turned things around with the G400, but it doesn’t look like it and I’m very wary…

    LiamC
    §[< http://users.interact.net.au/~pwcl/<]§ Don't worry, I'm now on Western Digitals case :)

    • Anonymous
    • 18 years ago

    aside from the fact that this card will probably be obsolete by the time it ships…. if matrox driver suppport is so great today? how come they NEVER released a set of G200 drivers that provide a stable opengl icd. i was just at a friends house a month ago and every opengl game he played was a mess of missing textures and very slow. sure it could be a bad card, but ive seen 5 diffrent systems with G200’s before and they all were pretty much simmilar. i dont know what state the G400 drivers are in but the way i see it, any company that releases a product and never bothers to release a stable driver set isnt worth it. at least not for a gamer. if 2d is so important to you get a radeon, i think the best bargain on the market today is the radeon LE… its under $100 and when overclocked its easily faster than an overpriced gts.

    • Forge
    • 18 years ago

    AG #24 – Who died and made you God? Anybody who wants can keep yakking on this topic till Judgement Day, you have no say. You’re also quite welcome to have no say elsewhere, if people being hopeful for Matrox bothers you. I don’t think Matrox can suddenly take over the 3D world, either, but I really would love to see them try. Writing them off at this point is not only premature, it’s also unneccesarily negative.

    • Anonymous
    • 18 years ago

    Why–this big time-wasting Thread, about that loser Matrox?
    FACTS…..

    (1) Their 2-D is superb.
    (2) They are real nice people, who give friendly, courteous, personal attention.
    (3) They deliver drivers that are faulty — then take forever to upgrade them to good ones (which admittedly are then real fine).
    (4) They deliver a superb G400 which is a leader when it appears, but has driver and other problems which call for quick fix.
    (5) While you wait for the quick fix…and wait .. and wait… the card ages, then is surpassed by new ones from the competition.
    (6) They are a privately-held Company, thus not subject to stockholder pressure (“…Get off yer dead asses, and speed up your response-times to these Problems!”).
    (7) Thus, free of said stockholder pressures — they will never, never, ever, ever, Change.

    Matrox, as a “Leading-Edge, Killer, 3-D Card” Mftr ??
    Write ’em off.
    (I now pronounce this thread, Closed).

    • Forge
    • 18 years ago

    Magma – #22 – I’m inclined to agree with you. I picked up a G400 secondhand and have never had even one problem with it. I’ve run half a dozen drivers past it, and they all worked fine. I got this G400 fairly recently (about a year ago), but Matrox’s quality, both in 2D and in drivers, shines through. The problem is, the very vast majority of the graphics market is interested in high 3D speed first and high 3D quality a distant second, with 2D quality getting minimal attention.

    • Anonymous
    • 18 years ago

    *[

    • Forge
    • 18 years ago

    AG #20 – Wha? Try that again.

    • Anonymous
    • 18 years ago

    What’s happy?
    AMIGA will’t take g800?

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    I’ve had a G400 Max since 1999 with no driver problems. The reason the OpenGL drivers were so bad with the G200 is because Matrox had never done OpenGL before. It took them a while to get it right.

    As far as why they would sell the 550 when it will perform lower than a GeForce 2. My guess is they want to protect their place in the business market. Some businesses might not need 3d, but they also might want a decent 3d card just in case.

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    Ok, I bought a G400 MAX right when they came out, and I can personally say that, at that time, it was the fastest card on the market. I still love it. And another thing. Look at their prices! The GeForce 3 just came out and there are people talking rumors about it being available for “under $400.” That’s insane! Matrox has some of the most inexpensive cards I’ve ever found. Granted, the G450 wasn’t the greatest 3D card to hit the market, but if you remember correctly, they specifically said that they didn’t design that card specifically for gaming. The G450 kept their tradition of unreal 2D clarity alive.

    I’m proud to say I have a G400 MAX. I’ve never had any major problems with Open GL or Direct 3D. (Although, I have found that D3D performs better under Win98 while OGL performs better under Win2K.) I’m sure Matrox will come through within the next couple of months. I’ll be ready to build a nex box by then and I’ll be looking forward to putting another Matrox card in it.

    • Anonymous
    • 18 years ago

    The Matrox Mystique was my first “real” videocard. 3D Hardware Acceleration was at the time the new thing on the block (the biggest turning point in PC gaming history imo) and I was looking forward to some really fast gaming on my new $300 card complete with 4 whole MB of Video memory. Disappointment set in when I discovered that what Matrox thought Hardware Acceleration was not what everyone else thought it was. I ended up buying a Voodoo1 card to make up for the serious lack of Mystique 3D features, since then I’ve never even considered buying another Matrox product.

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    I’ve been using Matrox cards in my office exclusively for a few years now, and I couldn’t be any more satisfied.

    Oh, wait, I did buy one TNT2 – geez, what a piece of shit that thing was. Two identical monitors, one using the TNT2 and one using a G400, and man, you couldn’t pay me enough to sit in front of that TNT2 machine all day. That’s my subjective opinion, anyway.

    Let me clarify, though: this is an office, 2D is king, we don’t use OpenGL, and we don’t play Quake. For us, the Matrox cards look awesome and the drivers are rock-solid.

    I use my Matrox G400 to play Counterstrike and occasionally try out a game, but I’m not hardcore and also would never trade the image quality for a few more FPS in some game. And my DVDs look amazing.

    My only beef with Matrox is the secrecy around their upcoming products.

    • Anonymous
    • 18 years ago

    My friend who works at a company that builds OEM system says that even Matrox is recommending that they cut back on the their orders. How screwed up is that?

    –k

    • Electric Amish
    • 18 years ago

    AG #8-

    When the G400 MAX came out, it was the fastest 3d card on the market, for a couple months until the first GeForce came out.

    Matrox will not be going anywhere soon. They have a HUGE chunk of the OEM & business markets.

    amish

    • Anonymous
    • 18 years ago

    *[

    • Anonymous
    • 18 years ago

    I am waiting for Matrox to go belly up. My industry insider has stopped carrying Matrox because he thinks they are sliding downhill FAST. They have the most mediocre products out in the market, even worse than S3. The last good card they had was the Matrox Millennium. I don’t know what planet that rep guy is from but the G400 Max was never “the fastest”. The 2D quality is still remarkably good, but ATI is arguably its equal in that area, and the 3D features in R2 will blow away G550.

    –k

    • Electric Amish
    • 18 years ago

    Hmm… Loved every damn one of my Matrox products (ok, maybe not the M3d ;))

    I guess it may be because I don’t play anything that requires OGL. I had a little trouble with my G400 when I tried Q3a when I put together my new (at the time) Athlon/VIA system. Got that worked out, tho and Q3 worked just fine.

    It’s GREAT to finally hear something from the tight-lipped one’s at M. 😀

    amish

    • elmopuddy
    • 18 years ago

    Maybe I am biased since Matrox’s HQ is down the street from me, but all the work systems I put together have Matrox cards. Period.

    Currently I have mainly G450 16m SDR dualheads, with a couple of DDR models. Drivers for 2000 are solid, and we can even play some counterstrike every once in a while…

    The 2D and clarity on the G450 is way better than my GF2 IMHO…

    Hopefully they aren’t yanking our chain and will release a good 3D capable card soon.

    • Sofa King
    • 18 years ago

    How much anecdotal evidence are you going to need? I was the first kid on my block to have a G400, and the thing absolutely sucked in OGL for a full year before a barely acceptable OGL driver came out. By the time that driver was released, the card was approaching obsolescence.

    They i[

    • Anonymous
    • 18 years ago

    Hey, at least you got a desktop in Windows 2000… That’s more than my Radeon could do 😉
    Matrox has the best support in the industry, but something is definitely wrong with the OpenGL part of their drivers. It began with the G200, and even the newest drivers for the G4x0 series are always making me sweat before I have to run an OpenGL app. Sometimes it works. Sometimes it doesn’t.
    But it would certainly be nice to get a new product from Matrox. Something with *noticeably* more 3D speed than the G450…

    • Anonymous
    • 18 years ago

    Hmm, anecdotal evidence. Could be a bum card you have there. And Matrox hasn’t exactly been slacking off too much in the drivers section; they’ve made numerous (albeit quiet) driver releases in the last few months. I will admit, however, that the drivers that initially shipped with their cards (e.g. G200, G400) were quite . . . raw.

    Their recent drivers seem relatively stable and fast, and since the G550 is more of a modified G400 than a completely new architecture, driver support shouldn’t be too much of an issue. Hopefully.

    At least their drivers aren’t at the level of ATI yet, though that may change (one way or another).

    • Anonymous
    • 18 years ago

    Matrox seems to always get the stigma of having very poor drivers… it’s really not as bad as people seem to think..
    ALL card makers [yes even the beloved nVidia] put out drivers which are buggy… These drivers are complex and time intensive to write..

    Oh well, everyones got an opinion….
    I’ve had the G200 and now have a G400DH, the G200 had no trouble with Q3a once the OGL ICD was released, albeit late…

    • Anonymous
    • 18 years ago

    Matrox occasionally have good hardware but shitty drivers/support.

    My first venture into planet Matrox was when the G200 came out. Man did that card shit me. It took b[

Pin It on Pinterest

Share This