Musing on the state of motherboards

Manufacturers routinely ask us what we think of their current products and what we’d like to see from future ones. Sometimes, I get the impression that they’re genuinely interested in our input. Other times, I feel like my girlfriend is asking me what she should wear. My suggestion of a particular tank top, short jean skirt, and knee-high American Apparel socks—you know the ones I’m talking about—inevitably falls on deaf ears. She’s only interested in my opinion if it aligns with an outfit already on her mind. And that’s never it.

We don’t hold our opinions back here at TR. So, when an email hit my inbox a couple of weeks ago asking for a wish list of features for a new enthusiast board based on Intel’s upcoming Z68 chipset, the wheels started turning. Before long, I began to feel an old-man rant brewing. As someone who’s been reviewing motherboards at TR for nearly a decade, I’d like to think that I have a few morsels of wisdom that might be useful to the powers that be in the motherboard business.

This has all happened before, of course. I penned a memo to motherboard makers more than four years ago, and I’m emboldened by the fact that most of my demands appear to have been met, if only by coincidence. There are, however, a handful of issues that have persisted despite my whining. So, let’s beat up on a couple of dead horses before moving on to fresh meat.

We begin with my favorite pet project: BIOS-level fan speed controls. The vast majority of even high-end motherboards are stuck with rudimentary fan-control intelligence and precious few configuration options, especially for auxiliary fan headers that often have zero speed control at all. Contrast that with an overflowing bounty of frankly excessive voltage and overclocking options, and one is left wondering if motherboard makers really think there are more enthusiasts engaged in sub-zero overclocking than are trying to strike the perfect balance between cooling performance and noise levels. Someone’s been huffing the liquid nitrogen at those overclocking contests.

If Zotac can fit Wi-Fi on a Mini-ITX board, why can’t everyone else with full-size ATX?

Another problem I had with motherboards of old was the prevalence of models with dual GigE jacks but no built-in wireless connectivity. Boards with a single Gigabit Ethernet port are more common these days. The second port hasn’t been replaced by Wi-Fi, though; it’s just gone. I’d wager that all but a handful of enthusiasts have a wireless router in their homes. Why should they need an add-in card or USB adapter to use it? Bluetooth might as well be included, too—I bet it gets more action than the seventh and eighth SATA ports tied to omnipresent auxiliary storage controllers.

The last bit of unfinished business from my original memo concerns how gracefully motherboards recover from a failed overclocking attempt. Most of the time, they’ll attempt a reboot or two before springing back to life with the last collection of settings that worked. Bravo! However, things don’t always go smoothly, and users are sometimes forced to reset the BIOS manually. Doing so usually involves digging around inside the case to flip an onboard jumper. Seriously? You’ll let me overclock remotely with my iPhone, but I have to dig around inside the musty confines of my overstuffed enclosure because your BIOS’s auto-recovery scheme failed? I see plenty of room in pretty much every port cluster for a BIOS reset switch. Look:

All the latest in cluster connectivity and still loads of room for a CMOS reset button

I also see a lot of USB 2.0 ports built into modern core-logic chipsets—probably more than most folks need with the advent of USB 3.0. We can afford to lose one of them to an onboard flash drive loaded with a Linux-based OS much like Asus’ original Express Gate implementation. Keep Express Gate’s web browser so that users can easily download the latest BIOS and drivers before a fresh build. Add a monitoring app and some stress-testing software, allowing tweakers to dial in the perfect overclock without the risk of corrupting their OS. Finally, throw in a file browser to aid system recovery in the event that the OS does get hosed. If all of that can be squeezed into a UEFI BIOS, even better.

Generally, I suggest that people with halfway-decent headphones or speakers (anything nicer than a set of iPod earbuds, pretty much) invest in a budget sound card like Asus’ excellent Xonar DG. There is merit to onboard audio in certain circumstances, though. If you happen to be running speakers or a receiver with a digital input, most motherboards can pass along a pristine digital bitstream via their S/PDIF outputs. Unless the motherboard is capable of encoding multichannel bitstreams on the fly, however, multichannel digital output is limited to the pre-encoded audio tracks that accompany movies.

Surround-sound audio is nice to have when you’re gaming, and you don’t necessarily want to lose it if you have to put on headphones when your significant other complains about the volume of your late-night gang bangs—in Bulletstorm, of course. Virtualization schemes do a reasonable job of simulating surround-sound environments with stereo headphones, and I can’t wrap my head around why this isn’t a more popular feature among the growing crop of supposedly gamer-oriented mobos. The Realtek audio codecs virtually guaranteed to appear on enthusiast boards have optional software support for real-time multichannel encoding and surround-sound headphone virtualization, yet few mobo makers take advantage. They should.

If these military-class components are ultra durable, why only three years of warranty coverage?

The quality of onboard components like capacitors, chokes, and MOSFETs has received more attention in recent years. Some of these parts have exotic unobtanium cores, while others meet stringent military standards for overall goodness. Motherboard makers love talking about ’em, whether it’s claiming to be first to use a particular MOSFET design, boasting wildly about hard-to-quantify improvements in overclocking stability, or waxing on about improved durability. Yet even with components we’re led to believe are capable of surviving a tour of duty in Afghanistan, enthusiast boards typically carry a measly three years of warranty coverage. Weak.

Premium hard drives come with five-year warranties. Ditto for PSUs, some of which can be found with even longer coverage. Graphics cards from a handful of manufacturers are sold with lifetime warranties, and some of those extend through the first resale. Sure, there are a million more things to go wrong on a motherboard, but Asus is confident enough to offer five years of coverage with its new Sabertooth line. If other motherboards are as reliable and durable as the manufacturers claim, longer warranties should follow.

All the cool kids are wearing black and blue this spring. Bask in the generic sameness!

One could argue that motherboards look a lot better now than they did a few years ago. Gone are the neon hues, multicolored port arrays, and general lack of taste that typified enthusiast models of old. However, I’m starting to notice an alarming trend toward virtually identical black-and-blue color palettes. When we rounded up Sandy Bridge mobos from Asus, Intel, MSI, and Gigabyte earlier this year, it was hard to tell them apart without looking at the manufacturer names silk-screened on the circuit boards.

These attempts at aesthetic sameness often fall short of the mark when viewed in a vacuum, too. We’ve seen chipset and VRM coolers whose colors don’t match, racing stripes that don’t line up, and ornate heatsinks seemingly machined to chew off the maximum amount of flesh when you’re tightening a CPU cooler’s retention screws. Chicks don’t dig those kinds of scars.

I’m hardly advocating a return to the days of day-glow designs. Designers should be able to come up with their own sense of style without looking over their shoulders to see what everyone else is wearing, though. To be fair, as someone who hasn’t had a case window in years, I really don’t care all that much about a motherboard’s appearance. However, if there’s going to be some visual flair, I’d favor distinctiveness over a failure of originality. Is that really asking too much?

Is any of this?

I know I’ve been going on for a while now, but I don’t think I’ve been unreasonable. The motherboards we have today may be marvels of engineering, but they could certainly be refined to better serve my needs the needs of PC enthusiasts. So, how about it, mobo makers? Can you at least put on the socks?

Comments closed
    • Nutmeg
    • 9 years ago

    I need at least one PCI port to plug in my Β£5 Firewire card πŸ˜›

    • Rectal Prolapse
    • 9 years ago

    Hey – you forgot to whine about the lack of UEFI motherboards with 3 second BIOS boot times! πŸ˜›

    Actually I am still shocked that we still have to wait 30 seconds before the OS boots off of our super-fast SSDs. πŸ™

    • Arbie
    • 9 years ago

    I heartily agree regarding fan control on the mobo headers. My last build (Yorkfield 3.8GHz) ended up with about ten fans here and there, mostly to get adequate cooling without too much noise. The Asus ROG Rampage Formula mobo was great for this because they made a real effort at fan control – even to the extent of accepting thermal probes on three of the headers. The fans all run at about 60-70% normally, and ramp up only under major load. I have now planned a a Sandy Bridge build on a high-$ Gigabyte board, but realize it doesn’t have any of those features. That’s very disappointing, and gives me another reason to wait and see what Asus offers in Z68. Of course there are bigger issues, and of course I can rig up fans without all the built-in sensing etc – but taping rheostats inside a zillion-transistor upteen-gigaflop box is just silly. It is, however, more reliable and no more clunky than 95% of the “fan control panels” you can buy.

    I also want all heat sinks to be screwed down (no plastic pins) and mounted with real thermal paste, not pads. With all the air I’ve got going for CPU and GPU cooling, simply having good heatsinks would eliminate any concerns over chipset / VRM heating.

    Accurate voltage readouts / settings in the BIOS, please. I can’t imagine messing around with DVMs and microprobes when there’s so much to sort out. Again, there’s oceans of tech on board.

    I couldn’t care less about colors. In fact, I’m enough of a newbie that the crayola approach helps me avoid mistakes.

    • DeadOfKnight
    • 9 years ago

    How about make more space for air OCers to have one of their massive heatsinks installed and still be able to populate all RAM slots without removing heat spreaders and maybe even be able to include a RAM cooler if they choose to do so? Here’s an example:

    [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16835106150[/url<] [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16820145311[/url<] As far as I know there are no motherboards that can accommodate these things together and if they can I'm sure it's a tight squeeze.

    • WaltC
    • 9 years ago

    Speaking of “legacy” PCI slots…On several mobos I’ve looked at recently, the placement of the PCIex1 slot(s) (perfect for an x-FI sound card) is such that if your 3d card has a double-slot cooler (which most do these days) the PCIex1 slot is often completely obscured. It cannot be used to slot a PCIex1 card of any description. On my current budget board, I have to use an Audigy 2 simply because the PCI slot is available whereas the PCIex1 slot is not. If mobos are going to include one or two PCIex1 slots then it would be really nice to place them so that they’d be available for use in a populated board.

      • JustAnEngineer
      • 9 years ago

      With the PCIe X1 slot covered here, you’ve still got the PCIe X8 and X4 slots open:
      [url<]http://usa.asus.com/product.aspx?P_ID=iT2FJPCMOGBHClu4[/url<] However, if you put a second dual-slot graphics card in to the X8 slot, it's going to cover the X4 slot, too.

    • Voldenuit
    • 9 years ago

    Bravo, Geoff, bravo.

    • Kent_dieGo
    • 9 years ago

    My Asus MB has the graceful recover if overclocking fails. The other day somehow the motherboard detected a failed start up and re-set the BIOS to safe defaults even though I do not overclock. That disabled the RAID controller function for my RAID1 hard drives. Windows did not boot and went into automatic recovery mode and tried to “fix” the hard drives screwing up a lot. I had to re-enable RAID in BIOS and boot from Win7 DVD to fix. It took hours.

    I wish I could disable the graceful recovery feature! I am sure this will happen again.

    • south side sammy
    • 9 years ago

    Need to put threaded holes around the processor so Intel will be forced to get rid of that god awful heat sink retention mechanism.

    • Zoomer
    • 9 years ago

    No sidemounted connectors! These are just a huge pain to plug in, especially for the bottom sata.

      • Mithent
      • 9 years ago

      I’d rather have side-mounted SATA connectors than ones you can’t use because they’re blocked by graphics cards, though. Ideally the SATA ports would not be in line with 16x PCI-Express slots at all, I guess.

        • flip-mode
        • 9 years ago

        Side mounters don’t bother me. I just fill all the channels with cables and tuck unused cables away for future use.

    • dashbarron
    • 9 years ago

    Your insistence and rantings don’t go unnoticed!

    Thanks in part to the deal of the week post, that sound article from November, and the podcast, I “splurged” and paid $20 for that Xonar DG. I have onboard audio with 7.1 channels, but I figured/hope that the quality will be better on the DG, and I dont ever plan to have more than 5.1 channels anyways. I’ve always wanted to try discrete sound and I figured now was a good time to put up, or shut up.

    • yehuda
    • 9 years ago

    I wish I could buy dust covers for unused motherboard slots, back ports, LAN ports (in a router) and power supply connectors.

    • glynor
    • 9 years ago

    [quote<]racing stripes that don't line up[/quote<] Ha! I have that board, and I know what you mean. What they heck were they thinking? Still a nice board though.

    • NeelyCam
    • 9 years ago

    I like the power button on MSI boards. You can put things together, tweak, overclock etc. AND test it without having to shove the mobo inside a tight-ass case. I don’t know how many times I’ve cut my hands on some sharp edges inside a case when trying to remove memory sticks in order to figure out which one is dead

      • flip-mode
      • 9 years ago

      Wait, what do power buttons on the mobo have to do with changing RAM?

      FWIW, if you’r mobo is sitting on the open bench, all you need is to short the two pins for power to get the things started – now, I’m not saying that’s anywhere near as handy as a power button on the mobo. I do have something that is just as handy though – I ripped the power and reset switched out of an old case before I threw it away – I just hook those up and, well, use them.

      Even better, they also work just fine for resetting the BIOS.

    • CheetoPet
    • 9 years ago

    I think it’d be handy it someone included a mini-PCIe slot. Lots of real estate to dedicate but would still be a nice option.

      • Bauxite
      • 9 years ago

      That might kill the awesome cheap market on fleabay for desktop to minicard converters though. I think I’ve bought over a dozen so far.

      Maybe TR should review one for grins.

    • ssidbroadcast
    • 9 years ago

    My biggest gripe is how hard connecting the case power/reset switches and case-USB to the motherboard. God damn! You basically need the hands of a small child to guide those 1-pin cables to their destination.

      • glynor
      • 9 years ago

      A thousand times yes.

      The power/led/reset/speaker headers all have different pin-outs, and even the USB/1394 pin-outs don’t follow the standard sometimes. I’ve had to manually re-wire power switches to fit motherboards before because the Power On switch pin out is sometimes three pins wide and sometimes only two, and all sorts of other absurdity.

      • paulWTAMU
      • 9 years ago

      This. There’s 0 reason for th at to be as big a PITA as it is.

    • flip-mode
    • 9 years ago

    Fan controls would be nice, but at least there is a decent selection of quiet fans these days. The crummy part is that you can’t change the stock fan on the Intel cooler and the AMD stock cooler uses a 70mm fan which is just dumb stupid.

    BIOS reset – yep, common sense cheap to implement.

    Overclocking options – there are too damn many. RAM overclocking is dead – you can drop the whole stack of those options as far as I care except for whatever is needed to support changing bus frequencies for CPU overclocking.

    Wireless – yes! Just yes already!

    Geoff, there’s nothing you said that I disagree with. I agree with it all.

    • phez
    • 9 years ago

    Dear motherboard makers,

    PLEASE STANDARDIZE CASE CONNECTIONS. There is no bigger pain in the ass than fumbling with those microscopic LED and PWR/RST button connectors. Asus’ Q-connector helps but really, it is just a riser of sorts.

    • ShadowEyez
    • 9 years ago

    Modern mobos are miles ahead of what was out even a few years ago, but things keep moving.
    How about ALL ports be USB 3.0 (once intel and other chipsets support 10-12 USB 3s)
    Built in wifi is nice, though with most desktops, you’re going to be plugging into ethernet (though at LAN parties wifi makes sense)

    But the biggie is UEFI instead of BIOS. This should be standard; BIOS limitations will only look more glaring over time (2TB max bootable, limited troubleshooting/repair options, trouble with headless systems, etc…) and as long as UEFI is done in such a way that ppl can ALWAYS disable things like TPM and other such mechanisms, it’s really a no brainier.

    • Dposcorp
    • 9 years ago

    How much do onboard power and reset buttons cost in money and board space? Those are great for when first working on the build.

    • Stargazer
    • 9 years ago

    I would *definitely* like to see better BIOS (well, UEFI now I guess…) fan controls.
    Reducing unnecessary fan noise is a good way to improve your computing environment, and I think more people would be able to efficiently do so if it was easier to do.

    • willmore
    • 9 years ago

    Stop sticking every extra function under the sun on the MB. I don’t need tons of SATA ports, nor a bunch of USB3 (unless it’s in the chipset), nor a half dozen Gig-E ports. Save the board space and just stick on a bunch of PCI-E 1x slots. If I need some extra I/O, I’ll drop in a dirt cheap card and add just what I need. There’s no reason that everyone has to pay for I/O that only I want–and I don’t want to pay for I/O that I don’t want, either.

    No Wi-Fi.
    No Bluetooth.
    No PATA.
    etc.

    • dpaus
    • 9 years ago

    If your girlfriend won’t wear the short jean skirt and the thigh-highs, she has no grounds for complaining about any late-night gang-bangs. I’m just sayin’…

    • JustAnEngineer
    • 9 years ago

    Stop hating on my SATA ports, Diss!

    Let’s dump some of the legacy garbage already. Floppy disk drives, IDE and PCI are all well beyond their use-by date. My [url=http://usa.asus.com/product.aspx?P_ID=iT2FJPCMOGBHClu4<]Asus P8P67-M Pro[/url<] doesn't have any of them. This allows more useful PCIe, SATA and USB connections. It would be nice if everyone could agree on a common heatsink mounting bracket. I agree with the condemnation for front panel connectors. Asus' Q-block is a slight improvement, but the push-on jumper connectors aren't secure. Some motherboards have moved to more PWM connectors for case fans, but the BIOS speed controls are still lacking for all but the CPU fan.

      • mczak
      • 9 years ago

      I don’t quite agree on the “legacy garbage”. Ok floppy disks noone needs that. IDE? Ok maybe some still use IDE DVD burners (they didn’t really get any better lately so can reuse old ones) though you could always get a new one for 20$, so this isn’t really necessary neither.
      But PCI? There are still quite a lot of pci cards sold, and the problem is for some still no pcie replacement exists. For example, good luck trying to find a dvb-c pcie card – doesn’t exist (or rather, some quite special cards exist, and there is in fact a VERY new hauppauge card now, but all the normal cheap dvb-c cards are still pci).

        • bthylafh
        • 9 years ago

        The only legacy ports I care about personally are PS/2 and parallel. I have a Model M, and my wife and I share the printer by her using the USB connector and me the LPT. However, I’m thinking about buying a new printer and ones with Ethernet are really cheap these days, so…

        HANDS OFF MY PS/2 PORT AND NOBODY GETS HURT.

          • Bauxite
          • 9 years ago

          [quote<]HANDS OFF MY PS/2 PORT AND NOBODY GETS HURT.[/quote<] This x 1000.

            • Lazier_Said
            • 9 years ago

            I had to troubleshoot a Dell Dimension 9100 circa 2005 last week and discovered it had no PS2 port. 2005? Really? Furthermore, the little green PS2/USB mouse adapters don’t work with a keyboard. A pain in the ass that didn’t need to be.

          • glynor
          • 9 years ago

          Seriously, [url=http://www.monoprice.com/products/product.asp?c_id=104&cp_id=10404&cs_id=1040401&p_id=173&seq=1&format=2<]$5[/url<]. I can agree with preserving PCI, a single IDE, and a few other legacy connectors, but PS2? Really?

          • BenBasson
          • 9 years ago

          [url<]http://www.digitus.info/en/products/accessories/adapter-and-converter/digitus-usb-ps2-adaptor/[/url<]

        • JustAnEngineer
        • 9 years ago

        I could have sworn that I found PCIe DVB tuner cards for someone in the forums last week. I’ve been intentionally buying only PCIe expansion cards for my own systems for the past several years.

        I recently acquired one of [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16812206002<]these[/url<] IDE to SATA bridges for my ancient IDE DVD-ROM and it works a treat. Of course, it does require a SATA port now, if Dissonance hasn't gotten rid of all of them yet. The point is that for only $10, you can adapt any of your old legacy IDE devices to SATA.

          • mczak
          • 9 years ago

          Oh DVB-S pcie cards exist for some time. But try DVB-C and your option is either a very new hauppauge card (good luck with linux drivers for that), some quite odd dual-tuner cards or even weirder stuff (some pro pcie dvb-c gear seems to exist). But simple (== cheap) card? Nope (the new hauppauge, hvr-5500, doesn’t quite qualify for cheap neither, as it’s a multi-standard tuner card).

          And I’m quite sure there are some other areas where noone bothered to do a pcie card yet, though most of these might not quite be mainstream.

            • JustAnEngineer
            • 9 years ago

            I was looking for DVB-T PCIe cards for that gerbil. They seem to be widely available.

            The LinuxTV folks list a couple of DVB-C PCIe cards.

    • Deanjo
    • 9 years ago

    [quote<]Bluetooth might as well be included, tooβ€”I bet it gets more action than the seventh and eighth SATA ports tied to omnipresent auxiliary storage controllers.[/quote<] I have to disagree here. Even 8 sata ports are not enough. With SSD's and sata optical drives out there those ports tend to get used up really fast. I'm getting tired of this "Buy a 3TB to replace a 2TB" to get an additional 1 TB of space. To me every full sized board should be coming with a minimum of 12 sata ports. I already have 3 systems that have 6 ports fully occupied, each with an additional 4 ports via add in card that are also at full capacity. The number of bluetooth devices I have is one, my phone, which I sync once in a blue moon on the desktop.

      • nanoflower
      • 9 years ago

      I have no doubt that you can use up 8 SATA ports but how many people need that many ports built into a motherboard? It has to be a very select group especially with the size of hard drives today. It doesn’t seem like it would make sense to build in more SATA ports for such a small market when getting a SATA card isn’t that expensive.

        • Deanjo
        • 9 years ago

        Here is an enthusiast scenario.

        Channels 1&2 SSD Raid
        Channels 3&4 two large storage drives (Max 6ish TB without any redundancy)
        Channels 5&6 Burner and Bluray drive

        All 6 ports used up with nothing really extravagant.

        • Lazier_Said
        • 9 years ago

        I’ve only only run into a scenario where I wanted more ports once. And the usual solution of 4 more hanging off a 3rd party controller wouldn’t have helped much.

        Have: full RAID5 array.

        Want: larger disks.

        Swapping the new drives one at a time and rebuilding each time is a PITA. Fault tolerance is lost while that’s in progress.

        Ideal solution would have enough ports to keep the original array intact while adding new disks, creating a new array, and copying my data.

        Of course that was years ago, now if I wanted more space I’d just buy a second QNAP.

      • Skrying
      • 9 years ago

      What you’re looking for is called a file server. You should build one.

        • Deanjo
        • 9 years ago

        That’s what the two other machines are for. That is all they do is act as a file server.

      • UberGerbil
      • 9 years ago

      Sure, a mfr could do a specialized “storage-oriented” mobo specifically for file servers and people for whom 8 SATA ports aren’t enough. But the only way they could do that is by adding more 3rd party SATA controllers on the motherboard, each hanging off some PCIe lanes. But that’s no different from just buying an ordinary board and putting SATA cards in the PCIe slots, except the specialized board would be more expensive and less flexible — which makes it a hard business case for the motherboard maker.

      As Skyring says, it sounds like you need to rethink your approach to storage if your needs are that great.

        • Deanjo
        • 9 years ago

        If they would increase the number of Sata ports available off the southbridge no extra lanes would have to be used (or external chipsets).

          • UberGerbil
          • 9 years ago

          But that’s a chipset request. The topic at hand is motherboard design. The motherboard makers can’t add those lanes to the southbridge themselves, so they’d have to use PCIe.

      • bimmerlovere39
      • 9 years ago

      I don’t think all full sized boards should, but I think there’s logic behind some high-end boards having more than 8.

      Run two double-wide cards in Crossfire/SLI and a sound card, and you start running out of usable slots really quickly.

        • Deanjo
        • 9 years ago

        Exactly especially in many of those cases filling in the extra pci-e slots often drops the PCI-e lanes on the other ports to the next step down.

      • flip-mode
      • 9 years ago

      Well, lets just talk enthusiast users – not average users. Even with that limitation, I bet only 1-5% of enthusiast users need more than 6 SATA ports. And I’d guess that only 0.1% (one in a thousand) occupying 10 SATA ports in a single machine. That’s just ludicrous to expect desktop computer motherboard to do that. And that’s what we’re talking about here – desktop motherboards. You’re running a server or workstation level configuration. And you say that you have 3 machines in which you’re using 10 SATA ports? Ha, that’s pretty funny.

      And anyone running RAID 5 on the motherboard chipset is kind of a boob IMO, and if that’s why you’re asking for more SATA then I’m going to have to tell you to go out and get a real RAID card.

      Anyway, the point is that the [i<]typical[/i<] enthusiast motherboard is probably quite adequate with 6 SATA ports.

        • Deanjo
        • 9 years ago

        As I responded to nanoflower:
        Here is an enthusiast scenario.

        Channels 1&2 SSD Raid
        Channels 3&4 two large storage drives (Max 6ish TB without any redundancy)
        Channels 5&6 Burner and Bluray drive

        All 6 ports used up with nothing really extravagant.

        [quote<] And anyone running RAID 5 on the motherboard chipset is kind of a boob IMO, and if that's why you're asking for more SATA then I'm going to have to tell you to go out and get a real RAID card. [/quote<] Actually if you are using a hardware raid card for storage then you are kind of a boob. Here is why. Hardware raid is completely dependent on the raid controller. Raid controller dies you are hooped. You can pretty much kiss that data goodbye unless you have an identical card as a spare near by. In an enterprise environment they have their advantages due to the increased throughput on a high-demand network but this is @ home where such scenarios are pretty much never present. For the home enthusiast user they are better off with a software raid setup (not fake raid, pure software). That at least is easily migrated from one machine to the next when it comes to replacing / upgrading motherboards. Also with that amount of data you probably at least want to run minimum Raid 6. Also for that data a filesystem such as ZFS is desirable to ensure data integrity.

          • flip-mode
          • 9 years ago

          The point is that your SATA needs are far beyond typical and it’d be silly for a motherboard manufacturer to design their boards on the basis of such an unusual usage scenario. You give an example of using up 6 ports that you say is “nothing extravagant” but I think that could be up for debate. I call SSD RAID extravagant, personally.

          As for a RAID card going bad – that’s what backups are for. Everyone should have backups. RAID card dies: get a new one and restore your backup.

          A more fundamental point, in my opinion, is the fact that RAID really has no business in a desktop computer anyway, (though I’ll make an exception for RAID 0). What’s the point of RAID? Other than RAID 0, it’s all about continuous uptime even in the event of a drive failure. That doesn’t describe the desktop usage model at all – and even more so for enthusiasts who are always shutting their machines down to tinker inside the case.

          All that said, I don’t have a problem with a mobo having bunches of SATA ports. But I think the larger point is that there are other features that motherboard makers are ignoring that would be much nicer to have than a 7th, 8th, 9th, and 10th SATA port. Like wireless. Every darn mobo should have wireless at this point.

            • Deanjo
            • 9 years ago

            “that’s what backups are for”

            Ever try “backing up” 10 TB of data on a regular basis? It isn’t going to to happen.

            “What’s the point of RAID? Other than RAID 0, it’s all about continuous uptime even in the event of a drive failure.”

            It has very little to do with “continuous uptime”. It’s all about data redundancy in case of failure. That’s it, period.

            As far as “wireless” goes, for a real enthusiast, would never use it on a desktop because of items like latency, interference, speed, etc. We are talking about a system that sits in one place, usually for it’s entire lifespan and not one that is hauled around the living room or coffee shop.

          • Mentawl
          • 9 years ago

          Your idea of “nothing really extravagant” is quite impressive! That’s a ~Β£500 storage setup (assuming only 64gb SSDs), when most of the computing world get by with a single boot drive and a single optical drive @ <Β£100 quite happily :).

            • Deanjo
            • 9 years ago

            That may be true in the UK but a pair of SSD’s are easily had in NA for ~$300. Again we are talking about an “enthusiast” machine, not a mom and pop machine. A pair of SSD’s is not unusual for an enthusiast machine.

      • Bauxite
      • 9 years ago

      They already make a great solution for your problem: you want a sata/sas adapter in one of your pci-e slots. Probably an 8(2) port SAS HBA-only with breakout cables since you didn’t mention raid.

        • Deanjo
        • 9 years ago

        Sure that would work but if the chipsets supported CBS or FBS port multiplication on the chipset you wouldn’t even need an additional card and one could still run a fairly redundant raid setup with little effect for storage purposes.

      • NeelyCam
      • 9 years ago

      Yeah – especially when half of those ports will stop working after a year or two… you need to have spares.

        • Deanjo
        • 9 years ago

        Wouldn’t bother with an intel.

      • cynan
      • 9 years ago

      In my opinion, the more SATA ports, the better

      My x48 motherboard only came with 6 SATA ports, which were not enough for me (though I clearly thought they were when I bought the board). I could use about 10.

      Currently I have 8 (extra 2 with a cheap PCIe silicon image controller) and they are all used up.

      4 internal HDDs,
      1 removable 2.5″ HDD
      1 Optical drive
      2 eSATA connections (one on the back which is always connected to an external enclosure and an open port on the case front)

      Port number aside, if you want to connect devices other than HDDs, the ability to take advantage of features like Intel’s Matrix Raid becomes impaired.

      Sure you can buy a fancier raid controller add-on, with more ports and maybe hardware raid, but these things cost as much or more than the motherboard itself

    • rika13
    • 9 years ago

    Ironically, another site did something similar with cases about a week or so ago, and it got me thinking about the state of the industry.

    I would like to see more “realistic” parts with things people actually want and use, instead of crap like XL-ATX mobos that take 4 cards and require special cases, Antec’s CP series PSUs which only fit in like 3 of their cases, gimmick cases (open bench cases asking for dust, Lanboy Air), cases which cost more than motherboards or procs.

      • FuturePastNow
      • 9 years ago

      On the subject of cases, I’d like to see more that are actually nice looking. Clean, simple design, with no windows, no side or top fans, and no unnecessary lights.

        • Veerappan
        • 9 years ago

        Something like the original Sonatas? I still have mine running my HTPC, and I love it.

        Much better than the Cooler Master Centurion 5 I have my desktop housed in.

    • stupido
    • 9 years ago

    [quote<]If Zotac can fit Wi-Fi on a Mini-ITX board, why can't everyone else with full-size ATX?[/quote<] that was very very good comment! also, BIOS should become one of the distinctive features for a mobo - SW goodies like pre-installed linux distro, more 4-pin fan headers with different control schemes, etc...

    • eyke81
    • 9 years ago

    Debug LEDs would save a lot of guesswork and swapping of parts to hunt down the faulty part.

      • bthylafh
      • 9 years ago

      Complete BIOS documentation for all the beep codes, even. More than once I’ve had a motherboard with $POPULAR_BIOS make a beep code which turns out to be undocumented anywhere, even by the BIOS maker itself.

    • UberGerbil
    • 9 years ago

    Boxes with handles. We don’t have enough motherboards sold in [url=https://techreport.com/discussions.x/20495<]boxes with handles[/url<].

      • Palek
      • 9 years ago

      [url=http://www.epox.com/product.asp?ID=EP-8KHAplus<]This Epox EP-8KHA+ Socket A board[/url<] came in a plastic box with a rope handle way, way back. Ahead of their time? Their "visionary" packaging style didn't help the crappy VIA chipset, though.

      • DancinJack
      • 9 years ago

      Glad you brought this up again. MORE HANDLES PLZ

      • RickyTick
      • 9 years ago

      My MSI XPower Big Bang X58 came with a handle, so there.

    • Ronald
    • 9 years ago

    Gigabyte’s X58A-OC – designed by overclockers, for overclockers.

    [url<]http://www.gigabyte.us/MicroSite/265/x58a-oc.html[/url<] Maybe your next review?

      • kilkennycat
      • 9 years ago

      And you are the sheik of ……… ? Can you spell o..v..e..r..k..i..l..l.. ? A perfect fit for the PC ‘yuppie’.

        • Ronald
        • 9 years ago

        Glad my street cred is still intact.

    • DrDillyBar
    • 9 years ago

    unobtanium. πŸ™‚

      • UberGerbil
      • 9 years ago

      Which you can buy from Oakley. πŸ˜‰

    • ColdMist
    • 9 years ago

    After recently putting together a new Sandy Bridge setup, why don’t they have the case connectors in a nice sleaved strand, with a block like the HD Audio block? Include a short 1/2″ breakout cable for legacy compatibility, but move on already. And, have it keyed so you don’t plug the wires in backwards.

    Second, Add a little more space (just 1mm!) around the SATA ports. If you get a thick cable, sometimes you can’t plug 2 of them in next to each other, because the first one blocks the second.

    Finally: Support the mouse wheel in UEFI!

    • Skrying
    • 9 years ago

    I wish the industry would get together and finally standardize front panel connectors, I wouldn’t mind a redesign that was more solid either.

      • LawrenceofArabia
      • 9 years ago

      Good lord this. We’ve come to a point where most hardware manufacturers are paying attention to cable management options but we still have those same shitty front panel connectors.

      • DancinJack
      • 9 years ago

      I get so excited every time I upgrade enough in my case to require a motherboard change, then I remember about these. Front panel connectors are the worst thing about putting together a computer, by far.

    • EndlessSnowfall
    • 9 years ago

    I wish motherboards would have an adapter for the HDD/power switches and LEDs, so that you could plug in the cables outside of the case where it’s easier, then plug the larger adapter onto the motherboard. An adapter this simple shouldn’t cost very much.

      • neon
      • 9 years ago

      some motherboards now have this:
      [quote<]The ASUS Q-Connector allows you to connect or disconnect chassis front panel cables in one easy step with one complete module. This unique adapter eliminates the trouble of plugging in one cable at a time, making connection quick and accurate.[/quote<]

        • travbrad
        • 9 years ago

        Yeah ASUS has been doing that for a few years at least now. My P45-based board (P5Q SE PLUS) has that connector. You still have to plug in the individual cables of course, but it’s a lot easier than doing it inside the case.

        That plug can’t cost more than a dollar (if that) to make, so it seems like something all motherboards should have. I’d much rather have that than a 9th and 10th USB port, a 2nd ethernet port, etc

    • ClickClick5
    • 9 years ago

    I really miss the DFI LanParty series. Yellow. Ah….how they would glow under the UV lights.

    EDIT: Here is a fun board. The days when YOU had to know how things plugged in. Plug in that ATA cable wrong? Ha.
    [url<]http://www.amiga-hardware.com/download_photos/a4000tmb_rev4_4.jpg[/url<]

      • cygnus1
      • 9 years ago

      you know what i like about that board? no heatsinks. at all.

Pin It on Pinterest

Share This