Survey: Linux still gaining traction in the enterprise

These days, I don’t hear many industry analysts talking about the “year of Linux on the desktop”—that’d be a bit hyperbolic, with Linux distributions commanding an aggregate usage share of less than 1%, according to Net Applications. In the enterprise, though, interest in Linux is booming. So reports PC World, citing a survey by the Linux Foundation and the Yeoman Technology Group.

The survey, which gathered responses from 1,900 organizations, reveals a whopping 76% have plans to add extra Linux servers within the next year. Only 41% have similar aspirations for Windows servers… and 44% actually intend to maintain or decrease the number of Windows servers they use. If you look at a five-year time frame for upgrades, the balance shifts further toward Linux: 79% of organizations say they’ll add Linux-powered servers within the next half decade, compared to only 21% for Windows servers.

As icing on the cake for Linux vendors, the survey found that 60% of respondents are “planning to use Linux for more mission-critical workloads than they have in the past.”

Why such an interest in the open-source platform? The survey asked respondents about that, and it found that their motivations were, in order of importance: no vendor lock-in, the openness of the source code, the platform’s long-term viability, and the greater choice of hardware and software.

Comments closed
    • Anonymous Coward
    • 10 years ago

    I changed jobs some years ago to a Windows shop, and man I miss the Linux boxes. Windows makes a lame server. FROWN.

    • Anomymous Gerbil
    • 10 years ago

    Reply fail.

    • ultima_trev
    • 10 years ago

    I work at an undisclosed company in the Fortune 500, and our Linux processing is non-existent. We have several thousand Windows 2003 Server VMs, a few AIX or HP-Ux clusters, however most of our processing is actually done on mainframe. Client/Server FTL!!!!

    • herothezero
    • 10 years ago

    Nothing wrong with Linux, but Linux is only free if your time has no value. RH support is pretty good, and their builds are reliable and you can count on the stability of the builds over time. It’s hard to say that for other distros–that and the fact that everyone and their brother has “the best” distro, whatever that means.

    Relying on “the community” for support is a death sentence for an enterprise IT support group. We run a handful of FreeBSD/RHEL systems in a 14K employee company, all in an appliance capacity, and they’re great. But it’s the lack of certainty and concrete support options (to say nothing of applications) that keep a lot of business married to M$.

      • ew
      • 10 years ago

      Yes, if you want someone else to do the support for you it will cost you money. If you want to do the support yourself then Linux can be a very good choice.

        • blastdoor
        • 10 years ago

        You forgot “if your time is cheap”

          • just brew it!
          • 10 years ago

          Once you’ve familiarized yourself with how Linux works, it shouldn’t take any more time than supporting Windows. In many cases, it will actually be less.

      • just brew it!
      • 10 years ago

      I run Linux (Ubuntu and Debian) as both a desktop and server OS at home and work. I can honestly say that I spend /[

      • PenGun
      • 10 years ago

      This is the hooker argument. It’s cheaper to pay for sex than have a real relationship.

      • SNM
      • 10 years ago

      Your time doesn’t have to be cheap for supporting Linux in-house to be preferable. I don’t know any web hosting providers who out-source their server management/troubleshooting. 😀

      • eloj
      • 10 years ago

      Well, your time doesn’t become free just because you buy Windows or OSX or Solaris or whatever, so it’s still “gratis + time” < “$mucho_dinero + time”.

      The idea that linux (especially on the server!) is ‘harder’ and takes more of your time is a complete fallacy, typically grounded in the idea that “yeah, but I want to hire only the cheapest most incompetent people”.

        • stdRaichu
        • 10 years ago

        +1. All enterprise Linux make verbose logs, and are very good at telling you exactly what’s gone wrong. Error messages are, on the whole, much less cryptic than anything you’ll ever see in windows and the OS itself goes wrong alot less.

        Heck, the first exam of my RHCT was being given a broken RHEL box and told to fix it inside an hour; I managed to do it in six minutes, would have been less if I’d not made a typo in the bootloader.

        Linux is only hard if you’re ignorant of how it works; likewise, I’d find OSX “hard” as I’ve very little experience with it. But any sysadmin worth their salt will have at least some familiarity with linux, and it’s practically mandatory for those of us in the square mile.

    • StuG
    • 10 years ago

    Where I work Windows is the minority. I would say upwards of 90% are Linux based servers 🙂

    • PeterD
    • 10 years ago

    – no vendor lock-in -> ok, I understand
    – the openness of the source code -> ok, I understand
    – the platform’s long-term viability -> hmm, I don’t really understand, but I think I get the point
    – the greater choice of hardware and software. -> what?

      • Trymor
      • 10 years ago

      l[<- the greater choice of hardware and software. -> what?<]l Thats what I was thinking, but if you look only at 64bit Win7, then I can maybe see the point.

      • maxxcool
      • 10 years ago

      I’m willing to bet they are referring to Solaris hardware, PPC hardware, HP-unix hardware and the other “big iron” devices witch makes great sense.

      • just brew it!
      • 10 years ago

      Windows is x86/x86-64 only (well OK MIPS and ARM too if you count WinCE, but those are not particularly relevant for enterprise users). Linux runs on pretty much /[

        • Sahrin
        • 10 years ago

        Other than maybe one day ARM, all the other platforms you mentioned are irrelevant to the commodity server world. x86 is effectively (that is, by market share and purchase intent) the only server platform in existence, especially for new servers (though it remains to be seen whether ARM will be relevant in the future).

          • SNM
          • 10 years ago

          Uh, no. Many areas continue to migrate to x86, and certainly for *web* servers it’s the winner, but big iron continues to run on SPARC, Itanium, and PowerPC. It’s not just due to legacy costs, either; these architectures have their advantages.

            • reactorfuel
            • 10 years ago

            Look at Top500, the biggest iron of all. x86 derivatives make up the lion’s share of the list, and keep increasing. The biggest, baddest supercomputing site in the world – the Cray XT5 Jaguar at NCCS – runs on Opterons. Other architectures do have their own advantages, of course, but x86 is a huge player even at the very top end of the market.

            • Shining Arcanine
            • 10 years ago

            That is not “big iron”. “Big iron” systems are mainframes. Today’s super computers are cluster designs built around commodity hardware, which is why you see x86 used extensively in them today.

      • StashTheVampede
      • 10 years ago

      Greater choices of hardware/software is the real key for linux. Not mentioned in that report (but still VERY relevant) is how well VM consolidation and migration is going.

      Install ESXi and you have no issues managing VMs between linux, Windows, etc. Linux VMs play “nicer” than Windows VMs, so it’s very comforting rolling out/cloning more linux machines for everyone to play with.

      Now that you have virtualized your data center, you are less vendor specific with your setup. ESXi loves HP hardware, but it is just fine on Dell servers and a myriad of other vendors.

      Virtualization is a huge reason why linux is gaining adoption in data centers.

        • Corrado
        • 10 years ago

        Only recently has it become easy enough to run mission critical Linux VMs on esx. The way Linux keeps time made it nearly impossible to keep valid time in a vm without restarting ntpd every hour. My rhel5.3 machines would drift over a minute every two to three days. 5.5 + distro specific vmware tools have only recently made it usable (still a 2 second drift over a week). And since Linux ntpd does not have a provision to resynch with the ntp server without restarting the whole service, you need a script to kick the service in cron.daily to keep it proper. I had to build a local ntp server and synch everything to it. Big pain in the balls.

    • Xenolith
    • 10 years ago

    The desktop is fading. Owning the desktop won’t be that great of a prize in a few years.

      • StuG
      • 10 years ago

      Pretty sure this was in reference to servers, why bring this up?

        • indeego
        • 10 years ago

        Because it isn’t in reference to servers. The cliché is “This will be the year of Linux on the desktop.” But in fact, Linux skipped “the desktop” and went straight mobile.

        It appears Microsoft will own the desktop, and Android is on a path to own the mobile market, arguably much more important a market.

        If Microsoft cared about the mobile world, we would have seen Windows phone 7 in 2006, when the threat was very apparent to their mobile platformg{<.<}g

          • blastdoor
          • 10 years ago

          I don’t think Android will “own” mobile in the same way that Windows owned the desktop. No way will Android have 95% of the mobile market. Also, don’t count MS out of mobile. The very fact that WP7 will end up with the best mobile MS Office & Sharepoint implementation means that WP7 will find a market.

          In five years I’ll bet the mobile market is fairly evenly split between between Android (perhaps number 1, but not by any huge margin), iOS, and Windows. I suspect Nokia and RIM will have switched to Android or Windows as their OS by then.

        • Xenolith
        • 10 years ago

        The opening paragraph is about the desktop.

          • StuG
          • 10 years ago

          I felt like the opening paragraph was just that, and not really anything of subject. Still kinda feel that way, this article is about server growth for Linux really. Hence the confusion about your statement.

    • Corrado
    • 10 years ago

    My problem with Linux is still the cost. If I wanted to switch my servers to all RHEL, with proper support and patching etc, its $87 per year per server, even if you don’t upgrade said server version. Sure, a Win2k8 license costs more up front, but its a 1 time fixed cost that you can write down. The RHEL is an ongoing expense. Over a 5 year life cycle, that ‘free’ OS costs me just shy of $400. Over 5 years, Windows only costs me $150 on MS’s OpenLicense program.

    Now, thats not to say I don’t have a place for some RHEL machines, cuz I do. We have about 20 RHEL boxes running Oracle, OpenSSO, and some custom applications. But we also have about 500 Windows 2k3 servers running LOTS of other stuff.

      • stdRaichu
      • 10 years ago

      If you don’t want to pay support on boxes, you don’t have to – our prod kit all runs fully supported RHEL, all the dev and test kit is binary-compatible CentOS. Same OS, same patchset, zero price tag – that’s the benefit that lack of vendor lock-in provides.

      Besides, if it’s a choice of running Linux, Solaris, AIX, HP-UX or another flavour of proprietary UNIX, IME linux is a mere fraction of the cost even if you go for the full RHEL smörgåsbord.

      And the support you get from RedHat is awesome; phone support gets me a techie living within 50 miles who are capable of remotely diagnosing problems (along with related tickets, awareness of the quirks of particular patchsets and third party applications plus a complete report of exactly how the error arises) in a matter of minutes – they really put emphasis on having highly trained personnel even on tier 1. Contrast my experience with MS support which seems to be “unless you’ve got another tier 1 vendor telling us this is an issue, we’ll brush this under the carpet, and we’re still going to have you on hold for an aeon”.

        • Corrado
        • 10 years ago

        I can’t have just ‘a random techy log in for remote support’. This is big business with other B2B clients. If anyone wants to touch/look/think about our stuff, a background check + NDA need to be signed and that takes HR 2-3 days to get completed. If I have something break, I don’t have 2-3 days to wait for a fix.

        I’m also NOT pooh-poohing linux. It has its place. Theres just this complete falacy that its FREE, as in it costs nothing. Thats completely not true when you’re in a corporate environment that gets audited monthly for patches on everything from your switch OS to your printer firmwares (i’m not kidding) in order to keep industry certifications. So we MUST keep ALL software on a machine up to date. We’re allowed to be up to 1 month behind on all patches. When you have 500+ machines, you NEED the automated approach, which, with RHEL, requires the RHN support, which costs money. With MS, we run a WSUS server, and the machines patch themselves, and reboot during our SLA window. No interaction required on our part.

          • bigfootape
          • 10 years ago

          If you’re using all of the vendor’s upstream patches from RHN and not cherry picking, a solution like automated yum updates would be a good. If not, perhaps something like Current would work. It’s been a long time since I used up2date, and I don’t know what controls they added for mass management so you’ll have to be the judge here.

      • bcronce
      • 10 years ago

      RHEL != Linux

      RHEL is a service for Linux.

      You can get numerous 100% Linux distros, but you can’t get a support service for free.

      I’m hoping that by the time I pay off my student loans, I can get some nice sexacore FreeBSD/Linux machines.

        • just brew it!
        • 10 years ago

        q[

          • bcronce
          • 10 years ago

          I’m not a fan of AMD’s current chips. Bulldozer looks very promising, but my i920 quad can out perform a Phenom II X6.

          Plus I’m more excited about having a PFSense 2.0 firewall on a 22nm low power 6/8 core with DDR4. DDR4 is going to be 1v-1.2v and a 22nm cpu should only sip power.

          Aiming to have a FreeBSD 9 running Clustered File System with a ZFS back-end, a 10Gb nic and 40Gb infiniband interconnects between the cluster nodes.

          Looking at HP ProCuve managed switches with IPv6/vLAN/QoS

          This is where I will store my BluRay and DvD rips, purchased of course. If I have money, I will buy movies.

          From there, I will have some Linux set top boxes to stream movies to TVs in the house.

          I figure in the next 5 years, I’ll be able to start building this stuff and this will give time for any new standards to come out. I know PCIe3.0 is around the corner and Multi core cpus and large amounts of memory are becoming dirt cheap and SSDs will have stabilized and I’m guessing I have a 100% SSD 10+TB file share that could saturate a 10Gb link easily.

            • blubje
            • 10 years ago

            Please tell me this is a joke…

            • bcronce
            • 10 years ago

            I know people who spend $40+ per night drinking(for the first week or so, then they’re out of $$ until next pay day).. I think my hobby will result in less liver/brain damage and actually be semi useful and a lot more fun.

            • not@home
            • 10 years ago

            Yeah but… I have a file server that serves dvd rips uncompressed to other machines on my network and it is nowhere near as powerful as the gear you are hoping to put together. You do not need that much. I do not have any HD videos, but I can not imagine it needs much more than what I use. I have a 433MHz Celron CPU w/ add on disk controller and a 100 network. I have yet to do anything that get it higher than about 75% usage.

            • bcronce
            • 10 years ago

            Here’s why I’m looking at implementing this.

            Clustered: I don’t want one failed computer to take out anything.

            ZFS: If I have 20+TB of data archived, I don’t want it going bad. 3 Computers in the cluster. Any one computer that fails and the cluster keeps working. Each node in the cluster running RAIDz ZFS, so even multiple HD failures will not affect the operation of the machine.

            10Gb(~1.2GB/sec both ways): Consumer grade SSDs that are coming out next year will be 500MB/sec. I don’t want my network limiting my speed.

            Infiniband: Reading/Writing at full 10Gb is going to need some fast intraconnects for overhead.

            Back-Ups: I want to only have my OS/Apps installed locally. So SSDs. If I decide to setup a back-up job on all the computers in my network, I want the back-ups to run damn near full speed of the SSDs.

            Procurve Managed Switch: With IPv6, all devices on my network are going to have public IPs and I’m going to have 2^64 IPs to play with. I want to subnet my network between Desktops/Wireless/Servers/Consoles/Set-top-boxes/Guests. Each subnet will be in a vLAN that will have a very strict ACL. The ACL will re-enforce my edge firewall rules. Minimal access ACLs.

            PFSense Firewall: I want my edge firewall to be low latency and high bandwidth. It will also handle VPNs, IDS, Traffic shaping, QoS, and detailed logging.

            I absolutely hate it when my internet is slow, my WAN is flashing like mad, and I can’t tell who/what is causing all that activity or what kind of activity it is.

            This will also be a great learning experience.

            This kind of stuff is “fun” to me.

            • blubje
            • 10 years ago

            I don’t think you fully appreciate the purpose, nor expense, of setting up these components.
            1. ZFS has many problems. Read the mailing list. It has many features, more than any other currently, but b/c of that complexity it has stability issues. It is intended for use in enterprise but I don’t know how many enterprise companies actually use it.
            2. 10Gb is insanely fast. Unless you are running nuclear/weather simulations you don ‘t need this. It would only be useful if someone walked into your house and said “If you can’t transfer the entirety of Avatar HD to this thumb drive inside 5 minutes, I’m going to kill your fish”
            3. Infiniband – again, this is mainly for supercomputers/NUMA. It is primarily used not for it’s bandwidth but for it’s latency. You could have 10 different streams of HD content to 10 different tvs with 5e.
            4. Backups, by their nature, don’t require super fast connections. They run in the background at whatever time you setup in cron. Additionally, if you plan on using all this supercomputing equipment why aren’t you taking advantage of LVM? ZFS has similar features. This simply doesn’t require a 10Gb connection.
            5. IPv6… it’s your funeral. Have fun!
            6. No argument about a firewall. Very useful, but a high end computer/fast connection does you no good unless you plan on doing packet inspection on hundreds of desktops.

            As for the fun of it. Yeah, I completely get it. I do similar sorts of things myself, but part of being tech savvy is knowing what you really need. What you’ve described to me is something that you would see at Brookhaven or Naval Academy. It literally is so far beyond what you are asking for that it is as if you bought a Veyron for a 5 mile drag race when you are competing against 20 year old Geos.

      • StashTheVampede
      • 10 years ago

      What ties you to RHEL? Oracle?

      • stmok
      • 10 years ago

      l[http://en.wikipedia.org/wiki/CentOS<]§ => §[<http://www.centos.org/<]§ We use it for VPS (commercial web hosting services), engineering (supercomputing/cluster), back-end servers to medium sized businesses, etc.

      • SNM
      • 10 years ago

      You seem to have forgotten the support costs you’re paying to Microsoft. If for some reason you’re not paying Microsoft for support, what the hell are you doing paying for OSS support?

        • Krogoth
        • 10 years ago

        Support costs time and $$$$ (Yes, even OSS). There’s no way around it. Corrado has is trying to point that out.

        OSS solutions may or may not be cheaper depending on the client’s needs.

        • Corrado
        • 10 years ago

        I can run Windows update for free and get all security and feature patches for the lifetime of that product. I can’t run RHEL update for free. I have to do it all manually for each package installed unless I have a valid RHN account, for each individual machine, logged to their website.

          • BobbinThreadbare
          • 10 years ago

          Sounds like Red Hat is screwing you, not Linux.

          There are plenty of distros that don’t charge for updates. When you said you were paying for support I assumed that meant something above and beyond security patches and bug fixes.

          • just brew it!
          • 10 years ago

          If you’re not going to pay for the RHEL support, there’s really no point to using RHEL in the first place. You should be using CentOS (which is essentially a rebranded RHEL) instead…

            • Corrado
            • 10 years ago

            I don’t believe our clients (which include some REALLY big names, such as Walmart, VMWare, Pepsi Co, Bank of America, Accenture) would like us to be using a community supported patching model. Like I said above, we get audited ever month in order to keep certain certifications. We would be stuck using RHEL, SuSe, or another ‘big name’ with a reputation behind it.

            • StashTheVampede
            • 10 years ago

            I work for a medium-sized company (less than 2k people) that does all sorts of advertising and marketing online. We process credit cards, so we’re audited annually.

            Not once was there a question as why we we using CentOS on our servers. The only time any question was asked about our servers was about our licensing and we had them for the various apps servers we ran on production.

            • just brew it!
            • 10 years ago

            #77’s reply notwithstanding, I agree that’s a valid concern. In reality, I would argue that a solid community-supported distro like CentOS or Debian is still /[

      • MarkD
      • 10 years ago

      I’ll add my two cents as a UNIX/Linux admin working for an IT service provider.

      We use Red Hat for our backup servers, because the throughput is really good – finally something that can saturate the tape drives – which incidentally cost more than the servers. Depending on the size of your enterprise, you might pay less than that $87 – of course, you might be getting a better deal on Windows as well. Red Hat for applications is just starting to take off here. Oracle is trying to undercut Red Hat with their own supported Linux, so you are hardly tied to one distro, even if you need someone to pick up a phone and help you 24×7.

      Windows servers are generally good enough for applications, but the corporate databases are still Oracle on Linux or UNIX, despite the cost (Oracle is the big hit here, the hardware and OS support is a smaller part of the cost.) When downtime idles factories or hundreds of customers, a few extra bucks on IT is decent insurance.

      Vendor tech support, in general, is not all it is cracked up to be. Nine times out of ten, the issue is with an application and we’re going to get an answer from google faster than we would from a vendor. The tenth time is when the vendor earns his money.

      This reply is coming from an old laptop running Fedora 14 beta. Good enough for what I am doing, and I don’t miss XP a bit. My other machine is a desktop running W7 Pro. This one I built myself and bought the license just so I could run anything I might want or need to. I generally like Linux OS better, but I started in the punched card days, so the command line is not exactly daunting. To be honest, I have a harder time with the interface on my cell phone. My big beef about application interfaces isn’t the inconsistencies, it’s that they keep “improving” them. My time is worth something to me, so don’t make me keep re-learning something I already know how to do.

      • Shining Arcanine
      • 10 years ago

      If you insist on RHEL, but do you want first party support, you can install CentOS. It is the same thing, except you do not pay for it and you do not get first party support.

    • sweatshopking
    • 10 years ago

    *nix will NEVER supplant windows. It’s simply not possible. too may issues with usability. GUI inconsistencies, and who wants to learn command line?

      • [+Duracell-]
      • 10 years ago

      If you stick with one GUI package, then you don’t have to worry about these inconsistencies, and there are plenty of tasks that can be done better in the command-line (and even scripted/scheduled).

      And the article talks about servers, where most people worth a salt would know how to use a command line, I would imagine.

      • stdRaichu
      • 10 years ago

      Anyone who’s involved in administering hundreds of servers wants to learn the command line. Which is what the people in the article are talking about, not desktops.

        • sweatshopking
        • 10 years ago

        here’s the thing though, command line is boring. that’s the major issue here. regardless of function, it’s boring. And boring is something I simply cannot get behind.

          • flip-mode
          • 10 years ago

          I don’t think you’re part of the target group. You may find the command line boring, but server administrators – at least a good number of them – probably find the command line more exciting than just about anything. And it seems that Microsoft validates these people since they came out with the Power Shell for Server 2008.

          You bitch about the command line, but I wonder if you read this part:
          q[

            • sweatshopking
            • 10 years ago

            i’m just horsing around. I love linux. Used ubuntu tons.

            • flip-mode
            • 10 years ago

            Ah, you twerp!

            • tay
            • 10 years ago

            You’re such a tard. You had me there for a bit too….

          • PeterD
          • 10 years ago

          “command line is boring” ?….
          Well, get back to your games, then.
          You’re supposed to make the things work, not to enjoy a show.

          • just brew it!
          • 10 years ago

          For an enterprise server admin, if things are exciting it generally means something has crashed, or that there’s been a security breach. They /[

        • khands
        • 10 years ago

        Yeah, they’re talking beyond smbs.

      • designerfx
      • 10 years ago

      did you really intend to derail the entire conversation?

      the only place windows is used is on the desktop, and that’s only because a lot of software is not compatible with any flavor.

      however, that’s not a business model that’s going to last.

        • sweatshopking
        • 10 years ago

        I was hoping to.

      • eitje
      • 10 years ago

      You’re just upset about the Gnome desktop. Troll.

      §[<http://en.wikipedia.org/wiki/The_World_of_David_the_Gnome<]§

      • PenGun
      • 10 years ago

      The command line rules. It’s faster and way more useful than your clicky GUI nonsense.

      I can burn a whatever with a simple string that I can type faster than anyone can click through the burner GUI. I could go on.

        • cphite
        • 10 years ago

        Okay. Now select 50 specific files out of a folder containing thousands and move them to your burner, via the command line – faster than someone else using a GUI.

          • PenGun
          • 10 years ago

          If the files have something similar about them no problem. If not and we are just choosing names then I’ll use Midnight Commander which uses curses to run the best file manager ever made from a command line.

        • shank15217
        • 10 years ago

        You can get locked into a vendor even with Linux, thats a silly reason to use Linux. Vendor lock ins exist for one good reason, its in the vendor’s benefit to convince their customers to stick with what they know.. then a few discounts for upgrades and a few corporate licensing deals later they stay put for a very long time.

          • just brew it!
          • 10 years ago

          Sure, lock-in is still possible. But it is /[

      • sschaem
      • 10 years ago

      Never say never. 5 to 10 years from now linux might have evolved enough to give shops no reason to pay upfront for an OS…

      • Umbragen
      • 10 years ago

      I agree with you, but not for any of the reasons you mention.

    • Anomymous Gerbil
    • 10 years ago

    You can’t have that discussion without considering the use of Linux to replace Solaris (and other OSes). To discuss only Linux and Windows shows a lack of understanding of the enterprise.

      • flip-mode
      • 10 years ago

      You can talk about almost any two things in isolation if you frame it properly.

        • sweatshopking
        • 10 years ago

        Your mom, and my bedroom. FRAME THAT!

        BANNED!

          • flip-mode
          • 10 years ago

          My mom is waiting for you to finish with your mom.

            • sweatshopking
            • 10 years ago

            Don’t worry, I won’t be long.

            • flip-mode
            • 10 years ago

            That’s what she said.

            • sweatshopking
            • 10 years ago

            damn skippy.

            “2 minutes in heaven is better than 1 minute in heaven” –Jermaine

      • stdRaichu
      • 10 years ago

      Damn straight; we’re supposedly a windows-only shop, but we’ve actually got about twenty beefy linux boxes running in the background*, and this is going to raise to fifty or so with the next 18 months.

      Mostly it’s because we’re transitioning a bunch of our oracle kit (and some MSSQL databases) from AIX onto RHEL, but we’re also replacing half the extranet with linux boxes because they’re more secure – not to mention the cost savings.

      * Not to mention all the devices that already run linux, like our SANs, KVMs, firewalls…

      • flip-mode
      • 10 years ago

      Actually if you click through to the article, the survey does include Unix.

        • Anomymous Gerbil
        • 10 years ago

        Yep, but I was more referring to the Tech Report summary article.

Pin It on Pinterest

Share This