Bargain basement: a Dell G5 with a Core i9 and a 144 Hz G-Sync display for $1180, and more

Greeting, folks. There isn't much time for pleasantries today, other than to mention that it's real sunny out there and I wish I wasn't swamped with work. Today's crop of deals is plentiful, however, and it's as hard as ever to not bring out my own credit card and acquire some of these fine sundries. Check them out.

  • Who wants a quality portable that can make short of work of heavy-duty number crunching but also game with the best? The Dell G15 5590 (gnvca5cr1000smp) is one mean machine, fitted with a spankin' new Core i9-9750H processor sitting next to 8 GB of dual-channel RAM. For pixel-pushing prowess, you'll find a Nvidia GeForce GTX 1660 Ti graphics card with 6 GB of VRAM, while storage duties are taken care of by a combo setup with a 256 GB NVMe solid-state drive and a 1 TB hard drive. That's darn impressive enough already, but we saved the best for last: the display is a 1920×1080 IPS unit with a 144 Hz refresh rate and G-Sync adaptive refresh rate support. The price tag reads just $1179.99 at Rakuten. Daaaang, right?

  • There's a ton of discounted storage today, starting with NAND flash. The Intel 660p 1 TB NVMe drive says "s'mee again" and brings its 1800 MB/s sequential read and write speeds to the table, as well as a good helping of random I/O at 150K read IOPs and 180K write IOPs. The price is now under a Benjamin: only $92.99 at Newegg with the cart code EMCTAUY22.

  • Nobody should be forced to employ a rodent of poor upbringing. Affordable quality choices are now plentiful, like the SteelSeries Rival 310. We liked it well enough back when we reviewed it—particularly its accurate 1:1 sensor, and the sensible shape and button placement. Whereas we weren't enamored with the price tag back then, that's a complete non-issue today as you can obtain one for only $29.99 from Amazon. If you prefer a different take on an affordable mouse, the Logitech G402 Hyperion Fury has a distinct shape and sensor, but a similar price: $28.70, also at Amazon.

  • The months-long drought of affordable mass storage seems to be coming to an end. The Western Digital EasyStore 8 TB spinner is a simple and straighforward external drive, and it'll set you back just $129.99 at Best Buy, or $16.25 a terabyte. The bigger model, the Western Digital EasyStore 10 TB, is comparatively even cheaper at $159.99 at the same e-tailer, or $16 per TB. Finally you have somewhere to store your cheese picture collection.

  • Now here's something off left field: a monitor that you can game and do color-critical work on, from a place you wouldn't expect. The Aorus AD27QD is a 27" display with a resolution of 2560×1440. So far so good, but it uses a IPS panel with a 144-Hz refresh rate and—get this—10-bit color reproduction, leading to a color gamut that should cover 95% of the DCI-P3 color space. There's HDR support on tap and a 1 ms blur-reduction mode, too. If you were looking for a monitor that can do everything, this one is it. Get it for $539.99 from Newegg with the cart code EMCTAUY55.

That's all for today, folks! There's a chance you're looking for something we haven't covered. If that's the case, you can help The Tech Report by using the following referral links when you're out shopping: not only do we have a partnership with Newegg and Amazon, but we also work with Best Buy, Adorama, RakutenWalmart, and Sam's Club. For more specific needs, you can also shop with our links at Das Keyboard's shop.

Comments closed
    • anotherengineer
    • 7 months ago

    sigh $56 for the mouse in Canada
    [url<]https://www.amazon.ca/dp/B073WGFLQY?slotNum=0&linkCode=g12&imprToken=jDiTcXDyvFxRuR.QhvQfyg&creativeASIN=B073WGFLQY&tag=techreport07-20[/url<]

    • malachy33
    • 7 months ago

    Problem is it is a Dell, so the speakers will blow the first time it makes a sound and the rest will fall apart in 3 months. The upside is if you get the next business day on-site warranty it only takes them 3 weeks to ship, repair, and ship it back to you each month when something breaks!

      • MOSFET
      • 7 months ago

      I know nothing about this Dell series, and I’m certainly not a Dell fanboy – I have no financial connection or outlay towards any Dell I use. I’m a fan of DIY, among other things. Having said that…

      I have a 2012 Latitude 5530 with Ivy Bridge i5-3210m and it’s still working well, and nothing is broken. Sure, since 2012, I’ve quadrupled the amount of RAM and the size of the SSD. Original Samsung 830 SSD still working great as a Win10/BlueIris boot drive. Maybe Dell quality or parts procurement quality has gone down since then; or maybe you had a bad experience, and I had a normal experience.

    • chΒ΅ck
    • 7 months ago

    i like how the alienware design language is trickling into the dell G series

    • Chrispy_
    • 7 months ago

    Whichever muppet at Dell thought that a pair of 4GB RAM sticks was the correct option for an i9 needs their head checked.

    In saying that, it does at least minimise your losses when you throw out the generic 2400MHz Dell RAM bottlenecking the system and buy some proper stuff.

      • smilingcrow
      • 7 months ago

      They come with DDR4 2666 which is the highest speed supported by the CPU. The timings might be rubbish though!

        • Chrispy_
        • 7 months ago

        Fair enough, I was guessing because I’m in Europe and the Rakuten website isn’t available in Europe.

      • Usacomp2k3
      • 7 months ago

      16GB is plenty for most people and there are areas where the extra clockspeed over an i7 would benefit where extra RAM wouldn’t.

        • Chrispy_
        • 7 months ago

        But it comes with only 8GB – That’s my point!

        For a gaming laptop, 8GB is pretty limiting – and someone wanting the i9 for performance gaming doesn’t want to tank their framerate for swapping or background RAM management.

          • Usacomp2k3
          • 7 months ago

          Oh dang. You’re right. I read that as dual 8GB (8×2), not 8GB total (4×2)

      • synthtel2
      • 7 months ago

      I had the complete opposite reaction – they bothered going for dual-channel, good for them!

      It does make it more expensive to upgrade, but if you’re alright with single-channel RAM, why go for something like a 9750H in the first place? 8GB is a bit weak, but not overwhelmingly so (I still don’t think I’d notice a difference 99.5% of the time if I only had 8GB).

        • Chrispy_
        • 7 months ago

        Hmm. Perhaps my experience is sullied by my own laptop (A Ryzen 7 2700U with 2x4GB RAM)

        I suspect I’m actually experiencing life on a 6GB machine instead of an 8GB machine, and of lesser bandwidth too – once the graphics core usage has taken its toll.

        I can’t remember the last time I used an 8GB machine with a dGPU – I’m still on DDR3 for my desktops and I stuffed all my machines to the max when DDR3 prices bottomed out in 2016 at barely $100 for 32GB kits.

          • synthtel2
          • 7 months ago

          To be fair, my own use is abnormally light on RAM (space, definitely not bandwidth). I do try to spec 16GB for other people’s machines at this point, but that’s mostly a matter of accounting for random bloatware and possibly having Chrome open with a bunch of tabs while trying to game.

          My brother actually just ran into almost exactly the 16G/single-channel versus 8G/dual-channel tradeoff (was running 8GB DDR3-2133, found 16GB of DDR3-1333 lying around, dual-channel either way) and decided to stick with the 8GB; 16 made Star Citizen much better at small expense to pretty much everything else, but Star Citizen was still pretty bad even with the 16GB (i5-4590 / GTX 960 / Win7 so no memory compression).

          I have 16 in my own machine, but if I look at conky and see anything that’d be iffy on 8 it’s a bit of a WTF. Last time that happened it was because some bug was causing the kernel to eat 3GB whenever video had been played since the last reboot.

    • Bauxite
    • 7 months ago

    The acer non-gamer bling version of that monitor (same panel) is $319 right now: VG271U
    [url<]https://www.newegg.com/Product/Product.aspx?Item=N82E16824011278[/url<]

      • sweatshopking
      • 7 months ago

      only twice the price in canada!

        • K-L-Waster
        • 7 months ago

        Yeah, but we’re [s<]suckers[/s<] nice, so we can pay extra....

      • Bauxite
      • 7 months ago

      $309 now lol

    • Amien
    • 7 months ago

    That Logitech G402 is an incredible mouse for the money if you’re not against the shape/left handed.

    • techguy
    • 7 months ago

    I don’t understand the abundance or persistence of high-end 1440p gaming monitors available today. 1440p is solidly in console territory. If you’re on PC, surely you can go Ultrawide or 4k by now? I don’t even bother checking framerates with my 2080 Ti at 3440×1440 100Hz Gsync, it’s just not an issue. Sorry if I sound like an elitist, but cereal. I’m like, comeon.

      • morphine
      • 7 months ago

      Absolutely not an elitist with the $1300 graphics card and $800 monitor.

        • cygnus1
        • 7 months ago

        I bet that monitor was more than that…

      • BobbinThreadbare
      • 7 months ago

      ultra wide is the opposite of what I want and I’m not spending more than $300 on a graphics card

        • Wirko
        • 7 months ago

        You’re not the only one here … [url<]https://techreport.com/news/32673/tuesday-deals-a-mechanical-keyboard-an-ultra-wide-monitor-and-more?post=1055502[/url<]

      • auxy
      • 7 months ago

      What a reprehensible comment. Even as someone with a $750 graphics card and a $550 monitor, I find your statements so gross it defies description.

      Let’s take one step back and realize that most people are still playing in 1920×1080. Common recent enthusiast graphics cards like the GTX 1060 or RX 570 struggle to maintain 60 FPS in AAA games at 1920×1080. (There is the argument that AAA games are unoptimized garbage that no one should play, but that’s neither here nor there.)

      But let’s take another step back and realize that there are LOTS of gamers out there who DON’T have an enthusiast graphics card, and are using a GTX 1050, GTX 950, RX 260, or even older/slower card. Or who are playing on integrated graphics. Even this 2560×1440 144Hz display is an absolute luxury and even an excess to the overwhelming majority of gamers.

      Now let’s take another couple steps away from your viewpoint and gain some perspective. When you step outside of the USA and Europe, there are millions upon millions of PC gamers who struggle to afford even an older gaming PC.

      One of my good online friends lives in the Philippines, where a gaming PC is far out of reach for most people. He just upgraded (with my help) to a used Haswell i7 machine with an R9 290. He just upgraded to that! And he loves it, of course. It’s amazing; he probably has the fastest computer around for miles. Of course, that machine could barely make use of the Aorus monitor above, much less a 4K display.

      You should take some time to familiarize yourself with the Steam hardware survey:
      [url<]https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey[/url<]

        • Srsly_Bro
        • 7 months ago

        Auxy, I think he trolled you. I don’t think he’s that dumb to believe what he typed.

          • Redocbew
          • 7 months ago

          Hanlon’s razor, bro.

          [url<]https://en.wikipedia.org/wiki/Hanlon%27s_razor[/url<]

            • Srsly_Bro
            • 7 months ago

            Thanks for the post. I recently leaned of that concept from an attorney during a lecture.

            • techguy
            • 7 months ago

            I don’t really care if anyone agrees with me. I see the value in spending money on products that offer a high level of quality and performance.

            Buy nice or buy twice.

            That doesn’t make me “stupid”.

            • Srsly_Bro
            • 7 months ago

            100fps makes you stupid. You had me until then.

            • techguy
            • 7 months ago

            I can’t help it there were no good alternative GSync monitors with higher refresh rate available, at a price I was willing to pay. Those new $2000 4k GSync monitors look nice, except they cost as much as a 4k TV that’s 4x the surface area, and that’s horrible value in my opinion.

            • ronch
            • 7 months ago

            Hey how are your Instagram posts lately?

            • techguy
            • 7 months ago

            No idea, don’t have a single social media account, it’s all nonsense.

            • ronch
            • 7 months ago

            Nah it ain’t nonsense, those kids love it. You know, yachts and golden spoons and 2080 Tis.

            • K-L-Waster
            • 7 months ago

            [quote<]Buy nice or buy twice. [/quote<] In order for the 2080 Ti to be a value purchase, you'd have to stick with it until the 5080 Ti gets released. Otherwise, it's a beast, but a way overpriced beast....

        • techguy
        • 7 months ago

        LOL. Clown world.

        I’m worse than Hitler! Reprehensible! Disgusting! Gross!

          • auxy
          • 7 months ago

          No, clown world is when people tell me, a woman, that I have to share bathrooms with dudes in skirts and makeup. Clown world is when I, an asian person, get called a white supremacist because I don’t think all races are necessarily created equally. Clown world is when the UK police are confiscating spoons as deadly weapons and arresting people for mean tweets. Clown world is when people insist that, because it’s a little warm today, we need to federalize 200 industries. Clown world is when Dear Leader tweets verifiably untrue things, even if the kind of people who use “clown world” unironically would never criticize him.

          You aren’t being put upon for the sake of political correctness.
          You’re just an inconsiderate jerk who has absolutely no self-awareness.
          This is the exact same as when the people you hate cry discrimination and play the victim.
          Grow up and take responsibility.

            • techguy
            • 7 months ago

            TRIGGERRED!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

            • Srsly_Bro
            • 7 months ago

            I love auxy. This is my official endorsement. Great post.

            • astrotech66
            • 7 months ago

            [quote<]Clown world is when I, an asian person, get called a white supremacist because I don't think all races are necessarily created equally[/quote<] So ... you're an asian supremacist? I'm just wondering what your preferred race is, since you seem to think they're not all equal.

            • ronch
            • 7 months ago

            As a Klingon we don’t care which race anyone on your puny little planet is from. You’re all just puny earthlings to us.

            • steelcity_ballin
            • 7 months ago

            I mean, this is absolutely not the forum for it, but Charles Murray, author of “The Bell Curve” is basically ostracized in his field and branded as a racist for the scientific data collecting and reporting that demonstrates that intelligence as a metric varies with race, and that specifically African < Caucasian < Asian by about 10 IQ points between them on average. The entire book is mostly labeled as racist science by his peers, even when 99% of the book isn’t even about race.

            The issue is that even if all we are talking about is verifiable correct data without any charge of emotion, people understandably don’t like hearing this and especially in the current socio-political climate, are quick to brand any talk of such things as bigoted. That’s a real problem IMO because assuming good faith, we have to be capable as a species of quantifying, measuring, observing such things and talking intelligently about them.

            Data cannot be racist or bigoted. What you do with that data can be however, and therein I think lies the danger. But I’ve never been in favor of censoring data or burning books. Every person is an individual, and we are not equal. We should have an equal opportunity in life to achieve what we aspire to, but that doesn’t mean equality of outcome. This has probably crossed into RnP territory by now, where I’d be happy to continue this conversation if anyone would like to.

            To the OP’s point, I think the concept of supremacy is a joke. People are people. There’s a lot more that connects us than separates us, and if we’d focus on that aspect of our humanity and have genuine compassion for others even when we don’t agree with one another, I think we’d have a much happier society.

            • sweatshopking
            • 7 months ago

            [quote<] Data cannot be racist or bigoted. [/quote<] HAHAHAHAHAHHA WHAT

            • Spunjji
            • 7 months ago

            Here are the two biggest problems with The Bell Curve:

            1) There is no scientific basis for classifications such as “African”, “Caucasian” and “Asian”.

            2) There is still no useful, universal, truly scientific basis for measuring intelligence – let alone one that is reducible to a single figure. IQ measures particular kinds of intelligence that were valued by middle-class white Europeans at the beginning of the 20th century. If the subject being tested has not benefited from a standardized education and been immersed in a society that emphasizes those specific traits, they will not score well, no matter how “smart” they may or may not be.

            Those two problems are why it’s called out as being racist. I don’t think the authors themselves *meant* to be racist – they’re just displaying data they collected and analysed – but the frameworks they worked within to collect and analyse the data were inherently racist.

            • auxy
            • 7 months ago

            Why does it have to be about “supremacy”? That’s not a knee-jerk reaction. I want you to think about that question.

            We have to be able to recognize differences in each other because we are different in so many ways. Pretending everyone is the same and that there are no differences all in the sake of protecting someone’s feelings serves nothing in the long run.

            It’s not about being better than someone else. This kind of talk doesn’t have to be “my race is superior.” It only becomes that whenever you try to force the narrative that people who recognize differences are automatically bigoted.

            • caconym
            • 7 months ago

            You’re why we need the humanities.

            • caconym
            • 7 months ago

            (your science sucks too)

            • sweatshopking
            • 7 months ago

            TECH GUY IS NUTS, BUT LET’S NOT PRETEND YOUR LIST ISN’T FULL OF CRAZY TOO

            • Pancake
            • 7 months ago

            R u Pinoy?

            • Spunjji
            • 7 months ago

            This comment makes me sad.

            You will have been happily and unwittingly sharing bathrooms with trans women for years before the idiocy of the culture wars kicked off. Why choose to join in the nasty shit about “dudes in skirts and makeup”? Watch a couple of Contrapoints videos, educate yourself.

            Unfortunately, thinking that there’s anything remotely scientific to be said about people on the basis of race means you have bought into what originated as white supremacist concepts, whether or not you yourself are a white supremacist. This might help understand why people fling that accusation at you. Look up scientific racism, educate yourself – race has no basis in science.

            Nobody’s “arresting people for mean tweets” in the UK – if they’re being *arrested* and not just asked for an interview, there’s going to be a bloody good reason for it. Anyone telling you otherwise has a very specific agenda.

            Climate change is real. Tackling it doesn’t have to involve nationalizing anything, but the American right refuses to even acknowledge it, let alone propose mitigation. Of course you’re only getting left-wing solutions when more than half your government is bought and paid for by the fossil fuel industry. Educate yourself on the realities and get angry at the people who won’t do anything, not the people who want to fix a very real problem in ways you disagree with.

            Being a centrist doesn’t mean giving your headspace space to lies, bullshit and propaganda just because of “balance”. Be better.

          • Goty
          • 7 months ago

          Don’t take it personally. auxy basically only comes here to rage post.

            • auxy
            • 7 months ago

            You’re not really wrong, but it’s because the quality of commenter here is generally so good. I really don’t have a lot to say whenever I read most posts anymore. Normally if I’m posting it’s because someone made me angry. “SOMEONE IS [b<]WRONG[/b<] ON THE INTERNET!" Hehe. Part of it also is that I'm usually talking to Zak when he's writing the articles and so generally he's covered anything I want to say in the post. (*Β΄βˆ€ο½€*)

      • Redocbew
      • 7 months ago

      [quote<]Sorry[/quote<] Doubtful.

      • K-L-Waster
      • 7 months ago

      The only 4K monitors that exceed 60 FPS are ludicrously expensive. Some people prefer frame rate to pixel density.

      And sure, you might be able to run 100 FPS at 3440×1440 with a 2080 Ti, but a 1080 Ti or lower might not have such an easy time. (And when I got my 1080 Ti it took me a while to convince myself to actually click the order button — I could swing the expense, sure, but justifying spending it on a GPU took some doing…)

      • Litzner
      • 7 months ago

      If you are into high end, high refresh rate gaming (240hz) you still want 1080p, not even 1440p… 1440p+ is solidly in casual territory when it comes to competitive gaming, sorry if I sound like an elitist πŸ˜›

        • Srsly_Bro
        • 7 months ago

        But csgo servers iirc are at 60 tick and you have to play community or face it servers for the higher tick rate to use the high refresh. I’m only at 1440p 144hz and 240hz 1080p is superior in competitive gaming. It’s not elitist; it’s factual.

        Nice post, bro.

      • Zizy
      • 7 months ago

      Ultrawide? Yuck. I still use trusty old HP ZR24w because they stopped making sane aspect ratio screens.
      There are literally no monitors from 2017 on that have higher resolution and haven’t regressed to 16:9 or worse. The only strict upgrade that is possible (without considering AIOs) is some ancient 2560×1600 thingy.

      Sure, I have 3 of those screens, but that is besides the point. I dread the day when one of them gives up the ghost again and I have to replace them all. Yet I would love to get some of the new goodies like high and adaptive refresh rate.

        • Usacomp2k3
        • 7 months ago

        I would put my 2@3400x1400p monitors against your 3@2560×1600 from a usability standpoint anyday.

          • cygnus1
          • 7 months ago

          Those HPs aren’t even 2560×1600, their 1920×1200 but aka 16×10 aspect ratio. That’s what the complaint was really about, saying 16×10 is better than 16×9.

          I recently got a pair of 3440×1440 monitors at work, and for once I count my work setup as better than my home setup. If I wasn’t planning for a kid, new 3440×1440 high refresh monitors for home would very much be on my shopping list.

      • derFunkenstein
      • 7 months ago

      If “I would never want something and therefore I don’t see why someone else would, either” isn’t code for “the universe revolves around me,” I don’t know what is.

      • Srsly_Bro
      • 7 months ago

      100fps, you in 2010 bro? Nice troll but the 100fps gave it away.

      • Krogoth
      • 7 months ago

      There aren’t many GPUs that have the proper bandwidth over DP for 4K@120FPS and beyond without compromises (color space). Never mind the fact that not even the mighty Titan RTX can consistently deliver 4K@120FPS and beyond without making compromises on visual fidelity on newer titles.

      Perhaps by the time Turing’s successor comes around that threshold will become obtainable and DP 1.4a capable GPUs are more commonplace.

      • superjawes
      • 7 months ago

      [quote<]I'm like, comeon.[/quote<]What's a "comeon"? Is that a new Eevee evolution?

        • techguy
        • 7 months ago

        Guess you guys have never watched South Park.

          • Srsly_Bro
          • 7 months ago

          +1 for reference

          • derFunkenstein
          • 7 months ago

          Proudly so

      • Bauxite
      • 7 months ago

      I dropped from 4k to 1440p UW due to better quality panels and better frames.

      • Chrispy_
      • 7 months ago

      Ultrawide in 2019 is still a mixed bag in terms of proper support. It’s certainly better than it used to be but as an ex-ultrawide owner who sold his due to patchy game compatibility with 21:9 in 2017, I’m not really feeling the love.

      I always have a 21:9 custom resolution on my machines so that I can run as an ultrawide on my 4K and 1440p displays, and very few games actually feel ‘better’ in ultrawide. My non-scientific, non-exhaustive list of gaming at ultrawide resolutions leads me to conclude that:

      [list<][*<]60% of my games support ultrawide and although there are no graphical errors, many aspects of ultrawide are afterthoughts - UI is 16:9, secondary elements (maps, overlays, loading screens etc) are 16:9 and in several cases, the horizontal FOV is locked to prevent unfair advantage/exploits - meaning that vertical FOV is actually cropped on an ultrawide resolution.[/*<][/list<] [list<][*<]20% of my games are incompatible with ultrawide. It may be playable, but there will be bugs such as things outside the 16:9 boundary being unselectable or untargetable, object and enemy pop-in because the game engine is optimised to save on rendering/compute power for stuff that would normally be offscreen on a 16:9 display[/*<][/list<] [list<][*<]10% of my games truly benefit from ultrawide in every way. The developer understands how to fully exploit a 21:9 aspect display and the 16:9 experience is the cropped and reduced experience.[/*<][/list<] [list<][*<]10% of my games simply don't run in ultrawide resolutions. If the game itself supports an ultrawide resolution it will be either stretched at the wrong aspect ratio, or have black borders at the sides.[/*<][/list<] Wake me up in 2025 when 21:9 has flawless support from the media and gaming industry. Until then, I'm done and will stick to what the game devs are designing and testing on.

        • DeadOfKnight
        • 7 months ago

        It depends on the user, really. If someone plays a single game 99% of the time, and that game has flawless support and actually gives an advantage in field of view, they’re going to find it to be absolutely necessary. If you play roulette with your steam library and play for an hour or two a few nights a week and half the time it’s broken or requires some workaround, you’re not going to be as impressed. I’m kind of in between.

        I’ve been playing a lot of Diablo 3, and it’s a total game changer.
        You can actually see things that would be half a screen away at 16:9. If I play something else, I always expect to have to go to PCGW and find fixes. I think it’s a lot more worth it if you play a lot of modern games, but if you spend a lot more of your time playing old favorites, it can be more frustrating. Personally, I actually kind of enjoy “hacking” games to max out the visuals.

          • Bauxite
          • 7 months ago

          UW is really great for MMOs, RPGs and similar games where it is handy to have maps, inventory, etc up most of the time.

          • Chrispy_
          • 7 months ago

          Yeah, D3 was one of the games where it did provide a decent advantage in game but I would classify that in the 20% of “incompatible” with horrible bugs in the menu, and range issue bugs caused by being allowed to mouse-over, target, and interact with objects/enemies that are outside the normal 16:9 region.

          [url<]https://i.imgur.com/VaNSp3H.jpg[/url<] I worked around and tolerated the problems of D3 in 21:9 because the extra view it provided was an advantage (maybe even a cheat?) but it's clearly not a game that was designed for aspects wider than 16:9.

        • Bauxite
        • 7 months ago

        Yeah its game dependent, but it is getting better. I remember when 16:10 (now nerfed to 16:9, thanks TVs) was the “new” widescreen ratio and some FPS shooters would not let you change the FOV enough in multiplayer because it was “unfair” to 4:3 and 5:4 players. (Piss on you, Battlefield 2142)

        Also a 34″ UW turns into a 27″ 16:9 if you want it to, same black border concept as running 21:9 on 4k monitors.

        The fast 4k panels still kinda suck, and are still limited at higher framerates until new standards arrive. (granted anything 100hz+ is pretty damn smooth) I’ve put them side by side and the best fast UW beats the best fast 4k in image quality, for half the price.

        The GPUs don’t always keep up very well at 4k either, not even the best.

        If you only care about media obviously the best panels are 4k, but they are also inside larger TVs and 60hz max for now πŸ˜‰

      • DeadOfKnight
      • 7 months ago

      Besides the ignorance on how much money people can spend, your comments are wrong. A great experience at 3440×1440 doesn’t translate to a great experience at 4K. 4.9 vs 8.3 million pixels. You are pushing 60% as many pixels as 4k. Granted, you can still see the benefits with a 100Hz monitor, but that would be true for 1440p/144Hz as well, invalidating that argument.

      99.99% of games are old games with low resolution textures that don’t look much better on a higher resolution display. As many of them can exceed 144fps on a mid-range card, they would benefit a lot more from 144Hz. Furthermore, you can get two of them for what was the price of your one when it first came out. 2 screens >> ultrawide, if you have the desk space.

      • DavidC1
      • 7 months ago

      I definitely don’t think you are “reprehensible as auxy thinks, but you are nowhere near the average gamer.

        • DeadOfKnight
        • 7 months ago

        Yeah, flexing your disposable income is kind of lame, but really anyone with a high end system in their forum signature is low key doing the same thing. This guy is kind of taking it to a new level by shaming people who aren’t doing the same thing. I’ve clearly got a better setup than even he has, but I’m not going to tell people that anything less is crap. If my financial situation were any worse than it is, I would have nothing to brag about. It’s probably insecurity. By that I mean he’s probably doing it to make himself feel better about spending that much money.

          • Srsly_Bro
          • 7 months ago

          Lol tech “shaming”

          Toughen up and reduce snowflake episodes. Many of your problems will disappear.

          He “shamed” my tech. Maybe he worked hard to earn what he purchased. He prioritizes things differently then you. Stop crying when someone decides to allocate money differently than you. Now you want to shame him for being proud of what he has. Get a life, bro.

            • DeadOfKnight
            • 7 months ago

            Hah. As if there were any emotion in my response. Nice attempt at trolling me “bro”.

            • K-L-Waster
            • 7 months ago

            Incorrect.

            He’s calling Techguy out for implying that anyone who doesn’t have a 2080 Ti is “console” — aka “peasant.”

            But I’m not surprised the distinction is beyond you.

            • Srsly_Bro
            • 7 months ago

            Have a seat and tell me how the bad man made you feel.

            • Spunjji
            • 7 months ago

            You should try moving away from your reliance on strawmen. It’s like arguing on hard mode – people respect you more, and you don’t look so filthy casual.

            • sweatshopking
            • 7 months ago

            [quote<] Maybe he worked hard to earn what he purchased [/quote<] haha pretending that pay is in any way relative to your level of "hard work " good one

            • Srsly_Bro
            • 7 months ago

            Fair point. Wasn’t to be taken in that context so much as consistently working and saving to purchase something.

            #stillmybro

            • steelcity_ballin
            • 7 months ago

            His alleged wealth was never the issue. His manner of flaunting it is not only tacky (bragging on a small tech forum? Come on…), but speaks to the lack of wealth he likely has. Money talks, wealth whispers. Further, it’s one thing to brag about your bankroll, it’s another to talk down to others who you assume aren’t in a similar position. It’s just trashy behavior honestly.

          • derFunkenstein
          • 7 months ago

          What if the high-end system in my forum signature is Ivy Bridge era? That guy’s Radeon 7870 was high-end in mid 2012. πŸ˜†

            • Srsly_Bro
            • 7 months ago

            Iirc 7950,7970 and 7990 were high end. 7870 was selling for around $200 then. I paid around $330 for my Msi 7950tf3

            • derFunkenstein
            • 7 months ago

            prolly so.

            • Srsly_Bro
            • 7 months ago

            TBF there was a big price drop and I think the 7870 was originally around $350 but that was closer to the launch date which was late 2011?? I got my card around November 2012 and that’s after the prices were reduced.

            • Mr Bill
            • 7 months ago

            [quote<]...7870 was selling for around $200 then[/quote<] Yep, what I paid for my XFX.

            • Redocbew
            • 7 months ago

            That means they used to be one of the cool kids, but now they’re just sad and lonely.

      • floodo1
      • 7 months ago

      Big difference between 100hz and 144hz

      • NovusBogus
      • 7 months ago

      Meh. If you ask me everything 16:9 is peasant grade. That it’s impossible to get a laptop in 16:10 or 5:4 anymore irritates me almost as much as Quadro’s weak gaming performance.

        • Krogoth
        • 7 months ago

        16:10 is professional-tier only because of simple economics. Despite the fact it was the first digital widescreen spec and predates 16:9 over a decade.

        Panel manufacturers use the same tools and molds for both HDTV and computer monitors. 16:9 came about because of bandwidth limitations OTA for HDTV spec. That’s why 16:9 monitors are commonplace and much cheaper then their 16:10 brethren.

      • albundy
      • 7 months ago

      i agree with you, but alas, i have a feeling that the big tv market will eventually have more to offer with hdmi 2.1 4k 120hz than these extremely expensive tiny pc monitors available today. i jumped on the large tv band wagon last year when i got a samsung qled 82 inch freesync tv. the max i can do is 1440p 120hz, and it’s incredible! i made a decision that day to never sit at my desk ever again.

      • f0d
      • 7 months ago

      good for you mate but you are obviously bragging about why isnt everyone as rich as you
      i dont think i have spent the cost of a 2080ti on my last 3? or so whole box’s

      its great that you can afford that stuff (i would too if a 2080ti wasnt almost a whole months worth of pay) but the way you worded that post is serious elitist im better than everyone material

        • Srsly_Bro
        • 7 months ago

        $2000 makes someone rich?

          • f0d
          • 7 months ago

          if you can afford that with disposable income for just a graphics card…… yes
          nothing wrong with being well off you just dont have to be a douche about about it
          its like

          I don’t understand the abundance or persistence of high-end fords available today. fords are solidly in horse drawn cart territory. If you have a license, surely you can go ferrari or lamborghini by now? I don’t even bother checking speed limits in my ferrari 812 superfast. it’s just not an issue. Sorry if I sound like an elitist, but cereal. I’m like, comeon.

            • techguy
            • 7 months ago

            People are still getting their panties in a twist over this?

            Let me ask you all something:

            if you heard the words “but cereal. I’m like, comeon.” in a conversation, would you think the person was:
            a) being serious
            b) being facetious

            • Redocbew
            • 7 months ago

            c) so hipster they don’t even care

            • Spunjji
            • 7 months ago

            This is 2019. If you’re still assuming every obviously dumb thing people says is being said to be obviously dumb, well, all I can say is that’s a nice world and I’d love to go back to it :/

      • chuckula
      • 7 months ago

      I don’t really care that much about this thread but there’s no way I’m letting a flame fest like this happen without at least a token post on my part.

        • Mr Bill
        • 7 months ago

        +3

        • Redocbew
        • 7 months ago

        I am cancelling your involvement in the thread. Pray you are not canceled any further.

Pin It on Pinterest

Share This