Friday Shortbread

Eight is Enough

  1. X-bit labs: Apple may become a viable chip designer – analysts
  2. Michael Dell: hp‘s PC strategy change is a bad move – Mashable
  3. Intel CPUs overclocked: Sandy easy, Ivy easier, Haswell easiet – VR-Zone
  4. TC Magazine: Kingmax going up with 1TB with new SSD line
  5. iFixit’s iPhone 4S teardown
  6. Building Windows 8: The Windows 8 Task Manager
  7. PC infected? Blame yourself, Microsoft report concludes – Network World
  8. AMD Catalyst Application Profile 11.9 CAP2


  1. Dennis Ritchie: The giant whose shoulders we stand on – Ars Technica
  2. Reuters: AOL CEO pitches investors on Yahoo! deal: sources
  3. Mashable: Hulu is no longer for sale
  4. Google announces third quarter 2011 financial results
  5. BlackBerry service update (video)
  6. TechRadar has 7 ways to save Microsoft
  7. X-bit labs: ORNL’s Titan supercomputer to deliver 10 – 20 PetaFLOPS performance

    and Corsair unveils high-speed 8GB DDR memory modules, 32GB kits

  8. TC Magazine: OCZ develops Deneva 2 and Intrepid SSDs with mSATA connectivity

    and new Buffalo external drive can switch capacities, from 1TB to 3TB

    and PQI unveils the H552V portable hard drive

  9. VR-Zone: Asus launches USB 3.0 speed booster, UASP support for ASMedia
  10. TC Magazine: Iiyama coming out with the ProLite X2377HDS IPS-based monitor

    and SteelSeries and Gunnar Optiks intro the Desmo series eyeware for gamers

    and Bose debuts the OE2 and OE2i headphones

  11. hp‘s new blog: Input | Output
  12. Dealzon’s deals: $50 coupon for 17.3” Alienware M17x i7-2670QM / Geforce GTX 560M,

    $32 coupon for 14” Lenovo ThinkPad Edge E420s i5-2430M, $150 coupon for 14” Lenovo

    G470 i3-2330M, and $105 off 256GB Samsung 2.5” SSD


  1. C|Net reports senators aims to cut through 4G baloney
  2. Ars Technica: Android Ice Cream Sandwich event moved to October 19 in Hong Kong
  3. The Inquirer reports Dell will launch Windows 8 devices
  4. ocaholic reviews Toshiba AT100 tablet
  5. This is my next: Siri says some weird things
  6. Engadget’s HTC Sensation XE with Beats Audio review
  7. Engadget: Duke Nukem 3D is coming to Android, old rope shares soar

Software and gaming

  1. MakeUseOf: Linux just got better with the Fedora 16 beta distribution
  2. Opera Desktop Team: Introducing Opera 12 alpha
  3. Forbes: U.S. retail video game sales fell 6% in September
  4. Techgage’s look at Beamdog’s PC game digital distribution service
  5. Gamasutra’s interview: Tripwire’s Alan Wilson takes Red Orchestra 2 back to square one
  6. Battleblog #13: Multiplayer map reveal, from the streets of Paris to the outskirts of Tehran
  7. Battlefield 3 – Destruction gameplay
  8. Syndicate SP gameplay video – “Executive Search”
  9. Joystiq: ‘Infinity Blade FX‘ brings the iOS hit to arcades
  10. TR alum Joel Hruska on id’s Rage: Flawed, flat, but occasionally fun
  11. Cracked: 5 real skills video games have secretly been teaching us
  12. Steam’s THQ week – day 4

Systems and storage

  1. Benchmark Reviews on Battlefield 3: Desktop PC platform recharged
  2. ThinkComputers reviews 15.6″ Lenovo IdeaPad Y570
  3. TweakTown’s AMD FX-8150 vs. Core i7-2600k CrossFireX HD 6970 x3 head-to-head
  4. Guru3D’s AMD FX 8150 – 8120 – 6100 and 4100 performance review
  5. KitGuru Lite examines AMD’s Bulldozer
  6. Hardware Secrets on the HyperTransport bus
  7. PCPer reviews MSI Z68A-GD80 G3
  8. Neoseeker’s Intel Sandy Bridge 4-way motherboards roundup
  9. OCC’s Sapphire A75 Pure Platinum review
  10. KitGuru’s 1TB OCZ RevoDrive Hybrid HDD / SSD review
  11. Legit Reviews on 120GB Corsair Force 3 SSD
  12. Overclockers Online reviews 128GB Patriot Torqx 2 SSD

Multimedia and cooling

  1. VR-Zone’s BenQ EW2420 monitor review
  2. ProClockers review Cooler Master Storm Xornet gaming mouse
  3. VR-Zone’s Microsoft Touch mouse review
  4. Real World Labs on Mad Catz Cyborg R.A.T.7 Albino gaming mouse
  5. Hardware Canucks review Corsair Hydro Series H100
Comments closed
    • PeterD
    • 8 years ago

    Come on, TR, where is the Dennis Ritchie memorial article?

      • Deanjo
      • 8 years ago

      Only if it is done in C syntax.

      • mutarasector
      • 8 years ago

      [quote<]Dennis Ritchie: The giant whose shoulders we stand on[/quote<] Agreed. If Jobs was a giant, it was because he, OS X and iOS stood on the shoulders of this REAL giant...

    • OneArmedScissor
    • 8 years ago

    III.Intel CPUs overclocked: Sandy easy, Ivy easier, Haswell easiet – VR-Zone

    [quote<]... knowing that the sources were confirming the likelihood of combined multiple CPU and multiple GPU cores on Haswell (i.e. you could have, say, 4 CPU cores and 2 GPU cores, or 2 CPU cores and 3 GPU cores chip), would create possibly the most flexible desktop - and mobile - platform Intel, or anyone else, ever had in one socket.[/quote<] I'm baffled as to why no one is doing this right now, but with something more like a "0.5 core" GPU. Even the lowly Bobcat has its own video decoding unit, so there's no universal need for a powerful GPU. There's that completely GPU-less version of Llano, but why not just cut it down to 40 SPs so that it can run on very low power and still support switchable graphics? I think it's great that integrated GPUs are both standardized and that they are rapidly becoming more powerful, but the more powerful part really should be an option. Already by Trinity, about half the chip will be the GPU! Now imagine what happens at the next shrink, where they'll just increase the GPU again lol. The "CPU" will be dwarfed, even though it's still the important part. It's like how cache made its way up to about 1/3 of many CPUs, but there were inexpensive alternatives, like Pentiums made with a fraction of the L2, or an Athlon II that just left the L3 off altogether. These may be dangerous words, but I don't even care if they don't charge less and just pocket the difference. Computers aren't that expensive and being able to buy what you want is worth a few extra dollars. But they're not letting what I'd have to assume is the vast majority of people have a choice.

    • dpaus
    • 8 years ago

    Any announcements/coverage from JavaOne?

      • dpaus
      • 8 years ago

      Here’s the news on JavaFX 2.0:

      [url<][/url<] From the press release: "JavaFX 2.0 provides a web component based on Webkit.." "Existing Java Swing applications can be updated easily with new JavaFX features such as rich graphics API, media playback and embedded Web content" More on Java 8 and 9 as I hear it...

    • Voldenuit
    • 8 years ago

    [quote<]Cracked: [url=<]5 real skills[/url<] video games have secretly been teaching us[/quote<] This made my morning.

    • PeterD
    • 8 years ago

    ” Dennis Ritchie: The giant whose shoulders we stand on – Ars Technica ”

    Quite true.
    He did Unix.
    And Unix gave Linux.
    And Linux gave… iOS!

      • Hattig
      • 8 years ago

      Not quite.

      Unix gave OpenStep and the BSDs (as well as Linux).
      OpenStep and the BSDs gave us Mac OS X.
      And Mac OS X gave us iOS.

      • Geistbar
      • 8 years ago

      I’d say that delegating his influence to “indirectly gave us iOS” is a great disservice to his affect on modern computing. Creating C, by itself, is enormous. So many modern systems, including, to my knowledge, every noteworthy OS, are made with, or with a derivative of, C*.

      While I personally didn’t enjoy the times I had to use C/C++ (which is undoubtedly at least in part to me never being good, or even passable, with it), it’s influence [i<]far[/i<] exceeds allowing Apple to create to a mobile focused OS. *I had to make an edit to highlight how horribly comma intensive that sentence was. Yuck.

        • PeterD
        • 8 years ago

        That’s actually why I mentioned iOS: there has been lot of hubahuba about Steve Jobs’ death, but Jobs wouldn’t have been far without Dennis Ritchie.

      • mutarasector
      • 8 years ago

      Actually, he created C, but UNIX was more of a collaborative effort between him and Brian Kernighan, Ken Thompson, Douglas McIlroy, and Joe Ossanna.

      The mark of true genius is that for C and UNIX to be the foundational block for the vast majority of computing world today, it actually began as a >joke<…


        • PeterD
        • 8 years ago

        “but UNIX was more of a collaborative effort”

        That’s true.
        But so are Jobs’ prestations.
        Jobs himself even said openly that team effort is very important to get somewhere.

    • Squeazle
    • 8 years ago

    Cnet 4G link is broken.

      • Ronald
      • 8 years ago

      It works for me.

    • Arclight
    • 8 years ago

    [quote<]Benchmark Reviews on Battlefield 3: Desktop PC platform recharged[/quote<] I just hope that next gen mid end graphics can handle this game at 1080p with 50 fps or more on high settings, cause i have a GTX 560 Ti and it was seriously underpowered at that resolution, unless i turned all the settings to Low, which defeats the purpose. Hell even a GTX 580 has problems at 1080p on high settings.....SLi/Crossfire will be the way to go for this year if you want to play the game the way it's meant to be.

      • lilbuddhaman
      • 8 years ago

      SLI/Crossfire won’t help you when you hit those Vram limits. My 6870×2 (1gb) were chugging along at 100+ fps @ 1920×1200, but would drop to sub 20’s because the vram was at it’s limit. Turning on even just 2x msaa turned it into a constant stutter fest. This game has become the reason to get 2GB+ cards.

      (and FXAA and MLAA are trash, blur-o-vision techs, hate them both)

        • dashbarron
        • 8 years ago

        I’ve got a 570 SLI setup and I was struggling in the 20’s at max setting. The game is brutal and I only hope driver’s can get a little better. I don’t even know if OCing my CPU will do me any good.

      • Cyco-Dude
      • 8 years ago

      that’s a lot to demand of a mid-range card.

      i kinda scratch my head here…in a multiplayer fps (i’m assuming you play online), why do you want high resolution or high graphics settings anyway? back in the day, fps was king – you did whatever you could to get the highest fps possible (125 for quake 3) to make the game as smooth and chop-free as possible. if it meant lowering the resolution or graphics settings (many people did that regardless to make targets stand out better), then that’s what you did.

      furthermore, it was pretty common (for me at least) to play new games at lower resolutions to get playable framerates. that’s one thing i dislike about lcd monitors; they tend to look like crap at non-native resolutions. i’ve got a death-grip on my 1440×900 lcd and sony trinitron man…i’ll take fps over resolution every time (sp games excluded). when the hardawre catches up, THEN i’ll go after that stuff.

        • Arclight
        • 8 years ago

        I agree with all you said and i DID play the BETA on Medium settings at a typical laptop resolution just to ensure that my frames would be above 60, but i played on a native 1920×1080 monitor, idealy i would play at the native resolution.

        I must say that, for first person shooters, 4:3 aspect ratio seems superior to 16:9, as are the 120 Hz monitors compared to 59,60 Hz monitors. But you buy what you can afford, i could only afford a 22″ widescreen monitor with a refresh rate of 60 Hz and 2 ms response time (but i think it was GTG measurement not BTB)

          • Cyco-Dude
          • 8 years ago

          depends on the game in question; starcraft 2 and enemy territory: quake wars makes use of the extra width: you lose no vertical but do gain a little on the sides over 4:3. i still use the lcd as my primary, but i would use the crt if i had to. it’s mostly just for watching a second starcraft 2 stream these days (mlg soon woot!). i got the 1440×900 over the 1680×1050 just to have a lower resolution and make it easier to get playable framerates in newer games. bleh @ 16:9; don’t get me started…

      • turkeysam
      • 8 years ago

      It was a beta and it wasn’t finished. The graphics engine was very clearly incomplete. All games in beta show similar issues.

        • Arclight
        • 8 years ago

        That applies to bugs, which they actually said they already fixed before the BETA. The build used for the BETA was an old buggy build, they give it to the public more to test the servers in real life and maybe get some suggestion for last minute changes of the HUD.

        I’m not worried about the buggs, i’m certain they won’t be that extensive in the finished product, BUT some will still exist and i don’t expect performance to actually increase too much. I wouldn’t mind being wrong in this particular case though….

    • CuttinHobo
    • 8 years ago

    Re: X-bit labs: Apple may become a viable chip designer – analysts

    Clearly they’ve got their eyes on some piece of this pie (possibly only the piece that they plan on eating), but it seems to me like they would be at a serious disadvantage, patent-wise. I don’t know what IP they gained in their acquisition of PA Semi and whoever else, but I’d expect they would need more acquisitions or big cross-licensing agreements.

    But I could definitely see them throwing cash at AMD for a cross-licensing agreement, and AMD jumping at the chance to enable Apple to dump Intel. That is unless they convince a judge that they invented semiconductors, and have everybody else’s imports blocked. 😉

      • Beelzebubba9
      • 8 years ago

      I think it’s pretty clear Apple won’t be making x86 CPUs – only ARM. The A5 has a whole bunch of custom IP and I suspect the A6 will have even more. I’m not sure if Apple will design their own ARM cores like Qualcomm does, but there’s still plenty of room for a Apple to use their massive R&D budget to advance mobile hardware as they see fit, just like they did with the A5’s GPU.

        • oldog
        • 8 years ago

        And I think Intel will eat their lunch. Apple is used to going after the likes of MS. Intel is not MS. For a huge company they are focused and nimble in this arena.

        I predict that Win8 and a very low TDP Intel chip means that for anything other than a cell phone Apple’s move into this arena will be quixotic.

        • CuttinHobo
        • 8 years ago

        Right. I don’t think they have any interest in making x86 CPUs – and I’m sure they would rather consolidate everything on ARM (that they design) to break down the compatibility walls between their devices. Besides, AMD couldn’t license that part of their arsenal to anyone no matter how much they wanted to.

        Anyway, designing a CPU has to be a minefield of patents – like the couple that VIA is suing Apple over right now. Whether it’s for reorganizing operations, or techniques for checking the caches for data. The more basic, non-x86-specific IP is what I’d expect them to potentially license from AMD.

Pin It on Pinterest

Share This