The TR Podcast 22: A product opportunity, speedy video encodes, and bringing SLI up to snuff

Date: Oct 25, 2008

Time: 1:17:22

Hosted by Jordan Drake

Co-Hosts: Scott Wasson, Geoff Gasior, Cyril Kowaliski

Listen now:
Download: MP3 (70.9MB) | M4A (57.4MB)

Subscribe with RSS | Subscribe with iTunes

Sponsored by TheJewelStore.com, your online source for unbeatable diamond, gemstone and wedding jewelry with expert personal service.

 

Show notes

We have a nice cross-section of topics for you this week. We kick off this episode of the podcast with a question from an inebriated reader, then we delve behind the scenes of our new, overhauled system guide. After a look at the “product opportunity” AMD filled with its Radeon HD 4830 graphics card, we go over Seagate’s new 1.5TB monster hard drive, and we chat about two new goodies awaiting GeForce users: the upcoming ForceWare 180 driver release and Elemental’s Badaboom video transcoder. The episode closes with a discussion of the latest games, where Geoff explains what makes Dead Space so scary and Scott gives his impressions of Far Cry 2.

As a reminder, you can still enter our contest for a chance to win an Acer Aspire One netbook. Head over to the giveaway post to enter. We’ll be taking names until October 30. (Once again, thanks to NCIX for providing the Aspire One that will end up in the hands of a lucky podcast listener.)

Send in listener mail, and we’ll answer on the podcast. – jdrake@techreport.com

Listener mail:

Plugging the hole (0:04:23) – Anonymous

“Scott,

Random thought.

Back in the C2D launch days, there was discussion about the fact that on 1066MHz FSB CPU’s you only really needed DDR2 667MHz memory to plug the FSB bandwidth. Now granted 1333FSB and 1600FSB would theoretically up the size of the “hole to plug” with memory bandwidth, this idea breaks down with i7 and it’s non-FSB related communication path to the CPU. But the GPU is now the limited one… bandwidth wise.

Bla.

Given the interface speed for the memory controller, and the dual/triple channel chipsets coming, how does that conceptualization change? I’d love for you to think on this as you do your/the i7 review.

All the best. *hic*”

Tech discussion:

    TR’s fall 2008 system guide (0:10:42)- Read more

    AMD’s Radeon HD 4830 graphics processor (0:23:06)- Read moreUpdate

    Seagate’s Barracuda 7200.11 1.5TB hard drive (0:32:57)- Read more

    Badaboom 1.0 uses Nvidia GPUs to transcode video (0:35:33)- Read more

    Nvidia spills the beans on ‘Big Bang II’ ForceWare release (0:47:13)- Read more

    Geoff talks about Dead Space and Scott shares his thoughts on Far Cry 2 (0:58:27)

That’s all, folks! Check back on November 1 for the next TR podcast.

Comments closed
    • Darkmage
    • 11 years ago

    Once again, great job on the podcast. What I particularly enjoyed from a purely selfish perspective, is how you followed discussions of a “suitable for HTPCs” 1.5 TB hard drive with a discussion of video encoding with Badaboom. This naturally leads me to visions of dropping that 1.5 TBs of goodness into my HTPC and then ripping my entire DVD collection to AVI and putting it onto that disk. A little work keeping it organized and a plug-in for MCE2005 and I’ve got video-on-demand for a kick-butt experience. Brought a smile to me face, it did.

    I get the impression that TR hasn’t quite decided what to do with Badaboom yet. Will it become part of your testing suite? Will TR do an in-depth review of Badaboom? Can you nail an interview with the head of Badaboom development?

    As an ATI/AMD fanboy (rah rah, drool drool, flame flame) I am kind of SOL when it comes to Badaboom. But I am curious if you guys have looked at the AVIVO video encoding engine that ships with the Catalyst suite. Does it use the video card hardware in a similar way? Does it perform as quickly on comparable hardware from the Red Team? Do different codecs affect the performance significantly? Will someone finally have a tool that will properly recode from a high definition DVR-MS file to an XVID AVI?

    • Darkmage
    • 11 years ago

    Regarding Radeon multi-monitor gaming: I’m using a single 4850 with two monitors and I would like to share a couple of observations regarding gaming on two monitors in an AMD environment.

    I’ve experienced the phenomenon of having the non-primary monitor do weird things (blank screens, big icons, etc) when I fire up a full-screen game on the primary monitor. I’ve come to the conclusion that this is due to the resolution of the game being set to a different resolution than the resolution of the secondary monitor. I run both monitors at a fairly low resolution and I set every game I play to a matching resolution. When I start the game, the secondary monitor doesn’t even flicker.

    Which, by the way, is incredibly useful for things like Grand Theft Auto, where you can place a JPG with a map of the hidden packages or walkthrough in the secondary monitor as a reference.

    Usually when I install and play a game for the first time, the default resolution is something different from my desktop. During that first game start, the secondary monitor will change resolutions to match the primary monitor, and I lose all of my icon placement. 🙁 Once I set the game to run at the desktop resolution, I’m good to go.

    Anyhow, given your profession of thrashing a graphics card to within an inch of their lives, I’m not sure how useful this information is. Hell, I don’t even know if you can duplicate my findings. But I thought you might like to know.

    • DrDillyBar
    • 11 years ago

    Good discussion.
    I enjoy hearing more about CPU’s is the podcast as there is alot of Graphics cards being reviewed lately, and the topic is getting a little stale.
    Then there’s Apple. I think I share Damage’s general oppinion when it comes to the effects of the RDF. Sure Apples are well built at this point, but I just can’t forget the days of the PPC and heavily customized benchmarks where Apple used non-intel compilers in tests just to prove the Mac was faster at photoshop. Even in daily shopping and discussions with friends I frequently hear people talking up the power of their latest Mac saying it’s all OSX and it’s ability to use 64-bits that makes it clobber Vista. Misinformation at it’s finest. Now they’re using intel. I enjoy pointing out that that is the difference, not so much the OS (which itself is really a variant on Unix).
    Can’t wait for the i7 review!

    • SecretMaster
    • 11 years ago

    I really love how throughout the discussion about Core i7 Cyril (and I think Scott a few times) had to keep correcting themselves as to when the release date is. They’d say what their insider information is and then quickly switch to what the official spiel from Intel is.

    I’m eagerly waiting Geoff’s secret review of the Nehalem chipset/motherboard.

    I think Geoff wins the coolest voice due to Skype distortion/problems for this podcast. At one point it almost reminded me of Darth Vader.

    • MaxTheLimit
    • 11 years ago

    You do get to shoot the limbs off trees in far cry, but larger trees I haven’t been able to knock over…2 RPGs and stil standing, bit you can knock a bunch of stuff off the big trees.

    Dead space is the next game I’m buying. I’ve already pre-brought Fallout3, so I look forward to getting scared by dead space.

    I actually don’t mind the Core i7 economy boards not coming out for a while. I still don’t see much reason to move ahead of a basic quad core, or a decent dual core CPU…not too much I do that really makes it important.

      • eitje
      • 11 years ago

      Heck, I still use Dothan P-Ms in my primary systems!

    • Prototyped
    • 11 years ago

    Re the Core i7 graphics, it isn’t like the PCI Express 2.0 x16 links are completely used. The QuickPath link for Core i7 is wide enough to carry that bandwidth to the memory controller, so there’s really not much to worry about. A single PCI Express 2.0 x16 link provides 500 MB/s x 16 = 8 GB/s of bandwidth. If graphics cards actually transferred this much, current systems would be completely starved for bandwidth, given that two channels of DDR2-800 provide 12.8 GB/s of bandwidth as a theoretical maximum, shared by the processor and all peripheral DMA, including video cards. As it happens, textures are transferred in bulk at scene changes or level loads, and beyond this during actual gameplay, the system memory bandwidth utilization by graphics accelerators is lower as they work out of their own local on-card texture memory.

    For reference, Core i7 920 and 940 will have 4.8 GT/s QuickPath links with a 16-bit data path, which comes to 9.6 GB/s in each direction which you might be able to tell can carry an entire PCI Express 2.0 x16 graphics card’s demand. Since the memory controller is on the processor itself, this 9.6 GB/s isn’t shared by the processor’s demand for memory bandwidth. (/[http://www.hardwaresecrets.com/fullimage.php?image=12403<]§ Re Big Bang II: The lack of multi-monitor support with SLI is totally lame, and it's good that it's changing. The suckitude affects not just Windows, either. [2] [2] §[<http://us.download.nvidia.com/XFree86/Linux-x86/177.80/README/chapter-25.html<]§

    • Hance
    • 11 years ago

    dont bother entering the contest for the laptop its mine all mine i say

    • ssidbroadcast
    • 11 years ago

    *[<1hour 13min<]* Scott, you think the ps3 *is* the baseline between the two consoles?? It might be a technically more difficult platform to develop for, but it certainly isn't technically /[

      • Damage
      • 11 years ago

      I said “PS3 baseline” very much in passing, but yes, it was intentional. IMO, the Xbox 360’s GPU is superior to the PS3’s, and I don’t think Cell has proven it’s able to make up for that fact. Cell may eventually have its uses, but right now, the PS3 does seem to be the lowest common denominator visually and in terms of traditional CPU power, too.

        • Nitrodist
        • 11 years ago

        Which one(s) do you own, Damage?

          • Damage
          • 11 years ago

          Wii. 🙂

        • ssidbroadcast
        • 11 years ago

        Fair enough, I guess since both consoles get ports of the same game the differences are mostly academic, although I’ll add that I don’t think the ps3 is getting a fair shake from developers. While most AAA titles seem to have performance parity between the two consoles, there have been others that have really poor optimization for the ps3 (compare:/[

Pin It on Pinterest

Share This