The TR Podcast 96: IDF and Inside the second

The Tech Report Podcast Date: September 18, 2011

Time: 1:42:12

Hosted by Jordan Drake

Co-Hosts: Scott Wasson, Geoff Gasior, Cyril Kowaliski

MP3 (73.7MB) | M4A (100.6MB)

RSS (MP3) | RSS (M4A)
iTunes (MP3) | iTunes (M4A)

Listen now:

Show notes

We’re a slightly beleaguered crew of tech reporters this week; Scott and Geoff have just returned on the red-eye from IDF, Cyril has been cranking away at his Windows 8 blog post, and Jordan is recording from home with tonsillitis (we apologize in advance for the lack of pop-filter and over-abundant room noise). Nevertheless, we kick things off on this episode with three listener tweets, ranging from API overhead to the best sound card for classical music. Then, it’s straight into IDF as Scott and Geoff dissect the keynote speeches, the product demos, and the sneak peeks at everything from Ivy Bridge to the sexy new ultrabooks. After that, it’s a deep dive into Scott’s mammoth look at why frames-per-second isn’t necessarily really the measure of graphical performance, from his article, “Inside the Second: A new look at game benchmarking.” Finally, Cyril has spent some quality time with Windows 8 and gives us the scoop.

Send in listener mail, and we’ll answer on the podcast. –

Follow us on Twitter – ScottJordanGeoffCyrilThe Tech Report

Listener mail/tweets:

API overhead (0:01:52) – from Michael:

“Hi Scott. John Carmack recently discussed API overhead. Do you think AMD/Nvidia will address this? Or is the current PC software pipeline a necessary evil considering the huge amounts of possible hardware configurations? ”

Sound cards for around $100 (0:07:09) – from Rahul:

“Question for podcast. Any sound card recommendations for listening to music (classical mostly) over headphones? Budget $100.

Bulldozer (0:08:29) – from Daniel:

“What do you guys feel about the Bulldozer architecture with the paired core elements?

Tech discussion:

    IDF 2011 Paul Otellini keynote (Ivy Bridge and Haswell) – (0:11:00) – Read more

    IDF 2011 Mooly Eden keynote (Ultrabooks) – (0:22:29) – Read more

    Ultra-slim Ultrabooks strut down the catwalk – (0:31:02) – Read more

    Geoff talks about X79 – (0:36:47) – MSIRead moreMSI UEFIGigabyte UEFI

    Cherry MX switches take root in Corsair gaming keyboards – (0:43:23) – Read more

    Intel confirms 25-nm NAND for Cherryville, Hawley Creek SSDs – (0:53:34) – Read more

    AMD demos working 28-nm GPU, Trinity APU – (0:55:36) – Read more

    Inside the second: A new look at game benchmarking – (1:00:31) – Read more

    Windows 8 and the marginalization of geeks – (1:32:37) – Read more

That’s all, folks! We’ll see you on the next episode.

Comments closed
    • xiaomimm
    • 8 years ago
    • xiaomim
    • 8 years ago
    • Arclight
    • 8 years ago

    So Intel’s intentions for the future are to reduce the power consumption for desktop PCs? Really? That’s why they keep releasing 130W CPUs since what? 2007/2008?

    • Aphasia
    • 8 years ago

    Just wanted to chime in to say that it was a great podcast. Especially with you talking about a new way of looking at benchmarks, and the article earlier in the week.

    I have been listening from the beginning and can only say you are doing better and better work and really having hit a good format for the casts. While I may not have the time to comment and send in questions anymore, I still follow and have listened to every podcast.

      • jdrake
      • 8 years ago

      Thanks for the note!

    • glynor
    • 8 years ago

    Awesome podcast this time around, guys. Keep up the good work!!

    • aggies11
    • 8 years ago

    Fantastic Podcast, a real great listen. The podcast is actually what got me looking at the site (so it works!, if that is your goal).

    With regards to the micro-stutter discussion: any thought to just calculating a “sub-second” framerate? Treat all the frame times as FPS values (1000/frametime), and then use those values to measure the difference between the standard FPS value (average over the second). So you can see how much the sub-second “framerate” varies from the average. I’d imagine you can present/organize it almost like you do with min,max,avg. (Where max might be an interesting value). Would probably graph real nice too (a low flat curve is desirable, any spikes should indicate something user would experience/notice with there eyes).

    For me I first noticed this playing oblivion on my 7600gt. The best performing driver version, had this issue of non-uniform motion (hitching, or micro stuttering), even though there were no variations in the avg frame rate. I needed all the performance I could get with that game, so I just sucked it up. But with all the running in straight lines you do in that game (walking from place to place) it becomes very obvious (static shot with very slow but consistent motion) if the movement isn’t smooth.

    A motion equivalent (pyschologically?) to screen tearing maybe?

    Keep this podcasts coming, each time I notice one is out, it’s a real treat to listen too 🙂

    • Aussienerd
    • 8 years ago

    Another good one and nice and long, Love how Cyril asked the questions of Scott on the GPU measuring just to get him involved.

    keep up the good work guys almost to the 100 mark.

    • codedivine
    • 8 years ago

    Wow that was a giant podcast! But was packed with lots of good info. And thanks for answering my question!

    • odizzido
    • 8 years ago

    I have finally cracked, I have to say it. Scott you say measure really strangely.

    • Pancake
    • 8 years ago

    John Carmack’s complains have largely been about moving memory around which is a bit of an issue with his Megatextures. Basically, there’s a big cost going from:

    hard disk -> system RAM -> graphics card

    But if you are getting good reuse of what’s in the graphics cards RAM – textures and geometry – then this isn’t that big a deal. You’re using all that insane on-card processing and memory bandwidth most of the time in the most efficient manner (as NVIDIA and ATI could make it). So, the PC model doesn’t suit him well and he’s grumbling about it. I’d suggest he look at incorporating more of the strengths of the PC in his games – shader processing, dynamic tessalation of geometry etc. It’s all very exciting what’s happening with DX11 and I’m looking forward to the next generation of games.

      • Meadows
      • 8 years ago

      I disagree. I finally see why console games do so well in comparison, despite hardware that’s [b<]ten times[/b<] weaker. Software layers are a big obstacle to consider with PC games/software, incorrect drivers can do a host of things, including make sound/video skip while P2P (or even your browser!) is using the NIC, increase videogame input lag to bizarre levels despite apparently normal framerates, and more. You can't just jedi-handwave away the issues on the PC, they're very real. You just don't sense them normally, because the platform has grown so powerful. But have you ever wondered why computer *responsiveness* has been stagnating for almost 10 years now?

Pin It on Pinterest

Share This