Imagination Technologies: real-time ray tracing ‘feasible in several years’

I’m still sorting through the vast amounts of interesting new things I saw at CES this year. One item that has so far slipped through the cracks in our coverage is an intriguing new technology the folks at Imagination Technologies were showing.

In its expansive CES suite, alongside demos of a bunch of conventional technologies that you might find in a tablet or a smartphone, Imagination had a little display for "PowerVR Ray Tracing" that said beneath: "The future of realistic graphics."

I immediately found this claim interesting because Intel used to talk pretty bullishly about ray tracing—a complex rendering technique that literally traces individual light rays as they bounce around inside a simulated scene—as the future of real-time graphics and gaming. AMD and Nvidia have generally been more cautious on this topic, talking about ray tracing as one technique among many, not an obvious future destination. Since the cancellation of Larrabee and a shift in focus at Intel, the "ray tracing is the future of motion graphics" crowd hasn’t had an obvious champion among companies that produce GPUs. Now, it appears a major graphics IP provider, the one whose graphics processors power every single iPhone and iPad, is pushing ray-tracing as the future.

Not only that, but Imagination is actively working on making real-time ray tracing a practical reality.

Above is a shot from the demo the firm had on display at CES. This scene with the airplane surrounded by incredibly complex vegetation was being rendered in near real time by a test chip sitting in a PC under the counter. When the camera’s position was adjusted, the demo system would draw a new frame in maybe a second or less. That’s pretty quick for ray tracing of this quality.

Ever skeptical, I was able to persuade the Imagination rep to show me the guts of the system running the demo.

What you see there beneath the graphics card (which is presumably just being used as a display output) is an engineering board sporting an Imagination test chip. The firm has developed some silicon IP to enable fast ray tracing, and it already has a test chip up and running. As one might expect, the test chip’s performance isn’t anywhere near that of a final product. Imagination claims ray tracing at 1080p and 60 FPS will be "feasible in several years."

In fact, this ray tracing logic appears to be an add-on to the company’s existing PowerVR graphics cores. It’s slated to become available "in a future version of PowerVR," and the firm believes this IP has a "surmountable" cost in terms of silicon area. In other words, Imagination thinks it can persuade its customers to adopt a version of PowerVR with real-time ray tracing logic included. If the company is right about that, we could potentially have tablets capable of producing near-photorealistic real-time visuals within the next few years.

Imagination has even created the cross-platform OpenRL API and released a downloadable SDK to enable the creation of applications that use ray tracing.

I wasn’t able to extract much more information about this new ray-tracing IP or how it works, but I expect to be hearing more from Imagination as this technology makes it further down the development path.  We’ll let you know when we find out more.

Comments closed
    • GrimDanfango
    • 6 years ago

    This story comes up every year or two, and it’s no more “feasible” than it ever has been.

    It’s a question of how you use your resources. Raytracing is only “feasible” in fields where you can justify a 10-100x increase in render times for a modest increase in visual fidelity – ie, movie visual effects.

    There is no magic-new-algorithm, there is no magic-new-accelerator card. Raytracing requires massive amounts of brute-force calculations. Whatever you can do with raytracing, you can do far quicker using a “trick” method like rasterizing textured triangles to the screen.

    Raytracing will never, ever attain parity with current real-time graphics techniques, because the whole point of those techniques is to mimic the look of true raytracing as closely as possible, while avoiding as much of the massive computational overhead as possible by cutting corners in clever ways.

    Raytracing is the “right” way, rasterized triangles is the “fast” way… the right way will never be as fast as the fast way. However good a raytraced game could look with the technology available at any particular time, I guarantee a real-time graphics engine built on conventional techniques will be able to look exponentially better at a given framerate, given the same level of technology.

    The only benefit that hardware-accelerated raytracing will be is to movie effects, where we already accept the overhead, and will gladly welcome a speed boost.

      • BlackStar
      • 6 years ago

      There comes a point where the programming simplicity and increased flexibility that comes with raytracing will offset its performance disadvantage. We are already using shaders that perform localized raytracing for specific effects and this trend will only increase as GPUs become faster.

      What you said is correct, rasterization is a fast first-order approximation of raytracing. The challenge is that effects that are essentially “free” with raytracing, cost a lot of development effort *and* visual quality to approximate (e.g. shadows, order-independent transparency, first-order reflections.) Even worse, the more realistic the graphics, the more difficult they become to approximate (second-order reflections are all but impossible with rasterization.)

      At one point, we won’t be able to create better graphics unless we ditch the approximation and go for the real thing.

    • Airmantharp
    • 6 years ago

    Maybe in five years GOG will be selling globally-illuminated versions of today’s greatest hits πŸ™‚

    (and just maybe they’ll play on your smart TV through a SteamOS plugin!)

    • Bensam123
    • 6 years ago

    Interesting… I wonder if that isn’t just one GPU on the underside of the board, but rather an array of mobile GPUs working in tandem.

    • sschaem
    • 6 years ago

    Ray tracing != photorealistic rendering by default

    What ray tracing bring to the table is a more elegant system, but at the cost of efficiency.
    And in the end what make rendering realistic is mainly the lighting and shading model,
    something both rasterizing and raytracing have in common.

    So raytracing simplify many problems faced by rasterizing, but so far no one seem to have solved the cost associated with this. So no one see a pure ray-tracing method making sense.

    And this so simple scene running at 1fps seem to show not much progress have been done.
    When they show a scene with 20+ of the objects flying around, blowing up, at 30+ fps
    on a <200w chip, they will have something to talk about.

    But AMD and nvidia are right about mixing technologies. ray tracing, sphere tracing / distance fields, classic rasterizing etc.. are not to be used in isolation but combined.

    The R9 remind us that we are still power limited….

    For gaming, this was just an interesting PR stunt.

    At this time this is pretty much the SOTA, old ATI 4870 demo, updated…

    [url<]http://raytracey.blogspot.co.nz/2013/10/brigade-3.html[/url<]

    • UnfriendlyFire
    • 6 years ago

    If ray tracing was available in a game…

    What would be the FPS at 1080p for a quad SLI or CF of super OCed GPUs, assuming other graphic settings are set to high?

      • Erebos
      • 6 years ago

      1 frame every a few minutes.

        • Klimax
        • 6 years ago

        Or maybe Knights Corner. (IIRC it was shown back then to do real-time ray tracing)

    • shaurz
    • 6 years ago

    This was technology we acquired from Caustic.

      • the
      • 6 years ago

      Indeed. Looks like a variant of the [url=http://www.extremetech.com/extreme/161074-the-future-of-ray-tracing-reviewed-caustics-r2500-accelerator-finally-moves-us-towards-real-time-ray-tracing<]Caustic R2500 board.[/url<]

        • Airmantharp
        • 6 years ago

        Thanks for the link- very impressive that they’re getting that much performance out of antique technology. 90nm processor using DDR2-800 on PCIe 2.0 means that they could get far closer to real-time just using TSMC’s bulk 28nm process and some modern DDR3 or GDDR5.

    • Deanjo
    • 6 years ago

    [quote<]real-time ray tracing 'feasible in several years'[/quote<] Like that hasn't been muttered over and over by various factions for the last decade. Hell it was what Larrabee pumped in damn near every PR release.

    • R2P2
    • 6 years ago

    [quote<]Intel used to talk pretty bullishly about ray tracing[/quote<] I first read that as "bullsh*ttishly".

    • fellix
    • 6 years ago

    Ray-tracing have two major problems: compute speed and memory performance.
    The first one is mostly solved with the ever increasing APU density, but the memory part remains ever so illusive. The moment someone manages to integrate several gigabytes of RAM on-chip with kilobit-wide parallel access interface, then we can talk a real next gen revolution. Think of vertical die stacking.

      • chΒ΅ck
      • 6 years ago

      That’s probably what they’re working and and will be ready in a few years.

      • spugm1r3
      • 6 years ago

      [url<]http://hybridmemorycube.org/technology.html[/url<] Thats probably a good start.

    • Hattig
    • 6 years ago

    Each time this tech gets closer, the goalposts move.

    720p realtime raytracing .in 5 years … but games are 1080p already.
    1080p realtime raytracing in 5 years … but games are 1600p already.
    So when this can do 60fps 1080p raytracing, traditional non-RT methods will be 4K at 120Hz.

    However it has been mentioned that RT could be used for selected objects within a scene, rather than the entire scene itself.

      • Timezone
      • 6 years ago

      i was wondering if they could do realtime ray tracing at say 640×480. How would that look like?

    • Corion
    • 6 years ago

    1080p at 60 FPS in “several years”? This doesn’t sound even remotely timely. “In several years” we should all be expecting Ultra HD to be the standard, with even higher resolutions on the horizon. And what about 120Hz+ and 3d visuals? Last time I checked, most companies don’t aim for achieving merely the current level of graphics fidelity “in several years”.

    People care about what the end result looks like, not whether the image was achieved using ray tracing. I’d love to see ray tracing take off, but these projections seem to go against their claims.

      • albundy
      • 6 years ago

      in several years, i expect to have a holographic 100″ tv in 1,000,000p coming from a credit card size device.

        • TAViX
        • 6 years ago

        I also expect they will invent a pill that give an instant erection in 10 seconds, after the 3rd time πŸ˜‰

          • albundy
          • 6 years ago

          giggitty giggitty! alright!

    • kamikaziechameleon
    • 6 years ago

    I think that as display resolutions increase ray tracing and current pixel tracing will find a collision in terms of performance efficiency.

    Additionally I think that there are abbreviated forms of ray tracing that are more effective or efficient that might leap frog us to using it sooner. I have a poor back ground in rendering but let me recall, the #1 issue with ray tracing is the amount of information generated in the process of bouncing light particles from sources and waiting to see what hits the screen. Reverse ray tracing uses pixels or particles(depending) sent from the screen to the environment and bounces them till they hit a light source right? Registering colors etc along the way. The limited number of particles in revers leads to the higher efficiency. As we intro 4K we are seeing that things are getting difficult to render this way with current tech. Perhaps we might see some limitations applied to traditional ray tracing to bridge this gap. Think draw distance or something to that effect.

    Rendering physics in modern tech is so amazing. Then you bring in a little post processing and BAM you’ve got battlefield 4. πŸ™‚

      • Waco
      • 6 years ago

      Ray tracing scales more poorly with resolution than rasterizing does.

      Going from 1920×1080 to 3840×2160 with ray tracing means you need exactly 4x the speed to produce the same framerate.

      Going from 1920×1080 to 3840×2160 with rasterization means you’ll need a variable amount more speed to render the same framerate…but that variable amount is usually far less than 4x.

      This is massively over-simplified but the point is that ray-tracing at higher resolutions directly leads to a proportional increase in the speed required.

    • hubick
    • 6 years ago

    Oh man, I’ve been waiting for my flying car and jet-pack, but since those don’t really seem to be happening, can we at least make running around in a fully destructible ray-traced environment using a high quality VR headset happen?

    • ssidbroadcast
    • 6 years ago

    That’s no video card! That’s just an ISA soundblaster 16!

      • Concupiscence
      • 6 years ago

      “Why, the fax machine is just a waffle iron with a phone attached!”

    • DragonDaddyBear
    • 6 years ago

    Isn’t ray-tracing used for some CG movies, like Cars? Games are are nice and all but what kind of impact would this have in rendering movies that currently take a server farm being able to be done by a fewer comparitively low-powered GPUs in less time?

      • chuckula
      • 6 years ago

      [quote<]Isn't ray-tracing used for some CG movies, like Cars?[/quote<] Correction, it's used in practically every pre-rendered CG movie you've ever seen and has been in use since at least the 1980's (probably even earlier*). This Apollo CG movie is one example (not the first) from the '80s: [url<]https://www.youtube.com/watch?v=b_UqzLBFz4Y[/url<] See also, Pixar's very first short-film: [url<]https://www.youtube.com/watch?v=Hrnz2pg3YPg[/url<] * Edit: According to Wikipedia: "The first ray tracing algorithm used for rendering was presented by Arthur Appel[1] in 1968. "

        • rootbear
        • 6 years ago

        Not quite. Pixar’s Renderman renderer for a long time used the REYES algorithm, which uses local illumination only and no ray tracing. Reflections are faked with reflection maps. It wasn’t until A Bug’s Life that any Pixar film or short used ray tracing and on Bugs it was only used in one or two scenes, using the BMRT renderer, which was ray tracing based, but Renderman compatible. Eventually, Renderman got ray tracing, but it wasn’t until Cars that ray tracing was used extensively. Compared to REYES, ray tracing is really expensive to get the same image quality, but with all the shiny surfaces in Cars, it just wasn’t possible to do it all with reflection maps. Today, global illumination (the general term for techniques like ray tracing) is used all the time, now that computers have gotten fast enough to make it practical.

          • chuckula
          • 6 years ago

          That’s interesting. Of course, even with ray-tracing the next question becomes how many bounces does each ray get in a tradeoff between quality (if it’s even perceptible) and computational power.

            • internetsandman
            • 6 years ago

            For that I think ideally you would have to look at real life photons and how much of their energy is either absorbed or reflected based off of what material they hit. I don’t know if it would be possible, or if it’s already part of the technology, to reduce the energy or intensity of each beam of light as it bounces off a surface in accordance to how much is absorbed. I imagine that kind of computational complexity would at least double what is currently required for the same time span

            • rootbear
            • 6 years ago

            There are some rendering algorithms that take energy conservation into account when bouncing photons around. It does cost more and your surface shaders are more complex. But sometimes perfection isn’t the goal. Film making is all about making compelling imagery and if that means faking it a bit and not trying too hard to copy reality, that’s just fine.

          • Erebos
          • 6 years ago

          Actually, Monsters University was the first Pixar film that used exclusively global illumination. There was a related interview in 3D World magazine.

            • rootbear
            • 6 years ago

            Oh, interesting, I hadn’t heard that. That’s pretty impressive that they can now use it everywhere. The important metric for non-realtime rendering seems to be time per frame, which is about two to four hours these days. So basically, once a certain technique is fast enough that it can churn out frames at that rate, then there is less reason not to use it.

      • kamikaziechameleon
      • 6 years ago

      Ray Tracing is the defacto for pre-rendered video.

      FYI when working in a environment like Maya or something its usually using a game render technique till you tell it to render a scene then it ray traces.

        • Andrew Lauritzen
        • 6 years ago

        Not exactly – only recently have movies started using ray tracing extensively (Arnold, etc). Indeed in Max (and Maya I’m pretty sure) the default renderer is raster-based unless you switch to a ray tracing engine (mental ray, etc).

          • kamikaziechameleon
          • 6 years ago

          You are right! Though I never used the built in renderer back when I was a teenager and worked in this stuff. I’m a little out of touch though.

          • Liron
          • 6 years ago

          Not too recently. Even in the 90s, ray tracing was used extensively. Jurassic Park was rendered in Softimage, which uses Mental Ray, Titanic CG scenes were rendered mostly in SI and Lightwave, which had a hybrid but mostly ray tracing engine back then, Jimmy Neutron was also LW.

            • just brew it!
            • 6 years ago

            Yes, it has been around for a while. But back in the day render time was measured in minutes per frame, not frames per second! Early ray-traced movies had entire datacenters dedicated to rendering the frames, and it was still nowhere near real-time!

            • Liron
            • 6 years ago

            I remember that Toy Story was something like 16 hours/frame.

    • willg
    • 6 years ago

    Least realistic looking plane on grass ever

      • Corion
      • 6 years ago

      Challenge accepted

    • Duct Tape Dude
    • 6 years ago

    I can’t help but think how this could very well end up like PhysX… solid concept, proprietary execution, licensed as an exclusive, and ultimately killed by its limited use.

    I do like me some realtime traced rays though. Godspeed, PowerVR.

      • windwalker
      • 6 years ago

      Just like that proprietary, very limitedly licensened x86 ISA.

        • Duct Tape Dude
        • 6 years ago

        Proprietary yes, but there are a gorillian x86 licensees now. ARM even moreso.

        PhysX was special because it could have easily been licensed, except that one of the big three manufacturers (nvidia, vs AMD/ATi and Intel) bought it and kept it for itself.

          • Ringofett
          • 6 years ago

          If by “gorillian” you mean currently 3 active players according to wikipedia: [url<]http://en.wikipedia.org/wiki/List_of_x86_manufacturers[/url<] Not that I've heard anything about VIA in a while, but perhaps they're still kicking. PhysX problem wasn't licensing, as much as people with an ideological bent might wish it so. It's problem was the market it hoped existed did not, at least not enough to be worth pursuing, like whatever that company is called that sells those ultra-high-end NICs and whatnot to gamers. Licensed or not, when it ended up being available by everyone with an nvidia chip (which without looking at a Steam survey is presumably about half the gamer market) then the potential pool of users is there, if anyone cared.

      • blastdoor
      • 6 years ago

      If a company like Apple is the exclusive licensee, then it will be a big deal.

      If a company like LG is the exclusive licensee, not so much.

      • kamikaziechameleon
      • 6 years ago

      yeah well put. Physx was axed by Nvidia basically killing it.

      First they limited it to their cards. Then they eliminated the ability to use a dedicated Nvidia card while a AMD card is in the system. Then they started charging 50 percent more for their products.

      • NovusBogus
      • 6 years ago

      I have to agree, raytracing would be huge but nobody’s going to do it if it fragments their user base.

    • chuckula
    • 6 years ago

    Ray-Tracing is the graphics technology OF THE FUTURE!

    (and always will be)

      • windwalker
      • 6 years ago

      That’s what we used to say about VR and biometrics.
      Often it’s not the idea that’s bad but the implementer who is incompetent.

        • Pwnstar
        • 6 years ago

        Used to? Still is.

          • Ringofett
          • 6 years ago

          Current employer uses biometrics for security; very picky facial recognition at the entrance, and fingerprints in the cafe to automatically debit an account we preload. And I almost bought a condo in a new building that used biometrics — several years ago. (Just fingerprints, I believe, for access to the building and again to the gym and pool)

          VR is close, though at least for military purposes, very well established in training, and… If it’s ever bloody finished.. the F35’s helmet will have capabilities that make a lot of older sci-fi look primitive — with a side dish of nausea.* Airlines in some cases use VR as well, but I’m not sure how much its caught on vs traditional simulators yet.

          *DoD could probably save itself several billion by just waiting for Oculus Rift to get perfected, or paying them to speed up. But of course, if it doesn’t cost 100x what it should, it’s not good enough for government.

      • just brew it!
      • 6 years ago

      Yeah, kind of like cost-effective nuclear fusion for electric power generation. They’ve been going on about that since I was a kid. And I’m an old fart (let’s just say I’m somewhere north of half a century).

      Now get off my lawn!

Pin It on Pinterest

Share This