We first heard about Euclideon back in 2011, when the company posted a video of a voxel-based rendering engine designed to enable environments with unlimited detail. This month, the firm made headlines again with a new video—and even more boisterous promises. Take a look:
Euclideon’s core technology is a scene renderer that uses voxels—basically 3D pixels—in place of the polygons that are fundamental to virtually all conventional 3D graphics. Voxels can be a useful way to represent 3D space, as witnessed by the fact that Nvidia uses voxels in the VXGI lighting scheme introduced with its Maxwell GPUs.
The folks at Euclideon have figured out how to use 3D scanners to capture real-world environments as point-cloud data. They can then feed the results into their voxel renderer and create a real-time rendered facsimile of that environment. As you can see in the video above, some of the results are quite striking.
The Australian start-up already offers this technology to businesses for things like geospatial analysis, but next year, it plans to make its entry into gaming. We spoke to Euclideon CEO Bruce Dell to find out more about those plans—and about the software that underpins them.
In a nutshell, Euclideon’s Unlimited Detail technology relies on a search algorithm that “efficiently grabs only one point for every screen pixel.” Scene data is stored in point-cloud form, and it’s streamed dynamically from mechanical storage with apparently negligible load times. Dell demoed this aspect of the technology at the SPAR laser-scanning conference earlier this year. In his demo, he told us a $600 laptop was able to load a 3TB model of the city of Vienna in just 0.8 seconds.
To create such expansive models, Euclideon uses laser-scanned data along with some special sauce that fills in the gaps between scanned points. The special sauce involves “basically reblending and remodulating all the colors with a little bit of artificial intelligence,” and it also handles other missing information. The result is a claimed thousand-fold increase in detail over the raw data from the laser scanner.
This scene from a church includes a lot of complex shapes. Source: Euclideon.
Perhaps the most surprising aspect of Euclideon’s technology is that, for now, it runs only on the CPU. Dell said the current implementation produces 2000×1000 frames at around 32 FPS on a six-core processor. He claims there’s “no reason” the technology can’t be sped up using, for instance, OpenCL on a GPU, but there are “lots of software ways” to improve performance yet. Jumping straight to GPU optimization would be “admitting defeat,” in his view.
Two games based on Euclideon’s technology are currently in development. Dell said he hopes they will be out in May of next year, though he conceded that the schedule could slip. Interestingly, he also said that “people will be very surprised when they find out which hardware platform [the games] are on.” Then again, when we asked about hardware requirements, Dell suggested that a six-core CPU will be the “median” requirement, with detail scaling down on lower-end processors. Hmm.
Both of the upcoming Euclideon-powered games will feature “directly imported graphics from the real world,” and they’ll be entirely voxel-based, with no polygons even for animated models. Dell told us that animating voxels is “not the hardest thing in the world,” but Euclideon’s implementation is only about 80% done. That’s why we haven’t seen it demoed yet. “If I were to put that up today, I think people would look at the 20% that was missing,” Dell explained.
Speaking of models, Euclideon offers an interesting alternative to traditional modeling apps for asset creation. While artists can still use 3ds Max and Maya, they also have the option to make physical models using clay or putty. They can then scan in those models with laser scanners. Dell suggested that creating a physical model could be faster in some cases than modeling one in software.
Euclideon’s technology can also reproduce natural environments. Source: Euclideon.
Delving a little deeper, we asked Dell how Euclideon’s technology handles dynamic lighting and antialiasing—two important features in real-time 3D rendering today.
Euclideon’s technology does support dynamic lighting, and Dell claimed the results are better than those from polygon-based games. However, he added that he prefers to preserve original photographic lighting from real-world scans whenever possible. Real-world lighting is “so much higher [quality] than what computers can generate.” The same goes for pre-baked lighting from offline renderers. “We’ve been . . . setting [3ds Max] and Maya to really high lighting settings and then running that through our converter to turn it back into XYZ voxels,” Dell noted. The decision to forego dynamic lighting apparently isn’t tied to performance constraints, though. “If we had any performance limitations,” Dell told me, “we’d probably go to something like CUDA and actually start using the graphics card.”
What about antialiasing? Euclideon has been “experimenting” with a new AA technique, but that technique is being kept under wraps because of “patent issues.” All Dell would say is that the “one voxel per pixel” formula doesn’t mean pixels have to fall “exactly on the pixel grid,” and the AA scheme may “make some decisions about where it does want to grab just a few voxels extra in an extremely efficient way and blend them together.” Dell also suggested that antialiasing will play a part in improving image quality on lower-end systems that may render scenes at lower resolutions. In any case, high-res screenshots from Euclideon’s latest demo still show hard, jagged edges between objects in some places. (See the gallery at the bottom of this page.)
Finally, we asked if Euclideon plans to license its technology à la Crytek and Epic. Here’s what he replied:
We’re moving forward with our own game engine. We’re moving forward doing a lot of very interesting things in a lot of different industries with our engine. We might—I’m not saying it’s a total no, as in there’s a few big names who we talked to about this, and I’m not allowed to talk about what’s going on in some of those areas—but in general, are we suddenly going to come out with this game engine and say, “Everyone please license our game engine?” I’m not sure that that’s an area we want to be competing in, when I’m not sure that there’s enough money in it even to survive with the way that the price of these things is dropping.
We do have a game engine. I suppose we call it a game engine. It extends also to simulation and training and other areas. . . . We’re trying to get to the point where it’s an engine that does not require any programming at all yet still can do anything anyone can think of.
We tried to do a little more prodding to find out if Euclideon is working with major studios or middleware vendors, but all Dell would say is this: “If we were, we’re probably under an NDA that says we’re not allowed to say that.”
There were many doubts about Euclideon’s credibility when the company first showed its technology few years ago. Today, as a player in the geospatial analysis market with nascent partnerships in the game industry, Euclideon has more clout.
Still, one can’t help but be dubious about the near-term prospects for voxel-based rendering when real-time polygonal 3D graphics are so close to photorealism as it is. Shifting to an entirely new fundamental representation of world data would be a wrenching change for game developers, graphics APIs, and chipmakers—essentially, the entire industry. Euclideon’s data sets appear to be rather large, too, which could become a constraint if entire game worlds reach into the multi-terabyte range.
That said, Euclideon’s technology has tremendous promise, especially over the longer term. The fact that they’re able to produce images with such fidelity using only a multi-core CPU, without a single flop contributed by fixed-function graphics hardware, is remarkable. The approach of capturing light and other environmental information from the real world makes a lot of sense, too. Already, game developers are scanning the real world to create highly realistic polygonal graphics. And the hottest thing in real-time graphics is “physically-based” lighting and other simulations, for which polygonal models are often poorly suited. Having a point cloud or voxel grid of the world space on hand could allow much better lighting, shadowing, physics interactions, and more.
Despite its potential, we expect Euclideon may have to struggle to get its foot in the door in gaming initially. Then again, if two games based on this technology are in the works, then at least some studios must share Euclideon’s enthusiasm. This technology could be intriguing in the gaming space if it can enable new levels of realism on hardware that lacks the GPU horsepower to produce comparable images via polgyon-based rendering. We’ll be very curious to see how these first Euclideon-based games look.
Update 9/26: Euclideon has posted a new video of its technology in action. Here it is:
As I said in the corresponding news post, the church scene is particularly impressive—though Dell says it uses 3.8GB of compressed point cloud data, which is relatively hefty.