Technology in BlackSpace: Quad-Core Utilization, 3D Vision, More
Posted on October 3, 2012
Given the opportunity, we always explore the technology behind upcoming games; understanding GPU specifications is great, but it's important to put that knowledge to practical use. We were able to talk with the developers of BlackSpace (originally covered here) and ask them about how, exactly, the visual interactions and programming works on a development level. Some topics covered were their engine's effects capabilities, 3D Vision optimization, how the code handles physics calculations between the GPU and CPU, engine technology, and more.
If you like our hardware guides -- personally -- I think this is an excellent way to learn about how those components actually interact with games. If nothing else, it's certainly cool from a technological perspective.
This interview was conducted with Volga Aksoy, Co-Founder of PixelFoundry and Tech Lead on BlackSpace. The game is currently on KickStarter and is looking for funders. You'll find the unedited interview on page two (certainly useful if you're a developer), but I've broken down our conversation into a more consumable form here:
What is BlackSpace?
First of all, a quick note on what this game actually is: BlackSpace is an asteroid mining space RTS that boasts deformable terrain and "destructible everything," both of which are computation-intensive on the CPU. Beyond the game's interesting technologies, its gameplay and mechanical direction look quite fun -- I'll leave this video here to explain it more succinctly, but in short, your goal is to mine minerals from asteroids, build and expand your base, and establish turrets to defend against invaders.
That's my kind of game.
On BlackSpace's HDR Lighting and Advanced Graphics Capabilities
BlackSpace's implementation of stereoscopic 3D (discussed in depth below) and high-dynamic-range lighting means the game will, hopefully, facilitate further realism in space. High-dynamic-range rendering (HDRR), as defined by nVidia, means that the darks can be really dark, bright things can be really bright, and we can see detailed objects within both; there is theoretically no detail lost just because something is dark, which, in a space game, tends to be important. Here's why:
To prevent asteroids from appearing "pure black" in dark scenarios, the team applies "a slight ambient contribution to the dark side of the asteroid," which is partly reflected by the above example.
"Unlike many other games -- where their gameplay takes place on a fairly flat terrain, and sky ambient and the sun light direction is constant for a given time -- in Blackspace, the user can see both the daylight and the night side of the asteroid simultaneously. Compound that with the fact that the terrain is deformable over time in various ways, maintaining a pleasing look across the whole surface becomes fairly tricky. With any new lighting feature we add or change in the game, we double and triple check that it looks acceptable under all changing conditions."
Now, some of you who saw a few of our initial Skyrim graphics overhauls will remember that despite lighting advancements made by modders, the game often looked incorrectly exposed (over-exposed in light areas, under-exposed in dark areas) and never looked quite realistic. These were, of course, mods - and they've improved - but they make this point easy to understand: BlackSpace has put in place tone mapping post-FX, which will help create more visually-accurate HDR lighting, unlike what is seen in some of those Skyrim mods.
Here's another example of the lighting effects:
In terms of Anti-Aliasing technology, BlackSpace will feature both FXAA and MSAA, making for better imaging in systems that have the GPUs to handle them - which most of you should.
If you find yourself particularly interested in learning more about BlackSpace's lighting technology, you can read more at this dev-blog post.
On 3D Vision Support and 3D Technology Implementation
In our review of nVidia's 3D Vision, we found nVidia's proprietary version of stereoscopic 3D gaming to be incredibly immersive for specific games, but entirely too much work for others. Some games - specifically those which were not developed to 3D spec - would exhibit shadow flickering or lighting conflicts in 3D (similar to what is seen in z-fighting, shown at 5:57 in this video). The flickering seen in the video normally occurs when a game uses deferred rendering, where the objects are first drawn, then the shading and lighting is applied after. This meant disabling settings and .ini tweaks were often necessary to fully enjoy the 3D Vision tech, which uses active shutter lenses that operate at 120Hz (as does the monitor).
BlackSpace hopes to remedy that problem by coupling with nVidia, who have told us before that they're happy to support almost any developer in a stereoscopic 3D overhaul. Even though games that did not develop with 3D in mind can be retrofitted (with special drivers that nVidia releases), you'll never get the full desired experience; settings will almost always have to be tweaked when playing a retrofitted 3D game. When we asked Aksoy about these problems, he told us that 3D Vision should be much more appealing in their game because they're supporting it from the get-go. While you'd do well to find his full description of why 3D tech will work better with BlackSpace on the following page, here's what we found most promising:
"The thing I love the most about our engine is the stereo 3D rendering capability. There is something to be said about roaming around on a desolate asteroid where you can peer into the depths of space while also looking at the little cracks on the ground and see a vast distance between the two without the need to move the camera.
Even though Blackspace also uses a deferred rendering pipeline, we are making sure that all of the visual features, including the 3D UI elements, follow NVidia’s specs and use the correct depth when rendering for each eye. If I am given the choice to have the driver automatically apply stereo 3D to a game versus the game rendering the view for each eye, I’d take the latter because it ensures that you as the developer have control over what the end-user will see including applying special adjustments to parallax settings."
The idea here is that visual artifacts will be mostly nullified by the game's stock support of 3D settings, hopefully making for a less work-intensive tweaking experience. So if you've been wanting to really enjoy 3D technology in space, this is your chance.
Making Use of Quad-Core CPUs
BlackSpace isn't making use of nVidia's PhysX capabilities, but they are using another physics processing engine (called BEPU), which does, in fact, utilize multiple threads. We've noted in several of our PC Builds that the gains in Intel's hyperthreading technology are often negligible for most games, making the 3570k a more reasonable choice for gaming. With that said, BEPU itself stands as an exception to this: From the engine website's own testing they've conducted (see the graphic below), BEPU actually exhibited a 41% performance boost (on a 4.5GHz turbo-boosted i7-3770k) when making use of 8 logical threads rather than 4 logical threads.
The gains after the initial 4 threads (or 4 physical cores without hyperthreading) will taper off gradually, but 41% will be more-than-noticeable in high-end games that can afford the threads. Although I doubt many of our regular readers should suspect problems running BlackSpace -- and while BlackSpace itself may not perfectly reflect BEPU's own testing, since it has other things going on that need threads -- it's important to know that quad-core CPUs and hyperthreading are being taken advantage of in newer engines, which will be reflected in games like BlackSpace.
BlackSpace itself features deformable asteroids, resulting in additional calculations required by the hardware; collision is one of these additional calculations, as it must change with the terrain to properly mesh with the new environment. The game is currently pushing physics calculations to a separate thread on the CPU, but they hope to look into potentially using GPUs for this.
On destructible environments, Aksoy told us the following:
"Other than the asteroid itself, when any of these objects are destroyed, we break apart the object by randomized spawning of smaller parts that make up the original object, and hand them over to the physics system to deal with it as a rigid body with physical characteristics that make sense. This can mean that the scrap metal part could still have an articulated assembly of parts such as hydraulic piston links."
Cool.
A Similar Note: Intel's 3rd Gen IGP Integration (HD 4000, HD 3000, etc.)
We asked if the team had taken advantage or tested with Intel's much-praised HD 4000 series graphics -- the response wasn't disappointing. Owners of the HD 4000 or 3000 series can expect to play BlackSpace on reasonable settings without visual artifacts, and the team is planning to optimize the game for various hardware arrays, which will include Intel's HD graphics. While I didn't ask about the Trinity series, I'd imagine the new Trinity line could also run BlackSpace quite reasonably, especially if the HD 3000 can.
Lander Controls, Gravity, and Six-Axis Movement
Being a space game, I always have to ask developers about the presence (or lack thereof) of six-axis movement. It doesn't make sense for every game, but after playing Shattered Horizon on Steam, I've always found it intriguing for gameplay mechanics.
It won't be available in BlackSpace, but for good reason:
"When we tested six-axis controls, it was quickly apparent that with constant gravity being applied to the lander, the users would have to dedicate almost all of their attention to simply keep the lander upright instead of performing other more complex tasks like tethering, mining, or fighting the enemy."
No loss there, then, but more importantly, the mechanics behind lander movement have been focused on heavily by the team and have resulted in something quite fluid, as demonstrated in this video:
To this end, Aksoy told us that they've achieved a healthy balance between stability and agility:
"We believe that we have what is a good balance between agility vs. stability for the lander and it keeps the controls familiar for more traditional action shooter fans while keeping the sense of controlling a vehicle with weight."
What is The BlackSpace Team excited about?
This question was quite simple: I asked what the team gets most excited about with the game's technology and design, to which Aksoy answered:
"With Blackspace, it’s tough to point at one aspect and claim that that is what sets it apart. While the spherical and deformable play surface is definitely something that we feel is a really exciting feature, you could almost talk about every aspect of Blackspace that we are excited about and show another game that has already done some version of it.
What sets Blackspace apart and excites us the most is the visceral game play experience as a whole. The hands-on dynamic physics, innovative strategy, elements and high-end visuals all come together, teasing the player to interact more with different aspects of the game. It is this design and art working with the technology behind Blackspace that makes it unique."
Final Thoughts
From a mechanical and gameplay perspective, it's fair to say that I'm excited about BlackSpace -- the lander's mechanics alone look excellent and fun to control, and the base-building and mining features tie everything together. Technologically, while the game isn't claiming to be breaking any world records, they've certainly taken advantage of everything available to them. The game should scale well between various hardware configurations, as evidenced by success on the HD 3000, and stereoscopic 3D will be a nice addition to those who have access to it.
We'll be posting more information about BlackSpace as it becomes available!
Continue on for the full, uncut interview, if that interests you.
- Steve "Lelldorianx" Burke.
BlackSpace Interview with Volga Aksoy: Hardware Interaction with the Game
Exploring the Graphics Technology behind Black Space
GN: What engine is Black Space built on? If you built it (or chose an existing one), what were some considerations when determining requirements?
Volga: Blackspace uses a custom in-house built framework that utilizes a heavily modified version of Xen for its graphics backend and BEPU for its physics engine, both of which sit on top of XNA.
XNA might seem like an odd choice for a game that uses fairly high-end graphics and physics features, not to mention that we come from a background at EA where C++ was the standard. However, we settled on XNA for a few important reasons. We wanted a framework that took care of the hardware specific differences while providing a unified API out of the box. In that regard, XNA definitely fits the bill where it sits nicely at a level slightly above where DirectX or OpenGL would sit in a C++ project. It is still generic enough that it allows us to implement out-of-the-ordinary graphics features while handling the boilerplate code one would have to write otherwise. Combine this with the fact that for Blackspace we effectively started from scratch, the need to get up to speed and maintain that momentum of development is paramount. This is the convenience of .NET, Xen and BEPU help immensely. Even though one cannot modify another engine code unless you have paid to acquire the rights to the source code, with Xen and BEPU, we have free reign over all of the features implemented on that level.
GN: What are some of the engine's finer graphics capabilities? What gets you really excited about working with Black Space's engine?
Volga: Blackspace contains almost all of the high-end features one would expect from a modern game engine. From simple things like gamma correct lighting, to a full deferred lighting pipeline that is almost a must for a dynamic game such as Blackspace, we have it all. If it wasn’t for the deferred lighting pipeline, implementing any kind of dynamic lighting for the night side of the asteroid would have been a painful experience.
Back when we were at EA working on sports games, we could never justify the overhead of using a deferred rendering engine and there were good reasons for it. The main reason was that most of the sports games we worked on took place in a fairly confined area with little to no dynamic environment lighting where the only moving entities are the athletes. So naturally, moving from a fairly static rendering framework to one where there is almost no baked lighting is quite an exciting leap for us.
The thing I love the most about our engine is the stereo 3D rendering capability. There is something to be said about roaming around on a desolate asteroid where you can peer into the depths of space while also looking at the little cracks on the ground and see a vast distance between the two without the need to move the camera.
GN: When I reviewed 3D Vision, I found that the limiting factor was often the game's graphics execution. For instance, I often had to disable shadows, shading, and lighting in order to reduce visual artifacts that would be caused by 3D Vision incorrectly interpreting graphics elements. What efforts are you/nVidia making to optimize 3D Vision for Black Space? As a game developer, do you find it any more difficult to develop with 3D technology in mind, or is it pretty much the same as normally?
Volga: The problem with many games’ stereo 3D visual glitches is usually caused by the fact that the developer originally did not work closely with the GPU vendors to make sure it supports their respective stereo 3D technology. In such a case, after the game has shipped, NVidia would analyze the game’s real-time rendering pipeline and add the support via their driver tweaks. In many cases, one can only do so much by applying drive tweaks because some aspects of the rendering pipeline might need special attention that only the developer working on the game can address. In particular, the biggest problem with applying stereo 3D to modern games is the use of the deferred rendering. For these kinds of games, unless the game developer explicitly implements stereo 3D support into their game, adding proprietary stereo 3D tech such as 3D Vision as an afterthought becomes a major problem because the standard method of doubling up a draw call for each eye proves to be problematic.
Even though Blackspace also uses a deferred rendering pipeline, we are making sure that all of the visual features, including the 3D UI elements, follow NVidia’s specs and use the correct depth when rendering for each eye. If I am given the choice to have the driver automatically apply stereo 3D to a game versus the game rendering the view for each eye, I’d take the latter because it ensures that you as the developer have control over what the end-user will see including applying special adjustments to parallax settings.
One thing I would like to see soon is a common PC specific 3D technology that can be invoked directly by applying the HDMI 1.4 standard stereo 3D protocol. PCs are just catching up to support this standard, each major vendor their own proprietary layer. Even then, it is not as straight forward as it is when developing for consoles where you can pretty much render a frame buffer side-by-side or top-and-bottom and tell the 3DTV to interpret the info as a stereo 3D buffer.
GN: What sort of focus is placed on lighting, shader technology, anti-aliasing methodologies, ray-tracing, smoke and fog effects, etc.? Will you be making use of any proprietary technologies, like nVidia's PhysX? Are these things minimized for your targeted specification requirements?
Volga: Blackspace is a high-dynamic-range deferred-rendering engine. Our goal from the start was to make sure that we can keep the lighting and shading on all objects dynamic at all times. Unlike many other games where their gameplay takes place on a fairly flat terrain, and sky ambience and the sun light direction is constant for a given time, in Blackspace, the user can see both the daylight and the night side of the asteroid simultaneously. Compound that with the fact that the terrain is deformable overtime in various ways maintaining a pleasing look across the whole surface becomes fairly tricky. With any new lighting feature we add or change in the game, we double and triple check that it looks acceptable under all changing conditions.
The lighting system is based on energy conserving lighting algorithms where we make sure that our diffuse and specular contributions do not reflect more than the light source emits. Starting from a fairly physically based lighting scenario allows us to have lighting knobs that require less tuning for each lighting situation. From there on, we have our modifiers to create a look that is more pleasing for both in terms of gameplay as well as art direction. A good example is the fact that we apply a slight ambient contribution to the dark side of the asteroid to keep it from looking pure black which is what the proper physically based model would normally dictate.
Once the deferred rendering portion is handled, we apply a healthy dose of post-FX that also include tone mapping to work out the proper exposure for our HDR lit environment. We also apply FXAA as a means for anti-aliasing, which has now become a pseudo standard in games using deferred rendering engines. That said, for those that can spare the cycles MSAA will be there to help as well.
As for physics, while we don’t use PhysX, our physics engine BEPU scales very nicely across multiple cores where a four-core CPU is a definite plus.
For those interested in more detail about our rendering and lighting pipeline, I recommend our dev-blog post at http://www.pixelfoundrygames.com/index.php/en/dev-blog/42-lighting-overview-for-blackspace where they can also see a detailed breakdown of the G-Buffer layout we utilize in Blackspace.
GN: I read that your game boasts a "destructible everything" approach! Sounds fun! How are the destructible events calculated within the game? Are they sent to the CPU (or GPU, if using PhysX) in real-time, or are these animations and events scripted/pre-baked?
Volga: Asteroids in Blackspace are deformable-spheroid terrains. Since it can be regarded as a fairly specific “collision primitive”, very few physics engines support it out of the box, let alone allowing them to be dynamic. So it wasn’t surprising that we had to also add these capabilities to our physics engine BEPU.
When deforming the asteroid, Blackspace currently pushes the deformation calculations on a separate CPU thread including the physics and the rendering data simultaneously. I say ‘currently’ because down the line there is a chance that we might change this handled by the GPU. At the moment, there is a definite convenience factor for doing the deformations on the CPU because it means that the physics engine will not have to wait for the data to be fetched back from the GPU results. There is also a chance that we might support both methods in the event that the end-user’s GPU is not powerful enough to deal with the overhead of terrain deformation.
At a high-level, the texturing of the asteroid is handled via tri-planar mapping, with an additional custom technique that also allows us to nicely texture striated sections of steep cliffs to show the layers of dirt. The benefit of this approach is that given any polygonal shape, we can generate decent looking textured results that will maintain these characteristics even when deformed over time.
When it comes to working with other objects in Blackspace, we rely on almost no authored animations. For example, in the event that an object needs to move from point A to point B, we move the object via physics retargeting. This means that even if we tell an object to go somewhere, or part of the object to rotate to a certain angle, if there is an object blocking its path, then physics will have the final word and will not allow it to move. Similarly, you can see in our Kickstarter video that the lander’s legs clasp onto the rock once the tether is reeled in. This is also driven by physical constraints and motors to make sure that the canned or scripted movements do not fight the physics.
Other than the asteroid itself, when any of these objects are destroyed, we break apart the object by randomized spawning of smaller parts that make up the original object, and hand them over to the physics system to deal with it as a rigid body with physical characteristics that make sense. This can mean that the scrap metal part could still have an articulated assembly of parts such as hydraulic piston links.
GN: Does Black Space support true six-axis movement (in the style of, for example, Shattered Horizon)? If so, does six-axis movement cause any unforeseen difficulties in development?
Volga: When initially getting Blackspace off the ground, we heavily prototyped and experimented with different the control mechanics for the lander. That said we always knew that at its core that the lander’s movement would be physics based. The major decisions were made around how much fly-by-wire controls we would apply on top of the lander’s main and auxiliary thrusters.
When we tested six-axis controls, it was quickly apparent that with constant gravity being applied to the lander, the users would have to dedicate almost all of their attention to simply keep the lander upright instead of performing other more complex tasks like tethering, mining or fighting the enemy.
To that end, the lander can be controlled using a dual analog or classic keyboard-mouse combo style, where the left hand inputs control the tilt and pitch while the right hand inputs control the yaw as well as the “mouse look” style camera movement. The fly-by-wire system will automatically apply self-righting forces to align the lander with the gravitational pull direction when there is no tilt or pitch input.
We believe that we have what is a good balance between agility vs. stability for the lander and it keeps the controls familiar for more traditional action shooter fans while keeping the sense of controlling a vehicle with weight.
For those who want an in-depth look at the flight mechanics of the lander, they can view our dev-blog video at http://www.youtube.com/watch?v=1UUUb44LNVQ
GN: Were there any polygon budgets or limits defined per level/scene/model? Could you expand and detail some of the polygon limitations for different objects or models within the game? How did your team go about determining these limitations? Additionally, if you have a wireframe screenshot of an object/item/environment in the game that I could share with our readers, that would be amazing!
Volga: Since Blackspace is not a console title, it is tough to apply any strict budgets on the assets in it. GPUs tend to get faster each day and can eat up just about anything thrown at them. Even though my workstation does not have a particularly high-end GPU and we are making sure that when it comes to performance, the game runs smooth on my workstation as well as a few other testing stations we have at our disposal ranging from legacy DirectX 9 hardware to an SLI system provided to us by NVidia.
Instead of defining budgets, we are pushing our asset quality as high as it makes sense to utilize the power and visual fidelity found in today’s or near future high-end GPUs. If a given user cannot run the game at the intended quality, we also give them the ability to scale back the visuals.
As it stands, the expensive parts of the visuals are not the polygons but the shaders that run on those polygons. For example, the asteroid’s surface shader is one of the most expensive shaders due to the amount of texture layering and tri-planar mapping we employ on the surface. As well as utilizing geometry LODs, we will be applying shader LODs such expensive surfaces to allow more mainstream hardware to cope with the cost of rendering and expose these as quality settings in the game.
GN: Is your team interested in taking advantage of Intel's 3rd Gen i7 IGPs (i.e. the HD 4000) for lower-spec'd users?
Volga: Since Intel’s 3rd Gen IGPs are fully functional DirectX 11 engines and having recently tested Blackspace in its current state on an HD 3000 series IGP, we have verified that it runs without any visual artifacts. We already have a good deal of graphics options, but once we start to optimize and tweak the game for certain hardware configurations, we will be adding more of these options to make sure that a wide range of users will be able to experience Blackspace including those who use one of the Intel IGPs.
GN: What is being done to allow for performance scalability across multiple platforms?
Volga: We are initially targeting a Windows PC release. That said, depending on how things go with our Kickstarter campaign, we are considering porting Blackspace to Linux and MacOSX by moving from .NET and XNA over to Mono and MonoGame. Originally when we started working on Blackspace, MonoGame was lacking some core 3D functionality but it seems like they have been consistently improving it and I wouldn’t be surprised if we become a major contributor to that API in the near future.
That said even if Blackspace does not ship on any other platform other than Windows PC, it still means that we need to have a scalable architecture and feature set. Along those lines, as the action gets more intense Blackspace utilizes all available cores at its disposal to keep the gameplay smooth and responsive. And similarly on the visuals side, we have scalable graphics options that will give users varying degrees of freedom when tuning the graphics performance. Moreover, for the enthusiast who has multiple GPUs, Blackspace works on SLI and CrossFireX systems, so they will be able to crank all visual settings to the max,
GN: If the game has in-game cut-scenes, will they be pre-rendered or rendered live?
Volga: Since we are an indie-game company, we would like to leverage our in-game technology as much as possible. To that end we plan on either rendering our cut-scenes at real-time especially in cases where there is dynamic content involved, or in the event that we might like to hide load times, we will pre-render them using the game engine to be played back over the loading screen.
GN: What were some of your biggest hurdles in the development of the game's underlying infrastructure and architectural technology?
Volga: Since we are using XNA as the API to communicate with the GPU, there are some tradeoffs one has to account for. Much of it has to do with the fact that XNA was mainly created to support a feature set similar to the capabilities of the Xbox 360. This means that certain bleeding edge features found on PCs require a bit more effort to implement with XNA. That said, there are workarounds for almost all of the issues we have encountered thus far. This is a cost vs. benefit tradeoff we were willing to accept for such well-defined APIs like XNA and .NET.
GN: Finally, what is the team most proud of or most excited about with regard to the game's technology, architecture, or design?
Volga: With Blackspace, it’s tough to point at one aspect and claim that that is what sets it apart. While the spherical and deformable play surface is definitely something that we feel is a really exciting feature, you could almost talk about every aspect of Blackspace that we are excited about and show another game that has already done some version of it.
What sets Blackspace apart and excites us the most is the visceral game play experience as a whole. The hands-on dynamic physics, innovative strategy elements and high-end visuals all come together teasing the player to interact more with different aspects of the game. It is this design and art working with the technology behind Blackspace that makes it unique.
Thanks for your time!