Gaming Features stub

Sean Tracy on CitizenCon Tech Demo: Parallax Occlusion Mapping & Physics

Posted on October 10, 2016

Immediately following our already-published interview with Star Citizen's Chris Roberts, we encountered Technical Director Sean Tracy, recently responsible for educating us on the game's 64-bit engine. The Technical Director took a few moments after CitizenCon to share details about the lower level technology driving the night's demonstration, like real-time physics, poly per pixel budgets, occlusion mapping, and more.

Tracy's role for the CitizenCon presentation primarily had him demonstrating the production tools utilized by CIG for planet development. This includes Planet Ed, already somewhat detailed here and here, which is the team's creation kit for planet customization and design. The toolkit takes the approach of getting artists “90% of the way there,” allowing them to fine-tune the final 10% for faster production pipelines and a hands-on polish. The tech demo showed a walk-through of CIG's team using large brushes to paint the surface with biomes, hand-placing bodies of water and buildings, and blending everything together.

Biomes in Action

The in-engine and in-game demonstrations at CitizenCon 2016 showed a Homestead demo, an Earth-like planet with various biomes edge-blended together. Those biomes included acrid swamps, desert, oceans, and forests, with individual rule sets applied to each biome and mixed between the edges. Procedural generation for Planets V2 also applies a new spin to each object placed, like palm trees, so that minute rotations and small tweaks break up the chance of noticing repeat assets and tiles.

Asked about how the demo went, Tracy said:

"I'm excited! I was super nervous for the demo. We're always excited to show it off for the community. It's almost like an investor meeting – I mean, we're goin' in, we're showing them what all their support has really allowed us to do. It was a really exciting demo, we were super happy to show planets V2.”

We then asked if the demo properly showed biomes, curious if the planet had been hand-crafted for the demo or if the Planet Ed tools were deployed in time to build-out the planet. During the demo, Chris Roberts noted that all the content shown on stage had been created just between Gamescom and CitizenCon – months apart – aside from the tools used to make the planet. Tracy informed us:

"That's exactly the biomes in action. The cool part about it is there's a referencing system within it, so as we tweak the biomes, we can update them, we can apply the update across the whole planet – those were a couple of [the biomes], and the whole idea of the long fly-in was to show you the huge amount of ecosystems that we're hitting already. Even later on in that other tech demo, we wanted to show some exotic ecosystems; we're not just making earth, we're not just making mars, we're making all kinds of crazy planets.

"When objects are actually placed into the terrain, they look like they are part of the terrain. There's no hard edge on it. This parallax occlusion mapping that we use, we have access to it in the depth buffer, and usually with parallax occlusion you don't. [The German team has] gotten into the depth buffer so that they can do this per-pixel edge blending on every single object placed in that terrain. I've never seen anybody do something like this before. We as gamers have just gotten used to this stuff, and it's all cut very obviously. I hope we fix that."

Parallax occlusion mapping is the development process for procedurally generating greater apparent depth in surfaces by using displacement maps to effectively deform terrain. A displacement map gets applied to the surface with definition on height, moving vertices and geometry around based on the spec laid-out in the map. In some ways, this can be thought of as a distant cousin to tessellation, which we explain in-depth with Unreal Engine developers over here.



POM adds depth to the planets in Star Citizen, and does so efficiently with the maps applied at a more global scale. Per-pixel edge blending also means that the developers, from an optimization side, can ensure that the detail scaling is as friendly with graphics hardware as possible, but retains high fidelity. We've previously talked about screen-space aware visual FX and graphics tech that interacts with the z-buffer, which may also help in understanding this concept. As a brief example, screen space ambient occlusion (you've likely seen this as SSAO in menus) is only aware of what exists within the z-buffer, which contains the distance of each pixel from the point-of-view of the camera. SSAO only traces shadows for what's currently on the screen, leaving out-of-view objects unaffected (like the underside of a car and its realistic interaction with the underlying surface). Voxel-based AO resolves this, but that's not important for today's discussion.

Tracy continued:

"Our graphics programmer out in the UK was really the chief architect of rebuilding and refactoring the CryEngine LOD system, so it worked really well for the Crysis teams because the ranges and things. But like you saw, ranges here are way beyond what Crysis or whatever do. Some of the refactors he's made are making it based less on distance and more on how many polys per pixel you end up drawing at different distances. This has taken a lot of iteration and a lot of work on the asset side and on the programming side to just get that ratio perfect.

“Even still, there's a little extra popping on the trees that we're not happy with and things like this [...] but we take that super seriously, because it's jarring when you're flying into an area and everything comes in. We've got the dissolves between LODs, we've got per-pixel – how many polys are per pixel, anyway? You can't draw anything more than 4 – that's what nVidia is recommending. There are some pretty exciting things that had to change to make that possible."


Spring Tension, Inverse Kinematics, and the Rover

Viewers of the demo might have noticed that the rover exploring the planet's surface had wheels responding to the terrain. This response is created in real-time by the physics engine within the game, which uses spring variables and physics processing to determine how the wheels react. The rover in particular is driven by two wheels (mid-wheel drive, effectively), whose contact with the ground determines the handling and traction for the rover.

On this point, Tracy expanded:

"That is completely real time physics. We use a spring system for it, so we have a spring and IK [Inverse Kinematics] system built into it. What's super interesting about it is we're applying it to the landing gear now. That one's a little more complex – it's conceptually the same idea. You've got a set of springs, you've got a set of IK joints, and basically you're controlling how much they're springing, how much tension is on them, how far they can extend. There was a shot of the Connie coming in and landing, and we worked on this for the Connie in this particular demo, but one of the hardest things with the ships is when they're landing – they bounce around. It's giving you this nice soft landing so when you land you have that compression in the landing gear. [...] You saw that as you go over terrain, they were bumping around and everything. What's nice is it gives us better traction, also.

“We set which wheel is the drivable wheel, and I think on that rover only the two center ones are driving wheels. Basically, they apply all the torque.”

Landing gear is giving the team some trouble yet because of the complexity of planet surfaces, but it sounds like the technology is all in the pipe and just needs fine-tuning.

Previous Coverage

View our other CitizenCon coverage here:

If you like our style of objective reporting, please consider supporting us directly on Patreon or following us on YouTube and Twitter.

(Footnote: We know that the Star Citizen community is very eager to share content. We kindly remind readers that copying and pasting entire articles is damaging to a publication's ability to measure the success of content and remain in business, and thus damaging to the ability to fund future flights to make developer tours.)

Editorial: Steve “Lelldorianx” Burke
Video: Keegan “HornetSting” Gallick