Cloud Imperium Games has been talking about its 64-bit engine conversion for at least two years now, but we've never had a chance to properly explain the benefit of this move. Following last week's interviews with Chris Roberts (Part 1: Procedural Planets V2, Alpha 3.0 & Part 2: Weather System), we sat down with CIG Technical Director Sean Tracy to learn about CryEngine, the technical inner-workings of procedural planet generation V2, and more.
Tracy sat in on our first meeting with Roberts, and was able to prepare some additional points of depth with notes taken from that meeting. The entire discussion with Tracy ran for about forty minutes. We've split that into two parts:
Part 1, today, is on 64-bit engine technology, world space coordinates, edge blending, and meshes and layers.
Part 2, Wednesday (10/5), is on CPU threading, system resource and load management, character technology, and more CitizenCon info.
Note: You may find our previous discussion on DirectX 12 & Vulkan of interest.
From “CryEngine” to “Star Engine”
The name isn't final, but we already talked about CIG's internal usage of the name “Star Engine” in place of the original “CryEngine” branding on internal splash screens. This name change, according to Roberts, was to more accurately reflect how Crytek's engine has been refactored in several aspects for integration with Star Citizen. We posed the question of overhauls made to Star Engine to Sean Tracy, who affirmed that 64-bit world space is one of the driving elements behind making Star Citizen possible.
CIG's engine is primarily being supported and built internally, these days. Traditionally, a game developer might approach an engine provider like Epic Games or Crytek and create terms on a licensing deal for game development. Things have changed in recent years, with the race to the bottom using free engines, but larger development houses still generally agree to private arrangements. For the most part, this is in exchange for direct engine developer technical support or specific changes in exchange for some percentage of the sale. That's changed, though, and we obviously don't know the finer details of CIG's agreement with Crytek.
But the point isn't to look at the business side of CIG & Crytek, it's to look at the technical relationship between Cloud Imperium and Crytek's product, the CryEngine.
CIG has largely completed a “refactoring” of the CryEngine, and is performing ongoing engineering to advance the technology to further support the demands of Star Citizen. This includes net code optimization, which we've already detailed, graphics and performance optimization, 64-bit conversion of relevant modules, and plenty more.
Clarifying: What is a “64-bit Engine” As It Relates to Star Citizen?
Of the Star Engine, Tracy told us:
"So, 'Star Engine' has been bounced around. I don't know how absolutely official it is or anything like that. It's a pretty different version than what the CryEngine is; we branched quite a while ago. We haven't taken a new CryEngine version for quite some time – the 3.7 [or] 3.8 version of CryEngine, this was early last year – [is] where we branched off. We branched off entirely because it was getting really difficult to take the integrations. At some point, when you're developing your game on middleware, you're going to get to the point that pulling integrations is hard because you've customized it so much for your game. So, whatever changes you make to your underlying engine systems, when there's a fundamental change that comes in from your middleware provider, it's pretty difficult to consume that all the time.
“Then you start being selective – 'OK, well, we'll take this feature but we won't take this feature.' What you don't know right out of the gate is if there's any interdependency on it. What's going to happen? You find that out usually the hard way later on.
“We do cherry pick on occasion from our particular code base that we have, up to 3.7 or 3.8, but we've made some pretty major changes. It's been ongoing for awhile for CIG; I only came on about two-and-a-half years ago from Crytek with their engine drop versions.”
Tracy joked that it was “time to get over there” (to CIG) at this juncture, and joined the team to assist in engineering tasks at the company.
We asked next about 64-bit engine conversions, primarily seeking confirmation on what, exactly, has been overhauled to accommodate 64-bit calculations. Tracy informed us of the following:
“One of the big, fundamental changes was the support for 64-bit positioning. What a lot of people maybe misunderstand is that it wasn't an entire conversion for the whole engine [to 64-bit]. The engine is split up between very independent and – not as much as they should be, but – isolated modules. They do talk to each other, but things like physics, render, AI – what are the purposes of changing AI to 64-bit? Well, all the positioning that it will use will be 64-bit, but the AI module itself doesn't care. There were a lot of changes to support these large world coordinates. […] The actual maximum is 18 zeroes that we can support, in terms of space.”
And at this point, it becomes a floating point limit. 64-bit computing allows increased memory capacity support and larger virtual address space, both of which make Star Citizen's large universe a possibility. 64-bit computing starts hitting memory limitations at 16EiB, or 18,446,744,073,309,551,616 bytes, which is where the “18 zeroes” number of Tracy's comment is derived. You may also recall this number from No Man's Sky and its “18 quintillion” claims which, regardless of how that particular game played out, is a number so seemingly specific because of those same 64-bit floating point limits. As more games are able to operate under 64-bit memory architecture and support larger memory capacities, it's likely that this number will come up again in discussion.
Also, just for sake of satisfying pedant-ism, the above numbers are prior to accounting for overhead. These are theoretical maximum values, not practical values. You lose some of the bits to architecture overhead on the hardware.
“At that point, you just can't have any more within the memory. It's pretty weird for people that have come over from Crytek that have worked on just CryEngine normally. You're working on a 4, maximum 8km of level. Already at that scale, I felt like that was pretty big. But it's such a – it just messes with your mind in the large world coordinates. You'll be working on something and you zoom out, and you zoom out a little more, and it just keeps going! It really messes with your sense of scale, though. It can be pretty confusing. If you're looking at a planet, you're looking at thousands of kilometers of area, so it can get very easy to underestimate how large the undertaking of just populating something [is].
“That's one of the bigger changes we made with the implementation of planets, but more just with the positioning of it. I think with the first system that we're going to do – this is just right now, maybe quantum travel speeds will change or so – but it takes about 45 minutes at our 'jump' or 'quantum jump' speed to cross an entire system. And that's just one system. If you want to think of the systems as levels themselves, that's how to think about it. There would be basically no loading for 45 minutes of quantum travel, which is the actually realistic value of 0.2c [c=speed of light]. It's crazy how large that area is.”
The CIG Technical Director and Crytek veteran emphasized that, of all the points to be made, one of the most critical is the distinction of what exactly is meant by “64-bit engine.” CryEngine's multiple, disparate modules do not each have to be rewritten ground-up to support 64 bits. As an example, this includes the AI module; there is no immediate benefit from executing the AI in 64 bits, and so it is left alone. The AI can operate and function in 64-bit world space, but the module controlling that AI's behavior can be to its 32-bit roots.
Physics, however, needed to be refactored alongside positioning to support more bits in memory. Speaking about the refactoring, Tracy told us, “[it] doesn't need to be the entire engine. There was stuff in the engine that was already 64-bit. What we really needed was physics and positioning to be changed.”
And there's not a direct performance advantage in the case of Star Citizen, either. Just like with double-precision COMPUTE, increasing the bits by a factor of two can actually have a negative impact on time to complete the task. The time spent performing the double is about the same as a single, but the memory required for each float is doubled. This impacts cache, which only holds half the number of floats -- creating a memory bandwidth issue. Unlike with hardware, however, game engine developers are able to make optimizations within the engine to reduce the performance hit to levels which are imperceptible. In the case of Star Citizen, the team has been able to optimize performance for newer hardware in a way that results in a net positive over the original engine, effectively negating any included (but insignificant) 64-bit performance impact altogether.
This isn't news in the world of 64-bit computing, and has been discussed since the first emergent FP64 use cases on the Cray Supercomputer (learn about the history of FP64 in our Computer History Museum tour). For anyone who'd like a bit of a science experiment, you could take a double-precision graphics card – like one of the original Titans (pre-Maxwell) – and attempt gaming in forced DP versus native single-precision. In most cases, though it's been years since we've run these tests, games will actually slow down as a result of the memory bandwidth limitations, cache limitations, and doubled-up usage of memory to contain floats. That extra crunching isn't free, but in the case of the Titan, was necessary for scientific and simulation applications. Things are different when talking about software development and engines, but the performance scaling stems from the same core concept: More bits take more time to process.
But again, to emphasize, Tracy has informed us that the CIG team has made optimizations that result in a net positive performance gain in 64-bit over the original 32-bit physics and world space modules.
“It [64-bit engine] is not better performance, or anything. If anything, it's a bit worse – but we're talking marginal differences. With newer CPUs, we did make a change to how the positioning works, so it ends up being faster. Normally, if you were going to switch over to 64-bit positioning, you would be a bit slower in your math.”
Procedural Planet Generation V2
CIG CEO Chris Roberts covered the content side of procedural generation V2 in our recent interview, and we convened with Sean Tracy to cover the technical aspect.
The first question was simple: How does procedural generation work in Star Citizen?
"The difference between V1 and V2 planets [is], on V1, what you saw in our demos is actually just a single terrain layer. It's one material. Yes, OK, we've got different textures that are blending at different distances or so, but all the planets we had shown literally only had a single material across the entire planet. [...] In a lot of cases where you see procedural generation tech, they never go beyond that. They're happy with just this one rocky material across the surface of the planet.
“What [V2] does, is this brings us in line with how we were making Crysis levels or levels down at Crytek with terrain, because designers and artists always had access to up to 16 layers for different terrain textures, so what we wanted to do with it was give them back that same ability, just on a ridiculous scale. I thought it was a lot of time to paint 8km of space, but to paint thousands and thousands of kilometers is crazy. But yeah, we want to give them that amount of layers because that's the only way you're going to get the quality you want out of a first-person shooter. You're never going to get that with just a single terrain there.
"Another big improvement on V2 planets is the biomes. On V1, they had no biomes, it was a single terrain layer, it's got a heightmap. On V2, we have biomes, so the biomes themselves have layers within them for object placement like for weather, as well as vegetation and other objects. This was one of the good reasons [for] coming from CryEngine – we reuse and try to reuse as much as possible, a lot of the really powerful tools that CryEngine already had, just at a bigger scale. [...] Terrain layers were the same way in CryEngine 4, now let's just apply that to the planets and you've got that same sort of ability, just at a larger scale. Same with the vegetation – it's more advanced rulesets with vegetation.”
We next discussed diversity with procedural generation. This was a topic with Roberts, too, but Tracy provided additional insight regarding asset variation to reduce the feeling of copy-paste environments:
"To give you some history [into Crysis], we really only had like 16 palm trees. [...] What was really good about it was the system we used to place these. We had variable scale, we would set rulesets on a layer or a group of these vegetation objects. We'll take this really powerful tool and apply it to the procedural stuff. You'll do things like density – how close can other palm trees be to this? Are they randomly rotated? That's a really easy win for a lot of things. You might have a tree that's sort of rotated like this [gestures a bend], if that's placed the same rotation everywhere, you'll be like, 'same asset!' As soon as you randomly rotate it, a lot of people will be like, 'oh wow! there's that tree, that tree,' it's all actually the same asset though. Then you've got randomized scale that'll happen between them, the density, random scale, whether it aligns to heightmaps.
“There's a lot of things we learned in the Crysis projects – and just working on CryEngine generally – that we're applying to [the procedural generation] toolset, so it's really powerful. The mentality at Crytek was always, 'use the tools to get you 90% of the way there, then have the artists come in and do that last 10%.' That's the exact same idea with these planets. We've got to have powerful tools, because again, we've got a thousand planets or whatever it's going to end up being."
"A lot of people say it's a crazy idea to go with CryEngine. Well, not really, because there's a lot of stuff in the engine that's very, very powerful. And we try to use that stuff. It would be a shame to take all the old modules and delete. There's no reason to do that; we want to try to save our time and build off of good toolsets that are already there."
Our initial interview with Roberts indicated that “edge blending” would be used within Planet Ed (the editor) to ensure biomes blend accurately and without hard edges. More importantly, according to the team, edge blending will be leveraged to fuse surrounding biome rules and create more unique environments. Tracy elaborated:
"We're talking about [edge blending] a lot right now, so whatever I answer right now, guaranteed it's going to change. The thinking right now – just like Chris said, there's a distribution map. You could either use that to inform all your placement, but then you don't really get perfected blends. What you can do is – on the edges of the distribution map – you know where that ends, so you can actually do a certain distance that you're calling a blend distance. This is where one [biome] is blending into the other. The problem is [that] if you have a lot of biomes, the rules to blend those two are probably going to end up being pretty different.
"For terrain layer blending, it's pretty straight forward; we did that in the Crysis games. That's sort of a high-pass technique. We have a detail map that's grayscale then a low-detail that's more color. It's really easy to blend two grayscales together on top of a base color. Terrain layers are probably going to be really easy, it's the vegetation and the actual assets themselves. If you're blending from desert to jungle, it's kind of easy – just a little bit of grass at the edges, then you start doing trees. But when you start doing a mountain, versus jungle, versus city – what do you do? So, a lot of the rules are based on that distribution map. We're going to have a distance, that distance is variable based on whatever biome is blending to it. Right now, we're trying to make it so that the rules are robust enough that we get nice blends out of it, but not so complex that nobody will understand what's going on."
And, finally, Tracy confirmed for us that CitizenCon (Oct 9) will contain some of the V2 planets tech, including basic edge blending without support for large objects (yet).
Additional Coverage and a Reminder on Pre-Release Games
Check back on Wednesday, October 5 for part 2 of our interview with Sean Tracy. The next part will focus on thread and resource management, character technology, and a bit more. Note also that GamersNexus will be present at CitizenCon to cover the event.
As a reminder, this is a crowd-funded, incomplete game. GamersNexus takes the same stance with all pre-release games and pre-release hardware, including items which we preview ahead of launch: We recommend waiting to make purchases, especially larger purchases, until after a product has shipped and reviews or user reports are online. GamersNexus encourages that readers particularly interested in supporting a crowd-funded effort take the time to research the project beforehand, and that readers make an informed decision on any purchases. Remember that early access purchasing is a support system for developers to further fund a game, not a sale of a complete, finished product.
If you like our style of objective reporting, please consider supporting us directly on Patreon or following us on YouTube and Twitter.
(Footnote: We know that the Star Citizen community is very eager to share content. We kindly remind readers that copying and pasting entire articles is damaging to a publication's ability to measure the success of content and remain in business, and thus damaging to the ability to fund future flights to make developer tours.)
Editorial: Steve “Lelldorianx” Burke
Video: Andrew “ColossalCake” Coleman