The game editor used for demos and screenshots is in-house, but the model (sponza) is not related to a specific title [CEDEC2019] Game editor basic design and stable and fast asset management in console game development
Combat 6: Fires of Liberation (2007) Ace Combat: Assault Horizon (2011) Tales of Card Evolve (2012) THE IDOLM@STER MILLION LIVE! (2013) Pokkén Tournament (2015) Pokkén Tournament DX (2017) • Past talks: Continuous integration of ACE COMBAT ASSAULT HORIZON: CEDEC2011 In-game camera production example in “ACE COMBAT ASSAULT HORIZON” in-game camera production example”: CEDEC2011 Game editor design and implementation in Pokkén: CEDEC2015 • In the current game team, I’m in charge of programming of graphics, tools, systems and assets
time has been a bottleneck with creation of background assets in home video game development • Inspired by the talk of Frostbite in GDC2018, I decided to develop it to improve light bake iteration [GDC2018] Real-time Raytracing for Interactive Global Illumination Workflows in Frostbite
correct lighting and perform lighting calculations in physical units → Enter brightness for shader, treat 1000 brightness as 1 in GPU • If artist inputs the time in the game editor, the direction and brightness of sunlight is determined by the preset latitude and longitude, then baking and lighting are performed → Golden hour can be reproduced because it can be controlled per minute • Directional light only supports sunlight Settings menu in Game editor https://en.wikipedia.org/wiki/Golden_hour_(photography)
Scene setup Model, texture loading, creation of Acceleration Structure, etc. Record the ray generation coordinates, Normal, UV, etc to the texture using the pixel shader Do this according to number of iterations (Monte Carlo Ray Tracing) Render scenes using bake results Emit ray from the ray generation texture and its result is stored in the UAV.
SH Vertex bake Diffuse Light map bake Diffuse Vertex bake Light probe bake RayGeneration Emit ray from the baking point and the lighting result is stored to UAV Shadow bake RayGeneration Store the information whether the bake point is shielded from the sun to UAV Alpha test AnyHit Path trace ClosestHit Calculate lighting Get the color of the sky Miss Refer to the texture of the sky Render ray generation texture Vertex, Pixel Create a ray generation point for lightmap bake Scene rendering Vertex, Pixel General game screen rendering
L1 to reduce the processing load → Output texture is BC6H x 4layers (Store to BC1 is not supported yet) SH L0 SH L1_-1 SH L1_0 SH L1_1 Bake Model data + level data
data, Vertex index Unique UV deployed vertex data Automatically generate by UVAtlas Texture (Diffuse, Normal) Level data (Model Entity) Model placement coordinates, rotation, scale Bake types SH light map, Diffuse light map… Light map, Shadow texture resolution Level data Times of day Input by artists Direction of directional light (sunlight), intensity, and color Calculated from time Texture of the sky BC6H Cubemap (512pixel x 6) x2layers Select 2 of pre-generated 27 textures by Terragen for interpolation • Prepare the data to enter into Light Baker
in advance using UVAtlas • Manual UV deployment is also supported in case UV deployment is not possible with UVAtlas • UV development is not a level unit but a model unit → Because of the schedule, texture atlas tool has not been created. UVAtlas Model Maya Unique UV deployed models https://github.com/microsoft/UVAtlas
prepared for each model • If an identical model is placed in multiple locations in the scene, register an instance using D3D12_BUILD_RAYTRACING_ACCELERATION_STRUCTURE_INPUTS:: InstanceDescs TLAS BLAS BLAS Create a BLAS for each model TLAS is one Store model’s Matrix as row major3x4 in D3D12_RAYTRACING_INSTANCE_DESC::Transform [12] PIX Acceleration Structure preview
UAV Textures for lightmap and vertex data for vertex bake are stored there Ray generation coordinate texture Used for lightmap baking Model vertex for ray generation Used for vertex baking Light probe coordinates for ray generation Used for light probe baking UAV for random number state storage Use 6 layers to store random state of XORWOW Sampler Point Sampler used for baking ConstantBuffer Parameter to be referenced in all shader stages, such as the sun direction Texture of sky color Generated with Terragen. 2 layers used for interpolation
for all models → Not only texture but also vertex data is required for UV calculation in the shader (described later) Name Remarks Vertex index, Vertex data Used for UV calculation from Attribute at Hit Group Diffuse texture Used for alpha test and lighting calculation Normal texture Used to calculate ray reflection direction in Closest Hit Shader ConstantBuffer Model type (such as emissive object) and UV offset at baking
Group Shader Vertex Buffer Texture Constant Buffer Hit group (Model B) Hit Group Shader Vertex Buffer Texture Constant Buffer Miss Shader Hit group (Model C)… • Since Shader Table is just a buffer, it is built by memcpy Shader Record etc. while considering alignment by myself.
state → It is necessary to save the random number state with Payload or UAV, and we wanted to reduce the size of the random number state from the viewpoint of processing load • Though it was as simple algorithm, there was no problem with random number accuracy struct RandomState { uint d; uint v[5]; }; RWStructuredBuffer<RandomState> RandomStateUAV: register(u9); float Random(inout RandomState state) { uint t = (state.v[0] ^ (state.v[0] >> 2)); state.v[0] = state.v[1]; state.v[1] = state.v[2]; state.v[2] = state.v[3]; state.v[3] = state.v[4]; state.v[4] = (state.v[4] ^ (state.v[4] << 4)) ^ (t ^ (t << 1)); state.d += 362437; uint randomValue = state.d + state.v[4]; return saturate(randomValue * 1.0f / 4294967296.0f); }
deterministic, and the random number seed is set as fixed. → The same bake result is output from the same input data • The random number state is initialized by the CPU, and a random value is set for each UAV pixel. UAV Light map solution std::mt19937 mtRandom(0);//Random number seed is a fixed value //Prepare buffers for lightmap resolution auto bufferSize = sizeof(RandomState) * lightmapWidth * lightmapHeight; auto buffer = std::make_unique<RandomState[]>(bufferSize); for (auto& state : buffer)//Set a random value to buffer { state.d = mtRandom(); std::for_each(state.v.begin(), state.v.end(), [](auto& v) {v = mtRandom(); }); } ///Copy buffer to UAV for random number state storage
from ray generation texture and shift direction with random number, then call TraceRay • Then get radiance from Payload, calculate SH and store into UAV [shader("raygeneration")] void SHLightmapRayGenerationShader() { uint2 bufferPosition = DispatchRaysIndex().xy; Payload payload; payload.randomState = RandomStateUAV[bufferPosition];//get the previous random number state float3 rayOrigin= //Get ray generation point from ray generation texture float3 rayDirection= //Get ray direction from ray generated texture rayDirection= //shift ray direction using payload.randomState TraceRay(); SH currentSH = GetSH (rayDirection, payload.radiance);//Calculate SH from TraceRay results SH previousSH = SH_UAV[bufferPosition]; //Get previous SH //Store SH average value to UAV based on number of bake iterations float lerpFactor = (float)iterationCount / ((float)iterationCount + 1.0f); SH_UAV [[bufferPosition]= LerpSH4(currentSH, previousSH, lerpFactor); RandomStateUAV[bufferPosition] = payload.randomState; //update random number state }
TraceRay() does not use RAY_FLAG_ACCEPT_FIRST_HIT_AND_END_ SEARCH • Set vertex and vertex index in ByteAddressBuffer for UV value calculation • Configure texture with Local Root Signature [shader("anyhit")] void AnyHitShader(inout Payload payload, in MyAttributes attribute) { float2 uv = //Calculate UV of color texture from vertex index and attribute //get alpha value from texture float alpha = DiffuseTexture.SampleLevel(pointSampler, uv, 0).w; if (0 < alpha) { AcceptHitAndEndSearch(); //since it is opaque, go to Closest Hit Shader } else { IgnoreHit();//since it is transparent have Closest Hit Shader not work } }
Shader can only give barycentrics where the ray hits the model → Set vertex index and vertex data in ByteAddressBuffer in advance, and calculate the UV value of the collision point by oneself //set with Local Root Signature ByteAddressBuffer VertexArray : register(t15); ByteAddressBuffer IndexArray : register(t16); float2 GetUV(MyAttributes attribute) { int indexSize = 12;//32bit Index * 3 //get vertex index uint3 index = IndexArray.Load3(PrimitiveIndex() * indexSize); int vertexSize = 64;//position,normal,tangent... int uvOffset = 48;//offset where uv is stored in the vertex data //gt vertex UV of triangle vertices float2 uv0 = asfloat(VertexArray.Load2(index.x * vertexSize + uvOffset)); float2 uv1 = asfloat(VertexArray.Load2(index.y * vertexSize + uvOffset)); float2 uv2 = asfloat(VertexArray.Load2(index.z * vertexSize + uvOffset)); //calculate UV from center of barycentrics float2 uv = uv0 * (1.0f - attribute.barycentrics.x - attribute.barycentrics.y) + uv1 * attribute.barycentrics.x + uv2 * attribute.barycentrics.y; float2 transformedUV =///apply UV offset, UV and repeat to UV value return transformedUV; }
the ray in the normal direction shifted by random numbers • Do not process if the number of reflections is greater than the set value → MaxTraceRecursionDepth can be set up to 31, but considering the amount of light attenuation by DIffuseTexture, the number of reflections should be enough around 4 [shader("closesthit")] void ClosestHitShader(inout Payload payload, in MyAttributes attribute) { //end if the number of reflections is greater than the setting if (maxReflectionCount <= payload.reflectionCount) return; payload.reflectionCount += 1; ///calculate vertex UV, Normal, Tangent and Binormal from vertex index and attribute ///fetch texture from DiffuseTexture and NormalTexture //Calculate and add radiance by lighting calculation payload.radiance += radiance * payload.throughput; float3 hitPosition = WorldRayOrigin() + WorldRayDirection() * RayTCurrent(); float3 rayVector = normalize(WorldRayOrigin() - hitPosition); float3sampleDirection =;//determine brdf sample position with random number ///calculate WorldNormal float NoL = saturate(dot(sampleDirection, worldNormal)); if (0 < NoL) { float pdf = NoL * InversePi; float3 brdf = DiffuseTextureColor * InversePi; payload.throughput *= brdf * NoL / pdf; TraceRay();//emit ray again to the normal direction } }
times of day with Terragen • Atmospheric images for 24 hours can be acquired by linearly interpolating two atmospheric images based on the time information set from the game editor. • The atmospheric image is filled with the sky color so that the sky color can be acquired even if the ray flies below the horizon. Below the horizon is also with the sky color //configure with Global Root Signature TextureCube SkyTextureA : register(t10); TextureCube SkyTextureB : register(t11); [shader("miss")] void MissShader(inout Payload payload) { float3 radianceA = SkyTextureA.SampleLevel(pointSampler, WorldRayDirection(), 0).xyz; float3 radianceB = SkyTextureB.SampleLevel(pointSampler, WorldRayDirection(), 0).xyz; float3 radiance = lerp(radianceA, radianceB, SkyTextureLerpRatio); payload.radiance += radiance * payload.throughput; payload.missed = 1.0f; }
bake result to light map, the bake result can be perviewed in real time • From the game editor, set the lightmap resolution and the number of bake iterations, etc. Bake Scene rendering 1 frame Sponza test level [CEDEC2019] Basic design of game editor and stable and fast asset management in console video game development
game editor for bake data debugging Indirect light General rendering Ray generation texture (position) For lightmap resolution check Checkerboard rendering Baked shadow Base color Normal Roughness Metallic Direct Light
of a directional light is changed, reset the bake state and re-bake it • Render 0 on the light map with Ray Generation Shader, having 0 bake iterations cbuffer ConstantBuffer : register(b0) { //directional light etc. //... float clearOutput; } [shader("raygeneration")] void SHLightmapRayGenerationShader() { //bake processing //TraceRay ... if (0 < clearOutput) { previousSH = 0; //reset SH } else { previousSH = SH_UAV[bufferPosition];//get the previous SH } //SH output processing //... }
are added to the bake, or coordinates are updated → Can be re-baked without stopping the game editor 1, Stop baking and rendering in the game editor 2, Wait for 4 frames until baking stops on GPU 3, Open TLAS and BLAS then reconstruct them 4, Reset bake state 5, Start baking and resume rendering in the game editor
to UAV per iteration of bake, make 1Dispatch time on GPU not exceed 2 seconds (Windows default TDR seconds) • If we use Registry Editor, the number of seconds of TDR can be changed, however considering the facility of maintenance, TDR is set as default because we didn’t want to change the registry of the team members
of baked texel is large, the processing load of the GPU will be 100%, Windows will be lost and the controllability of the game editor will be reduced → Prevent the GPU processing load from overload • Instead of baking all lightmaps in one frame, determine the number of baked texels per frame and bake all lightmaps over several frames
compute SH • Set the light probe coordinates to StructuredBuffer and call DispatchRays • Use Ray Generation Shader to shoot a ray downward, and if it doesn't hit the ground, raise the Y coordinate of the ray generation coordinate → Avoid picking up the color below the horizon • Shaders other than Payload and Ray Generation Shader are the same as the light map StructuredBuffer<float3> LightProbePositionArray : register(t12); RWStructuredBuffer<SH> OutputSHArray : register(u0); [shader("raygeneration")] void LightProbeRayGenerationShader() { uint index = DispatchRaysIndex().x; float3 rayOrigin = InputLightProbePositionArray[index]; ///emit ray down and raise the Y coordinate of the ray generation coordinate if it doesn't hit the ground float3 rayDirection = //set the direction with random number ///Following will be the same processing as lightmap }
WorldMatrix to ConstantBuffer and calculate ray generation point Struct Vertex{ //define the vertex structure position, normal, binormal ...} StructuredBuffer<Vertex>InputVertexArray: register (t12); RWStructuredBuffer<SH> OutputSHArray : register(u0); [shader("raygeneration")] void SHVertexRayGenerationShader() { uint vertexNumber = DispatchRaysIndex().x; Vertex vertex = InputVertexArray[vertexNumber];//fetch vertices //calculate ray generation point float4 position = mul(float4(vertex.position, 1.0f), worldMatrix); float3 rayOrigin = position.xyz/ position.w; float3 rayDirection = //get ray direction from normal, binormal, tangent of vertex rayDirection = //shift the ray direction using random numbers ///following will be the same processing as lightmap }
Miss Shader is executed while emitting the ray to the sun direction. • Since Closest Hit Shader is not required, set RAY_FLAG_SKIP_CLOSEST_HIT_SHADE R to RAY_FLAG • Shadow decision is performed only once per texel RWTexture2D<float4> ShadowTexture : register(u0); [shader("raygeneration")] void ShadowRayGenerationShader() { uint2 bufferPosition = DispatchRaysIndex().xy; RayPayload payload; payload.rayOrigin = //get ray generation point from ray genaration texture payload.rayDirection = //set the sun direction TraceRay(); //In the case Miss Shader is not executed set it as Shadow ShadowTexture[bufferPosition] = payload.missed; }
Since it’s not rendered during gameplay, it can be placed in large quantities • Use InstanceInclusionMask of TraceRay() for shadow bake Red emissive object (Debug display) The surrounding area is shown red cbuffer HitGroupConstantBuffer : register(b2) { float3 emissiveColor; float isEmissiveObject; }; [shader("closesthit")] void ClosestHitShader(inout Payload payload, in MyAttributes attribute) { if (0 < isEmissiveObject)//hit to emissive object { payload.radiance += payload.throughput * emissiveColor; payload.irradiance = float3(0, 0, 0); //Do not emit ray again after hitting emissive object } else { ///lighting calculation TraceRay(); } }
probe in addition to the normal light probe • Brighten the character by switching the light probe when the camera comes close to the character Normal light probe Brighter light probe Red emissive Object
Geforce2080 on the build PC • Take advantage of the game editor's automatic test function • Bake when bake input data such as level asset or model asset is submitted to Perforce • With automatic build, the number of bake iterations is increased and the quality is improved compared to the preview. Bake properties
some cases, it may be difficult to understand what is wrong, for example, when the Estimation is mistaken and unintended Closest Hit Shader works. • We also used printf debugging, such as rendering the intermediate results in Ray Generation Shader on UAV for debugging. • Since DirectX Raytracing is an early technology, there was a problem that seems to be caused by the driver, such as reset of Windows when Compute Shader was started in a separate process during baking. → Solved the problem by changing the workflow
reflections: 6 • Number of bake iterations: 1000 • Number of light probes: 1125 • Output texture size • Baking time: 188.472 seconds → Artists adjust the lighting while watching the baking process, so it is not necessary to wait until the baking is finished. Type Resolution Layers x Number of models SH Light map 128x128 4x7 2048x2048 4x2 Shadow Texture 512x512 1x7 2048x2048 1x2
workflow has been improved → Since the baking process can be previewed, lighting can be adjusted without waiting for the baking • The range of expressions has expanded, including emissive objects • Compared with CPU-equipped light baker, there are VRAM limitations, defects unique to early technology, and lack of tools, however there are enough advantages to make up for them
be output when Rasterizer shader is output in Node editor At the moment, multiple DiffuseTexture is not supported • Denoising NVIDIA OptiX can be one of the options
Carlo Ray Tracing Basic to OpenCL Implementation [CEDEC 2013] • From Monte Carlo Ray Tracing Basics to OpenCL Implementation (Implementation) [CEDEC 2014] • From Basics of Bidirectional Path Tracing (BDPT) to Implementation with OpenCL [CEDEC 2015] • INTRODUCTION TO OPTIX [GTC2017] • AN INTRODUCTION TO NVIDIA OPTIX [GTC2018] • DirectX Raytracing [GDC2018] • Precomputed Global Illumination in Frostbite[GDC2018] • Real-time Raytracing for Interactive Global Illumination Workflows in Frostbite [GDC 2018] • Shiny Pixels and Beyond: Real-Time Raytracing at SEED [GDC 2018] • Introduction to DirectX RayTracing [SIGGRAPH 2018] • Direct x raytracing the life of a ray tracing kernel [CEDEC 2018] • Ray tracing for game development [CEDEC 2018] • Basic design of game editor and stable and fast asset management in console video game development [CEDEC2019]