rate up
1
rate down
Question about Descriptor Range of RootDescriptorTables
Hey again! This week I've been working on a system that, given an input of the number of used resources, the types and slots of the resources, it will create a correct Root signature. I'm not sure if I had understood correctly how I should use descriptor tables tho. So, a descriptor table, will be created with the D3D12_ROOT_DESCRIPTOR_TABLE struct: D3D12_ROOT_DESCRIPTOR_TABLE pdes = {}; pdes.NumDescriptorRanges = ranges.size(); pdes.pDescriptorRanges = &ranges[0]; Basically, it will hold "x" number of Descriptor Ranges. A descriptor range, is just a start point and a offset in a descriptor heap. This means (as I understand) that I could define multiple ranges, that don't need to be contiguous, and each range could only hold 1 descriptor. Is this correct, or I'm missing something? Thanks! **Background:** So, the reader may ask, why do I want to make such that thing? Well, lets put that I want to use Descriptor Tables to set many sampler states for my shaders (I don't want to inline descriptors in the root signature as that will take much space), and each shader could use "random" samplers (for example, sampler(2) and sampler(8)) **More info:** Ok so I think I need to give more information of what I'm doing, because I'm pretty sure it's not the way... First, on engine load, I create the required samplers for each of the effects/passes/materials/shaders/(insert name here). Basically, I first create a descriptor heap (*D3D12_DESCRIPTOR_HEAP_TYPE_SAMPLER*) with 64 or so *NumDescriptors* (it will be enough to hold all the samplers). Then, when I create the RootSignature I do the following: // Build a list of samplers in use vector<int> sSlots; if(UsesSamplers()) { for(int i;i<16;i++) { if(UseSampler(i)) { sSlots.pushBack(i); } } } // Build a list with all the RootParameters // Add CB // Add "x" // Add Samplers if(sSlots.size()) { std::vector<D3D12_DESCRIPTOR_RANGE> samplerRanges; for each(auto s in sSlots) { D3D12_DESCRIPTOR_RANGE range= {}; range.BaseShaderRegister= s; range.NumDescriptors= 1; range.RegisterSpace= 0; range.OffsetInDescriptorsFromTableStart = D3D12_DESCRIPTOR_RANGE_OFFSET_APPEND; range.RangeType = D3D12_DESCRIPTOR_RANGE_TYPE_SAMPLER; samplerRanges.push_back(range); } D3D12_ROOT_DESCRIPTOR_TABLE pdes= {}; pdes.NumDescriptorRanges= samplerRanges.size(); pdes.pDescriptorRanges= &samplerRanges[0]; D3D12_ROOT_PARAMETER p= {}; p.ParameterType= D3D12_ROOT_PARAMETER_TYPE_DESCRIPTOR_TABLE; p.DescriptorTable= pdes; p.ShaderVisibility= D3D12_SHADER_VISIBILITY_ALL; } I think that's not how tables are supposed to be used... Maybe the parameter *OffsetInDescriptorsFromTableStart* should also change in this case.
Sign in to answer!