Making shader object as generic as possible

    This site uses cookies. By continuing to browse this site, you are agreeing to our Cookie Policy.

    • Making shader object as generic as possible

      Hi,

      I am trying to make a 2D game engine using OpenGL ES 2.0 (iOS for now). I've written Application layer in Objective C and a separate self contained RendererGLES20 in C++. No GL specific call is made outside the renderer. It is working perfectly.

      But I have some design issues when using shaders. Each shader has its own unique attributes and uniforms that need to be set just before the main draw call (glDrawArrays in this case). For instance, in order to draw some geometry I would do:

      Source Code

      1. void RendererGLES20::render(Model * model)
      2. {
      3. // Set a bunch of uniforms
      4. glUniformMatrix4fv(.......);
      5. // Enable specific attributes, can be many
      6. glEnableVertexAttribArray(......);
      7. // Set a bunch of vertex attribute pointers:
      8. glVertexAttribPointer(positionSlot, 2, GL_FLOAT, GL_FALSE, stride, m->pCoords);
      9. // Now actually Draw the geometry
      10. glDrawArrays(GL_TRIANGLES, 0, m->vertexCount);
      11. // After drawing, disable any vertex attributes:
      12. glDisableVertexAttribArray(.......);
      13. }
      Display All


      As you can see this code is extremely rigid. If I were to use another shader, say ripple effect, i would be needing to pass extra uniforms, vertex attribs etc. In other words I would have to change the RendererGLES20 render source code just to incorporate the new shader.

      Is there any way to make the shader object totally generic? Like What if I just want to change the shader object and not worry about game source re-compiling? Any way to make the renderer agnostic of uniforms and attributes etc?. Even though we need to pass data to uniforms, what is the best place to do that? Model class? Is the model class aware of shader specific uniforms and attributes?

      Following shows Actor class:

      Source Code

      1. class Actor : public ISceneNode
      2. {
      3. ModelController * model;
      4. AIController * AI;
      5. };


      Model controller class:

      Source Code

      1. class ModelController
      2. {
      3. class IShader * shader;
      4. int textureId;
      5. vec4 tint;
      6. float alpha;
      7. struct Vertex * vertexArray;
      8. };


      Shader class just contains the shader object, compiling and linking sub-routines etc.

      In Game Logic class I am actually rendering the object:

      Source Code

      1. void GameLogic::update(float dt)
      2. {
      3. IRenderer * renderer = g_application->GetRenderer();
      4. Actor * a = GetActor(id);
      5. renderer->render(a->model);
      6. }


      Please note that even though Actor extends ISceneNode, I haven't started implementing SceneGraph yet. I will do that as soon as I resolve this issue.

      Any ideas how to improve this? Related design patterns etc?

      Thank you in advance!
    • Rendering is not a huge task, making a rendering abstraction is a huge task, on my last game even with a non-abstracted renderer, it probably ate a majority of the projects dev time.

      I am actually working on a renderer as a low priority side project simply for learning, as I am interested in rendering but not as much as game logic, which is why we decided to use Ogre3D in place of our own solution, this may be an option for you as well but since that isn't an answer to your question, I will try to give whatever insight I can, I am still a beginner at abstracting rendering however.

      - Use a concrete naming convention for vertex attributes in your shaders and rendering abstraction, try to match these with the built in glsl names
      - Have a vertex format specifier, such as 'V3T2...etc', you're gpu program abstraction will than know exactly what the shader expects, and will expect it from the given sources
      - Depending on how common your uniforms are, you may also want a concrete convention for certain things, especially MV matrix arrays for transforming instances in a buffer

      For custom uniforms that pertain to a specific shader, there really is no good way I know of aside from manually specifying that uniform, however that doesn't mean hard coded, consider this

      - Ogre3D uses a high level material file system, which lets you specify the data that goes into a shader
      - UDK has its 'Kismet' system, which lets you visually plug modules together to create a material, I actually attempted to recreate this for fun and came out with a nice Qt based system that could produce diffuse-normal mapped materials
      - Unity3D has pre-defined material types with slots that you can attach textures to, you can also make these yourself however

      When it comes down to it, I think you have to make a determination between what is behind the scenes in your shader, and what you want to be able to customize

      - Vertex Transformations are behind the scenes, this data shouldn't be exposed to the user
      - Same goes for things like animation skinning
      - Texture mapping (or normal, displacement, etc) should be exposed, whether through a direct C++ interface to the program object, or through a scripting language/visual editor

      I am reasonably sure that even the visual editors (at least my implementation) end up converting their shader module tree into a DAG which is then parsed into a shader and script pair, or together would be called a material, so in the end maybe a nice scripting system would be best for you, and than once it is solid, and if it is beneficial, you could even make an editor yourself. Some languages that would be good for scripting something like that could be XML (which I used), or even Lua could probably be used by exposing texture and program objects to Lua.

      I hope I answered your question, and didn't stray too far. I would also suggest looking at Ogre3D's source code for how they abstracted rendering, it is HUGE, which is why I decided to focus on game play rather than rendering.
      PC - Custom Built
      CPU: 3rd Gen. Intel i7 3770 3.4Ghz
      GPU: ATI Radeon HD 7959 3GB
      RAM: 16GB

      Laptop - Alienware M17x
      CPU: 3rd Gen. Intel i7 - Ivy Bridge
      GPU: NVIDIA GeForce GTX 660M - 2GB GDDR5
      RAM: 8GB Dual Channel DDR3 @ 1600mhz
    • Interesting enough, but I'm stuck with the very same problem at the moment. Generic processing of vertex layout is a pain in the a**. I'm thinking about offering some predefined abstract vertex formats to the user for the usage in vertex buffer. Then each vertex format should know about its attributes. Although it limits possible combinations.

      Of course we can make some high level abstraction like type-name pair that is passed inside VBO wrapper. And then pass vertex data for each of the pair as separate array. (VBO will be constructed as side by side arrays.) However this idea breaks under DirectX where the only option is to feed interleaved data to the vertex buffer (if using single vb). So one should either copy data to the interleaved array first and then copy it again to the vbo or use multiple buffers. Either way it's additional payload which I don't want to add. And afaik interleaved data is more cache friendly (in most cases) than non-interleaved. That makes impossible to use separate arrays for the vertex buffers even in OpenGL.

      If you guys have any ideas how to make friendship between DirectX and OpenGL vertex buffers in nice and generic way, shoot it!
      Looking for a job!
      My LinkedIn Profile
    • OpenGL also supports interleaved data through glVertexAttribPointer, it is fairly straight forward to use.
      PC - Custom Built
      CPU: 3rd Gen. Intel i7 3770 3.4Ghz
      GPU: ATI Radeon HD 7959 3GB
      RAM: 16GB

      Laptop - Alienware M17x
      CPU: 3rd Gen. Intel i7 - Ivy Bridge
      GPU: NVIDIA GeForce GTX 660M - 2GB GDDR5
      RAM: 8GB Dual Channel DDR3 @ 1600mhz
    • I am still a bit confused. Suppose I have an actor which grabs a power up. He starts to glow using bloom shader and after 10 seconds back to normal attaching the default shader again. The question basically boils down to:

      How to use different shaders on the same model at runtime (if possible then without changing renderer code)?

      Consider following very simple example:

      Default shader:

      Source Code

      1. attribute vec4 Position;
      2. uniform mat4 ModelViewProjMatrix;
      3. void main(void)
      4. {
      5. gl_Position = ModelViewProjMatrix * Position;
      6. }


      Render code inside RendererGLES20 will be:

      Source Code

      1. void RendererGLES20::render(Model * model)
      2. {
      3. glUniformMatrix4fv(mvpUniform, 1, 0, &mvpMatrix);
      4. GLuint positionSlot = glGetAttribLocation(_program, "Position");
      5. glEnableVertexAttribArray(positionSlot);
      6. // interleaved data, But for now we are ONLY using the positions, ignoring texture, normals and colours.
      7. const GLvoid* pCoords = &(model->vertexArray[0].Position[0]);
      8. glVertexAttribPointer(positionSlot, 2, GL_FLOAT, GL_FALSE, stride, pCoords);
      9. glDrawArrays(GL_TRIANGLES, 0, model->vertexCount);
      10. glDisableVertexAttribArray(positionSlot);
      11. }
      Display All


      Simple enough! Now imagine that the actor got some power up and following crazy shader is applied:

      Crazy Shader:

      Brainfuck Source Code

      1. attribute vec4 Position;
      2. attribute vec4 SourceColor;
      3. attribute vec2 Texture;
      4. attribute vec4 Normal;
      5. attribute vec2 tempAttrib0;
      6. attribute vec2 tempAttrib1;
      7. // A bunch of varying but we don't need to worry about these for now
      8. varying vec4 .........;
      9. varying .........;
      10. uniform mat4 MVPMatrix;
      11. uniform vec2 BloomAmount;
      12. uniform vec2 BloomQuality;
      13. uniform vec2 BloomSize;
      14. uniform vec2 RippleSize;
      15. uniform vec2 RippleAmmount;
      16. uniform vec2 RippleLocation;
      17. uniform vec2 deltaTime;
      18. uniform vec2 RippleMaxIterations;
      19. void main(void)
      20. {
      21. // Some crazy voodoo source code here...
      22. // .........
      23. gl_Position = ..............;
      24. }
      Display All


      As you can clearly see, in order to attach this shader to the model I would need to modify the actual renderer source code to following:

      Source Code

      1. void RendererGLES20::render(Model * model)
      2. {
      3. glUniformMatrix4fv(mvpUniform, 1, 0, ....);
      4. glUniformMatrix4fv(bloomAmountUniform, 1, 0, ....);
      5. glUniformMatrix4fv(bloomQualityUniform, 1, 0, ....);
      6. glUniformMatrix4fv(bloomSizeUniform, 1, 0, ....);
      7. glUniformMatrix4fv(rippleSizeUniform, 1, 0, ....);
      8. glUniformMatrix4fv(rippleAmountUniform, 1, 0, ....);
      9. glUniformMatrix4fv(rippleLocationUniform, 1, 0, ....);
      10. glUniformMatrix4fv(rippleMaxIterationsUniform, 1, 0, ....);
      11. glUniformMatrix4fv(deltaTimeUniform, 1, 0, ....);
      12. GLuint positionSlot = glGetAttribLocation(_program, "Position");
      13. GLuint sourceColorSlot = glGetAttribLocation(_program, "SourceColor");
      14. GLuint textureSlot = glGetAttribLocation(_program, "Texture");
      15. GLuint normalSlot = glGetAttribLocation(_program, "Normal");
      16. GLuint tempAttrib0Slot = glGetAttribLocation(_program, "TempAttrib0");
      17. GLuint tempAttrib1Slot = glGetAttribLocation(_program, "TempAttrib1");
      18. glEnableVertexAttribArray(positionSlot);
      19. glEnableVertexAttribArray(sourceColorSlot);
      20. glEnableVertexAttribArray(textureSlot);
      21. glEnableVertexAttribArray(normalSlot);
      22. glEnableVertexAttribArray(tempAttrib0Slot);
      23. glEnableVertexAttribArray(tempAttrib1Slot);
      24. // interleaved data
      25. const GLvoid* pCoords = &(model->vertexArray[0].Position[0]);
      26. const GLvoid* sCoords = &(model->vertexArray[0].SourceColor[0]);
      27. const GLvoid* tCoords = &(model->vertexArray[0].Texture[0]);
      28. const GLvoid* nCoords = &(model->vertexArray[0].Normal[0]);
      29. const GLvoid* t0Coords = &(model->vertexArray[0].TempAttrib0[0]);
      30. const GLvoid* t1Coords = &(model->vertexArray[0].TempAttrib1[0]);
      31. glVertexAttribPointer(positionSlot, 3, GL_FLOAT, GL_FALSE, stride, pCoords);
      32. glVertexAttribPointer(sourceColorSlot, 4, GL_FLOAT, GL_FALSE, stride, sCoords);
      33. glVertexAttribPointer(textureSlot, 2, GL_FLOAT, GL_FALSE, stride, tCoords);
      34. glVertexAttribPointer(normalSlot, 4, GL_FLOAT, GL_FALSE, stride, nCoords);
      35. glVertexAttribPointer(tempAttrib0Slot, 3, GL_FLOAT, GL_FALSE, stride, t0Coords);
      36. glVertexAttribPointer(tempAttrib1Slot, 2, GL_FLOAT, GL_FALSE, stride, t1Coords);
      37. glDrawArrays(GL_TRIANGLES, 0, model->vertexCount);
      38. glDisableVertexAttribArray(positionSlot);
      39. glDisableVertexAttribArray(sourceColorSlot);
      40. glDisableVertexAttribArray(textureSlot);
      41. glDisableVertexAttribArray(normalSlot);
      42. glDisableVertexAttribArray(tempAttrib0Slot);
      43. glDisableVertexAttribArray(tempAttrib1Slot);
      44. }
      Display All


      You see how vastly different code you need to write in order to attach a different shader. Now what if I want to re-attach the default shader back? (this is attaching and detaching of shaders has to happen at run-time, e.g.: actor collected power up).

      Any ideas how can I efficiently and easily implement this to allow a model to change shaders at run-time? I am just looking forward to a nice implementation/idea. How would you handle the above problem?

      The post was edited 1 time, last by fakhir ().

    • oh boy... you're getting into some dangerous territories... haha nah i'm just kidding. Designing a flexible shader/effect system is not hard... just very very insanely tedious...

      I won't go much into detail but this may require a major rewrite on your graphics system. More specifically, adding more abstractions to your effects(optional), shaders, and models/material/geometry.

      You may also need another system for handling generic parameters(attributes). This is the key to your "handling generic attributes" problem. In my pet project i have a parameter system that i can pass various data types from one system to another. When i compile my shader, i grab all the data that the shader needs. In Direct3D i use ShaderReflection. You'll just have to look that up in OpenGL for the equivalent. Or you can just add another data/info layer on top of your shader to explicitly specify that data it needs (vectors, matrices, sampler states, blending states, etc...).

      When i compile my shader, i grab all the required data then get a parameter handle for each of the data in my parameter system. Which in turn various components supply the required data:

      Actor Transform Component: Passes the transform matrix to the parameter system.
      Material Component: Passes some shading info like diffuse, normal, specular textures/values, and other other data you can think of that fits in the material system.
      Geometry: Passes all the vertex attributes, vertex and index buffers.

      Then my shader/effect system just grabs the data that it really needs from the parameter system through the parameter handles it got earlier after compiling the shader.

      For your problem lets say your character has a normal vanilla diffuse shader effect then it needs to switch to a glow effect after getting a power-up. You can create two effects and material pair. First one for the vanilla shader and the second pair for the glow shader. The major difference here besides from different shaders is that the vanila material only supplies the diffuse texture while the glow material has some additional data that the glow shader needs. In my implementation i actually store the effect in my material system.

      So in code it would simply look like this:

      pActor->GetMaterialComponent()->SetMaterial( pMaterialDiffuse );

      // Render model with diffuse material...

      // Power-up! Switch to amazing glow shader!
      pActor->GetMaterialComponent()->SetMaterial( pMaterialGlow );

      I will post some additional resources for you to read on a bit later i have to go for now.
    • My material system is just an empty container that can hold generic parameters so that i can store different types of attributes i can pass to the shader or any other system that may require from the material system.

      Here's the link of the shader system that i used:
      gamedev.net/topic/169710-materialshader-implmentation/

      Pay attention to the posts by Yann L.

      The parameter system that i used is an adoption from the Hieroglyph 3 by Jason Zink: hieroglyph3.codeplex.com/
    • I agree with BrentChua -

      Separate your geometry from material. The central component of a material is of course the shader, which will require geometry with the right data (like multiple UV coordinates for texturing), and other data as the shader requires, such as texture or normal maps.

      This allows you to swap different shaders onto the same model for different effects as you asked about, but also use the same shader on many different geometries.

      One caution - going "hog wild" as they say in Austin is a good way to kill performance of your game - try to strike a careful balance between the complexity of your shaders (i.e. don't write an ubershader that can do everything), and applying a different shader to every object just so it can draw a little differently. LOL - can you imagine someone writing a "Red" shader or "Blue" shader! Definitely not the right approach, especially on mobile platforms where switching shaders can be expensive.
      Mr.Mike
      Author, Programmer, Brewer, Patriot
    • Originally posted by mrmike
      I agree with BrentChua -

      Separate your geometry from material. The central component of a material is of course the shader, which will require geometry with the right data (like multiple UV coordinates for texturing), and other data as the shader requires, such as texture or normal maps.

      This allows you to swap different shaders onto the same model for different effects as you asked about, but also use the same shader on many different geometries.

      One caution - going "hog wild" as they say in Austin is a good way to kill performance of your game - try to strike a careful balance between the complexity of your shaders (i.e. don't write an ubershader that can do everything), and applying a different shader to every object just so it can draw a little differently. LOL - can you imagine someone writing a "Red" shader or "Blue" shader! Definitely not the right approach, especially on mobile platforms where switching shaders can be expensive.


      As I understand, each material is responsible for setting its shader data {which can be hugely different with different shaders}. Consider following Actor:

      [IMG:http://farm4.staticflickr.com/3740/9242463844_bd412864f0_o.png]

      If I understood correctly, the glue code between material and shader is responsible for making all the DX / GL calls for setting up uniforms and attributes?

      The post was edited 1 time, last by fakhir ().

    • Forgive me if i have misunderstood you but i will try to answer your question the best as i can. I will answer your 2nd statement first as this makes more sense before the first one:

      "If I understood correctly, the glue code between material and shader is responsible for making all the DX / GL calls for setting up uniforms and attributes?"

      The glue code, which is the parameter system, is not responsible for any of the systems. You can think of the parameter system as a glorified global variable manager or something like a registry; it just stores stuff that you tell it to. I mentioned in my previous post about asking for a handle from the parameter system. Let's say an actor's transform component wants to store the transform matrix in my parameter system. I would get something like ParameterMatrixHandle *. That handle is just a pointer that points to the variable being stored in the parameter system. Through that handle, i can check its current value or change it. Now for the shader, my shader also requires the transform matrix based on its requirements. When i get the transform matrix handle from the parameter system, that handle is the same handle that the actor's transform component get. In DX through the ShaderReflection API, the shader only grabs the necessary parameter handles that my shader needs. So if the actor's material contains a bunch of other unecessary data, it doesn't matter because the shader will only check for the actual data attributes used.

      For the GL/DX call for setting the attributes, the shader object is still responsible for it. In my my shader class, i have an "update" function that checks each parameter handle, grab the value through the handle, and pass it in whatever API specific calls to pass these data to the actual shader object.

      "As I understand, each material is responsible for setting its shader data {which can be hugely different with different shaders}. Consider following Actor:"
      Developing a generic parameter system may take you a good number of days or weeks to get it up and running. So if this is not currently an option for you, you may be able to make it so that the material will be responsible for directly passing the data to the shader. You can use class inheritance defining a base class material then deriving specific materials, containing specific data attributes, for each shader that you create. Just as a safeguard, when your material passes data to your shader, i would check first if the attribute the material is passing actually exists in the shader first. This can be a quick fix hack but as you create more shaders and shader permutations, your derived material classes can quickly grow and get ugly. With the parameter system, i was able to design a single material and shader class that can fit whatever material types and shaders that i need. My materials and shaders vary through an intermediary data language that i use; xml.

      So whenever i experiment on a new material and shader. I would just create a new material and shader/effect file. In my project, my material file looks like this:

      <!--Material elements are stored in Material as (material)global scope
      (i.e. before render passes ) and also in sub-Materials as
      local scope (i.e. per render pass).-->
      <Material>
      <Effect>
      <File>Diffuse.efx</File>
      </Effect>
      <Parameters>
      <Element>
      <Type>Texture2D</Type>
      <Name>gfx_gDiffuseTex</Name>
      <Value>woodCrate.jpg</Value>
      </Element>
      </Parameters>
      <SubMaterials>
      <SubMaterial>
      <Parameters>
      <!--...-->
      </Parameters>
      </SubMaterial>
      </SubMaterials>
      </Material>


      Then my i would define my shader file like this: (actually, this is my technique file which in turn just loads the actuall hlsl files)

      <Technique>
      <EffectRegisters>
      <Register>
      <EffectID>3</EffectID>
      <Priority>50</Priority>
      </Register>
      </EffectRegisters>
      <RenderPasses>
      <Pass>
      <RenderViewStage>GBuffer</RenderViewStage>
      <Requirements>
      <Geometry>
      <Element>Position</Element>
      <Element>Normal</Element>
      </Geometry>
      <Material>
      </Material>
      </Requirements>
      <RenderStates>
      <DepthStencilState>default.dss</DepthStencilState>
      <BlendState>NoBlending.bs</BlendState>
      <RasterizerState>BackFaceCulling.ras</RasterizerState>
      </RenderStates>
      <Shaders>
      <VertexShader>
      <File>Geometry.shx</File>
      <FunctionName>VS</FunctionName>
      </VertexShader>
      <PixelShader>
      <File>Geometry.shx</File>
      <FunctionName>PS</FunctionName>
      </PixelShader>
      </Shaders>
      </Pass>
      <Pass>
      <RenderViewStage>Composite</RenderViewStage>
      <Requirements>
      <Geometry>
      <Element>Position</Element>
      <Element>TextureDiffuse</Element>
      </Geometry>
      <Material>
      </Material>
      </Requirements>
      <RenderStates>
      <DepthStencilState>default.dss</DepthStencilState>
      <BlendState>NoBlending.bs</BlendState>
      <RasterizerState>BackFaceCulling.ras</RasterizerState>
      </RenderStates>
      <Shaders>
      <VertexShader>
      <File>GeoComposite.shx</File>
      <FunctionName>VS</FunctionName>
      <Defines>
      <Define>
      <Name>DIFFUSE</Name>
      <Value>1</Value>
      </Define>
      </Defines>
      </VertexShader>
      <PixelShader>
      <File>GeoComposite.shx</File>
      <FunctionName>PS</FunctionName>
      <Defines>
      <Define>
      <Name>DIFFUSE</Name>
      <Value>1</Value>
      </Define>
      </Defines>
      </PixelShader>
      </Shaders>
      </Pass>
      </RenderPasses>
      </Technique>
    • @BrentChua
      Thank you for such a detailed response. I guess I needed to hear the following:

      Originally posted by BrentChua
      For the GL/DX call for setting the attributes, the shader object is still responsible for it. In my my shader class, i have an "update" function that checks each parameter handle, grab the value through the handle, and pass it in whatever API specific calls to pass these data to the actual shader object.

      Developing a generic parameter system may take you a good number of days or weeks to get it up and running. So if this is not currently an option for you, you may be able to make it so that the material will be responsible for directly passing the data to the shader. You can use class inheritance defining a base class material then deriving specific materials, containing specific data attributes, for each shader that you create.



      PS: Why don't you join the GCC community game? If you don't have time to actively participate then maybe you can review our designs and give feedback. It would be nice to have a programmer chat :)

      The post was edited 1 time, last by fakhir ().

    • PS: Why don't you join the GCC community game? If you don't have time to actively participate then maybe you can review our designs and give feedback. It would be nice to have a programmer chat :)


      Haha, I really appreciate the offer but sadly i'll have to pass for now. I'm a bit in a tough spot in life right now. Things hasn't been going well for me and my family having one tragic event after another and another and another. Plus with the recent closing down of the company i used to work for, i've been basically out-of-job in the industry for the past 12 months.

      But i still hang out in here from time to time anyways. And I don't really think i'm that qualified enough to give out code/architecture reviews but i'll just try to share what i know when i can. :)