Virtual Functions

    This site uses cookies. By continuing to browse this site, you are agreeing to our Cookie Policy.

    • Virtual Functions

      I'm not sure if this question will be any good, but it's something I've been wondering about a lot. I read in an AI programming text that virtual functions should be avoided whenever possible due to the overhead of looking for the function in the virtual function table. If this is the case, why does the engine described in GCC4 so dependent on virtual functions? Is the said overhead miniscule, or is it impossible to structure a great engine without as much virtual functions?
      Macoy Madson-http://www.augames.f11.us/
    • Someone correct me if i'm wrong here, but it's my understanding is that Virtual methods, and Virtual tables are really only an issue when you abuse inheritance.

      Here's an interesting article on virtual tables
      coldattic.info/shvedsky/s/blog…-walks-into-a-bar/posts/3

      One if the things that they point out in the GCC4 book (and if remember correctly, the GCC3 book as well) is to keep your inheritance hierarchy as flat as possible. In fact that's the whole reason for going with the Actor/Component pattern, otherwise you'd end up with some sor tof Tree structure needed to represent your Actor heirarchy lol.

      Making smart/flat inheritance structures essentially throws out any arguement against virtual methods, imo.
    • Virtual functions aren't actually that much slower and they should not be avoided. The article Kl1X linked is decent and talks about what you get. Really, it's the cache miss that will kill you, not the vtable jumping. With regular functions, the CPU can cache that code. This is especially important in deep inner loops where the function is called thousands of times:

      Source Code

      1. for (int i = 0; i < 100000; ++i)
      2. {
      3. DoStuff(); // this probably shouldn't be virtual
      4. }


      The CPU will attempt to cache the code in DoStuff() so that it doesn't have to keep fetching it from memory. The i variable will likely also be in a register, so the CPU can execute the entire loop without having to touch system memory at all. If DoStuff() were a virtual function, it would probably just be able to cache the vtable and have to go out to system memory every iteration.

      You should use virtual functions whenever it makes sense and not worry about the performance costs. Honestly, if your game is bottlenecking on virtual function looks-ups, something is very wrong.

      -Rez
    • Another thing to keep in mind is that most higher level programming languages treat all functions as virtual by default. For example, any function that's not private or static in Java is considered virtual. There might be optimizations that Java uses to reduce this overhead though. Generally the way I think about it is that anything I can make non-virtual is just a bonus that I get for using C++ and that there's nothing wrong with virtual functions when they are needed.

      James
    • Thanks for all the replies! So basically keep hierarchies flat and there will be relatively smooth sailing. Rez, in your example, would that be a significant problem (if DoStuff was a virtual function), or would it not effect the performance too much?
      Macoy Madson-http://www.augames.f11.us/
    • I wrote a quick program and did some profiling. The difference was very small, though it was certainly the best-case scenario. If it were me and there was a reason I wanted to make DoStuff() a virtual function, I would do it without a second thought. I talk about some of this stuff in Game Coding Complete in chapter 23, starting on page 846.

      Here's the dirty secret about software performance analysis: it's never where you think it is. There are lots of little tricks to increase performance that actually make no real difference in a final product. For example, consider the following code:

      Source Code

      1. // version 1
      2. void DoSomething(GameObject* pObj)
      3. {
      4. if (!pObj)
      5. return;
      6. if (!pObj->IsValidObject())
      7. return;
      8. if (!pObj->HasSomethingToDo())
      9. return;
      10. pObj->DoSomething();
      11. }
      12. // version 2
      13. void DoSomething(GameObject* pObj)
      14. {
      15. if (pObj)
      16. {
      17. if (pObj->IsValidObject())
      18. {
      19. if (pObj->HasSomethingToDo())
      20. {
      21. pObj->DoSomething();
      22. }
      23. }
      24. }
      25. }
      Display All


      Which function is faster, version 1 or version 2? The answer is version 2 is slightly faster because most compilers (including VS) will optimize to assume that you're going to execute the statement in the if branch. If I were writing this for real, I would probably write version 1 because it's cleaner and easier to read.

      But what about the performance costs? Well, in a real game, there's so much going on that this tiny little performance optimization is like a drop of water in the ocean. No one will notice it. That's why premature optimizations like this are never a good idea. You want to get the system working the way it's suppose to work, then worry about optimization. You always, always, always want to measure your performance before you make any changes. Run your game through a profiler and actually measure where your game is slowing down. It's almost never where you think it is.

      For example, let's look at AI on The Sims. Where do you think the slow part is? When I first got there, I assumed it would be in scoring all of those interactions to determine which one was better. That certainly can be slow, but I was surprised to learn that the Test() functions were sometimes even slower! A Test() function on The Sims is like an early-out for an interaction. If you're already using the toilet, I can't use the toilet at the same time so it tests out. If you're laying in bed, I can't lay in bed with you because our relationship isn't high enough. If you're not my girlfriend, I can't kiss you because that relationship bit isn't set. Running all these tests can be REALLY slow.

      So it's important to actually measure your game and figure out what's wrong. You can spend your time fixing the worst performance issues and your game will probably be fine, even if you abuse virtual functions and compiler branch prediction. The only time this stuff really matters is if it's truly in an inner loop somewhere. There are places in The Sims AI where we call into functions over a million times a frame on a densely populated lot. These are functions where things like branch prediction and virtual function calls might matter. But even in those cases, they probably don't. I was able to solve many of those performance issues with clever caching so that we didn't have to call it that many times. One issue was basically one O(n^2) algorithm calling into another O(n^2) algorithm. Again, clever caching can eliminate the need for that kind of algorithmic complexity.

      -Rez