Thursday, April 7, 2011

Spring Unit Animation: Why in-sim animation should be optional even though its awesome

For background you need to know what spring is:

Back in "the day" there was a game called "Total Annihilation[+]" which could rightly be described as a "Revolutionary Real-Time Strategy Game" for a lot of reasons... but the two that are important to the discussion here are that [1] the game was set-up and the developers provided tools to make it exceptionally modifiable and [2] units were 3D models and the scripts that affected their behaviour also actually controlled their behaviour.

This second point is what I mean by "in-sim animation".  Essentially, in order for the the unit to aim it's weapon or deploy it's nanolath mechanism or load or deploy a unit... the script actually, like programming a robot, sent commands to adjust the position and rotation of pieces and then sent fire commands to weapons etc.  This was great because it meant that these animation script could take into account every possible angle or situation by having the script control the position of each bit or bob of their unit... allowing a programmer with some understanding of spacial geometry to create fluid and realistic animations.

Anyways, fast forward a bit and "Total Annihilation" is starting to show it's years... still an awesome game but with modern computers better able to handle complex graphics and simulation a group of programmers and TA modders start a project which will eventually become what is now knows as the "Spring RTS Engine[+]".

Spring has been growing away from it's TA roots but it still maintains that great "in-sim animation" feature as the only way to animate units.  The problem is that I believe it's holding back some potential features.

Because gameplay is affected by these unit animation scripts they need to be run for every unit, regardless of it the unit is being drawn or even in the players line of sight, on every computer, including the server.  This also significantly increases the number of variables that can be rounded wrong and result in desynchronization.  Another issue is that simulating all this stuff is CPU intensive which results in an unnecessarily high "frame rate bottleneck" when CPU load is being generated for these animations even when they are not rendered.

So, what's my solution to this problem?

Unit scripts will have the "option" of triggering "non-sim animations"... these animation scripts will have no effect on gameplay.  For example, a unit would have 2 "synced" vectors for it's emit points and the script would move these vectors to the correct point in space and then send this change to the "unsynched" animation script to deal with, this "unsynced" code could be as simply as existing animation script just running in parallel with the "synced" code, or it could be as complicated as an inverse kinematic solver to allow the unit to point in the new direction or interpret 3D studio or Ogre3D style animation scripting.

The cool thing (in my mind) is that this could probably be done already with the existing engine... units would just be emit points and their Lua scripts would send a message to an unsynced gadget that handles the unit's animation.

Sooo... crazy smart idea or jut plain crazy?

No comments:

Post a Comment