What is the plan for making animation work in Armory? I’ve had a thought that you could call a specific action set of animation to “open a door” or “walk” but how would you tell Armory to blend two actions together or how to move the hand to open a door to the actual door handle when the player initiates the command from an angle? Gaming animation in general is not something I’m rather used to but it would be useful if armory had these basic skills early on.
Here is a suggestion, make it so that people can animate at a certain frame rate and then have the engine translate the user set frame rate number to stand for a second of time in real life. Thus, the animation that you made with the principle that it would be played at 24 frames a second can easily be played at 60+ without committing the cardinal sin of calculating on frames because NO developer should EVER do that if they can avoid it. Just import the animation and tell Armory what frame rate the animation was intended to be played at (make sure to do this for individual animations as people will import different frame rate animated items in the same project) and then Armory would extrapolate that the given framerate would be equal to 1 second of real life time. Then it would know that, for example, frame twelve of an animation would equal to half a second and so it would play it at one half second of time of the animation playing and Armory would simply extrapolate what the “missing animation” would be in the frames in between as it plays the animation at 60fps or whatever other frame rate. I’m simplifying this greatly but you get the general idea and I’ve already gone on for a long while.
I think that in most games, there ain’t much blending of animations, looks to be more like 2 or more actions that play after the initial one, like, for example, we press A to Duck and if the character is stopped it ducks immediately if the player it’s running then it inserts an action to blend in the transition before it actually plays the ducking one… Don’t make a clue if it is the case do. Regarding you frames per second idea I like it!
EDIT: I’ve just wrote nonsense, even Blender Game Engine has Actions Blending
I’m using my knowledge on design to come up with ideas on how to make the best user experience for the most powerful game engine blender will have, hopefully. I would advice https://en.wikipedia.org/wiki/The_Humane_Interface this as a perfect way to go forward in terms of design. Humane design is a perfect way to minimize the learning curve by focusing on what human beings wish to intuitively do to make it easier on the individual that wishes to get into one of the steepest learning curves for software (Blender’s) in the modern software world. This is the same kind of design that the “bring things back to the viewport” thing the Blender Foundation is doing screams of. It’s an old book, but it still maintains the same knowledge that one would need for this. I think that blending would be a perfect contender. For context, think something like the NLA editor today but better designed to organize and use animations. It would also have to lend itself to programming both through the node system and in general. You would have to have a clear web (I mean literally) in the interface to show the connections between animations and the influences (what keyed bones affect what and how much with what curve). Again, up for debate but a good picture to build off from and change. However, taking the industry standard idea may also be a good way to go. Whatever does the job better and what people like better.
i would like to see something like this in armory https://www.youtube.com/watch?v=on7wAz0fsGg
This does look awesome. After the node setups and programming is in in Build 9, animation should be the next item on the list. Of course, gradual adding of support for cycles nodes should be done throughout time.