View Issue Details

IDProjectCategoryView StatusLast Update
0002247The Dark ModAnimationpublic01.09.2019 22:25
ReporterSpringheelAssigned To 
PrioritynormalSeveritynormalReproducibilityhave not tried
Status newResolutionopen 
Product VersionSVN 
Target VersionFixed in Version 
Summary0002247: Facial Animations
DescriptionTry making some head animations for non-idle states, like combat or pain.
TagsNo tags attached.

Relationships

related to 0003605 new Eyelid bones jitter when AI are talking 

Activities

Springheel

Springheel

19.05.2011 22:37

administrator   ~0003847

models_src/proguard_mb has the following which might be for new head rig:

head_idle.mb
head_blink.mb
head_death.mb
nbohr1more

nbohr1more

19.05.2011 23:36

developer   ~0003848

I am sure this is probably too much work but I thought I would post it for your consideration.

Since all vertex animations require significant CPU overhead, I was wondering if it would be cheaper to use an animated normal map to change facial expressions.
This would likely be light enough to allow very detailed animation changes akin to Source engine facial tech.

You would create several template facial positions for categories of expressions then create scrolling normal maps for the actual animation.

You'd probably need to bake the whole head for each animation frame unless you could break off just the "face portion".

(As I said, probably too much hassle.)
Springheel

Springheel

19.05.2011 23:54

administrator   ~0003849

Setting aside the fact that you'd have to do that individually for every head (since they don't share normalmaps), it also would require code support, and I can't imagine how it would look convincing if the actual geography of the face didn't move.
Springheel

Springheel

19.05.2011 23:56

administrator   ~0003850

A note for later: I suspect that the lipsynching code might be overriding the head animations.
nbohr1more

nbohr1more

20.05.2011 00:07

developer   ~0003851

Last edited: 20.05.2011 00:19

View 4 revisions

The only thing that would look wrong is the silhouette. You could mitigate that a little by animating BOTH the low poly mesh and the normal map but you'd loose most of the performance benefit.

Not sure you'd need code support? The animation editor allows you to apply different materials at different frames, no? Or different "Skins" ?

"All" you would need to do is specify animated normal map A at frames 8 through 20 (etc) but it would also require careful timing between the two animation types.

(and a hell of a lot of normal map baking... )

If not for the bulk of the animation, this could possibly be used for small animations like an eyebrow lift or small smirk when a conversation perks up to something interesting (etc)

Springheel

Springheel

20.05.2011 01:06

administrator   ~0003852

Last edited: 20.05.2011 01:07

View 2 revisions

I don't know how you would presume to make the mouth or brows look like they're moving when none of the geography around the mouth moves. As for code support, what happens when the animation is interrupted before the frame command that restores the skin is called?

It just doesn't seem practical on any level, and since the head mesh is constantly animated anyway, I can't see why there would be any performance saving.

You might be able to do very tiny modifications, but it hardly seems worth all the effort for that.

nbohr1more

nbohr1more

20.05.2011 01:31

developer   ~0003854

Last edited: 23.05.2011 23:53

View 2 revisions

Ah, so the head mesh is always regenerated per frame even when no features change.

Oh well... I suppose you could lower the frame rate to save a little...

The normal map would be a bake of a full head animation frame so you wouldn't just be moving the mouth or eye per se. Of course, adding to the impracticality is that you would need to turn each MD5 anim frame into a high-poly static model for the normal.

Yes, state interruption would be a nightmare to coordinate.


Edit:

Nope, when the vertex positions do not change between frames, a number of CPU intensive operations are conserved per this Intel document on Id Tech 4:

http://software.intel.com/en-us/articles/optimizing-the-rendering-pipeline-of-animated-models-using-the-intel-streaming-simd-extensions/

So animated normal maps would improve performance (just for the record).

Springheel

Springheel

31.05.2011 22:15

administrator   ~0003866

Someone else is welcome to explore animated maps if they wish. All the other complications aside, I can't believe it would have any noticeable performance impact. A single AI has thousands of animated verts. Facial animations might use two dozen verts at a time...you'd need hundreds of AI on screen to equal the impact of a single AI.
nbohr1more

nbohr1more

01.06.2011 01:01

developer   ~0003867

I was imagining giving the facial animation the apparent appearance of hundreds of animated bones verses the current numbers. Nonetheless, your point is valid. Without significantly boosting the facial animation quality, animated normalmaps would not offer any noticeable optimization.

Yes, the verdict is not looking good on the animated normalmap concept.

If videomap could be used for diffuse and normals then it would be a more realistic concept to approach. Until then, (if ever) it would require too much video RAM ??? or wouldn't offer much of a quality boost ???

After reading more about all the technical pitfalls of LA Noire's version, I've come to the conclusion that it's also not a good approach in general... Not sure how that translates to artist generated frames compared to "captured" frames though...
Springheel

Springheel

03.05.2012 22:46

administrator   ~0004524

Here's something from tdm_ai_base.script, don't know if it's relevant:

void ai_darkmod_base::Head_Idle()
{
    if (hasAnim(ANIMCHANNEL_HEAD, "idle")) {
        // play the animation
        playCycle(ANIMCHANNEL_HEAD, "idle");
        
        eachFrame
        {
            if (getMoveType() == MOVETYPE_SLEEP)
            {
                animState(ANIMCHANNEL_HEAD, "Head_Idle_Sleep", 4);
            }
        }
    }
}
Springheel

Springheel

11.07.2013 00:21

administrator   ~0005696

This would be useful:

tdm_ai_showanimstate Shows the current anim states of the anim channels torso, legs and head, as well as their wait states.

At least I could find out what animation the head is actually playing.
Springheel

Springheel

27.07.2013 23:45

administrator   ~0005894

Last edited: 28.07.2013 00:00

View 4 revisions

Hmm, testing with that on gave confusing results. The head channel is listed as playing head_idle constantly, even when the AI is talking (even though it is actually playing head_talk.md5anim at this point). It must also be playing head_blink regularly.

When the AI is killed, the head channel shows head_death.

Also noticed that there is a channel eyelids on the head mesh.

Here are the defined animations:

 anim idle models/md5/chars/heads/npcs/head_idle.md5anim
    anim blink models/md5/chars/heads/npcs/head_blink.md5anim
    anim talk1 models/md5/chars/heads/npcs/head_talk.md5anim

    anim death models/md5/chars/heads/npcs/head_death.md5anim
    {
        prevent_idle_override
    }

    anim sleep


Trying to create a "walk" animation had no effect; the Ai still played head_idle while walking.

Springheel

Springheel

28.07.2013 00:04

administrator   ~0005895

Going back to this:

void ai_darkmod_base::Head_Idle()
{
    if (hasAnim(ANIMCHANNEL_HEAD, "idle")) {
        // play the animation
        playCycle(ANIMCHANNEL_HEAD, "idle");
        
        eachFrame
        {
            if (getMoveType() == MOVETYPE_SLEEP)
            {
                animState(ANIMCHANNEL_HEAD, "Head_Idle_Sleep", 4);
            }
        }
    }
}

it sounds like the head is supposed to play head_idle constantly unless "movetype_sleep" is true. Does that mean every animation would need it's own "if" statement in order to play a different head animation?
Springheel

Springheel

28.07.2013 20:55

administrator   ~0005897

Last edited: 28.07.2013 20:59

View 2 revisions

In the same script at the beginning, there is a list of animations, but only one has a Head channel listed:

    void Torso_Idle_Sleep();
    void Legs_Idle_Sleep();
    void Head_Idle_Sleep();

Also, it seems like each animation lists actions for the torso, and legs separately. Sleeping also includes one for the head:

void ai_darkmod_base::Head_Idle_Sleep()
{
    if (hasAnim(ANIMCHANNEL_HEAD, "sleep")) {
        // play the animation
        playCycle(ANIMCHANNEL_HEAD, "sleep");
        
        eachFrame
        {
            if (getMoveType() != MOVETYPE_SLEEP && !AI_KNOCKEDOUT && !AI_DEAD)
            {
                animState(ANIMCHANNEL_HEAD, "Head_Idle", 4);
            }
        }
    }
}

So does death:

void ai_darkmod_base::Head_Death()
{
    if (hasAnim(ANIMCHANNEL_HEAD, "death")) {
        // play the animation
        playAnim(ANIMCHANNEL_HEAD, "death");
        
        while( !animDone(ANIMCHANNEL_HEAD, 0)) {
            waitFrame();
        }
    }
    
    // finish up and disable this animchannel
    finishChannelAction(ANIMCHANNEL_HEAD, "death");
    finishAction("death");
    wait(1);
    disableAnimchannel(ANIMCHANNEL_HEAD);
}

and KOing

SteveL

SteveL

01.09.2014 13:25

developer   ~0006921

Last edited: 01.09.2014 13:29

View 2 revisions

The reason those scripts are hard to interpret is that under the hood each AI is 2 animated entities: a body, which is the AI entity, and an attached head, which is an AF Attachment. I suspect you couldn't see blinking and talking using tdm_ai_showanimstate because it was showing you the head channel of the AI entity, instead of an animchannel on the head entity.

Every animated entity, even something as simple as a waving flag, has the same 5 anim channels:
ANIMCHANNEL_ALL
ANIMCHANNEL_TORSO
ANIMCHANNEL_LEGS
ANIMCHANNEL_HEAD
ANIMCHANNEL_EYELIDS

Entities that have legs and torsos etc apparently map a set of joints to each channel. Not found that mapping yet but I read it on the wiki and it makes sense. Entities that don't have arms and legs, like flags, simply use ANIMCHANNEL_ALL for their animations, which affects all joints.

When the scripts request an animation on ANIMCHANNEL_HEAD, *some* of the game functions treat that as a special case and they find the head that's attached to the AI and play the animation on that entity instead. But not necessarily all. Some might simply play it on the head channel of the body mesh, which would affect either nothing (if no joints are mapped to HEAD), or only the angle and position of the head. Either way they wouldn't affect the joints *in* the head.

How did you want to start head animations? By implementing a rule that says, "if the head has a matching animation to what gets played on the torso, then play it on the head too?" Or something more flexible?

in the meantime, I'll carry on mapping out the current set-up of any script that specifies ANIMCHANNEL_HEAD.

Springheel

Springheel

01.09.2014 13:34

administrator   ~0006922

Last edited: 01.09.2014 13:41

View 2 revisions

Thinking in ideal terms, There are four different kinds of animations we might want to play on the head:

1. Random blinking (limited to eyelid channel)
2. Lip-syncing when talking (already covered in a primitive way)
3. Animations linked to body animations
4. Facial animations to indicate mood

0000001 and 0000002 already work. 0000003 would be covered by a "matching animation" system. 0000004 might not be necessary, but I could imagine mappers wanting to call certain facial animations when using the conversation system--for example, an AI plays an "angry" animation on the head while hearing bad news.

//Not found that mapping yet//

That's done in the modelDef--you can see it in any ai .def file.

SteveL

SteveL

02.09.2014 00:16

developer   ~0006924

Last edited: 02.09.2014 00:20

View 2 revisions

I've taken a read through the game code, and this all *ought* to work already, but the scripts might not be set up to make use of it. I've not read through those thoroughly yet, but you quoted the relevant stuff above.

3) Should work by adding a call to SyncAnimChannels to sync the head after starting an animation on the torso. SyncAnimChannels is already set up to look for an anim on the head that has the same name as the torso anim, and to match up the frame (by number). I'm not sure whether any of the scripts do that though, so we'll need to think about when we want it to happen.

4) Is supportable through the existing script events PlayAnim and PlayCycle, which already forward the play request to the head entity when called with ANIMCHANNEL_HEAD. But it'll need a new custom AnimState script creating for the head, so that the custom anims don't get interrupted by idle anims. The game scripts currently provide custom animstates only for torso and legs. Those are used by the path tasks and conversations as well as the AI state system.

Right now the game only has a very few AnimStates for the head: those are the scripts you quoted above: sleeping and death and idle. No reason why there can't be a custom animstate for the head too, that allows a mapper-specified animation to play without interruption. We'll need to tweak the scripts you quoted above too so that the head can drop out of its idle state without having to be asleep.

To test it all I'll need to find (1) a head animation to play in isolation, and (2) two compatible anims for head and body with the same name and number of frames. Any ideas?

Springheel

Springheel

02.09.2014 00:43

administrator   ~0006925

Hmm, 2 is something I hadn't considered before. When I was running tests I was just taking an existing head animation, like "death" and renaming it. If it didn't have the same number of frames as the body animation, would that cause it not to work?
SteveL

SteveL

02.09.2014 08:29

developer   ~0006929

It should still work. It just won't be synchronised in any meaningful sense. I'll check but I'm pretty sure that if the sync code requests frame 45 on the head, and its anim has only 30 frames, it'll wrap and start playing at frame 15.
SteveL

SteveL

02.09.2014 18:50

developer   ~0006930

Heads only have 2 effective anim channels: ALL and EYELIDS. No others are defined. Lip sync plays on animchannel_all, and it does something unique: it plays the frames of the animation out of sequence, by selecting a frame based on the sound. So we'll need to avoid mixing it with custom anims. Then again custom anims will play on animchannel_all too, so lip syncs can be allowed to override custom anims if a bark kicks in.

That's almost certainly what's causing a clash with blinking 0003605, even if I haven't been able to observe it yet. The lip sync will be interpolating its own positions for the eyelids with the blink animation.

I reckon we should consider mapping all non-eyelid joints to ANIMCHANNEL_HEAD, so that lip sync and custom anims can be independent of blinking.
Springheel

Springheel

02.09.2014 23:56

administrator   ~0006932

Last edited: 02.09.2014 23:57

View 2 revisions

"lip syncs can be allowed to override custom anims if a bark kicks in."

Hmm. That's going to be limiting, but I'm not sure we can do anything about it. For example, it would be nice if while in combat, AI had an "angry" expression. But they also make barks while attacking. Does that mean the head would snap from angry to neutral to angry? Not sure how that would look.

How difficult is it to establish new animation channels?

"I reckon we should consider mapping all non-eyelid joints to ANIMCHANNEL_HEAD"

How would this affect the KO and sleep animations, that control the eyelids?

SteveL

SteveL

03.09.2014 00:32

developer   ~0006933

Last edited: 03.09.2014 00:33

View 2 revisions

//
How difficult is it to establish new animation channels?
//
It should be straightforward, as long as you don't want to exceed the standard 4 sets of custom mappings per animated entity. They can re-use the existing names, with whatever joints mapped, or even use new names for the existing channel slots.

//
How would this affect the KO and sleep animations, that control the eyelids?
//
I guess those would still play on animchannel_all, so they'd still affect all joints. Forgot to mention, animchannel_head on the head is unused right now. Those scripts above that play those anims on animchannel_head: those scripts are on the AI not the head, and the code passes them on to animchannel_ALL on the head. Sleep and death already have their own animstate scripts, so it's easy to make special cases of them.

Lip syncing and custom anims could play on the non-eyelid joints. I guess we could split up the head further to try to isolate lip movement, but I'm not sure how it would look if the face frowns but the lips don't. Maybe not bad, and we can experiment with blending. Lip syncing is already another special case, so its settings can be tweaked.

Springheel

Springheel

03.09.2014 19:54

administrator   ~0006936

My thought was, if it's not too difficult to make new channels, we could create a separate channel for the mouth. Then lipsynching would run only on that channel. If I have this right, that means an angry combat animation could run on the _all channel, but animations on the "mouth" channel would override them--this would result in the AI keeping his angry face when yelling a combat bark--only his mouth would move.
SteveL

SteveL

03.10.2014 18:46

developer   ~0007041

That's a good alternative. I'm looking through the existing setup so I can work out what will affect what.
SteveL

SteveL

03.10.2014 19:10

developer   ~0007042

Random notes

An Actor's "headAnim" AnimState controller is hard-wired to animchannel_all on the head. Can be changed but a lot of stuff uses it, and nothing has to specify a channel.

Opened a new tracker for some animation-related script events that look broken for heads. TODO: Check whether any of these are actually used.
SteveL

SteveL

03.10.2014 19:47

developer   ~0007044

Attached a diagram laying out the various paths by which anims get played or controlled on the head.

Most stuff goes through the Actor's headAnim controller which currently always specifies animchannel_all (i.e. all joints) on the head. Lip syncing is a special case so could be unhitched from that method and played on a new lip channel without disrupting the rest of the code. That would fix the other issue (eyelids jittering when speaking) but it probably won't play well with potential new head anims like frowning, if those play on animchannel_all.

Making all anims default to a new channel that misses out lips and eyelids wouldn't work too well either when the lips *aren't* talking. And some of those anims need to play on the eyelids as you said: sleep, death etc. I'll continue to mull over my diagram.
SteveL

SteveL

03.10.2014 19:48

developer  

animchannel_head.gif (59,918 bytes)
animchannel_head.gif (59,918 bytes)
Springheel

Springheel

03.10.2014 21:38

administrator   ~0007045

I won't pretend to understand the diagram....but do we have control over which channels take precedence if there's a conflict?
SteveL

SteveL

04.10.2014 00:03

developer   ~0007046

We do, there are a few tools for controlling the precedence: syncing one channel to another; starting an overriding anim later so it takes over; and tweaking animation weights manually if necessary.

The tricky bit is how to get it to work without making the picture above even more complicated, or duplicating the more complex setup used for the main Actor's anim channels which would be overkill. Heads have one anim channel and one anim state right now. I can't fiddle with that anim state to build in special rules because it's the same anim state code that's used for the main Actor's anim channels. The same goes for over half the code in the diagram. But there's usually a way.

How do I find out exactly which joints are used by the talk anims? Looking at models\md5\chars\heads\npcs\head_talk.md5anim, the only joints with flags are "LipUpperMiddle", "MouthRight", and "jaw". Not mouth left. Does that sound right?
Springheel

Springheel

04.10.2014 00:28

administrator   ~0007047

Last edited: 04.10.2014 00:41

View 2 revisions

At the moment, as far as I know, "lip" synching only uses a single joint--the jaw. Although the head skeleton has several bones around the mouth, none of them are currently used by existing animations. It would be cool to have more accurate lip-synching that used more bones, but I imagine that's a complex task.

edit: I went back and looked at the animation, and it looks like there are two other bones being moved--- mouth3 and mouth9. I'm not sure exactly what those bones do, however, as the vast majority of the heads (if not all) have no verts weighted to those bones.

VanishedOne

VanishedOne

01.09.2019 22:25

developer   ~0011831

I just came across this, and thought it might be worth posting the link: http://www.lunaran.com/page.php?id=181

Issue History

Date Modified Username Field Change
08.06.2010 14:23 Springheel New Issue
11.06.2010 17:21 Springheel Status new => assigned
11.06.2010 17:21 Springheel Assigned To => Springheel
19.05.2011 22:37 Springheel Note Added: 0003847
19.05.2011 23:36 nbohr1more Note Added: 0003848
19.05.2011 23:54 Springheel Note Added: 0003849
19.05.2011 23:56 Springheel Note Added: 0003850
20.05.2011 00:07 nbohr1more Note Added: 0003851
20.05.2011 00:08 nbohr1more Note Edited: 0003851 View Revisions
20.05.2011 00:18 nbohr1more Note Edited: 0003851 View Revisions
20.05.2011 00:19 nbohr1more Note Edited: 0003851 View Revisions
20.05.2011 01:06 Springheel Note Added: 0003852
20.05.2011 01:07 Springheel Note Edited: 0003852 View Revisions
20.05.2011 01:31 nbohr1more Note Added: 0003854
23.05.2011 23:53 nbohr1more Note Edited: 0003854 View Revisions
31.05.2011 22:15 Springheel Note Added: 0003866
01.06.2011 01:01 nbohr1more Note Added: 0003867
03.05.2012 22:46 Springheel Note Added: 0004524
11.07.2013 00:21 Springheel Note Added: 0005696
27.07.2013 23:45 Springheel Note Added: 0005894
27.07.2013 23:48 Springheel Note Edited: 0005894 View Revisions
27.07.2013 23:51 Springheel Note Edited: 0005894 View Revisions
28.07.2013 00:00 Springheel Note Edited: 0005894 View Revisions
28.07.2013 00:04 Springheel Note Added: 0005895
28.07.2013 20:55 Springheel Note Added: 0005897
28.07.2013 20:59 Springheel Note Edited: 0005897 View Revisions
01.09.2014 01:33 Springheel Assigned To Springheel => SteveL
01.09.2014 13:25 SteveL Note Added: 0006921
01.09.2014 13:27 SteveL Status assigned => feedback
01.09.2014 13:29 SteveL Note Edited: 0006921 View Revisions
01.09.2014 13:34 Springheel Note Added: 0006922
01.09.2014 13:34 Springheel Status feedback => assigned
01.09.2014 13:41 Springheel Note Edited: 0006922 View Revisions
01.09.2014 14:03 SteveL Target Version => TDM 2.03
02.09.2014 00:16 SteveL Note Added: 0006924
02.09.2014 00:20 SteveL Note Edited: 0006924 View Revisions
02.09.2014 00:43 Springheel Note Added: 0006925
02.09.2014 08:29 SteveL Note Added: 0006929
02.09.2014 18:50 SteveL Note Added: 0006930
02.09.2014 18:50 SteveL Relationship added related to 0003605
02.09.2014 23:56 Springheel Note Added: 0006932
02.09.2014 23:57 Springheel Note Edited: 0006932 View Revisions
03.09.2014 00:32 SteveL Note Added: 0006933
03.09.2014 00:33 SteveL Note Edited: 0006933 View Revisions
03.09.2014 19:54 Springheel Note Added: 0006936
03.10.2014 18:46 SteveL Note Added: 0007041
03.10.2014 19:10 SteveL Note Added: 0007042
03.10.2014 19:47 SteveL Note Added: 0007044
03.10.2014 19:48 SteveL File Added: animchannel_head.gif
03.10.2014 21:38 Springheel Note Added: 0007045
04.10.2014 00:03 SteveL Note Added: 0007046
04.10.2014 00:28 Springheel Note Added: 0007047
04.10.2014 00:41 Springheel Note Edited: 0007047 View Revisions
23.11.2014 11:40 SteveL Target Version TDM 2.03 => TDM 2.04
30.12.2015 15:57 SteveL Target Version TDM 2.04 => TDM 2.05
22.11.2016 20:25 nbohr1more Product Version => SVN
22.11.2016 20:25 nbohr1more Target Version TDM 2.05 => TDM 2.06
15.02.2017 04:36 grayman Assigned To SteveL =>
15.02.2017 04:36 grayman Status assigned => new
17.09.2017 20:50 nbohr1more Target Version TDM 2.06 => TDM 2.07
09.06.2018 18:40 nbohr1more Target Version TDM 2.07 =>
01.09.2019 22:25 VanishedOne Note Added: 0011831