top of page

A method for remapping blendShapes between characters.

My facial rig design is only dependent on 4 things.


  1. The character model

  2. The guide placement

  3. The weight maps

  4. The sculpts/blendShapes


Models:

I'll ignore the character model for this post but there are important considerations for speeding up workflow.


Guides:

I have a method of transferring guides that's basically a simpler version of this.


Weights:

Most of the setups I use on my facial rigs are parameterized, as a result the weighting from one character transfers well to another character, so I have a good starting point for weighting rather than starting from scratch.


Sculpts:

Sculpts (blendShape deltas) are much harder to reuse between characters. Unlike weights they have 3 values which makes them more complex, and they are usually used to create very precise deformations that weighting and regular deformers can't. But worst of all they're stored in worldspace.


Take this sphere for example. The red wireframe represents the blendShape target and the blue/black curves represent the deltas for each vertex.

But I apply those deltas to a sphere that has been rotated 45 degrees then they don't give the desired result.

That issue rears it's head if your character has a differently angled jaw or forehead, or the plane that the mouthCorner moves in is different. Basically even the simplest design differences in the model make the deltas nearly unusable, which is a big problem because doing sculpts is one of the most costly parts of facial rigging (in terms of time).


Solutions:

So instead of just directly copying the worldspace deltas you'll want to do some sort of conversion on them. One of the simplest things you can implement is a simple scaler, which just scales all the deltas by a single value. This helps when copying deltas between characters of different scales. You could try and get fancy by scaling the individual deltas based on the length of the edges that surround the vertex the delta modifies, but in my early tests that proved to be too "noisy" meaning adjacent vertices could have very different scales applied to them. I'm sure there's more complex methods that would be better but scaling everything by one value is great and requires no dev time.


So that's the delta length handled but what about the orientation? The first thing that I tested was using the tangent-space of the mesh. So for each vertex we get the normal and two tangents and use that to create a "parent space" for the delta, and calculate the delta relative to that parent space. Then we take that relative delta and put it in the "parent space" of the vert on the other mesh to get a world space delta that has (sort of) rotated with the mesh.

That works alright on very simple meshes but on a face where you've got tight areas like the lips, eyelids, and wrinkles, you'll often get a result that doesn't match your expectations. The problem is the conversion is again, too "noisy". The parent space doesn't change gradually from vert to vert across the face, and small changes to vertex positions can change the normal and tangents a lot. There are also many tangents to choose from (but that's getting into the weeds a bit). Here's the normals and tangents around an eye corner, the verts on the top and bottom lid look very different. Transfers using this method tend to move the points in random directions sometimes, no good.


What we're looking for is a relative space for each vertex that's smooth instead of noisy, but still reflects the proportions of the character. In a UVN-centric system like the one I've talked about you can just use that same surface. You have a nurbs surface in the shape of the mouth region that's smooth, same thing for the brow. So you can just calculate the deltas relative to that surface and use that to improve the transfer between characters.


An example:


Here we have hippydrome and his twin that was exposed to radiation, zippydrome. Each of them have a a surface that would be used for the UVN deformer on their mouth.



now if hippydrome wants to give you a kiss it's easy, but for zippydrome with standard blendShape transferring, he's a bad kisser. but once we use the surface to convert the deltas, he's looking good.

This is just a quick writeup about this idea, because it seems quite promising.


Featured Posts
Recent Posts
Archive
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page