Kaydara Knowledge Base S0118
General MOTIONBUILDER Workflow
Typical Kaydara Pipeline for Character Animation
1. Create the character in any package supporting the FBX file format, either through the plugins available from
Kaydara or from those that support exporting and importing the format natively (such as 3ds max 5.0).
2. Animate the character in MOTIONBUILDER by keyframing, applying motion capture data, or
through real-time input devices and constraints.
3. Bake the animation onto the skeleton and shape channels of the character.
4. Output:
a. Directly to video using the Kaydara ONLINE delivery system
b. Render images, AVI's or Flash using Open GL
c. Export back to the original software package in which the characters and sets were created for final rendering
d. Export to game engine*
* When you export to a game engine, you must have a custom converter from FBX to the game
engine, or the engine itself must support the FBX file format. This can be done using the
MOTIONBUILDER SDK.
Note:
When not specified, this document refers to file formats, scenes and elements in software packages OTHER than
FBX and MOTIONBUILDER.
I. Content Acquisition
FBX content can be acquired in several ways, such as from Kaydara’s FBX partners through online
content catalogues, or by exporting models or animation from specialized applications like scanners
or camera tracking software. However, the majority of users create the content themselves in a 3D
software package such as 3ds max, LightWave, Maya or Softimage|3D/XSI.
Author Tom Wilczynski
Revision January 2003, by Yang-hai Eakes
Kaydara Software MOTIONBUILDER 4.0
Description This document outlines some of the basic concepts and procedures in
the MOTIONBUILDER pipeline. It is designed as a beginner's overview
of the process and should be used with the Plugin documentation, the
Reference Guide, and the Tutorials developed for the software. Kaydara Knowledge Base S0118
You can model characters and sets in any of the software packages supporting the FBX file format,
either through the free plugins available from Kaydara, or those that support native import and
export of the FBX format. Please refer to each plugin's documentation for what is supported. Since
not all FBX plugins support the creation of elements in a blank scene on Import, you should consider
what you want to animate in MOTIONBUILDER (characters, cameras, lights, and so on), and create
all the required elements in your software package of choice prior to exporting the file to FBX.
General Workflow
The following describes the general workflow:
1. Model the Geometry
If using subdivisions, certain modifiers, or custom tools in the modeling process, you must collapse
or convert the final model into a polygonal mesh. Nurbs support differs from one software package
to another as well, so verify if nurbs or weighted nurbs are supported for exporting to FBX from
your modeling package.
2. Apply the Textures
All textures applied to models must be done through UV mapping. Procedural shaders and textures
can usually be converted to texture maps and applied using UV coordinates in software or third
party tools. Generally, multiple textures are supported when exporting to FBX, but verify the plugin
documentation for your specific package.
3. Model the Shapes
Facial or other shapes (bulging muscles, special deformations) should be modeled prior to applying
skin deformations. Most custom deformers are not supported so stay with the basic shapes
depending on your software (morph targets, blend shapes, endomorphs, and so on), and clusters.
4. Build a Skeleton
The skeleton should contain at least the following nodes:
Hips>Spine>Head
LeftLeg>LeftKnee>LeftAnkle
RightLeg>RightKnee>RightAnkle
LeftArm>LeftElbow>LeftHand
RightArm>RightElbow>RightHand
These nodes are required in the hierarchy even if some might not serve an apparent purpose or
have no geometry attached, such as the legs of a character that are always behind a desk.
All the bones should have their transform (especially scale) reset to 100/1.00 prior to skinning, since
negative scaling, which is sometimes produced by mirroring bones, can produce unpredictable
results and is not supported.
5. Apply Skin Deformations (Weighting)
Check which weighting tools are supported in each software package.
6. Create Cameras, Lights and Other Set Elements to animate and/or preview in
MOTIONBUILDER.
7. Export to FBXKaydara Knowledge Base S0118
At this point, there are no IK rigs or constraints created since this will be done in MOTIONBUILDER.
II. Animating in MOTIONBUILDER
While there are several ways to animate characters and objects in MOTIONBUILDER, they all involve
three major types of objects:
• Principal skeleton of the character
• Secondary animation on the character
• Other scene elements (such as lights, cameras, sets)
CHARACTER (main)
You must first define the character’s structure to animate it. To do this, make sure the character is
in a T-stance and standing on the ground (bottom of the foot at Y=0).
1. Create a new character (definition) in MOTIONBUILDER by dragging and dropping “Character” from the Asset
browser onto the skeleton.
2. If you do not follow the naming convention used in the Character settings or some basic bones
were missing, the characterization will not work. In this case, Alt-drag the appropriate bones into
the mapping list of the Character settings to define the basic structure of the skeleton and any
additional joints.
3. Characterize the structure and create a Control rig.
4. Set up the white foot definition markers for the floor contact.
Note:
Any bends in the limbs/joints at the time of the characterization will determine the maximum extension of the limbs
when animating the character.
Keyframing
Once the character is defined, you can activate Control Rig Input in the Character Controls window
and start keyframing the rig using FK and/or IK.
Motion from Other Sources
If you have animation stored on other characters (for example, exported from your 3D package or
skeletons obtained from content providers or certain motion files), you can easily retarget it to your
new character. Make sure that the animated characters are also defined as MOTIONBUILDER
characters (don't forget the T-stance), and simply select them from the Input menu for your main
character in the Character Controls window and then activate. Direct input will work for any defined
character.
If you are working from certain motion files such as .c3d or .trc optical motion capture files, you
first have to map the source markers to a MOTIONBUILDER actor, as follows:
1. Create a new actor in MOTIONBUILDER (drag and drop “Actor” from the Asset browser).Kaydara Knowledge Base S0118
2. Position the actor to match the markers. Use TRS transforms so that the actor matches the pose
as closely as possible (preferably a T-stance or a “motorcycle riding” position) of the real actor at
the time of the capture.
3. Drop the appropriate source markers to the corresponding body parts on the actor.
4. Activate the actor.
You can assign the actor to your character using the Input menu of the Character Controls.
Mapping from both Actor and Character Input sources should be adjusted using the different
mapping filters in the Character Settings (Navigator window) or Properties window (when the
character is selected in the Navigator window). You can use FK and IK as solving methods in
combination with various offsets to get the results you want.
Offsetting Motion
You may want to adjust or extend existing motion beyond the simple adjustments obtained with
mapping filters. At this point, you can use the Control rig to offset the motion from its original path,
as follows:
1. From the Character Controls window, select the Plot Character option (on the destination character).
2. Select Control Rig as the destination.
3. Adjust the plotting options (such as takes, rate, and filters).
4. Plot.
This will bake the animation to the Base Layer of the Control rig and allow you to offset or modify
on additional layers (Layer 1, Layer 2, and so on). Any time you want to return to the original Base
Layer, use the Zero button in the Key Controls window to set a zero offset keyframe.
In general, we recommend that you always use Flat keyframes when keyframing over existing
motion data. This is simply because when you set a normal keyframe over existing motion, you
have no guarantee that this keyframe will be at the maximum offset because your interpolation
before or after this keyframe may go to a higher or lower value than this keyframe. A Flat keyframe
forces the interpolation of this keyframe to be flat (horizontal), ensuring that you do not get an
unwanted higher or lower offset value.
Regardless of whether you keyframe from scratch, retarget motion, or keyframe over retargeted
motion, when you are satisfied with the result on your Character, plot the animation to
the skeleton. This allows you to export the animation back to your 3D software package for
rendering, or to continue using certain settings inside MOTIONBUILDER to further alter your motion,
such as TimeWarp or Motion Blend.
CHARACTER (secondary)
Any type of animation on a character’s nodes outside the Character definition (what is controlled by
the Control rig) constitutes secondary animation. For example:
• Facial animation
• Tail, wing, and prop animation
• Other shape animation (breathing, muscles, and so on)
• Driving certain animations in real time using external devices (Keyboard, Joystick, and so on)Kaydara Knowledge Base S0118
While the methods used to do these vary, they generally consist of the following:
a. Keyframing
b. Constraint animation
c. Keyframing over constraint animation
Keyframing
The simplest way to animate non-characterized elements is to select a joint or model and keyframe
it in TRS. Other animation channels, such as shapes or colors, can also be animated in the FCurves,
the Navigator, or the Properties windows.
For facial animation, you can keyframe the shape channels directly on the model using the FCurves
window. To map motion capture, mix complex expressions from simple shapes, or set up clusters
for “shape” animation, you use the Facial settings.
The general workflow for using the Facial settings follows:
1. Drag and drop a Character Face from the Asset browser.
2. If using clusters, Alt-drag the desired cluster nodes into the Cluster Shapes Creation pane, snap
the Rest pose, and then snap virtual shapes from different cluster positions.
3. If using shapes, Alt-drag the appropriate models into the Shapes Mapping pane.
4. Define Generic and User expressions from the basic shapes or virtual cluster shapes by mixing
them in desired percentages.
5. Drag and drop an Actor Face from the Asset browser.
6. Select the Actor Face as an input Source for the Character Face.
You can now keyframe all of the channels defined for the Character Face on the Actor Face, map
optical facial motion capture to your shapes or clusters, or constrain any channel in a Relation
constraint.
Note:
The Facial settings can be used on any model setup with shapes or clusters, not only face models.
Constraint Animation
Constraint animation is an extremely powerful feature in MOTIONBUILDER and lends itself greatly
to secondary animation. Different channels (TRS, shape, color, weight) can be constrained to
devices (Voice, Keyboard, Mouse, Midi, and so on) or expressions inside Relations constraints. You
can also use a combination of constraints (for example: IK and Relations to set up damping effects
on wings or tails of characters), allowing you to achieve more realistic movements without doing all
the work from scratch. The general procedure for working with Relations constraints follows:
1. If you plan on animating channels/values other than TRS or shapes, activate its Animate (A) button.
2. Drag and drop a Relations constraint from the Asset browser.
3. Drop the element containing the properties you want to constrain into the Relation constraint as
a Receiver.Kaydara Knowledge Base S0118
4. Drop a source. This could be a device, a Source operator, another constraint as a Macro, or
another element in the scene from which you want to drive the constraint.
5. Drop and connect the appropriate operators between the Sender and the Receiver.
Note:
When activating constraints other than a Relation, you can snap an offset between the source and the receiver
elements, preserving the desired distances between the objects. However, Relations constraints take the data “as
is” in the flow between the source and the receiver. Any desired offsets have to be added though the appropriate
operators.
Once you achieve your desired results, you must plot the animation to all the appropriate
objects:
1. Select all the objects affected by the constraints.
2. Select Plot All Properties from the Animation menu in the Key Controls window.
To verify that the animation transferred correctly, disable all the constraints used and play the scene.
If the result is not what you expected, you may have not selected all the objects affected by the
constraints before plotting.
Note:
If a constraint is driven by a live input device, that device must first be recorded before the animation can be plotted
to the receiving elements. To record a take from a live device, the Record, Live, and Online options for this device
must be activated. Once the take is recorded, the Live option must be disabled to play back and plot the recorded
take. Once the animation is plotted on the model or bone, the Online option must be disabled to play back the
plotted animation.
Keyframing over Constraints
Once the animation has been plotted to any element or animation channel, you can keyframe over
it using layers. Remember, all plotting is always done on the Base Layer.
OTHER ELEMENTS (cameras, lights)
Other elements in the scene, such as cameras and lights, can be animated in the same ways as
secondary character animation. You can keyframe, constrain, or keyframe over constraints any
animatable property on any object. To see which fields are animatable, select the object in the
Viewer or the Navigator window (depending if it is a physical object in the scene or a property like
a shader or material), and look in the Properties window.
One important aspect to consider is the Take>Layer options for any object in the scene. An object
or parameter can be Mono or Multi Take and Mono or Multi Layer within each take. These options
can be changed in the Options menu of the Key Controls window. You should always be aware of
what you are keyframing on which take and layer.Kaydara Knowledge Base S0118
III. Plotting Animation
All animation on all objects should be plotted once the scene is finished. Character animation on
the principal skeleton can be plotted from the Character Controls window. However, if your character
has additional parts (such as wings, tails, or props) that are not driven by the character engine
itself, the safest method is to select all of the skeleton, geometry, cameras, lights, and set
elements that have animated properties, and select Plot All Properties from the
Animation menu.
IV. Output
Export back to compatible 3D software for final rendering.
Once all the animation has been plotted to the objects in the scene and the FBX file is saved, you
can use the FBX plugins to import that file back into your favorite 3D package for embellishment
(dynamics, particles, shaders, and so on) and final rendering.
This step is best done when merging back the animation into the original scene, as opposed to
creating all the elements in an empty scene. Some elements, mainly geometry weighting, cannot
currently be created in certain packages. Also, certain FBX plugins do not currently support the
creation of any elements in an empty scene. Merging back will only import animation for objects in
the scene, leaving all the original elements (geometry, weighting, and so on) unchanged. This is
the safest method of importing the animation into compatible 3D software packages.
Export to Game Engine
If your game engine supports the FBX file format, you can save the FBX file in various ways that let
you import it directly into your engine. For example, you can:
1. Select only the skeleton of the character you just animated.
2. Select Save Selection from the File menu.
3. Select which animation takes to export.
4. Activate the One take per file option.
MOTIONBUILDER will save only the skeleton with its animation into separate FBX files for each
selected take of animation in the scene. This is a quick way to split animation clips from a master
scene into a library of motion files.
OpenGL Render
You can create AVI, TIF, TIFF, TGA and MOV renders that use all the available rendering effects to
produce broadcast quality images without having to go through a lengthy ray-tracing process.
You can also render vector-based Flash movies, which are fully editable inside Macromedia's Flash
application.Kaydara Knowledge Base S0118
Direct to Video
You can use Kaydara ONLINE as a superior method to deliver live, interactive content directly to
video or live broadcast and skip the render process entirely. You simply play the scene in real time
while recording it on an external VTR. The system can even automatically control or be controlled
by external decks through integrated timecode support.
Note:
When exporting MOTIONBUILDER 4.x content to ONLINE 3.5.1, certain constraints and shader types are not
supported. Make sure you verify the scene once in ONLINE and repair any broken connections, if necessary. Any
plotted animation will come into ONLINE without problems. This will be resolved when ONLINE 4.0 is released.
|
2014년 5월 28일 수요일
[Link] MOTIONBUILDER Workflow
피드 구독하기:
댓글 (Atom)
댓글 없음:
댓글 쓰기