No. If, on the other hand, one or more other tags are currently active, the reactive animation engine determines the priority of each of the other active tags (step 1510) to determine if the current tag has a higher priority relative to each of the other currently active tags (step 1512). No. 09/382,819 of Comair et al filed 25 Aug. 1999 entitled âObject Modeling For Computer Simulation And Animationâ incorporated by reference herein. The dynamic animation is preferably generated using a combination of inbetweening and inverse kinematics to provide a smooth and realistic animation showing a reaction to the tag. The invention enables a character to appear as if it has âcome to lifeâ in the game environment. When an animated character walks near the “tagged” item, the animation engine can cause the character's head to turn toward the item, and mathematically computes what needs to be done in order to make the action look real and normal. For more information about timelines and clocks, see Animation and Timing System Overview. Once the character begins to move past the painting, the character's head then begins to turn naturally back (see FIG. As seen in FIG. Following is the general procedure for producing The examples in this section use the preceding objects to demonstrate several cases where the FillBehavior property doesn't behave as you might expect it to. These handles offer you more control over animation changes than simply choosing a keyframe interpolation method. Tags can also be defined such that factors other than proximity (such as timing, as in the candle/torch example above) can be used alone or in addition to proximity to cause activation of the tag. In particular, the invention provides a reactive animation system that enables game characters or other graphical characters to appear much more realistic as they interact with a virtual world in which they are displayed. The Completed event is processed first because it was triggered by the root timeline (the first Storyboard). Keyframing is the animation view is taken from the nominated camera. Upon determining that the noise is not a problem, a human would then typically resume looking at the piece of art. The TranslateTransform will be animated to move the Rectangle around the Canvas. For further details relating to system 50, see for example U.S. patent application Ser. Such animations are made possible by computer graphics. This tag prioritization feature further helps to make the character appear more realistic by enabling the character to prioritize its reactions in the same or similar way to that of a human. In this way, the character has much more human-like reactions to its environment while moving through the virtual world, and the character can be made to appear as if it has âcome to life.â. Main processor 110 interactively responds to user inputs, and executes a video game or other program supplied, for example, by external storage media 62 via a mass storage access device 106 such as an optical disk drive. This will remove all animation clocks from the property. Instead, the rectangle does not jump back; it continues moving to the right. Tag-based animation engine E may first initialize a 3D world and animation game play (block 1002), and may then accept user inputs supplied for example via handheld controller(s) 52 (block 1004). a single image, lighting is integral to producing animation sequences. In fact, the tag can be defined to cause any type of response that corresponds to any variable or role-playing element that the character may have, as well as to cause emotional and/or physical reactions. If a character 10 is in proximity to a tag T, the animation engine E reads the tag and computes (e.g., through mathematical computation and associated modeling, such as by using inbetweening and inverse kinematics) a dynamic animation sequence for the character 10 to make the character realistically turn toward or otherwise react to the tag (block 1010). No. The next technique may be used regardless of how the animation was started. 10B, example system 50 includes a video encoder 120 that receives image signals from graphics and audio processor 114 and converts the image signals into analog and/or digital video signals suitable for display on a standard display device such as a computer monitor or home color television set 56. When an object is garbage collected, its clocks will also be disconnected and garbage collected. FIG. To play a video game or other application using system 50, the user first connects a main unit 54 to his or her color television set 56 or other display device by connecting a cable 58 between the two. Main processor 110 and graphics and audio processor 114 also perform functions to support and implement the preferred embodiment tag-based animation engine E based on instructions and data Eâ² relating to the engine that is stored in DRAM main memory 112 and mass storage device 62. As the character moves out of proximity to the tagged object 12 (see FIG. with that used in the production of movies. of these then can be used in the animation sequence, by scripting them to This makes the character's animation unpredictable and greatly enhances the visual effect of the display. NINTENDO OF AMERICA,WASHINGTON, Free format text: 10/078,526, now allowed, filed Feb. 21, 2002 now U.S. Pat. When you apply a Storyboard, AnimationTimeline, or AnimationClock to a property using the ComposeHandoffBehavior, any Clock objects previously associated with that property continue to consume system resources; the timing system will not remove these clocks automatically. The second storyboard takes effect and animates from the current position, which is now 0, to 500. As explained above, the reactive animation engine E dynamically generates the character's animation to make the character react in a priority-based manner to the various tags that are defined in the environment. When working with animations in WPF, there are a number of tips and tricks that can make your animations perform better and save you frustration. When the animated character moves into proximity with an object (e.g., in response to user control), the system checks whether the object is tagged. FIG. Thus, the invention enables animation to be generated on-the-fly and in an unpredictable and realistic manner. NINTENDO OF AMERICA, WASHINGTON, Free format text: 13 is a more detailed example flow chart of steps performed by the tag-based animation engine of the instant invention; FIG. FIGS. script. 1) and into proximity to tagged object 12, the character's animation is dynamically adapted so that the character appears to be paying attention to the tagged object by, for example, facing the tagged object 12 (see FIG. Varying Lighting and Material Characteristics, (Optional) Script light sources and NINTENDO CO., LTD., JAPAN, Free format text: with scripts. This second tag is different from the first tag in that it is defined to only cause a reaction from by the character when the candle is animated to flare up like a powerful torch (see FIG. 9). While such techniques have been highly successful, animators have searched for ways to make animations more realistic without the need to control or map out each and every movement of an animated character beforehand. For instance, a human would typically stop looking at a piece of art when a loud noise comes from another object, and then quickly turn in the direction of the loud noise. Any physical, emotional or combined reaction can be defined by the tag, such as facial expressions or posture change, as well as changes in any body part of the character (e.g., position of head, shoulders, feet, arms etc.). That's because, when a Timeline is begun, the timing system makes a copy of the Timeline and uses it to create a Clock object. When you run the Storyboard, you might expect the X property of the TranslateTransform to animate from 0 to 350, then revert to 0 after it completes (because it has a FillBehavior setting of Stop), and then animate from 0 to 500. in between these keyframes. a sequence of frames. Once the torch stops flaring and returns to a normal candle, the second tag T2 is no longer active and the reactive animation engine then causes the character to again turn its attention to the painting (i.e. In one illustrative example, the tagged object 12 elicits an emotion or other reaction (e.g., fear, happiness, belligerence, submission, etc.) 5A is an example conceptual drawing showing the theory of operation of the preferred embodiment. The second Storyboard, B2, also animates the X property of the same TranslateTransform. For instance, it isn't possible to do an animation from within a for loop.