We present a novel approach for interactively synthesizing motions for characters navigating in complex environments. We focus on the runtime efficiency for motion generation, thereby enabling the interactive animation of a large number of characters simultaneously. The key idea is to precompute search trees of motion clips that can be applied to arbitrary environments. Given a navigation goal relative to a current body position, the best available solution paths and motion sequences can be efficiently extracted during runtime through a series of table lookups. For distant start and goal positions, we first use a fast coarse-level planner to generate a rough path of intermediate sub-goals to guide each iteration of the runtime lookup phase.
We demonstrate the efficiency of our technique across a range of examples in an interactive application with multiple autonomous characters navigating in dynamic environments. Each character responds in real-time to arbitrary user changes to the environment obstacles or navigation goals. The runtime phase is more than two orders of magnitude faster than existing planning methods or traditional motion synthesis techniques. Our technique is not only useful for autonomous motion generation in games, virtual reality, and interactive simulations, but also for animating massive crowds of characters offline for special effects in movies.
past head
- James Kuffner
- Manfred Lau
past contact
- Manfred Lau