A framework for video-driven crowd synthesis.
Abstract
We present a framework for video-driven crowd synthesis. The proposed framework
employs motion analysis techniques to extract inter-frame motion vectors from the exemplar
crowd videos. Motion vectors collected over the duration of the video are processed
to compute global motion paths. These paths encode the dominant motions observed
during the course of the video. These paths are then fed into a behavior-based crowd simulation
framework, which is responsible for synthesizing crowd animations that respect
the motion patterns observed in the video. Our system synthesizes 3D virtual crowds
by animating virtual humans along the trajectories returned by the crowd simulation
framework. We also propose a new metric for comparing the \visual similarity" between
the synthesized crowd and exemplar crowd. We demonstrate the proposed approach on
crowd videos collected under di fferent settings and the initial results appear promising.