Show simple item record

dc.contributor.advisorQureshi, Faisal Z.
dc.contributor.authorStadler, Jordan J.
dc.date.accessioned2014-10-27T16:09:04Z
dc.date.accessioned2022-03-30T17:05:48Z
dc.date.available2014-10-27T16:09:04Z
dc.date.available2022-03-30T17:05:48Z
dc.date.issued2014-09-01
dc.identifier.urihttps://hdl.handle.net/10155/469
dc.description.abstractWe present a framework for video-driven crowd synthesis. The proposed framework employs motion analysis techniques to extract inter-frame motion vectors from the exemplar crowd videos. Motion vectors collected over the duration of the video are processed to compute global motion paths. These paths encode the dominant motions observed during the course of the video. These paths are then fed into a behavior-based crowd simulation framework, which is responsible for synthesizing crowd animations that respect the motion patterns observed in the video. Our system synthesizes 3D virtual crowds by animating virtual humans along the trajectories returned by the crowd simulation framework. We also propose a new metric for comparing the \visual similarity" between the synthesized crowd and exemplar crowd. We demonstrate the proposed approach on crowd videos collected under di fferent settings and the initial results appear promising.en
dc.description.sponsorshipUniversity of Ontario Institute of Technologyen
dc.language.isoenen
dc.subjectFrameworken
dc.subjectData-drivenen
dc.subjectCrowd analysisen
dc.subjectCrowd synthesisen
dc.titleA framework for video-driven crowd synthesis.en
dc.typeThesisen
dc.degree.levelMaster of Science (MSc)en
dc.degree.disciplineComputer Scienceen


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record