Graduation date: 2007
This thesis presents a model for simulating individual pedestrian motion based on empirical data. The model keeps track of a pedestrian’s position, orientation, and body configuration and leverages motion capture data to generate plausible motion. Our model can automatically incorporate a pedestrian’s physical limitations when making movement decisions, since it takes into account the current configuration of the character. Models can also be built for generating heterogeneous crowds by collecting motion capture data that includes children, the elderly, pedestrians in wheelchairs, and people on crutches. In this thesis, we present a 2D model for an able-bodied male and demonstrate the realism of our approach with a few small scale simulations and a larger crowd evacuation scenario. Furthermore, we compare the speed and density of pedestrians walking in single file to existing empirical results. The thesis concludes with a discussion of our model and offers suggestions for further research.