UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Pedestrian intent estimation through visual attention and time and memory conscious u-shaped networks for training neural radiance fields Kuganesan, Abiramy

Abstract

When people cross the street, they make a series of movements that are indicative of their attention, intention, and comprehension of the roadside environment. These patterns in attention are linked to individual characteristics which are often neglected by autonomous vehicle prediction algorithms. We make two strides towards more personalized pedestrian modelling. First, we design an outdoor data study to collect behavioural signals such as pupil, gaze, head, and body orientation from an ego-centric human point of view. We gather this data over a range of diverse variables including age, gender, geographical context, crossing type, time of day, and the presence of a companion. In order for simulation engines to be able to leverage such dense data, efficient 3D human and scene reconstruction algorithms must be available. The increased resolution and model-free nature of Neural Radiance Fields for large scene reconstruction and human motion synthesis come at the cost of high training times and excessive memory requirements. Little has been done to reduce the resources required at training time in a manner that supports both dynamic and static tasks. Our second contribution takes the form of an efficient method which provides a reduction of the memory footprint, improved accuracy, and reduced amortized processing time both during training and inference. We demonstrate that the conscious separation of view-dependent appearance and view-independent density estimation improves novel view synthesis of static scenes as well as dynamic human shape and motion. Further, we show that our method, UNeRF, can be used to augment other state-of-the-art reconstruction techniques to further accelerate and enhance the improvements which they present.

Item Citations and Data

Rights

Attribution-NonCommercial-ShareAlike 4.0 International