Camera Control for Learning Nonlinear Target Dynamics via Bayesian Non-Parametric Dirichlet-Process Gaussian-Process (DPGP) Models

Mobile sensors are often deployed to cooperatively perform surveillance of an environment and track objects of interest: for example, intrusion detection throughout a secure facility, and anomaly detection in manufacturing plants. In many such cases it is infeasible to provide accurate sensor coverage of the entire environment, and thus an environment model and associated controller that accounts for sensor field-of-view (FoV) geometry is necessary to determine sensor configurations that minimize uncertainty. Additionally, when detailed environment model structure is unavailable a priori, the model must be flexible enough to incorporate learned structure.
In collaboration with Prof. Silvia Ferrari’s group in the Laboratory for Intelligent Systems and Controls at Duke University, we have developed a new information value function for the DPGP that can be used to guide sensor actions. The feasibility of real-time multi-sensor control using this method was demonstrated using fixed-position pan/tilt/zoom cameras to track behaviors of iRobot Create® ground vehicles. Ongoing work aims to incorporate the notion of risk, such that high uncertainty is deemed acceptable if not mission-relevant.