Files

Abstract

In this talk we present work on sensor-based motion planning in initially unknown dynamic environments. Motion detection and probabilistic motion modeling are combined with a smooth navigation function to perform on-line path planning and replanning in cluttered dynamic environments such as public exhibitions. Human behavior is unforeseeable in most situations that include human-robot interaction, e.g. service robots or robotic companions. This makes motion prediction problematic (they rarely move e.g. with constant velocity along straight lines), especially in settings which include large numbers of humans. Additionally, the robot is usually required to react swiftly rather than optimally, in other words the time required to calculate the plan becomes part of the optimality criterion. The "Probabilistic Navigation Function" (PNF) is an approach for planning in these cluttered dynamic environments. It relies on probabilistic worst-case computations of the collision risk and weighs regions based on that estimate. The PNF is intended to be used for gradient-descent control of a vehicle, where the gradient indicates the best trade-off between risk and detour. An underlying reactive collision avoidance provides the tight perception-action loop to cope with the remaining collision probability. As this is work in progress, we present the approach and describe finished components and give an outlook on remaining implementation issues. Two algorithmic building blocks have been developed and tested: On-line motion detection from a mobile platform is performed by the SLIP scan alignment method to separate static from dynamic objects (it also helps with pose estimation). The interface between motion detection and path planning is a probabilistic co-occurrence estimation measuring the risk of future collisions given environment constraints and worst-case scenarios, which unifies dynamic and static elements. The risk is translated into traversal costs for an E* path planner, which produces smooth navigation functions that can incorporate new environmental information in near real-time.

Details

PDF