In this paper, we present a stereovision algorithm for real-time 6DoF ego-motion estimation, which integrates normalized cross-correlation-based feature tracking and 3D stereo information in the well-known Iterative Closest Point (ICP) scheme. The proposed method addresses a basic problem of standard ICP, i.e. its inability to perform the segmentation of data points and to deal with large displacements. Neither apriori knowledge of the motion nor inputs from other sensors are required, while the only assumption is that the scene always contains visually distinctive features which can be tracked over pairs of successive stereo images. This generates what is usually called Visual Odometry. The paper details the various steps of the algorithm and presents the results of experimental tests performed with an all-terrain mobile robot, proving the method to be as accurate as effective for autonomous navigation purposes.