This paper discusses mobile robot localization by means of geometric features from a laser range finder and a CCD camera. The features are line segments from the laser scanner and vertical edges from the camera. Emphasis is put on sensor models with a strong physical basis. For both sensors, uncertainties in the calibration and measurement process are adequately modeled and propagated through the feature extractors. This yields observations with their first order covariance estimates which are passed to an extended Kalman filter for fusion and position estimation. Experiments on a real platform show that opposed to the use of the laser range finder only, the multisensor setup allows the uncertainty to stay bounded in difficult localization situations like long corridors and contributes to an important reduction of uncertainty, particularly in the orientation. The experiments further demonstrate the applicability of such a multisensor localization system in real-time on a fully autonomous robot.