Accurate registration between real and virtual objects is critical for Augmented Reality (AR) applications. State of the art shows that no tracking device is individually adequate. We present a data fusion framework that combines orientation measurements of different tracker devices. It has been designed to work with a videobased tracker subsystem and an inertial tracker. Thanks to its flexibility the system can use orientation measurements produced by any kind and number of trackers, no matter their rate or physical configuration. The core of this fusion system is a Kalman filter with only one process and measurement model shared by all the trackers. The system weights each tracker according to the quality of their measurements. We have tested the system with synthetic and real orientation data to evaluate its fusion capabilities and find its limitations. This analysis leads our future work to the development of a drift corrector and to extend the filter to make it dynamically adaptive.