Choosing the best out of an increasing number of options requires reliable and accurate information. As our time and resources are limited, we commonly use the experience of others in order to take decisions. Reputation mechanisms aggregate in a formal way the feedback collected from peers and compute the ``reputation'' of products, services, or providers. They enjoy huge success and are believed to be the key of the agent mediated commerce of tomorrow. Obtaining honest feedback from self-interested agents is not a trivial problem. Mechanisms based on side-payments can be conceived such that honest reporting becomes rational (i.e. Nash equilibrium). Unfortunately, for every incentive-compatible Nash equilibrium there seems to also be a dishonest Nash equilibrium strategy that sometimes is more attractive. In this paper we analyze two incentive-compatible reputation mechanisms and investigate how undesired equilibrium points can be eliminated by using trusted (i.e. true) reports.