Impact of Trust Management and Information Sharing to Adversarial Cost in Ranking Systems

Ranking systems such as those in product review sites and recommender systems usually use ratings to rank favorite items based on both their quality and popularity. Since higher ranked items are more likely selected and yield more revenues for their owners, providers of unpopular and low quality items have strong incentives to strategically manipulate their ranking. This paper analyzes the adversary cost for manipulating these rankings in a variety of scenarios. Particularly, we analyze and compare the adversarial cost to attack ranking systems that use various trust measures to detect and eliminate malicious ratings to systems that use no such trust mechanism. We provide theoretical results showing the relation between the capability of the trust mechanism in detecting malicious ratings and the minimum adversarial cost for successfully changing the ranking. Furthermore, we study the impact of sharing trust information between ranking systems to the adversarial cost. It is proved that sharing information between two ranking systems on common user identities and malicious behaviors detected can significantly increase the minimum adversarial cost to successfully attack any of them under certain assumptions. The numerical evaluation of our results shows that the estimated adversary cost for manipulating the item ranking can be made significant when proper trust mechanisms are employed or combined.

Published in:
Proc. of the 4th IFIP International Conference on Trust Management (IFIPTM 2010)
Presented at:
4th IFIP WG 11.11 International Conference on Trust Management (IFIPTM 2010), Morioka, Iwate, Japan, June 14-18, 2010

 Record created 2010-02-25, last modified 2018-03-17

Download fulltextPDF
External link:
Download fulltextURL
Rate this document:

Rate this document:
(Not yet reviewed)