Title
Tackling Peer-to-Peer Discrimination in the Sharing Economy
Abstract
Sharing economy platforms such as Airbnb and Uber face a major challenge in the form of peer-to-peer discrimination based on sensitive personal attributes such as race and gender. As shown by a recent study under controlled settings, reputation systems can eliminate social biases on these platforms by building trust between the users. However, for this to work in practice, the reputation systems must themselves be non-discriminatory. In fact, a biased reputation system will further reinforce the bias and create a vicious feedback loop. Given that the reputation scores are generally aggregates of ratings provided by human users to one another, it is not surprising that the scores often inherit the human bias. In this paper, we address the problem of making reputation systems on sharing economy platforms more fair and unbiased. We show that a game-theoretical incentive mechanism can be used to encourage users to go against common bias and provide a truthful rating about others, obtained through a more careful and deeper evaluation. In situations where an incentive mechanism can’t be implemented, we show that a simple post-processing approach can also be used to correct bias in the reputation scores, while minimizing the loss in the useful information provided by the scores. We evaluate the proposed solution on synthetic and real datasets from Airbnb.
Year
DOI
Venue
2020
10.1145/3394231.3397926
WebSci '20: 12th ACM Conference on Web Science Southampton United Kingdom July, 2020
DocType
ISBN
Citations 
Conference
978-1-4503-7989-2
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Naman Goel1113.60
Maxime Rutagarama200.34
Boi Faltings33586331.33