Title
MetaMove: On Improving Human Mobility Classification and Prediction via Metalearning
Abstract
Despite offering efficient solutions to a plethora of novel challenges, existing approaches on mobility modeling require a large amount of labeled data when training effective and application-specific models. This renders them inapplicable to certain scenarios, where only a few samples are observed, and data types are unseen during training. To address these issues, we present a novel mobility learning method—MetaMove, the first metalearning-based model generalizing mobility prediction and classification in a unified framework. MetaMove deals with the problem of training for unseen mobility patterns by generalizing from the known patterns. It trains the model over a variety of patterns sampled from different users and optimizes it on their distribution. To update and fine tune the individual pattern learners, we employ a fast adapting model-agnostic method for very few available trajectory samples. MetaMove exploits unlabeled trajectory data at both metatraining and adaptation levels, thereby alleviating the problem of data sparsity while enforcing less sensitivity to negative samples. We conducted extensive experiments to demonstrate its effectiveness and efficiency on two practical applications—motion trace discrimination and next check-in prediction. The results demonstrated significant improvements of MetaMove over the state-of-the-art benchmarks.
Year
DOI
Venue
2022
10.1109/TCYB.2021.3049533
IEEE Transactions on Cybernetics
Keywords
DocType
Volume
Algorithms,Humans
Journal
52
Issue
ISSN
Citations 
8
2168-2267
0
PageRank 
References 
Authors
0.34
26
4
Name
Order
Citations
PageRank
Fan Zhou110123.20
Xin Liu200.34
Tings Zhong300.34
Goce Trajcevski41732141.26