Title
Mercator: uncovering faithful hyperbolic embeddings of complex networks.
Abstract
We introduce Mercator, a reliable embedding method to map real complex networks into their hyperbolic latent geometry. The method assumes that the structure of networks is well described by the popularity x similarity S1/H2<i static geometric network model, which can accommodate arbitrary degree distributions and reproduces many pivotal properties of real networks, including self-similarity patterns. The algorithm mixes machine learning and maximum likelihood (ML) approaches to infer the coordinates of the nodes in the underlying hyperbolic disk with the best matching between the observed network topology and the geometric model. In its fast mode, Mercator uses a model-adjusted machine learning technique performing dimensional reduction to produce a fast and accurate map, whose quality already outperforms other embedding algorithms in the literature. In the refined Mercator mode, the fast mode embedding result is taken as an initial condition in a ML estimation, which significantly improves the quality of the final embedding. Apart from its accuracy as an embedding tool, Mercator has the clear advantage of systematically inferring not only node orderings, or angular positions, but also the hidden degrees and global model parameters, and has the ability to embed networks with arbitrary degree distributions. Overall, our results suggest that mixing machine learning and ML techniques in a model-dependent framework can boost the meaningful mapping of complex networks.
Year
DOI
Venue
2019
10.1088/1367-2630/ab57d2
NEW JOURNAL OF PHYSICS
Keywords
DocType
Volume
complex networks,network geometry,statistical inference
Journal
21
Issue
ISSN
Citations 
12.0
1367-2630
1
PageRank 
References 
Authors
0.35
0
4
Name
Order
Citations
PageRank
Guillermo García-Pérez110.35
Antoine Allard2295.05
M Ángeles Serrano325717.84
Marián Boguñá454335.14