Title
Simulated annealing for optimization of graphs and sequences
Abstract
Optimization of discrete structures aims at generating a new structure with the better property given an existing one, which is a fundamental problem in machine learning. Different from the continuous optimization, the realistic applications of discrete optimization (e.g., text generation) are very challenging due to the complex and long-range constraints, including both syntax and semantics, in discrete structures. In this work, we present SAGS, a novel Simulated Annealing framework for Graph and Sequence optimization. The key idea is to integrate powerful neural networks into metaheuristics (e.g., simulated annealing, SA) to restrict the search space in discrete optimization. We start by defining a sophisticated objective function, involving the property of interest and pre-defined constraints (e.g., grammar validity). SAGS searches from the discrete space towards this objective by performing a sequence of local edits, where deep generative neural networks propose the editing content and thus can control the quality of editing. We evaluate SAGS on paraphrase generation and molecule generation for sequence optimization and graph optimization, respectively. Extensive results show that our approach achieves state-of-the-art performance compared with existing paraphrase generation methods in terms of both automatic and human evaluations. Further, SAGS also significantly outperforms all the previous methods in molecule generation.
Year
DOI
Venue
2021
10.1016/j.neucom.2021.09.003
Neurocomputing
Keywords
DocType
Volume
Sequence optimization,Simulated annealing,Graph optimization,Paraphrase generation
Journal
465
ISSN
Citations 
PageRank 
0925-2312
0
0.34
References 
Authors
0
8
Name
Order
Citations
PageRank
Xianggen Liu132.07
Pengyong Li211.36
Fandong Meng33119.11
Hao Zhou400.34
Huasong Zhong500.34
Jie Zhou62103190.17
Lili Mou752033.31
Sen Song829922.35