Title
Convolution by Evolution: Differentiable Pattern Producing Networks.
Abstract
In this work we introduce a differentiable version of the Compositional Pattern Producing Network, called the DPPN. Unlike a standard CPPN, the topology of a DPPN is evolved but the weights are learned. A Lamarckian algorithm, that combines evolution and learning, produces DPPNs to reconstruct an image. Our main result is that DPPNs can be evolved/trained to compress the weights of a denoising autoencoder from 157684 to roughly 200 parameters, while achieving a reconstruction accuracy comparable to a fully connected network with more than two orders of magnitude more parameters. The regularization ability of the DPPN allows it to rediscover (approximate) convolutional network architectures embedded within a fully connected architecture. Such convolutional architectures are the current state of the art for many computer vision applications, so it is satisfying that DPPNs are capable of discovering this structure rather than having to build it in by design. DPPNs exhibit better generalization when tested on the Omniglot dataset after being trained on MNIST, than directly encoded fully connected autoencoders. DPPNs are therefore a new framework for integrating learning and evolution.
Year
DOI
Venue
2016
10.1145/2908812.2908890
GECCO
Keywords
DocType
Volume
CPPNs, Compositional Pattern Producing Networks, denoising autoencoder, MNIST
Conference
abs/1606.02580
Citations 
PageRank 
References 
19
0.74
19
Authors
8
Name
Order
Citations
PageRank
Chrisantha Fernando131424.46
Dylan Banarse2190.74
Malcolm Reynolds31004.30
Frederic Besse41005.17
Pfau, David5806.76
Max Jaderberg6161454.60
Marc Lanctot7212197.97
Daan Wierstra85412255.92