Title
Scalable Learned Image Compression With A Recurrent Neural Networks-Based Hyperprior
Abstract
Recently learned image compression has achieved many great progresses, such as representative hyperprior and its variants based on convolutional neural networks (CNNs). However, CNNs are not fit for scalable coding and multiple models need to be trained separately to achieve variable rates. In this paper, we incorporate differentiable quantization and accurate entropy models into recurrent neural networks (RNNs) architectures to achieve a scalable learned image compression. First, we present an RNN architecture with quantization and entropy coding. To realize the scalable coding, we allocate the bits to multiple layers, by adjusting the layer-wise lambda values in Lagrangian multiplier-based rate-distortion optimization function. Second, we add an RNN-based hyperprior to improve the accuracy of entropy models for multiplelayer residual representations. Experimental results demonstrate that our performance can be comparable with recent CNN-based hyperprior methods on Kodak dataset. Besides, our method is a scalable and flexible coding approach, to achieve multiple rates using one single model, which is very appealing.
Year
DOI
Venue
2020
10.1109/ICIP40778.2020.9190704
2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP)
Keywords
DocType
ISSN
RNN-based image compression, quantization, entropy coding, RNN-based hyperprior
Conference
1522-4880
Citations 
PageRank 
References 
0
0.34
0
Authors
4
Name
Order
Citations
PageRank
Rige Su100.34
Zhengxue Cheng22810.45
Heming Sun39222.50
Jiro Katto426266.14