Title
Compressive-Sensing-Based Video Codec By Autoregressive Prediction And Adaptive Residual Recovery
Abstract
This paper presents a compressive-sensing-(CS-) based video codec which is suitable for wireless video system requiring simple encoders but tolerant, more complex decoders. At the encoder side, each video frame is independently measured by block-based random matrix, and the resulting measurements are encoded into compressed bitstream by entropy coding. Specifically, to reduce the quantization errors of measurements, a nonuniform quantization is integrated into the DPCM-based quantizer. At the decoder side, a novel joint reconstruction algorithm is proposed to improve the quality of reconstructed video frames. Firstly, the proposed algorithm uses the temporal autoregressive (AR) model to generate the Side Information (SI) of video frame, and next it recovers the residual between the original frame and the corresponding SI. To exploit the sparse property of residual with locally varying statistics, the Principle Component Analysis (PCA) is used to learn online the transform matrix adapting to residual structures. Extensive experiments validate that the joint reconstruction algorithm in the proposed codec achieves much better results than many existing methods with consideration of the reconstructed quality and the computational complexity. The rate-distortion performance of the proposed codec is superior to the state-of-the-art CS-based video codec, although there is still a considerable gap between it and traditional video codec.
Year
DOI
Venue
2015
10.1155/2015/562840
INTERNATIONAL JOURNAL OF DISTRIBUTED SENSOR NETWORKS
Field
DocType
Volume
Entropy encoding,Computer science,Adaptive Multi-Rate audio codec,Algorithm,Real-time computing,Residual frame,Encoder,Intra-frame,Quantization (signal processing),Deblocking filter,Codec,Distributed computing
Journal
11
ISSN
Citations 
PageRank 
1550-1477
2
0.41
References 
Authors
31
4
Name
Order
Citations
PageRank
ran li120.75
Hongbing Liu2598.74
rui xue320.41
Yan-ling Li4214.28