Measurement coding for compressive imaging using a structural measuremnet matrix

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

41 Scopus citations

Abstract

Compressive imaging can acquire image signal in an under-sampled (i.e., under Nyquist rate) representation called measurement. However, measurement compression still has an essential problem in its overall rate-distortion performance. In this paper, we propose a measurement prediction method in which the best predictor is directionally selected in order to reduce the entropy of measurement to be sent. Generally, the measurement prediction usually works well with a small block while the quality of recovery is known to be better with a large block. In order to overcome this dilemma, we propose to use a structural measurement matrix with which compressive sensing is done in a small block size but recovery is performed in a large block size. In this way, both prediction and recovery are expected to be improved at the same time. Experimental results show its superiority in measurement coding amounting up to bitrate reduction by 39 %.

Original languageEnglish
Title of host publication2013 IEEE International Conference on Image Processing, ICIP 2013 - Proceedings
PublisherIEEE Computer Society
Pages10-13
Number of pages4
ISBN (Print)9781479923410
DOIs
StatePublished - 2013
Event2013 20th IEEE International Conference on Image Processing, ICIP 2013 - Melbourne, VIC, Australia
Duration: 15 Sep 201318 Sep 2013

Publication series

Name2013 IEEE International Conference on Image Processing, ICIP 2013 - Proceedings

Conference

Conference2013 20th IEEE International Conference on Image Processing, ICIP 2013
Country/TerritoryAustralia
CityMelbourne, VIC
Period15/09/1318/09/13

Keywords

  • compressive imaging
  • measurement prediction
  • structural measurement matrix

Fingerprint

Dive into the research topics of 'Measurement coding for compressive imaging using a structural measuremnet matrix'. Together they form a unique fingerprint.

Cite this