Entropy-constrained overcomplete-based coding of
Author : mitsue-stanley | Published Date : 2025-05-17
Description: Entropyconstrained overcompletebased coding of natural images André F de Araujo Maryam Daneshi Ryan Peng Stanford University Outline Motivation Overcompletebased coding overview Entropyconstrained overcompletebased coding
Presentation Embed Code
Download Presentation
Download
Presentation The PPT/PDF document
"Entropy-constrained overcomplete-based coding of" is the property of its rightful owner.
Permission is granted to download and print the materials on this website for personal, non-commercial use only,
and to display it on your personal computer provided you do not modify the materials and that you retain all
copyright notices contained in the materials. By downloading content from our website, you accept the terms of
this agreement.
Transcript:Entropy-constrained overcomplete-based coding of:
Entropy-constrained overcomplete-based coding of natural images André F. de Araujo, Maryam Daneshi, Ryan Peng Stanford University Outline Motivation Overcomplete-based coding: overview Entropy-constrained overcomplete-based coding Experimental results Conclusion Future work Motivation (1) Study of new (and unusual) schemes for image compression Recently, new methods have been developed using the overcomplete approach Restricted scenarios for compression Did not fully exploit this approach’s characteristics for compression Motivation (2) Why? Sparsity on coefficients better overall RD Overcomplete coding: overview (1) K > N implies: Bases are not linearly independent Example: 8x8 blocks: N = 64 basis functions are needed to span the space of all possible signals Overcomplete basis could have K = 128 Two main tasks: Sparse coding Dictionary learning Overcomplete coding: overview (2) Sparse coding (“atom decomposition”) Compute the representation coefficients x based on the signal y (given) and dictionary D (given) overcomplete D Infinite solutions approxim. Commonly used algorithms: Matching Pursuits (MP), Orthogonal Matching Pursuits (OMP) Overcomplete coding: overview (3) Sparse coding (OMP) Overcomplete coding: overview (4) Dictionary learning Two basic stages (analogy with K-means) Sparse coding stage: use a pursuit algorithm to compute x (OMP is usually employed) Dictionary update stage: adopt a particular strategy for updating the dictionary Convergence issues: as first stage does not guarantee best match, cost can increase and convergence cannot be assured Overcomplete coding: overview (5) Dictionary learning Most relevant algorithms in the literature: K-SVD and MOD Sparse coding stage is done in the same way Codebook update stage is different: MOD Update entire dictionary using optimal adjustment for a given coefficients matrix K-SVD Update each basis one at a time using SVD formulation Introduces change in dictionary and coefficients Entropy-const. OC-based coding (1) Entropy-const. OC-based coding (2) Entropy-const. OC-based coding (3) RD-OMP Entropy-const. OC-based coding (4) EC Dictionary Learning – key ideas Dictionary update strategy K-SVD modifies dictionary and coefficients - reduction in Lagrangian cost is not assured. We use MOD, which provides the optimal adjustment assuming fixed coefficients Introduction of “Rate cost update” stage Analogous to ECVQ algorithm for training data Two pmfs must be updated: indexes and coefficients Entropy-const. OC-based coding (5) EC-Dictionary Learning Experiments (Setup) Rate calculation: optimal codebook (entropy) for each subband Test images: Lena, Boats, Harbour, Peppers Training dictionary experiments Training data: 18 Kodak downsampled (to 128x128) images (does not include images being coded) Use of downsampled images to 128x128, due to very high computational complexity