SOURCE CODING PROF. A.M.ALLAM Huffman Code Application Lossless Image Compression A simple application of Huffman coding of image compression which would be : Generation of a Huffman code for the set of values that any pixel may take For monochrome images a set usually consists of integers from 0 to 255 Lecture7: Huffman Code
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
SOURCE CODING PROF. A.M.ALLAM
Huffman Code Application
Lossless Image Compression
A simple application of
Huffman coding of
image compression
which would be :
Generation of a
Huffman code for the
set of values that any
pixel may take
For monochrome
images a set usually
consists of integers
from 0 to 255
Lecture7: Huffman Code
SOURCE CODING PROF. A.M.ALLAM
11/8/2016 2 LECTURES
Huffman Code Application
1. Encode the image using Huffman code
2. Save it in a file
The original (uncompressed) image representation uses 8 bits/pixel. The image consists of 256
rows of 256 pixels, so the uncompressed representation uses 65,536 bytes
Steps to have lossless image compression
3. Generate a Huffman code for compressed image
5. store it in a file again
6. Determine the compression ratio
number of bytes (uncompressed)/ number of bytes compressed
1.The number of bytes in the compressed representation includes the number of bytes
needed to store the Huffman code
2.The compression ratio is different for different images
Notes:
Lecture7: Huffman Code
SOURCE CODING PROF. A.M.ALLAM
11/8/2016 3 LECTURES
Huffman Code Application
Image Name Bits/Pixel Total Size (B) Compression Ratio
Sena 7.01 57,504 1.14
Sensin 7.49 61,430 1.07
Earth 4.94 40,534 1.62
Omaha 7.12 58,374 1.12
Huffman (Lossless JPEG) compression based on pixel value
Image Name Bits/Pixel Total Size (B) Compression Ratio
Sena 4.02 32,968 1.99
Sensin 4.7 38,541 1.70
Earth 4.13 33,880 1.93
Omaha 6.42 52,643 1.24
Huffman Compression Based on pixel difference value between the pixel
and its neighbor
Lecture7: Huffman Code
SOURCE CODING PROF. A.M.ALLAM
4
Huffman Code Application
Image Name Bits/Pixel Total Size (B)
Compression Ratio
Sena 3.93 32,261 2.03
Sensin 4.63 37,896 1.73
Earth 4.82 39,504 1.66
Omaha 6.39 52,321 1.25
Huffman compression based on pixel difference value and adaptive model
Lecture7: Huffman Code
In the end, the particular application will determine which approach
is more suitable
Notice that there is little difference between the performance of adaptive
Huffman code and Huffman coder
Adaptive Huffman coder can be used as an on line or real time coder makes the
adaptive Huffman coder amore attractive option in many applications
However, adaptive Huffman coder is more subjected to errors and may also be
more difficult to implement
SOURCE CODING PROF. A.M.ALLAM
Huffman Code Application
Text Compression
-The probabilities in the left table 3 are the probabilities of the 26 letters obtained for the U.S. Constitution and
are representative of English text
-The probabilities in the right table ere obtained by counting the frequency of occurrences of letters in an
earlier version of some chapter
-While the two documents are substantially different, the two sets of probabilities are very much alike
-Text compression seems natural for Huffman coding. In text, we have a discrete alphabet that, in a given
class, has relatively stationary probabilities
Lecture7: Huffman Code
SOURCE CODING PROF. A.M.ALLAM
11/8/2016 6 LECTURES
Huffman Code Application
Another class of data that is very suitable for compression is CD quality audio data. The
audio signal for each stereo channel is sampled at 44.1 kHz, and each sample is represented
by 16 bits
Audio Compression
The three segments used in this example represent a wide variety of audio material, from
symphonic pieces as nominated
File Name
Original File Size (bytes)
Entropy (bits)
Estimated Compressed File Size(bytes)
Compression Ratio
Mozart 939.862 12.8 725.420 1.3
Cohn 402.442 13.8 349.300 1.15
Mir 884.020 13.7 759.540 1.16
Lecture7: Huffman Code
SOURCE CODING PROF. A.M.ALLAM
11/8/2016 7 LECTURES
Tunstall Code
It is clear that Huffman code encodes letters from the source alphabet
using codewords with varying numbers of bits codewords with fewer
bits for letters that occur more frequently and codewords with more bits
for letters that occur less frequently
On the other hand, errors in codewords propagate, error in one
codeword will cause a series of errors to occur
Tunstall Code
It encodes letters such that each group of different letters
from the source are encoded to codewords of equal length
It is variable to fixed mapping
Lecture7: Tunstall Code
It is fixed to variable mapping
SOURCE CODING PROF. A.M.ALLAM
11/8/2016
Tunstall Code
-Start with the N letters of the source alphabet
-Remove the entry with highest probability
-Add N string obtained by concatenating this letter with every letter in the alphabet (including
itself), this will increase the size from N to N+(N-1)
-Calculated the probabilities of the new entries
-Select the entry with the highest probability and repeat until
the size reaches 2L, i.e., it is repeated k times until