Image Compression Signals and image processing by computer Winter 2012-13 Yael Erez
Feb 16, 2016
Image Compression
Signals and image processing by computerWinter 2012-13
Yael Erez
Binary Images• Bit Map
• Very simple• Representation size = image size• Large memory
1001010001000110
Coding Scheme
Encode
Decode
image code
imagecode
Binary Images Encoding
• Run Length• Very efficient for some images. Less efficient for
others.
Binary Images Encoding
• Chain Code• Begin from some pixel on the contour and decode
directions (clockwise). (interior is full).• Small code, but complicated to decode and
encode• Same image – several codes
02
1
3
0
213
4
56
7
Entropy• Image x with L gray levels, and normalized
histogram values • Measure of uncertainty (surprise):• Entropy:
1
02 ))((log)(}{
L
k
khkhxH
)(kh
symbolbits
Entropy=7.4451
))((log2 kh
Entropy• Uniform distribution:
• P(1)=1
LLL
LkhkhxHL
k22
1
02 log1log1*))((log)(}{
01log*1))((log)(}{ 2
1
02
L
k
khkhxH
Entropy Encoding• Symbol 0 1 2 3• Code 00 01 10 11• Mean code length 2 bits/sample
• H(x) 0.5 0.3 0.1 0.1• Entropy 1.6855
bits/sample
• How can we reduce the mean code length?
Huffman Coding• Symbol: 0 1 2 3• H(x) 0.5 0.3 0.1 0.1• Entropy 1.6855
bits/sample
• Huffman 0 10 110 111• Huffman mean code length 1.7 bits/sample
• Prefix code
• Create dictionary:Huffman Binary Tree
1
0
2
3
0.3
0.5
0.10.1
0.2
0.5
1
0
1
0
01
1
Symbol prob
code0
10
110
111
• How can we decode?
Prediction• Pixels are not independent!
Entropy=2.6276
• Huffman encoding yields 2.6466 bits/sample
Differential Encoding
Prediction
image Compressed image- Encodin
g
Prediction
Compressed image image+Decodin
g
• Very sensitive to errors!
SummaryCode length
WSWG
Entropy encoding
Prediction,Transforms + Entropy encoding
Lossy compression
DPCM
DPCM
Simplified JPEG
DCT8x8
blocks
Quantizer
Entropy encodin
gQ factor 1. RLE
2. Entropy encoding