Top Banner
Compression: Why? Shannon! H = ²log S^n = n log S H: information S: number of symbols n: messagelength But what if we know what to expect? 2, n = 4 has same info content as S = 16, n = 1 e 4 log 2 = 1 log 16 can e.g. code 4 bits as one hexadecimal number Entropy: m: a possible message p: probability H = p(m) ²log p(m) 4 random bits (16 r. messages): H = - 16 * 1/16 ²log 1/16 = 4 bit
9

Compression: Why? Shannon! H = ²log S^n = n log S H: information S: number of symbols n: messagelength But what if we know what to expect? So S = 2, n.

Jan 03, 2016

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Compression: Why? Shannon! H = ²log S^n = n log S H: information S: number of symbols n: messagelength But what if we know what to expect? So S = 2, n.

Compression: Why? Shannon!

H = ²log S^n = n log S

H: informationS: number of symbolsn: messagelength

But what if we know what to expect?So S = 2, n = 4 has same info content as S = 16, n = 1Because 4 log 2 = 1 log 16So we can e.g. code 4 bits as one hexadecimal number

Entropy:m: a possible messagep: probability

H = -Σ p(m) ²log p(m)

4 random bits (16 r. messages):H = - 16 * 1/16 ²log 1/16 = 4 bit

Page 4: Compression: Why? Shannon! H = ²log S^n = n log S H: information S: number of symbols n: messagelength But what if we know what to expect? So S = 2, n.

Blockwise Fourier: blocking artefact

Page 5: Compression: Why? Shannon! H = ²log S^n = n log S H: information S: number of symbols n: messagelength But what if we know what to expect? So S = 2, n.

Quantisation artefact

Page 8: Compression: Why? Shannon! H = ²log S^n = n log S H: information S: number of symbols n: messagelength But what if we know what to expect? So S = 2, n.

Inter-frame compression artefacts