Linux Pseudo random Number Generator (LPRNG) Real-life cryptography Pfeiffer Alain
Feb 25, 2016
Linux Pseudo random Number Generator (LPRNG)
Real-life cryptographyPfeiffer Alain
Index
Types of PRNG‘s History General Structure User space Entropy types Initialization process Building Blocks Security requirements Conclusion
Types Non-cryptographic deterministic: Should not
be used for security (Mersenne Twister)
Cryptographically secure: Algorithm with properties that make it suitable for the use in cryptography (Fortuna)
Entropy inputs: Produces bits non-deterministically as the internal state is frequently refreshed with unpredictable data from one or several external entropy sources (LPRNG)
History
Part of the Linux Kernel since 1994 Written by Ts‘o Modified by Mackall +/- 1700 lines of C code
General Structure Internal states:
Input pool (128, 32-bit words = 4096 bits) Blocking pool (32, 32 bit words = 1024 bits) Nonblocking pool (1024 bits)
Output function: Sha-1 Mixing function: Linear mixing function ≠ hash Entropy Counter:
Decremented when bits are extracted Incremented when new bits are collected
User space /dev/random
Reads from blocking pool Limits the number of generated bits Blocked when not enough entropy Resumed when new entropy in input pool
/dev/urandom Reads from nonblocking Generates random bits WITHOUT blocking
Writing the data does NOT change the entropy counter!!!
Get_random_bytes() Kernel space Reads random bytes from nonblocking pool
Entropy inputs Backbone of security
Injected: Into generator for initialization Through updating mechanism
Usable independently Does NOT rely on physical non-deterministic phenomena Hardware RNGs▪ Available for user space▪ NOT mixed into LPRNG
Entropy gathering daemon:▪ Collects the outputs▪ Feeds them into LPRNG
Entropy sources
Reliable Entropy: User inputs (Keyboard, Mouse) Disk timings
Interrupt timings are NOT reliable: Regular interrupts Miss-use of the
„IRQF_SAMPLE_RANDOM“ flag
Entropy events „num“ value (Type of event, 32 bits)
Mouse (12 bits) Keyboard (8 bits) Interrupts (4 bits) Hard drive (3 bits)
CPU „cycle“ Max: 32 bits AVG: 15 bits
„jiffies“ count (32 bits) Kernel counter of timer interrupts (avg. 3 – 4 Bits) Frequency 100 – 1000 ticks/sec
The generator never assumes max entropy.
Entropy Estimation Conditions1. Unknown distribution: Inputs vary a lot2. Unknown correlation: Correlations
between inputs are likely3. Large sample space: Hard to keep
track of 232 Jiffies values.4. Limited time: Estimation happens after
interrupts, so they must be fast.5. Estimation at runtime: Estimation for
every input!6. Unknown knowledge of the attacker
Initialization
Not much entropy in Linux boot process! At Shutdown:
Generates data from /dev/urandom Save into file
At Startup: Writes the saved data to /dev/random Mixes the data to:▪ Blocking pool▪ Nonblocking poolwithout changing the counter!
Building Blocks
1. Mixing Function2. Entropy Estimator3. Output Function4. Entropy Extraction
Linear feedback shifting register
…
1. Mixing Function
1. Mixes 1 byte after each other2. Extend it to 32-bit word3. Rotate it by 0-314. Linear shifting (LFSR) into the
pool
No entropy gets lost
1. Mixing WITHOUT Input
Linear feedback shifting register (LFSR)over
Galois field: GF(232)with
Feedback Polynomial: Q(X) = α3 (P(X) – 1) + 1
where
Primitive element: α Size of the pool: P(X)
Input Pool: P(X) = X128+X103+X76+X51+X25+X+1 Output Pool: P(X) = X32+X26+X20+X14+X7+X+1
Input pool period: 292*32 -1 ≠ 2128*32 -1 Output pool period: 226*32 -1 ≠ 232*32 -1
1. Mixing WITHOUT Input (cont.) Input Pool: P(X) = X128+X103+X76+X51+X25+X+1 Output Pool: P(X) = X32+X26+X20+X14+X7+X+1
P(X) is NOT irreducible! But by changing one feedback position
Input Pool: P(X) = X128+X104+X76+X51+X25+X+1 Output Pool: P(X) = X32+X26+X19+X14+X7+X+1
P(X) is irreducible But NOT primitive! However by changing α to:
α2 (X32+X26+X23+X14+X7+X+1) α4
α7
… P(X) is irreducible AND primitive! Periods: 2128*32 -1 & 232*32 -1
1. Mixing WITH Input
Function L1: {0,1}8 {0,1}32
▪ Rotates▪ Multiplication in GF(232)
Feedback function L2: ({0,1}32)5 {0,1}32
2. Entropy Estimator (1) Random variables:
Identically distributed Different (single) source
Sample space: D where |D| >> 2 Jiffies count: ᵹi
[1] at time i Estimator with input Ti:
Logarithm function:
Outcome:
2. Entropy Estimator (2) To compute We must know:
Time ti-1
Jiffies count: ᵹi-1[1] where [1] = event 1
Jiffies count: ᵹi-1[2] where [2] = event 2
Property: invariant under a permutation Permutation: Distribution q: Distribution p:
H(p) ≠ H(q), since it uses the value of a given element and not its probability!
3. Output Function
Transfer: Input pool output poolGenerate data from
output pool Uses Sha-1 hash
Feedback phase Extraction phase
3. Output – Feedback phase
Sha-1 Get all pool bytes (32-bit word) Produce 5-word hash Send it to▪ Mixing function▪ Extraction phase
Mixing function Get the 5-word hash Mix it back Shift 20 times (20 words = 640 bits)
3. Output – Extraction phase
Sha-1 Initial value (Hash) Get (16) Pool-words▪ Overlap with last word from the feedback function▪ Overlap with 3 first words of the output pool
Produce 5-word hash
Fold in half Extract w0 xor w1 xor w2 xor w3 xor w4 Produce 10 byte output
4. Entropy Extraction
Random Variable: XRényi Entropy: H2(X)Hash function: Random choice of the hash: G
IF H2(X) ≥ r G: uniformly distributed
Entropy is close to r bits
4. Entropy Extraction - LPRNG LPRNG fixed hash function:
Assumptions: Each element has size of Attacker knows all permutations
Universal hash function:
If the pool contains: k bits of Rényi entropy m ≤ k
Entropy close to m bits:
Security requirements Sound entropy estimation:
Estimate the amount entropy correctly Guarantee that an attacker who knows the input can
NOT guess the output!
Pseudo randomness: Impossible to compute the:▪ Internal state▪ Future outputs
Unable to recover:▪ Internal state▪ Future outputswith partial knowledge of the entropy
Sound entropy estimationSamples: N = 7MEmpirical frequency: Estimators:
LPRNG entropy: Shannon entropy: Min-entropy: Rényi entropy:
Results:
Pseudorandomness Sha-1: one-way function
Adversary can NOT recover the content of ▪ output pool▪ input pool if he only knows the outputs!
Folding: Avoids recognizing patterns Output of the hash is NOT directly recognizable
Secure if the internal state is NOT compromised!
Security resilience
Backtracking resistance: An attacker with knowledge of the current state should NOT be able to recover previous outputs!
Prediction resistance: An attacker should NOT be able to predict future outputs with enough future entropy inputs!
Securiy resilience LPRNG Forward security: Knowledge of the initial state does
NOT provide information on previous states. Even if the state was not refreshed by new entropy inputs.
Backtracking provided by: One-way output function
Backward security: Adversary who knows the internal state is able predict Outputs Future outputsbecause the Output function is deterministic… (Bad!)
Prediction provided by: Reseed the internal state between requests!
Forward Security Attacker knows:
Input pool Output pool
Attacker knows the previous states EXCEPT the 160 bits which were fed back.
BUT without additional knowledge an generic attack would have:▪ 2160 overhead▪ 280 solutions
Backward Security Transferring k bits of entropy means that
after: Generating data from UNKNOWN S1 Mixing S1 to the KNOWN S2 Guessing the NEW S2would cost on average 2k-1 trials for the attacker!
Collecting k bits of entropy means that after: Processing unknown data from KNOWN S1 Guessing the NEW S1would cost on average 2k-1 trials for the observer!
Backward Security – Attacks 1. Attacker:
Knows the output pool Does NOT know the input pool
2. Attacker knows Input pool Output pool
Backward Security – Attack 1Enough entropy (k >= 64 bits)?
Yes!▪ Transferring k bits from input▪ Attacker looses k bits of knowledge▪ NO output before k bits are mixed Generic attack (2k-1): k bits resistance!
No!▪ NO bits are transferred▪ Attacker keeps knowledge▪ NO output before k bits are sent from input Generic attack (2k-1): k bits resistance!
Backward Security – Attack 2 //k = 64 bitsCollect k bits of entropy (2k-1
guessings) If (counter >= k bits) then
counter-- Else
counter++ transfer k bits from input
64 bits resistance
Conclusion
Good level of security
Mixing function could be improved!
Newer hash-function could be used (Sha-3)