Physical Layer Security in Wireless Networks An Information Theoretic Perspective Vince Poor ([email protected]) Thanks to: Arsenia Chorti (ENSEA), Mahdi Shakiba Herfeh (ENSEA) and Miroslav G. Mitev (Essex) M. S. Herfeh, A. Chorti & H. V. Poor, “Physical Layer Security: Authentication, Integrity and Confidentiality.” In Physical Layer Security, Khoa N. Le, Ed. (Springer Nature, 2021), to appear.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Arsenia Chorti (ENSEA), Mahdi Shakiba Herfeh (ENSEA) and Miroslav G. Mitev (Essex)
M. S. Herfeh, A. Chorti & H. V. Poor, “Physical Layer Security: Authentication, Integrity andConfidentiality.” In Physical Layer Security, Khoa N. Le, Ed. (Springer Nature, 2021), to appear.
• IoT vulnerabilities to cyber attacks à Mostly concern personal privacy and security
IoT Security – A Major Concern
An Example of What Can Go Wrong[Soltan, Mittal, Poor, USENIX’18]
• Manipulation of demand via IoT: Botnets controlling high-wattage IoTdevices (air conditioners, refrigerators, etc.) can disrupt the power grid.
• A Mirai-sized (600,000 bots) botnet of water heaters can change thedemand instantly by 3GW – similar to having access to the largestcurrently deployed nuclear plant!
4078 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 65, NO. 7, JULY 2019
Fig. 2. Secrecy rate for the Gaussian wiretap channel with P/N1 = 3 dB, P/N2 = −3 dB, ε = δ = 10−3.
where CS is given in (114) and
Vi = log2 e2
P2 + 2P Ni
(P + Ni )2 , i ∈ {1, 2} (117)
Vc = V1 + V2 − P N1
P + N1
(1
N2+ 1
P + N2
)log2 e. (118)
Proof: See Appendix VI.
C. Numerical Results and Discussions
In this section, we compare the bounds proposed in thispaper with existing bounds in [25] and [35], and with theapproximations provided in Theorem 14 for a Gaussian wire-tap channel (with the O(·) terms omitted). The results areshown in Fig. 2.
Let us first explain how each bound was computed. Notethat, every achievability bound for the wiretap channel consistsof two parts: a reliability bound on the decoding performanceof the legitimate receiver and a secrecy bound on the informa-tion leakage to the eavesdropper (see, e.g., Theorem 7). Forthe reliability part, we have used Shannon’s channel codingachievability bound [49], which is the tightest achievabilitybound for Gaussian channels [7, Sec. III.J-4]. In particular,let MShannon(ε, n) be the number of codewords for a givenprobability of error ε and blocklength n implied by Shannon’sachievability bound. The quantity MShannon(ε, n) can be com-puted via the numerical routine in [50]. The maximal secrecyrate R∗
avg(n, ε, δ) can be lower-bounded by
R∗avg(n, ε, δ) ≥ 1
nlog
MShannon(ε, n)
L(119)
where L may be interpreted as the number of “dummy”messages that must be conveyed through the wiretap channelto achieve the desired secrecy. To compute an achievabilitybound on R∗
avg(n, ε, δ), it suffices to determine an upper bound
on L. Note that, each of the achievability bound plottedin Fig. 2 is computed using (119) and corresponds to arespective method for upper-bounding L.
1) Achievability (Theorem 7): The result is an applicationof the secrecy bound (45). To compute (45), we have chosenQZ to be a multivariate Gaussian distribution with i.i.d.N (0, P + N2) elements, and have chosen A to be the powersphere Sn = {xn : ‖xn‖2 = n P}. By the rotational symmetryof the set A and of the distributions PZ |X=x and QZ , the twosupremums on the right-hand side of (45) are both achievedby an arbitrary x ∈ A. In particular, we have set
x = x n ! [√
P, . . . ,√
P] (120)
in the computation. With these choices, (45) becomes
δ ≤ 1 − E[exp
(−|Bn − log γ |+
)]
+ 12
√γ
LE
[exp(−|Bn − log γ |)
](121)
where
Bn ! n2
log(
1 + PN2
)+ log e
2
n∑
i=1
(
1 − (√
P Zi − √N2)2
P + N2
)
(122)
with {Zi} i.i.d. CN (0, 1) distributed. The inequality (121)implies the following upper bound on L:
√L ≤ min
γ
√γ E
[exp(−|Bn − log γ |)
]
2(δ + E[exp
(−|Bn − log γ |+
)]− 1)
(123)
where the minimization is over all γ > 0 such that thedenominator in (123) is positive. Substituting (123) into (119)we obtain the desired result.
Lai, Liang, Du, Poor (2015)Key Generation from Random Channelsin Physical Layer Security inWireless Communications (CRC)
Privacy-Utility Tradeoffs
in
Sensing Systems
• Privacy is not secrecy:
Privacy vs. Secrecy
• Privacy is not secrecy:
• Denial of access (secrecy) makes a data source useless.
= Eve ?
Privacy vs. Secrecy
• Sensing systems generate considerable electronic data:
• Data’s utility depends on its accessibility.
• Accessibility endangers privacy.
• This fundamental tradeoff can becharacterized via information theory.
Privacy pEquivocation E
Utility UDistortion D
Utility-PrivacyTradeoff Region
Privacy-Utility Tradeoff
• Smart meter data is useful for price-aware usage, load balancing
• But, it leaks information about in-home activity
Example: Smart Meter Privacy
Giaconi, Gunduz, Poor (2021) Smart Meter Privacy, in Advanced Data Analytics forPower Systems (Cambridge)
P-U tradeoff leads to a spectral ‘reverse water-filling’ solution
The following theorem captures our main result.
Theorem 3: The utility-privacy tradeoff for smart metermeasurements modeled as a Gaussian source with memory isgiven by the leakage function λ(D) which results from choos-ing the distribution p (xn|xn) as the rate-distortion (withoutprivacy) optimal distribution.
Proof: The proof follows directly from noting that, for agiven jointly Gaussian distribution of the source and correlatedhidden sequence, pXnY n , the infimum in (8) and (9) is strictlyover the space of conditional distributions of the revealedsequence given the original source sequence as a result ofthe Markov chain relationship Y n − Xn − Xn. Expandingthe leakage as I(Y n; Xn) = h(Y n) − h(Y n|Xn), and usingthe fact for correlated Gaussian processes, Yk = αkXk + Zk,for all k, where {Zk} is a sequence independent of {Xk}and αk is a constant for each k, one can show that the jointlyGaussian distribution of Xn and Xn which minimizes (8) alsominimizes (9).
Remark 2: Theorem 3 simplifies the development of theRDL region for Gaussian sources with memory for which therate-distortion function is known. For Gaussian sources withmemory the rate-distortion function is known and lends itselfto a straightforward practical implementation that we discussin the following section.
F. Rate-Distortion for Gaussian Sources with Memory
In general, the rate distortion functions for sources withmemory are not straightforward to compute. However, forGaussian sources, the rate-distortion function R(D) (withoutthe additional privacy constraint) is known and can be obtainedvia a transformation of the correlated source sequence Xn
to its eigen-space in which the resulting sequence Xn isuncorrelated (and hence, independent for jointly Gaussiansources); let SX(ω), SY (ω), and SXY (ω) denote the two-sided power spectral densities (PSDs) of the {Xk} , {Yk},and {XkYk} processes, respectively [16]. Let φ denote theLagrangian parameter for the distortion constraint (4) in therate minimization problem. Explicitly denoting the dependenceon the water-level φ, the rate-distortion function Rφ (D) andthe average distortion function D (φ) are given by
Rφ (D) =
∫ π
−π
max
(
0,1
2log
SX(ω)
φ
)
dω
2π(10)
D (φ) =
∫ π
−π
min (SX(ω),φ)dω
2π. (11)
Note that the water-level φ is determined by the desiredaverage distortion D (φ) = D. Thus, R(D) for a Gaussiansource with memory can be expressed as an infinite sum of therate-distortion functions for independent Gaussian variables,one for each angular frequency ω ∈ [−π,π]. The “water-level” φ captures the average time-domain distortion constraintacross the spectrum such that the distortion for any ω is theminimum of the water-level and the PSD. The privacy leakageλ(D (φ)) is then the infinite sum of the information leakage
0 0 0
0.5
1
1.5
2
2.5
3
ωπ−π
S (ω)
φ
Fig. 1. The PSD of {Xk}. The area below the curve and the horizontal lineis equal to D.
about {Yk} for each ω, and is given by
λ (D (φ)) =
∫ π
−π
1
2log
(
SY (ω)
SXY (ω)g (ω) + SY (ω)
)
dω
2π(12)
where g (ω) ≡ (min (SX(ω),φ) − 1) .Remark 3: The transform domain “water-filling” solution
suggests that in practice the time-series data can be filteredfor a desired level of fidelity (distortion) and privacy (leak-age) using Fourier transforms. The privacy-preserving rate-distortion optimal scheme thus reveals only those frequencycomponents with power above the water-level φ. Furthermore,at every frequency only the portion of the signal energy whichis above the water level φ is preserved by the minimum-ratesequence from which the source can be generated with anaverage distortion D.
IV. ILLUSTRATION
The following example illustrates our results. We assumethat the private information to be hidden is the measurementsequence itself, i.e., Yk = Xk, for all k. For the metermeasurements modeled as a stationary Gaussian time series{Xk} , we choose Xk ∼ N (0, 1) for all k ∈ I, and anautocorrelation function
cm = E[XkXk+m] =
1 m = 0,0.3 m = ±1,0.4 m = ±2,0 otherwise.
The power spectral density PSD (frequency domain represen-tation of the autocorrelation function) of this process is givenby
S(ω) =∞∑
m=−∞
cm exp(imω) = 1+0.6 cos(ω)+0.8 cos(2ω),
− π ≤ ω ≤ π. (13)
In order to obtain the rate-distortion function Rφ(D) for thissource, for a given D we have to find the water-level φsatisfying (11).
Source Coding Solution:Hidden Gauss-Markov Model (protection of the hidden
intermittency state)
wasted energyversus
information leakageTradeoff:
A Control Approach:Energy Harvesting and Storage
º
ENERGY MANAGEMENT UNIT (EMU)
APPLIANCES
ENERGY HARVESTING (EH)
DEVICE
Char
ging
Disc
harg
ing
Harv
este
dEn
ergy
Output LoadInput Load
UTILITY PROVIDER
(UP)
RECHARGEABLE BATTERY (RB)
iX iY
iZ
SMAR
T M
ETER
(S
M)
Info
rmat
ion
Leak
age
Rat
e, I
(pX = pZ = ½)
Privacy-Utility Tradeoff:Binary Variables
Competitive Privacy: Privacy-Utility Tradeoffs for Interacting Agents
• Multiple interacting, but competing,agents (or groups of agents) withcoupled measurements.
• Each wants to estimate its ownparameters, or state.
• They can help each other by sharingdata, but wish to preserve privacy.
• Each has a privacy-utility tradeoff, butthey are competitive ones.
• How should they interact?
Motivating Examples
Electricity Grids: grid management
Radar: untrustworthy allies
Sensor Networks: resource localization
Yk = Hk ,mXm + Zk
m=1
M
∑ , k = 1,2,…,M
mth system state
• Noisy measurements at agent k:
• Utility for agent k: mean-square error for its own state Xk
• Privacy for agent k: leakage of information about Xk to other agents
Linear Measurement Model
How Should Agents Exchange Data?
• This is a classical problem in information theory – theWyner-Ziv problem (optimal distributed sourcecoding) – which tells how to exchange information.
• But, doesn’t say how much information to exchange.
• Because of the competitive nature, gametheory or prospect theory can illuminate this.
• Leads to a number of interesting solutions:– a basic problem is a prisoners’ dilemma
– with pricing, cooperation or multi-play games,more meaningful solutions arise
Poor (2018)Privacy in Networks of Interacting Agentsin Emerging Applications of Control and SystemTheory (Springer)
Other Issues
Other Issues
• Authentication
– Information theoretic bounds on the probabilities of successful impersonation and
substitution attacks [Lai, et al. IT’09]
• Attacks on MANETs
– Information theoretic guidance on how many malicious nodes can be tolerated
[Liang, et al. IT’11]
• Man-in-the-Middle and Spoofing Attacks on Sensor Nets
– Effects on CRLB in parameter estimation [Zhang, et al. SPM’18]
Authentication with Correlated Sequences
Impersonation attack: O transmits a message before S
Substitution attack: O replaces S’s message with its own
S R
O
s1L s2L
M
Theorem [Lai, et al. IT’09]: If the S-O channel is not less noisy than the S-R channel, then
Eav�
Eav�
Eav�
MANETs with Malicious Nodes
Secrecy Capacity Scaling[Liang, et al. IT’11]
Attacks on Sensor Nets[Zhang, et al. SPM’18]
3
studied in [3], [4], [7]–[9] and references therein. Investigations on cyberattacks on estimation systems
have been studied in [6], [7], [10]–[15], [17], [23] and references therein. The early work in [3], [4],
[6], [7], [7]–[9], [13]–[15], [23] set the tone for many later investigations and influenced most of the
discussions in this paper. In particular, the impact and mitigation of cyberattacks on systems solving
hypothesis testing problems was studied in [3], [4], [7], [9] and references therein. Distributed detection
in tree topologies in the presence of cyberattacks was considered in [8]. Investigations on cyberattacks
on estimation systems have been studied in [6], [7], [10]–[15], [17], [23] and references therein. The
problem of distributed spectrum sensing in a cognitive radio network under cyberattacks was studied in
[4], [5], [25]. Several cyberattack detection techniques were proposed for IoT localization systems in [6],
[12], [18], [19]. More recently, the data-injection attack in smart grids was considered in [13]–[15], [26]
and the references therein.
FusionCenter
QuantizerSensor 1
Physical phenomenon
QuantizerSensor 2
QuantizerSensor 3
QuantizerSensor N
Spoofing Attack
���
Man-in-the-
middle Attack
Fig. 1: Cyberattacks in IoT systems.
According to the place where they occur, cyberattacks in IoT systems can be categorized into two
classes, as illustrated in Fig. 1. We call any attack modifying a signal in the IoT system prior to
quantization a spoofing attack. It has been shown [12] that the same changes in the signals in the
IoT system produced by any spoofing attack can also be produced by changing the data going into
the sensors to be different from that coming from the physical phenomenon being monitored. We call
any attack modifying a signal in the IoT system after quantization a man-in-the-middle attack (MiMA).
The same changes in the signals in the IoT system produced by any MiMA [12] can also be produced
May 22, 2018 DRAFT
Man-in-the-middle attack:- TQA uses attacked data- SEA ignores attacked data
• Information theory can help understand some fundamental limits of
security and privacy in wireless networks
• These are theoretical constructs; although they sometimes point to
potential practical solutions, there are many needs to connect this kind ofanalysis to real networks, e.g.
- more finite-blocklength analysis
- scaling laws for large networks
- practical coding schemes to achieve fundamental limits
- other security primitives (signatures, certificates, etc.)
• Not all security has to be iron-clad: Quality of Security can be aparameter, just like Quality of Service
Closing Thoughts
Some References
Schaefer, Boche, Khisti, Poor (2017) Information Theoretic Security and Privacy ofInformation Systems (Cambridge)