Network-based Intrusion Detection and Prevention in Challenging and
Emerging Environments: High-speed Data Center, Web 2.0, and Social
NetworksYan Chen
Lab for Internet and Security Technology (LIST)
Department of Electrical Engineering and Computer Science Northwestern University
Chicago
2
Northw
estern
3
4
Statistics
• Chicago: 3rd largest city in US
• NU: ranked #12 by US News & World Report– Established in 1851– ~8000 undergrads
• McCormick School of Engineering: ranked #20– 180 faculty members– ~1400 undergrads and similar # of grad students
5
Statistics of McCormick
• National academy memberships: – National Academy of Engineering (NAE): 12 active,
7 emeriti– National Academy of Science (NAS): 3 active– Institute of Medicine (IoM): 1 emeritus– American Academy of Arts and Sciences (AAAS): 5
active, 3 emeriti– National Medal of Technology: 1 active
6
7
NetShield: Massive Semantics-Based Vulnerability Signature Matching
for High-Speed Networks
Zhichun Li, Gao Xia, Hongyu Gao, Yi Tang, Yan Chen, Bin Liu, Junchen Jiang, and Yuezhou Lv
NEC Laboratories America, Inc.
Northwestern University
Tsinghua University7
supplies 3,026travel 5,200capital equipment 0Facilities 0publication 1,000
To keep network safe is a grand challenge
Worms and Botnets are still popular
e.g. Conficker worm outbreak in 2008 and infected 9~15 million hosts.
8
9
NIDS/NIPS Overview
NIDS/NIPS (Network Intrusion Detection/Prevention System)
Signature DB
NIDS/NIPS `
`
`
Packets
Securityalerts
• Accuracy• Speed
9
State Of The Art
Pros• Can efficiently match
multiple sigs simultaneously, through DFA
• Can describe the syntactic context
Regular expression (regex) based approaches
Used by: Cisco IPS, Juniper IPS, open source Bro
Cons• Limited expressive
power• Cannot describe the
semantic context • Inaccurate
Example: .*Abc.*\x90+de[^\r\n]{30}
10
11
State Of The Art
Pros• Directly describe
semantic context• Very expressive, can
express the vulnerability condition exactly
• Accurate
Vulnerability Signature [Wang et al. 04]
Cons• Slow! • Existing approaches all
use sequential matching• Require protocol parsing
Blaster Worm (WINRPC) Example:BIND:rpc_vers==5 && rpc_vers_minor==1 && packed_drep==\x10\x00\x00\x00&& context[0].abstract_syntax.uuid=UUID_RemoteActivationBIND-ACK:rpc_vers==5 && rpc_vers_minor==1CALL:rpc_vers==5 && rpc_vers_minors==1 && packed_drep==\x10\x00\x00\x00&& opnum==0x00 && stub.RemoteActivationBody.actual_length>=40&& matchRE(stub.buffer, /^\x5c\x00\x5c\x00/)
Goodstate
BadstateVulnerability
Signature
Vulnerability: design flaws enable the bad inputs lead the program to a bad state
Bad input
Regex vs. Vulnerabilty Sigs
Regex ContextFree
ContextSensitive
Protocol grammar
Theoretical prospective Practical prospective
• HTTP chunk encoding
• DNS label pointers
Parsing
Matching
Vulnerability Signature matching
Regex cannot substitute parsing
12
Combining
Regex V.S. Vulnerabilty Sigs
• Regex assumes a single input
• Regex cannot help with combining phase
Regex + Parsing cannot solve the problem
Cannot simply extend regex approaches for vulnerability signatures
13
Motivation of NetShield
Theoretical accuracy limitation of regex
State of the art regex Sig
IDSesNetShield
Existing Vulnerability
Sig IDS
Accuracy HighLow
Low
Hig
hS
peed
14
Research Challenges and Solutions
15
• Challenges– Matching thousands of vulnerability
signatures simultaneously• Sequential matching match multiple sigs.
simultaneously
– High speed protocol parsing
• Solutions (achieving 10s Gps throughput)– An efficient algorithm which matches multiple
sigs simultaneously– A tailored parsing design for high-speed
signature matching– Code & ruleset release at www.nshield.org
16
NetShield System Architecture
Outline
• Motivation
• High Speed Matching for Large Rulesets
• High Speed Parsing
• Evaluation
• Research Contributions
17
1818
Background
• Vulnerability signature basic– Use protocol semantics to express vulnerabilities– Defined on a sequence of PDUs & one predicate for
each PDU– Example: ver==1 && method==“put” && len(buf)>300
• Data representations– The basic data types used in predicates: numbers and
strings– number operators: ==, >, <, >=, <=– String operators: ==, match_re(.,.), len(.).
Blaster Worm (WINRPC) Example:BIND:rpc_vers==5 && rpc_vers_minor==1 && packed_drep==\x10\x00\x00\x00&& context[0].abstract_syntax.uuid=UUID_RemoteActivationBIND-ACK:rpc_vers==5 && rpc_vers_minor==1CALL:rpc_vers==5 && rpc_vers_minors==1 && packed_drep==\x10\x00\x00\x00&& opnum==0x00 && stub.RemoteActivationBody.actual_length>=40 && matchRE(stub.buffer, /^\x5c\x00\x5c\x00/)
19
Matching Problem Formulation• Suppose we have n signatures, defined on k
matching dimensions (matchers)– A matcher is a two-tuple (field, operation) or a four-
tuple for the associative array elements– Translate the n signatures to a n by k table– This translation unlocks the potential of matching
multiple signatures simultaneously
Rule 4: URI.Filename=“fp40reg.dll” && len(Headers[“host”])>300RuleID Method == Filename == Header == LEN
1 DELETE * *
2 POST Header.php *
3 * awstats.pl *
4 * fp40reg.dll name==“host”; len(value)>300
5 * * name==“User-Agent”; len(value)>544
Signature Matching
• Basic scheme for single PDU case
• Refinement– Allow negative conditions– Handle array cases– Handle associative array cases– Handle mutual exclusive cases
• Extend to Multiple PDU Matching (MPM)– Allow checkpoints.
20
Difficulty of the Single PDU matching
Bad News– A well-known computational geometric problem
can be reduced to this problem. – And that problem has bad worst case bound
O((log N)K-1) time or O(NK) space (worst case ruleset)
Good News– Measurement study on Snort and Cisco ruleset– The real-world rulesets are good: the
matchers are selective.– With our design O(K) 21
Matching Algorithms
Candidate Selection Algorithm
1.Pre-computation: Decides the rule order and matcher order
2.Runtime: Decomposition. Match each matcher separately and iteratively combine the results efficiently
22
23
Step 1: Pre-Computation• Optimize the matcher order based on buffering
constraint & field arrival order • Rule reorder:
RequireMatcher 1
Don’t careMatcher 1
RequireMatcher 1
RequireMatcher 2
Don’t careMatcher 1
& 2
1
n
2424
Step 2: Iterative Matching
RuleID Method == Filename == Header == LEN
1 DELETE * *
2 POST Header.php *
3 * awstats.pl *
4 * fp40reg.dll name==“host”; len(value)>300
5 * * name==“User-Agent”; len(value)>544
PDU={Method=POST, Filename=fp40reg.dll, Header: name=“host”, len(value)=450}
S1={2} Candidates after match Column 1 (method==)S2= S1 A2+B2={2} {}+{4}={}+{4}={4}S3=S2 A3+B3 ={4} {4}+{}={4}+{}={4}
1 ii AS
Si1 ii AS
Don’t care matcher i+1
requirematcher i+1
In Ai+1
R1
R2
R3
Complexity Analysis
• Merging complexity– Need k-1 merging iterations– For each iteration
• Merge complexity O(n) the worst case, since Si can have O(n) candidates in the worst case rulesets
• For real-world rulesets, # of candidates is a small constant. Therefore, O(1)
– For real-world rulesets: O(k) which is the optimal we can get
Three HTTP traces: avg(|Si|)<0.04Two WINRPC traces: avg(|Si|)<1.5
25
Outline
• Motivation
• High Speed Matching for Large Rulesets.
• High Speed Parsing
• Evaluation
• Research Contribution
26
High Speed Parsing
• Design a parsing state machine
Tree-based vs. Stream Parsers
Keep the whole parsetree in memory
Parsing and matchingon the fly
Parse all the nodes in the tree
Only signature relatedfields (leaf nodes)
VS.
VS.
27
High Speed Parsing
• Build an automated parser generator, UltraPAC
28
Parsing State Machine field_1:
length = 5; goto field_5;field_2: length = 10; goto field_6;…
Protocol ParserProtocol
Spec.
Signature Set
2929
Observations
array
PDUPDU parse treeLeaf nodes are numbers
or strings
Observation 1: Only need to parse the fields related to signatures (mostly leaf nodes)
Observation 2: Traditional recursive descent parsers which need one function call per node are too expensive
3030
Efficient Parsing with State Machines
• Studied eight protocols: HTTP, FTP, SMTP, eMule, BitTorrent, WINRPC, SNMP and DNS as well as their vulnerability signatures
• Common relationship among leaf nodes
• Pre-construct parsing state machines based on parse trees and vulnerability signatures
Varderive
Sequential Branch Loop Derive(a) (d)(c)(b)
VarVar
Outline
• Motivation
• High Speed Matching for Large Rulesets.
• High Speed Parsing
• Evaluation
• Research Contributions
31
Evaluation Methodology
• 26GB+ Traces from Tsinghua Univ. (TH), Northwestern (NU) and DARPA
• Run on a P4 3.8Ghz single core PC w/ 4GB memory• After TCP reassembly and preload the PDUs in memory• For HTTP we have 794 vulnerability signatures which
cover 973 Snort rules.• For WINRPC we have 45 vulnerability signatures which
cover 3,519 Snort rules
Fully implemented prototype 10,000 lines of C++ and
3,000 lines of PythonDeployed at a DC in TsinghuaUniv. with up to 106Mbps
32
Parsing Results
Trace TH DNS
TH WINRPC
NU WINRPC
TH HTTP
NU HTTP
DARPA HTTP
Avg flow len (B) 77 879 596 6.6K 55K 2.1K
Throughput (Gbps)
Binpac
Our parser
0.31
3.43
1.41
16.2
1.11
12.9
2.10
7.46
14.2
44.4
1.69
6.67
Speed up ratio 11.2 11.5 11.6 3.6 3.1 3.9Max. memory per connection (bytes)
16 15 15 14 14 14
33
Parsing+Matching Results
Trace TH WINRPC
NU WINRPC
TH HTTP
NU HTTP
DARPA HTTP
Avg flow length (B) 879 596 6.6K 55K 2.1K
Throughput (Gbps)
Sequential
CS Matching
10.68
14.37
9.23
10.61
0.34
2.63
2.37
17.63
0.28
1.85Matching only time
speedup ratio4 1.8 11.3 11.7 8.8
Avg # of Candidates 1.16 1.48 0.033 0.038 0.0023Avg. memory per connection (bytes)
32 32 28 28 28
11.08-core
34
Scalability Results
0 200 400 600 800
01
23
4
# of rules used
Th
rou
gh
pu
t (G
bp
s)
Performancedecreasegracefully
35
36
Accuracy Results
• Create two polymorphic WINRPC exploits which bypass the original Snort rules but detect accurately by our scheme.
• For 10-minute “clean” HTTP trace, Snort reported 42 alerts, NetShield reported 0 alerts. Manually verify the 42 alerts are false positives
Research Contribution
Regular Expression Exists Vul. IDS NetShield
Accuracy Poor Good Good
Speed Good Poor Good
Memory Good ?? Good
• Multiple sig. matching candidate selection algorithm
• Parsing parsing state machine
Tools at www.nshield.org
Make vulnerability signature a practical solutionfor NIDS/NIPS
37
38
Q&A
Q&A
4. Vulnerability Signature Matching for Large Ruleset Complexity Analysis
Three HTTP traces: avg(|Si|)<0.04 Two WINRPC traces:
avg(|Si|)<1.5
Merging complexity Need k-1 merging iterations For each iteration
Merge complexity O(n) the worst case, since Si can have O(n) candidates in the worst case rulesets
For real-world rulesets, # of candidates is a small constant. Therefore, O(1)
For real-world rulesets: O(k), which is the optimal case
4040
Example for WINRPC• Rectangles are states• Parsing variables: R0 .. R4
• 0.61 instruction/byte for BIND PDU
1 rpc_ver_minor
R4
20*R4
R2++R2£R3
R2 ‹- 0R3 ‹- ncontext
Header BindR0
R0
R1-16
Bind
Bind-ACK
R1
Bind-ACK
1 rpc_vers
1 pfc_flags
1 ptype
2 frag_length
4 packed_drep
6 merge1
1 n_tran_syn
2 ID
16 UUID
1 padding
tran_syn4 UUID_ver
1 ncontext
8 merge2
3 padding
merge3
41
Parser generator
• We reuse the front-end of BinPAC (a Yacc like tool for protocol parsing)
• Redesign the backend to generate the parsing state machine based parser