Knock, Knock, Knocking on (Network) Doors: Penn State's Intrusion Detection Architecture Copyright Penn State Information Technology Services, 2004. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author.
24
Embed
Knock, Knock, Knocking on (Network) Doors: Penn State's Intrusion Detection Architecture Copyright Penn State Information Technology Services, 2004. This.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Knock, Knock, Knocking on (Network) Doors: Penn State's
Intrusion Detection Architecture
Copyright Penn State Information Technology Services, 2004. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author.
Knock, Knock, Knocking on (Network) Doors: Penn State's
Intrusion Detection Architecture
The Security Professionals WorkshopMay 18, 2004
Security Operations and Services
● A division of Information and Technology Services (ITS) at Penn State
● 8 full time staff members– Director
● Kathy Kimball– Intrusion detection: 2 staff members
● 18 installed units– 2002: 6 units (5 commercial, 1 SOS build)– 2003: 12 units (3 commercial, 9 SOS builds)
● Locations– 8 units at 6 non-UP campus locations– 6 units at 5 UP colleges– 2 units at 2 ITS locations– 1 unit at other UP department– 1 unit at UP residence hall*
● 8,912 addresses covered (~35 class Cs)
Initial Experiences
● Overwhelming amount of data– Initial average of 60,000 alerts daily on each sensor
● What does this alert mean?– Initial tendency to analyze false positives– Initial tendency to question/ignore alerts
● How do I write this rule?● Constant attention needed
– No benefit without continuous monitoring– Rule sets/software updates– Mirrors go down
● The insight provided into networks
IDS and ID Tool Utilization
● Iterative process using Snort, RID, nmap, flow data, (ex/in)ternal reports, well-known information; for example:– Scanning activity from internal host ( (ex/in)ternal
report/Snort detected)● Nmap of host/connection to open ports for signature detection● Signature of detected port(s) input into RID
or– Compromise (with signature) detected on Snort
● Signature of for detected port(s) input into RIDor
– Backdoor without signature identified on specific port● Nmap scans
Interesting ports on (128.118.xx.xx):Port State Protocol Service135 open tcp loc-srv139 open tcp netbios-ssn206 open tcp at-zis... 1926 open tcp unknown
● Effectiveness? - can't say with certainty– Circumstances often limit monitoring (e.g. crisis
management, other tasks, time off)– Things are missed/ignored– Signatures don't exist or not on devices
● What we can say with certainty– Improvement over commercial 24x7 managed service trial– Central detection contributes to effectiveness during crisis– July 2003: border filters applied for vulnerable Microsoft
ports (and a few more) ● More internal damage is detected/limited● July 30/August 7, 2003 experiences
– Self-monitoring is important; less external reporting/some attacks remain inside with border filters
The Need for Automation
● New attacks/worms require IDS signature development (though portscan may detect)
● Human analysis/response (even 24x7) is insufficient to minimize attack/worm damage– “Triage” experience against recent rapidly propagating