About the aVTC experience Dr. Klaus Brunnstein, Professor emeritus Department for Informatics, University of Hamburg President, International Federation for Information Processing (IFIP) AV-workshop Reykjavik (F-Prot) May 16-17, 2007 1. Background: Uni Hamburg´s IT Security Curricula 2. Development of aVTC @ Uni-Hamburg 3. Methods used in aVTC tests, lessons learned 4. Demand for inherently secure systems
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
About the aVTC experienceDr. Klaus Brunnstein, Professor emeritus
Department for Informatics, University of HamburgPresident, International Federation for Information Processing (IFIP)
AV-workshop Reykjavik (F-Prot) May 16-17, 2007
1. Background: Uni Hamburg´s IT Security
Curricula
2. Development of aVTC @ Uni-Hamburg
3. Methods used in aVTC tests, lessons learned
4. Demand for inherently secure systems
Abstract
Title: The VTC experience Author: Klaus Brunnstein, University of Hamburg, Germany Abstract: Established in 1987, Virus Test Center at Hamburg university was the first lab where students learned how to analyse security threats esp. related to malicious software and prepare software solutions to counter related threats (later, other labs worked about chipcard security, biometrics and incident reponse methods). After initial projects (including Morton Swimmer´s ANTIJERU), Vesselin Bontchev (coming from the virus lab of the Bulgarian Academy, Sofia) joined VTC in 1992 and started his AntiVirus test suit; Vesselin was probably the first ever to systematically organise AV tests, and his experiences taught several AV experts and their companies how to improve their products. When Vesselin left (for Iceland), a series of student projects were started where students could learn to organise and maintain a malware datase, prepare testbeds, develop criteria for testing, perform AV/AM tests with special emphasis on detection quality of AntiVirus and AntiMalware products. VTC results were sometimes controversially recognized, esp. when the author announced that product tests would also adress detection of non-replicating malware (aka trojans); at that time, some AV producers withdrew their product from the test (some of which joined later, after having been convinced that AntiVirus-only tests are too restrictive). The paper describes methods used by VTC in maintaining testbeds and how tests were performed, esp. also adressing problems found in testing. After the principal investigator finished his teaching carreer (in fall 2004), VTC was closed because of lack of students devoting time to test procedures.
Agenda: Chapter 1
1. Background: Uni Hamburg´s IT Security Curricula
2. Development of aVTC @ Uni-Hamburg
3. Methods used in aVTC tests
4. Demand for inherently secure systems
1.1 Background: Hamburg´s IT Security Curricula
Working Group AGN (Applications in Science: K.Bru.) responsible for education and research in IT Security
WS 1987/88: first lecture „IT Security and Safety“ pre-cycle: winter 1987/88-summer 1989
• Self-protection of Malware against Detection:– Hiding interactions: e.g. replacement of interrupts
– Self-encrypting malware: • Many viruses self-encrypt, with the decryption routine often
– Oligo- and Polymorphic (Obfuscated) Code: • Change static layout of code by changing sequence of code (e.g.
sequence of loading registers before procedure invocation) where semantic is not affected
• Oligomorphic code: few different variations of code (same effect)
• Polymorphic code: many different instantiations of code (same effect)
• Problem: malware „signatures“ are combinations of static codes (combined with AND, OR, NOT and woWildcards) to help identifying viruses and distinguish different „variants“
• Such code requires specific detection routines (scanning process slowed)
1.6 Background: Hamburg´s IT Security Curricula
Additional Lectures on:Mathematicals Cryptography (Prof. Kudlek), Data Protection etc. Seminar on: „Actual Problems of IT Security and Safety“
(every semester) Practice in Reverse EngineeringVirus Test Center: practical student work with malware/testsOther labs: biometrics systems, secure chipcardsExamination Work: about 100 diplom/master thesisDissertation Works: e.g. Vesselin Bontchev on Viruses,
KlausPeter Kossakowski: Principles of Incident Response Systems Morton Swimmer (2005): new about new AV methods
1.7 Background: Hamburg´s IT Security Curricula
• Hamburg Bachelor Curriculum 2001-2006:– Lecture (4 hours/week) for ALL students in 3rd year– „Foundations of Secure/Safe Systems“ (GBI)– 250 Students per semester (mandatory)– Essential elements:
Phase 2: 1990-1995: Vesselin Bontchev: 1st professional AV tests
Vesselin is „Best Teller of this Saga“
Phase 3: 1994-2004: VTC established with student testers
1997/98: 1st malware test (against protest of some AV comps)
October 2004: aVTC closed (Prof. emeritus – no more students)
2.2 Survey of tests at aVTC @ Uni-Hamburg Scanner test July 2004 Scanner test April 2003 Scanner test December 2002 "Heureka-2" Scanner test March 2002 Scanner test October 2001 "Heureka(-1)" Scanner test July 2001 Scanner test April 2001 AntiVirus Repair Test (ART 2000-11) Comment on Sophos´ reaction to VTC test report August 2000 Scanner test August 2000 Scanner test April 2000 Pre-released Scanner test February 2000 Scanner test September 1999 Scanner test March 1999 Scanner test October 1998 Scanner test Computer Bild (June 1998) Scanner test February 1998 Scanner test July 1997 Scanner test February 1997 Scanner test July 1994
Scanner test July 2005: Detection of mobile viruses (diplom thesis)
1. Background: Uni Hamburg´s IT Security Curricula
2. Development of aVTC @ Uni-HH
3. Methods used in aVTC tests3A Survey of methods3B Survey of test results3C Lessons learned
4. Demand for inherently secure Systems
3A.1 Test System: Lab Network
DOS Win 95Win NT
WXP
Client 3
Win NT
100 Mbit Ethernet using Microsoft Netbui
Client 1 Client 2
Server
3A.2a Test server:
Win-NT Server (1) hardware: Pentium 200 MHz, 64 MB RAM, 2 GB hard disk (boot) 2*4,3 GB data/reports, 2*9,1 GB virus database (mirror) 3 network cards: 2*100 MBit/sec, 1*10 MBit/sec Protected against electrical faults (USV: APC 420 VA) Operating system: Windows NT Server 4.0 SP 6
Network:1* 10 MBit/sec BNC for 20 DOS clients 1*100 MBit/sec via 2 cascaded switches for all other clients with 10 MBit/sec cards 1*100 MBit/sec via 100 MBit/sec hub other clients
Saved as images of bootsectors andmaster boot records
File virus database
File extentions:boo, img, mbr
File extentions:COM,EXE,CMD,SYS, BAT
The directory structure iscreated out of the virus
namesThe files are in theiroriginal structure
3A.4 Test System: Directory structure
Main directories:
CARO ( the three main scanners identify the virus identical:
implies that CARO naming conventions are valid
NYETCARO: one or two scanners identified the virus
UNKNOWN: none of the three scanners identified the virus, but the files replicate
• In early tests: – OS/2: viruses natively working under OS/2
– WINDOWS 95: viruses natively working under Windows 95
3A.5 Early Test System Size (1997)
Boot virus database:images: 3910viruses: 1004
File virus database:files: 84236viruses: 13014
Macro virus database:files: 2676viruses: 1017
Macro malware database:files: 61malware: 89
File malware database:files: 213malware: 163
3A.6b Test System: Size April 2003
"Full Zoo": 21,790 File Viruses in 158,747 infected files 8,001 different File Malware in 18,277 files 664 Clean file objects for False Positive test 7,306 Macro Viruses in 25,231 infected docs
450 different Macro Malware in 747 macro objects 329 Clean macro objects for False Positive test 823 different script viruses in 1,574 infected objects 117 different script malware in 202 macro objects
"ITW Zoo": 11 Boot Viruses in 149 infected images/sectors 50 File Viruses in 443 infected files 124 Macro Viruses in 1,337 infected documents
20 Script Viruses in 122 infected objects
3A.7a Preprocessing of new objects (#1/4)
Unzip the archives
Reset all file attributes
Sort all files into main categories (boot, file, macro)
Restore the normal file extensions (e.g. .EX_ ==> .EXE)
3A.7b Preprocessing of new objects (#2/4)
Remove with Dustbin all known non-viruses
Search for duplicate files (binary identical)First step: only the new filesSecond step: new files and old databaseThird step: delete all duplicate files
Replication of all new files to test if they are „alive“ (partially applied in test 1997-07)
3A.7c Preprocessing of new objects (#3/4)
Scan new files and previous databases with F-Prot,
Dr. Solomon and AVP to create report files
Move the non viruses (trojan, dropper, germs) into a special directory
Preprocessing reports using CARO.bat
If a virus is operating-system specific, it is sorted into the corresponding subdirectory below the specific OS-Directory (Win95, WinNT, OS/2)
3A.7d How CARO.BAT works (#4/4):
The subdirectory name is created out of the virus name.
The dots between the family names, sub family, main variant and sub variant are substituted with backslashes.
All characters except a-z, 0-9, „-“ and „_“ are substituted with „_“.
If a file with the same name already exists, the new file in this directory is renamed.
If F-Prot identifies a virus by name, the file is moved into the corresponding subdirectory below the NYETCARO directory
If Dr. Solomon identifies a virus by name, the file is moved into the corresponding subdirectory below the NYETCARO directory
If AVP identifies a virus by name, the file is moved into the corresponding subdirectory below the NYETCARO directory
If all three scanners identify a virus by the same name, the file is moved into the corresponding subdirectory below the CARO-Directory
3A.8 Test Procedures: Testing boot viruses
For practical reasons, no infected floppy disks are tested (method for such tests available but not practiced).
1.) Using SIMBOOT:• Is used to scan the boot images• Simulates changing of infected floppy disks• Simulates the user inputs to scan the next floppy disk
2.) If SIMBOOT fails, direct test:• Scan the images directly
Remark: several AV products crash under SIMBOOT.
3A.9 Test Procedures: Testing file/macro viruses
Heuristic mode
Reports only (no repair)
Experience: some scanners crash upon detecting viruses improperly
Scan small amount of files ( it‘s easier to start the scanner again )•CARO•NYETCARO\A•NYETCARO\B ........•NYETCARO\Z•Unknown•OS/2 (early tests)•WINDOWS 95 •Windows NT•Windows XP
3A.10 Test Procedures for file/macro viruses
Start Test-Version of the OS
Install scanner
Scan and save report to the network
Reboot with Master System
Delete Test-Version and restore from backup
Start from beginning
3A.11 Test Results, Evaluation
1) UNIX-Tools and AWK-Scripts are used to evaluate the reports; in cases of changed scanner diagnostics, scripts must be adapted.
2) Create an alphabetical list, which contains for each directory the directory name and the number of files in the directory
3) Analyse how many files are scanned and recognized for each scanner report.
4) Sort and join the reports ( directory listing - preprocessed scanner report )
5) Evaluate the joined report
6) Quality assurance
3B.1a Test results (e.g. 2003-04: 1st WXP test)• 0README.1ST - Latest notes.• 0XECSUM.TXT - Executive Summary of Test Report 2003-04• 1CONTENT.TXT - This file• 2PROLOG.TXT - Background, Aims of this test• 3INTRO.TXT - Introduction: Background, Aims, • Development of VTC tests• 4TESTCON.TXT - Conditions which a scanner must fullfil • in order to be tested• 5PROTOCO.TXT - Detailed description of VTC test protocols• 6jWXP.TXT - Detailed Results: Windows-XP File, Macro and Script• Virus and Malware Results • 6mCMP32.TXT - Detailed Results: Comparison of 32-Bit results from• test 2002-12 and 2003-04 (Win-XP, Win-98, Win-2k)• 7EVAL-WXP.TXT - Windows-XP Results: Evaluation, Grading of WXP
products• 7EVAL-CMP.TXT - W32-platforms: Comparison, Evaluation, • Grading of W32 products• 8PROBLMS.TXT - Problems and bugs experienced during tests• 9EPILOG.TXT - Summary, future test plans, and final comment• DISCLAIM.TXT - Disclaimer: About usage of this document
3B.1b Test report structure (cont)
Evidence for reproducibility of test results:
• --------------------• A1ITW00b.TXT - "In-The-Wild" list of PC Viruses • (October 2002: Wildlist.org)• A2SCANLS.TXT - List of scanners/versions and parameters, • including information on producer• A4TSTDIR.TXT - Directory of A3TSTBEDs (content of A3TSTBED.zip)• A5CODNAM.TXT - Code names of AV products in VTC tests
• Separate appendix:• ------------------• A3TSTBED.ZIP - Index of File, Macro, Script Virus & infected object• Databases, both full and "In-The-Wild"; • Index of Macro and Script Malware Databases; • and Index of non-viral and non-malicious objects • used in False-Positive test (all pkZIPped).
3B.3 Example of test result: File/Macro/Script Zoo Virus Detection Rates
Scan I == File Virus == + == Macro Virus == + == Script Virus == ner I Detection I Detection I Detection -------+-----------------------+-------------------------+--------------------------AVP I 100.~ I 100.~ I 98.9BDF I 82.9 I 99.0 I 72.4CMD I 98.5 I 99.9 I 89.1DRW I 98.3 I 99.4 I 94.7FSE I 100.~ I 100.~ I 99.5INO I 98.7 I 99.9 I 94.7NAV I 98.3 I 99.6 I 96.8NVC I 97.8 I 99.8 I 87.6RAV I 96.7 I 99.9 I 96.1SCN I 99.8 I 100.0 I 99.6-------+-----------------------+-------------------------+--------------------------Mean : 97.1% I 99.8% I 92.9%Mean>10%: 97.1% I 99.8% I 92.9%-------+-----------------------+-------------------------+--------------------------Student testers preferred/developped graphical representations (see next folios).
Antiviren-Test 2002-12
Erkennung unter Windows 2000
Antiviren-Test 2002-12
Erkennung unter Windows 2000
Antiviren-Test 2002-12
Erkennung unter Windows 2000
3B.4a Grading of AV/AM products:
• Definition (1): A "Perfect AntiVirus (AV) product"• --------------------------------------------------• 1) Will detect ALL viral samples "In-The-Wild"• AND at least 99.9% of zoo samples,• in ALL categories (file, boot, macro and script-based• viruses), with always same high precision • of identification and in every infected sample,• 2) Will detect ALL ITW viral samples in compressed • objects for all (NOW:5) popular packers, and• 3) Will NEVER issue a False Positive alarm• on any sample which is not viral.• Remark: detection of "exotic viruses" is• presently NOT rated.
3B.4b Grading of AV/AM products:
• Definition (2): A "Perfect AntiMalware (AM) product"• ----------------------------------------------------• 1) Will be a "Perfect AntiVirus product", • That is: 100% ITW detection• AND >99% zoo detection• AND high precision of identification• AND high precision of detection• AND 100% detection of ITW viruses • in compressed objects,• AND 0% False-Positive rate,• 2) AND it will also detect essential forms• of malicious software, at least in unpacked• forms, reliably at high rates (>90%).• Remark: detection of "exotic malware" is • presently NOT rated.
3B.4c Example of product gradingTest category: "Perfect" "Excellent"
1) Continuous improvement of knowledge & skills of AV test personnell (courses, events)
2) Publish test methods in detail, as basis for analysis of test methods & discourse about improvement
3) Work with trusted AVcompanies but avoid dependency
4) Sound test database requires uniform naming as well as proper quality assurance measures (including publication of problems, including own failures)
5) Send test results (incl. missed samples) to AV companies for analysis&verification of results.
6) Publish all details of tests (methods, problems, findings) to allow for expert analysis (NOT samples!)
Agenda: Chapter 4
1. Background: Hamburg´s IT Security Curriculum
2. Development of aVTC @ Uni-HH
3. Methods used in aVTC tests
4. Demand for inherently secure systems
4.1 Contemporary Solution: „Tower of IT“
AA
BB
B
WAN
Protected LAN
AM
AMLANU1
KryptoBoxFirewall
Intrustion DetectionAntiMalware
KryptoBox
Malicious Information
Zone Red: NO PROTECTION
Zone Blue: Hi-Protection
Zone Yellow: Partial Protection
U#
4.2 Requirements for Inherently Safe&Secure Systems
Basic requirement: for all IT systems in a ubiquitous network (including devices in personal contact), manufacturers specify and guarantee essential functions and features.
Requirement #1: „SafeComputing“ (SC): SC architecture guarantees: functionality of processes, persistence & integrity of objects, encapsulation of processes, graceful degradation (!), benign recovery (!)
Requirement #2: „SecureNetworking“ (SN): SN protocol guarantees: confidentiality, integrity, authenticity of sender/receiver, reliability of transfer, non-repudiation (!), non-deniability (!)
Requirement #3: Assurance of functional adequacy:All functions and features must be specified and implemented in a way to permit adequate assurance of specifications.
4.3 Residual Risks in Ubiquitous Computing
Future Secure and InSecure Networlds:
Stand alone:local anomalies
No protectionagainst attacksin FreeNetwork
No anomalies
Lokal stark
Inherently secure against attacks
FreeNetwork
SecureNetwork
!!!Protection from import of anomalies, attacks, flooding
4.4 Enforcement of Inherent Security
Path #1: DT Manufacturers establish and enforce adequate quality and standards.
Example: Vapor engine quality enforced through
„Dampfkessel Ueberwachungs-Verein“ (now: TÜV)Presently, no such self-organisation of the ICT
industry is available.
Path #2: Directives (EU, president) and laws enforce protection of customers (persons AND enterprises), including damage compensation and preventive actions.
Example: customer protection legislation in USA etc following Nader´s book „Unsafe at any speed!“