Top Banner
About the aVTC experience Dr. Klaus Brunnstein, Professor emeritus Department for Informatics, University of Hamburg President, International Federation for Information Processing (IFIP) AV-workshop Reykjavik (F-Prot) May 16-17, 2007 1. Background: Uni Hamburg´s IT Security Curricula 2. Development of aVTC @ Uni-Hamburg 3. Methods used in aVTC tests, lessons learned 4. Demand for inherently secure systems
55

The VTC experience

Jan 28, 2015

Download

Technology

frisksoftware

Presented at the International Antivirus Testing Workshop 2007 by Prof. Dr. Klaus Brunnstein, University of Hamburg, Germany
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: The VTC experience

About the aVTC experienceDr. Klaus Brunnstein, Professor emeritus

Department for Informatics, University of HamburgPresident, International Federation for Information Processing (IFIP)

AV-workshop Reykjavik (F-Prot) May 16-17, 2007

1. Background: Uni Hamburg´s IT Security

Curricula

2. Development of aVTC @ Uni-Hamburg

3. Methods used in aVTC tests, lessons learned

4. Demand for inherently secure systems

Page 2: The VTC experience

Abstract

Title:         The VTC experience        Author:      Klaus Brunnstein, University of Hamburg, Germany        Abstract:   Established in 1987, Virus Test Center at Hamburg university was the first lab                        where students learned how to analyse security threats esp. related to malicious                        software and prepare software solutions to counter related threats (later, other labs                         worked about chipcard security, biometrics and incident reponse methods). After initial                        projects (including Morton Swimmer´s ANTIJERU), Vesselin Bontchev (coming from                        the virus lab of the Bulgarian Academy, Sofia) joined VTC in 1992 and started his                         AntiVirus test suit; Vesselin was probably the first ever to systematically organise                         AV tests, and his experiences taught several AV experts and their companies how to                        improve their products. When Vesselin left (for Iceland), a series of student projects                         were started where students could learn to organise and maintain a malware datase,                         prepare testbeds, develop criteria for testing, perform AV/AM tests with special                        emphasis on detection quality of AntiVirus and AntiMalware products. VTC results                        were sometimes controversially recognized, esp. when the author announced that                        product tests would also adress detection of non-replicating malware (aka trojans);                        at that time, some AV producers withdrew their product from the test (some of which                        joined later, after having been convinced that AntiVirus-only tests are too restrictive).                         The paper describes methods used by VTC in maintaining testbeds and how                         tests were performed, esp. also adressing problems found in testing. After the                        principal investigator finished his teaching carreer (in fall 2004), VTC was closed                        because of lack of students devoting time to test procedures. 

Page 3: The VTC experience

Agenda: Chapter 1

1. Background: Uni Hamburg´s IT Security Curricula

2. Development of aVTC @ Uni-Hamburg

3. Methods used in aVTC tests

4. Demand for inherently secure systems

Page 4: The VTC experience

1.1 Background: Hamburg´s IT Security Curricula

Working Group AGN (Applications in Science: K.Bru.) responsible for education and research in IT Security

WS 1987/88: first lecture „IT Security and Safety“ pre-cycle: winter 1987/88-summer 1989

Curriculum IT-Sicherheit (IT Security):1st cycle: winter 1989/90-summer 1991 2nd cycle: winter 1991/92-summer 1933 3rd cycle: winter 1993/94-summer 19954th cycle: winter 1995/96-summer 1997 5th cycle: winter 1997/98-summer 19996th cycle: winter 1999/00-summer 2001

Mean: 50 students per cycle (optional in diploma)

Page 5: The VTC experience

1.2 Background: Hamburg´s IT Security Curricula

Lecture 1: Introduction into IT Security and Safety

Survey of dependability/vulnerability studies

Survey of IT misuse: Hackers, Cracker, Viruses, Worms

Basic IT paradigms and IT-induced risks selected

Case studies of IT relevant incidents in organisations and enterprises, security and safety issues and policies

Legal aspects:- Data Protection- Computer Crime Legislation- Copyright, Intellectual Property Right

Page 6: The VTC experience

1.3 Background: Hamburg´s IT Security Curricula

Lecture 2: Concepts of Secure & Safe Systems I

Problems of "Quality", ISO 9000 etc.

IT Security and Safety Models & IT Security Criteria:

TCSEC/TNI, ITSEC, CTCPEC, US'FC, MSFR,

JCSEC,R-ITSEC, Common Criteria

Reference Monitor Concepts

Implementations of Virtual Systems

Intrusion Detection (IDES) / Avoidance (IDA)

Page 7: The VTC experience

1.4 Background: Hamburg´s IT Security Curricula

Lecture 3: Concepts of Secure & Safe IT Systems IIEncryption methods (general, DES, RSA, Clipper) Data Base/Information Systems Security:

Problems and Solutions (DBMS, RDMS) Communication and Network Security

Lecture 4: Risk and Incident AnalysisCase studies: Incident of IT based Systems

- Network, Mainframe, PC Attacks- Bank networks/accidents- Flight Management (EFCS) and other accidents

Methods of Risk Analysis Large Systems Backup Solutions Methods of Reverse Engineering Methods of Computer Emergency Response

Page 8: The VTC experience

AGNAGNAnwendungen der Informatik in Anwendungen der Informatik in

Geistes- und NaturwissenschaftenGeistes- und Naturwissenschaften

Page 9: The VTC experience

1.5a Background: Reverse Engineering Course

(anti) Virus Test Center = antiMalware Laboratory:

- local network, clients with flexible hub switching concept

- Intel-based Workstations

- VMWare as basic platform, to „contain“ malicious events

- DOS (boot viruses&trojans)

- W32 systems (file, macro&script viruses/worms; trojans)

Reverse-Engineering Courses (1/year):

- 10 days (2 weeks): survey malware, methods of analysis

- practice of reverse-engineering

- Certificate (examination + analysis of unknown malware)

Page 10: The VTC experience

1.5b Background: Reverse Engineering CourseGenerating Replicated Code

„Goat (=victim) object“:

executable „pure“ content of different types and length

„Infection process“:

virus or worms executed in protected environment to avoid uncontrolled spreading

different goat objects to assure that infection works under all relevant circumstances (also for proper detection)

generations of infection: original virus infects 1st generation1st generation infection generates 2nd generation2nd generation infection generates 3rd generation assurance: 1st & 2nd generation are infectious

(viral/wormy)

Page 11: The VTC experience

1.5c Background: Reverse Engineering Course: Dynamic Analysis: Observing Replication

• Hardware based analysis: Logic analyser; system tracing

using special hardware PERISCOPE: observe and control dynamic behaviour, performance monitoring

• Software based analysis: – Event tracing– Interrupt observation: INTSPY– HelpPC makes essential details (Interrupts, BIOS and DOS

structures, DOS commands, hardware specs) available

• Code Tracing:– Practicing Debugging (breakpoints, trigger etc)– Tool: SoftIce– Problem: analysis of malware using AntiDebugging methods

Page 12: The VTC experience

1.5d Background: Reverse Engineering Course Basics of Code Analysis

• General: differential analysis: comparing infected vs. uninfected code

• 16-bit/W32 code: Disassembly (mostly: Sourcer) separation of code/data, library functions, documentation of code, ...

• Macro code: specialised decompiler (or „manual“ work)– Macro viruses/worms exist in source (VBA) AND p-code

Analysis must adress BOTH VBA and p-code

– VBA: „hi-level language“ easy to understand (and reprogram )

– p-code: generating source code with editor

Remark: several viruses may reconstruct deleted source code from p code

• Script code: specialised decompiler, mostly „manual“ workProblem: VBS may deeply influence MS-W32 system system structures;

therefore, good system knowledge is required

Page 13: The VTC experience

1.5e Background: Reverse Engineering Course: Dynamic and Static Analysis: Understanding Camouflage

• Self-protection of Malware against Detection:– Hiding interactions: e.g. replacement of interrupts

– Self-encrypting malware: • Many viruses self-encrypt, with the decryption routine often

– Oligo- and Polymorphic (Obfuscated) Code: • Change static layout of code by changing sequence of code (e.g.

sequence of loading registers before procedure invocation) where semantic is not affected

• Oligomorphic code: few different variations of code (same effect)

• Polymorphic code: many different instantiations of code (same effect)

• Problem: malware „signatures“ are combinations of static codes (combined with AND, OR, NOT and woWildcards) to help identifying viruses and distinguish different „variants“

• Such code requires specific detection routines (scanning process slowed)

Page 14: The VTC experience

1.6 Background: Hamburg´s IT Security Curricula

Additional Lectures on:Mathematicals Cryptography (Prof. Kudlek), Data Protection etc. Seminar on: „Actual Problems of IT Security and Safety“

(every semester) Practice in Reverse EngineeringVirus Test Center: practical student work with malware/testsOther labs: biometrics systems, secure chipcardsExamination Work: about 100 diplom/master thesisDissertation Works: e.g. Vesselin Bontchev on Viruses,

KlausPeter Kossakowski: Principles of Incident Response Systems Morton Swimmer (2005): new about new AV methods

Page 15: The VTC experience

1.7 Background: Hamburg´s IT Security Curricula

• Hamburg Bachelor Curriculum 2001-2006:– Lecture (4 hours/week) for ALL students in 3rd year– „Foundations of Secure/Safe Systems“ (GBI)– 250 Students per semester (mandatory)– Essential elements:

• Legal requirements (data protection, crime law, SOX etc)

• Definition Security & Safety, technical requirements

• Survey of hacking intrusion techniques, malware

• Sources of InSecurity: paradigms, protocols, weaknesses, …

• Concepts of security: cryptography, secure OS & DB, Firewalls, Kerberos, AntiMalware, Intrusion Detection

• Risk Aanalysis / Risk Management

Page 16: The VTC experience

Agenda: Chapter 2

1. Background: Uni Hamburg´s IT Security Curriculum

2. Development of aVTC @ Uni-HH3. Methods used in aVTC tests

4. Demand for inherently secure Systems

Page 17: The VTC experience

2.1 Development of aVTC @ Uni-Hamburg

Phase 0: 1978: engagement in technical aspects of data protection,

seminars, lectures and diplom thesis

Phase 1: 1987 analysis of Jerusalem virus (received from Hebrew univ)

1st antivirus: Morton Swimmer: „antiJeru.exe“

KGB hack/K-P Kossakowski: foundation of 1st CERT

1991: Michelangelo virus: antiMich.exe distributed 30k*

Phase 2: 1990-1995: Vesselin Bontchev: 1st professional AV tests

Vesselin is „Best Teller of this Saga“

Phase 3: 1994-2004: VTC established with student testers

1997/98: 1st malware test (against protest of some AV comps)

October 2004: aVTC closed (Prof. emeritus – no more students)

Page 18: The VTC experience

2.2 Survey of tests at aVTC @ Uni-Hamburg Scanner test July 2004 Scanner test April 2003 Scanner test December 2002 "Heureka-2" Scanner test March 2002 Scanner test October 2001 "Heureka(-1)" Scanner test July 2001 Scanner test April 2001 AntiVirus Repair Test (ART 2000-11) Comment on Sophos´ reaction to VTC test report August 2000 Scanner test August 2000 Scanner test April 2000 Pre-released Scanner test February 2000 Scanner test September 1999 Scanner test March 1999 Scanner test October 1998 Scanner test Computer Bild (June 1998) Scanner test February 1998 Scanner test July 1997 Scanner test February 1997 Scanner test July 1994

Scanner test July 2005: Detection of mobile viruses (diplom thesis)

Page 19: The VTC experience

Agenda: Chapter 3

1. Background: Uni Hamburg´s IT Security Curricula

2. Development of aVTC @ Uni-HH

3. Methods used in aVTC tests3A Survey of methods3B Survey of test results3C Lessons learned

4. Demand for inherently secure Systems

Page 20: The VTC experience

3A.1 Test System: Lab Network

DOS Win 95Win NT

WXP

Client 3

Win NT

100 Mbit Ethernet using Microsoft Netbui

Client 1 Client 2

Server

Page 21: The VTC experience

3A.2a Test server:

Win-NT Server (1) hardware: Pentium 200 MHz, 64 MB RAM, 2 GB hard disk (boot) 2*4,3 GB data/reports, 2*9,1 GB virus database (mirror) 3 network cards: 2*100 MBit/sec, 1*10 MBit/sec Protected against electrical faults (USV: APC 420 VA) Operating system: Windows NT Server 4.0 SP 6

Network:1* 10 MBit/sec BNC for 20 DOS clients 1*100 MBit/sec via 2 cascaded switches for all other clients with 10 MBit/sec cards 1*100 MBit/sec via 100 MBit/sec hub other clients

Page 22: The VTC experience

3A.2b Test clients:

Windows Clients (9) have the following hardware: 2*Pentium 133 MHz, 64 MB RAM, 2 GB harddisk, 10 MBit/sec Pentium 90 MHz, 32 MB RAM, 1 GB harddisk, 100 MBit/sec Pentium-II 350 MHz, 64 MB RAM, 2 GB harddisk, 100 MBit/sec Pentium 233 MMX MHz, 64 MB RAM,2 GB harddisk, 100 MBit/s Pentium-II 233 MHz, 64 MB RAM, 4 GB harddisk, 100 MBit/s Pentium-II 350 MHz, 64 MB RAM, 4 GB harddisk, 100 MBit/s Pentium MMX 233 MHz 196 MB RAM, 4 GB harddisk, 100 MB/s Pentium III 128 MB RAM, 4 GB hard disk, 100 MBit/sec 2*Pentium IV 1.7 GHz 512 MB RAM, 40 GB harddisk, 100 MBit/s

Page 23: The VTC experience

3A.3 Test System: Databases

Boot virus database

Saved as images of bootsectors andmaster boot records

File virus database

File extentions:boo, img, mbr

File extentions:COM,EXE,CMD,SYS, BAT

The directory structure iscreated out of the virus

namesThe files are in theiroriginal structure

Page 24: The VTC experience

3A.4 Test System: Directory structure

Main directories:

CARO ( the three main scanners identify the virus identical:

implies that CARO naming conventions are valid

NYETCARO: one or two scanners identified the virus

UNKNOWN: none of the three scanners identified the virus, but the files replicate

• In early tests: – OS/2: viruses natively working under OS/2

– WINDOWS 95: viruses natively working under Windows 95

Page 25: The VTC experience

3A.5 Early Test System Size (1997)

Boot virus database:images: 3910viruses: 1004

File virus database:files: 84236viruses: 13014

Macro virus database:files: 2676viruses: 1017

Macro malware database:files: 61malware: 89

File malware database:files: 213malware: 163

Page 26: The VTC experience

3A.6b Test System: Size April 2003

"Full Zoo": 21,790 File Viruses in 158,747 infected files 8,001 different File Malware in 18,277 files 664 Clean file objects for False Positive test 7,306 Macro Viruses in 25,231 infected docs

450 different Macro Malware in 747 macro objects 329 Clean macro objects for False Positive test 823 different script viruses in 1,574 infected objects 117 different script malware in 202 macro objects

"ITW Zoo": 11 Boot Viruses in 149 infected images/sectors 50 File Viruses in 443 infected files 124 Macro Viruses in 1,337 infected documents

20 Script Viruses in 122 infected objects

Page 27: The VTC experience

3A.7a Preprocessing of new objects (#1/4)

Unzip the archives

Reset all file attributes

Sort all files into main categories (boot, file, macro)

Restore the normal file extensions (e.g. .EX_ ==> .EXE)

Page 28: The VTC experience

3A.7b Preprocessing of new objects (#2/4)

Remove with Dustbin all known non-viruses

Search for duplicate files (binary identical)First step: only the new filesSecond step: new files and old databaseThird step: delete all duplicate files

Replication of all new files to test if they are „alive“ (partially applied in test 1997-07)

Page 29: The VTC experience

3A.7c Preprocessing of new objects (#3/4)

Scan new files and previous databases with F-Prot,

Dr. Solomon and AVP to create report files

Move the non viruses (trojan, dropper, germs) into a special directory

Preprocessing reports using CARO.bat

If a virus is operating-system specific, it is sorted into the corresponding subdirectory below the specific OS-Directory (Win95, WinNT, OS/2)

Page 30: The VTC experience

3A.7d How CARO.BAT works (#4/4):

The subdirectory name is created out of the virus name.

The dots between the family names, sub family, main variant and sub variant are substituted with backslashes.

All characters except a-z, 0-9, „-“ and „_“ are substituted with „_“.

If a file with the same name already exists, the new file in this directory is renamed.

If F-Prot identifies a virus by name, the file is moved into the corresponding subdirectory below the NYETCARO directory

If Dr. Solomon identifies a virus by name, the file is moved into the corresponding subdirectory below the NYETCARO directory

If AVP identifies a virus by name, the file is moved into the corresponding subdirectory below the NYETCARO directory

If all three scanners identify a virus by the same name, the file is moved into the corresponding subdirectory below the CARO-Directory

Page 31: The VTC experience

3A.8 Test Procedures: Testing boot viruses

For practical reasons, no infected floppy disks are tested (method for such tests available but not practiced).

1.) Using SIMBOOT:• Is used to scan the boot images• Simulates changing of infected floppy disks• Simulates the user inputs to scan the next floppy disk

2.) If SIMBOOT fails, direct test:• Scan the images directly

Remark: several AV products crash under SIMBOOT.

Page 32: The VTC experience

3A.9 Test Procedures: Testing file/macro viruses

Heuristic mode

Reports only (no repair)

Experience: some scanners crash upon detecting viruses improperly

Scan small amount of files ( it‘s easier to start the scanner again )•CARO•NYETCARO\A•NYETCARO\B ........•NYETCARO\Z•Unknown•OS/2 (early tests)•WINDOWS 95 •Windows NT•Windows XP

Page 33: The VTC experience

3A.10 Test Procedures for file/macro viruses

Start Test-Version of the OS

Install scanner

Scan and save report to the network

Reboot with Master System

Delete Test-Version and restore from backup

Start from beginning

Page 34: The VTC experience

3A.11 Test Results, Evaluation

1) UNIX-Tools and AWK-Scripts are used to evaluate the reports; in cases of changed scanner diagnostics, scripts must be adapted.

2) Create an alphabetical list, which contains for each directory the directory name and the number of files in the directory

3) Analyse how many files are scanned and recognized for each scanner report.

4) Sort and join the reports ( directory listing - preprocessed scanner report )

5) Evaluate the joined report

6) Quality assurance

Page 35: The VTC experience

3B.1a Test results (e.g. 2003-04: 1st WXP test)• 0README.1ST - Latest notes.• 0XECSUM.TXT - Executive Summary of Test Report 2003-04• 1CONTENT.TXT - This file• 2PROLOG.TXT - Background, Aims of this test• 3INTRO.TXT - Introduction: Background, Aims, • Development of VTC tests• 4TESTCON.TXT - Conditions which a scanner must fullfil • in order to be tested• 5PROTOCO.TXT - Detailed description of VTC test protocols• 6jWXP.TXT - Detailed Results: Windows-XP File, Macro and Script• Virus and Malware Results • 6mCMP32.TXT - Detailed Results: Comparison of 32-Bit results from• test 2002-12 and 2003-04 (Win-XP, Win-98, Win-2k)• 7EVAL-WXP.TXT - Windows-XP Results: Evaluation, Grading of WXP

products• 7EVAL-CMP.TXT - W32-platforms: Comparison, Evaluation, • Grading of W32 products• 8PROBLMS.TXT - Problems and bugs experienced during tests• 9EPILOG.TXT - Summary, future test plans, and final comment• DISCLAIM.TXT - Disclaimer: About usage of this document

Page 36: The VTC experience

3B.1b Test report structure (cont)

Evidence for reproducibility of test results:

• --------------------• A1ITW00b.TXT - "In-The-Wild" list of PC Viruses • (October 2002: Wildlist.org)• A2SCANLS.TXT - List of scanners/versions and parameters, • including information on producer• A4TSTDIR.TXT - Directory of A3TSTBEDs (content of A3TSTBED.zip)• A5CODNAM.TXT - Code names of AV products in VTC tests

• Separate appendix:• ------------------• A3TSTBED.ZIP - Index of File, Macro, Script Virus & infected object• Databases, both full and "In-The-Wild"; • Index of Macro and Script Malware Databases; • and Index of non-viral and non-malicious objects • used in False-Positive test (all pkZIPped).

Page 37: The VTC experience

3B.2 Development of testbeds:

File Viruses/Malware Boot Viruses Macro Viruses/Malware ScriptViruses/Malware Viral objects Malware viruses objects viruses objects viruses objects malware--------------------------------+---------------+--------------------------+--------------------------1997-07: 12,826 83,910 213 938 3,387 617 2,036 72 1998-03: 14,596 106,470 323 1,071 4,464 1,548 4,436 459 1998-10: 13,993 112,038 3,300 881 4,804 2,159 9,033 191 1999-09: 17,561 132,576 6,217 1,237 5,286 3,546 9,731 329 2000-04: 18,359 135,907 6,639 1,237 5,379 4,525 12,918 260 2000-08: 5,418 15,720 500 306 2001-04: 20,564 140,703 12,160 1,311 5,723 6,233 19,387 627 477 2002-12: 21,790 158,747 18,277 7,306 25,231 823 1,574 2003-04: 21,790 158,747 18,277 7,306 25,231 823 1,574-------------------------------+---------------+-----------------------+------------------------

Page 38: The VTC experience

3B.3 Example of test result: File/Macro/Script Zoo Virus Detection Rates

Scan I == File Virus == + == Macro Virus == + == Script Virus == ner I Detection I Detection I Detection -------+-----------------------+-------------------------+--------------------------AVP I 100.~ I 100.~ I 98.9BDF I 82.9 I 99.0 I 72.4CMD I 98.5 I 99.9 I 89.1DRW I 98.3 I 99.4 I 94.7FSE I 100.~ I 100.~ I 99.5INO I 98.7 I 99.9 I 94.7NAV I 98.3 I 99.6 I 96.8NVC I 97.8 I 99.8 I 87.6RAV I 96.7 I 99.9 I 96.1SCN I 99.8 I 100.0 I 99.6-------+-----------------------+-------------------------+--------------------------Mean : 97.1% I 99.8% I 92.9%Mean>10%: 97.1% I 99.8% I 92.9%-------+-----------------------+-------------------------+--------------------------Student testers preferred/developped graphical representations (see next folios).

Page 39: The VTC experience

Antiviren-Test 2002-12

Erkennung unter Windows 2000

Page 40: The VTC experience

Antiviren-Test 2002-12

Erkennung unter Windows 2000

Page 41: The VTC experience

Antiviren-Test 2002-12

Erkennung unter Windows 2000

Page 42: The VTC experience

3B.4a Grading of AV/AM products:

• Definition (1): A "Perfect AntiVirus (AV) product"• --------------------------------------------------• 1) Will detect ALL viral samples "In-The-Wild"• AND at least 99.9% of zoo samples,• in ALL categories (file, boot, macro and script-based• viruses), with always same high precision • of identification and in every infected sample,• 2) Will detect ALL ITW viral samples in compressed • objects for all (NOW:5) popular packers, and• 3) Will NEVER issue a False Positive alarm• on any sample which is not viral.• Remark: detection of "exotic viruses" is• presently NOT rated.

Page 43: The VTC experience

3B.4b Grading of AV/AM products:

• Definition (2): A "Perfect AntiMalware (AM) product"• ----------------------------------------------------• 1) Will be a "Perfect AntiVirus product", • That is: 100% ITW detection• AND >99% zoo detection• AND high precision of identification• AND high precision of detection• AND 100% detection of ITW viruses • in compressed objects,• AND 0% False-Positive rate,• 2) AND it will also detect essential forms• of malicious software, at least in unpacked• forms, reliably at high rates (>90%).• Remark: detection of "exotic malware" is • presently NOT rated.

Page 44: The VTC experience

3B.4c Example of product gradingTest category: "Perfect" "Excellent"

---------------------------------------------------------------------------------------------------- WXP file ITW test: AVP,DRW,FSE,NAV,SCN INO,RAV WXP macro ITW test: AVP,DRW,FSE,INO,NAV,SCN BDF,CMD,NVC,RAV WXP script ITW test: AVP,CMD,DRW, INO FSE,NAV,NVC,RAV,SCN ---------------------------------------------------------------------------------------------------- WXP file zoo test --- AVP,FSE,SCN WXP macro zoo test: SCN AVP,FSE,CMD,INO, RAV,NAV,NVC,DRW,BDF WXP script zoo test: --- SCN,FSE---------------------------------------------------------------------------------------------------- WXP file pack test: AVP,FSE,SCN DRW WXP macro pack test: SCN AVP,DRW ---------------------------------------------------------------------------------------------------- WXP file FP avoidance: AVP,BDF,CMD,FSE,INO, DRW NAV,NVC,RAV,SCN WXP macro FP avoidance: BDF,INO,NAV,SCN RAV ---------------------------------------------------------------------------------------------------- WXP file malware test: --- FSE,AVP,SCN WXP macro malware test: AVP,FSE,SCN CMD,RAV,NVC,INO, NAV,BDF,DRW WXP script malware test: --- SCN,FSE,AVP,NAV ----------------------------------------------------------------------------------------------------

Page 45: The VTC experience

3B.4d Example of product grading

************************************************************ "Perfect" Windows-XP AntiVirus product: =NONE= (20 points) "Excellent" Windows-XP products: 1st place: SCN (18 points) 2nd place: AVP,FSE (13 points) 4th place: NAV (11 points) 5th place: DRW (10 points) 6th place: INO ( 9 points) 7th place: RAV ( 8 points) 8th place: BDF,CMD,NVC ( 6 points) ************************************************************ "Perfect" Windows-XP AntiMalware product: =NONE= (26 points) "Excellent" Windows-XP AntiMalware product: 1st place: SCN (22 points) 2nd place: AVP,FSE (17 points) 4th place: NAV (13 points) 5th place: DRW (11 points) 6th place: INO (10 points) 7th place: RAV ( 9 points) 8th place: BDF,CMD,NVC ( 7 points) ************************************************************

Page 46: The VTC experience

3B.5a Symbian MobilePhone Malware: Threats

Advent of Mobile Malware:- Platforms (Symbian, EPOC, ...) conceived to support easy

implementation of applications

- Programming in script languages, no exclusion of potentially harmful functions

- Example: platform = Symbian OS- Presently known: 12 different strains (families) of self-replicating

(=viral) or not self-replicating (=trojanic) malware with 100 variants or modifications

- Malicious functions: most specimen are „proof-of-concept“ malware (viruses/trojans) but some have a dangerous payload

- Example of dangerous payload:- Reorganize dictionary of telephone numbers

- Send MMS to every entry in telephone dictionary (real „payload“ )

Page 47: The VTC experience

3B.5b Symbian MobilePhone Malware Test: Products

14 Products in aVTC test: (versions: May 2005)ANT AntiVir (H&B EDV) (Germany)

AVA AVAST (32) (Czech Republic)

AVG Grisoft Antivirus (Czech Republic)

AVK AntiVirus Kit (GData) (Germany/Russia)

AVP AVP (Platinum) (Russia)

BDF BitDefender (AntiVirus eXpert) (Romania)

FPW FProt FP-WIN (Iceland)

FSE F-Secure AntiVirus (Finland)

IKA Ikarus Antivirus (Austria)

MKS MKS_vir 2005 (Poland)

NAV Norton AntiVirus/Symantec (USA)

NVC Norman Virus Control (Norway)

SCN NAI VirusScan/McAfee (USA)

SWP Sophos AntiVirus (Sweep) (UK)

Page 48: The VTC experience

3B.5c Symbian MobilePhone Malware Test: Testbed

Testbed (all specimen known May 12, 2005):Cabir 22 Variants (a ... .v),

1 dropper (installing variants .b, .c, .d)Commwarrior 2 Variants (a-b)Dampig 1 Variant (a)

Drever 3 Variants (a-c)Fontal 1 Variant (a)Hobbes 1 Variant (a)Lasco 1 Variant (a)Locknut: 2 Variants (a, b)Mabir 1 Variant (a)MGDropper (Metal Gear trojan) 1 Variant (a)Mosquitos 1 Variant (a)Skulls 11 Variants (a-k);

52 modifications of Skulls.D

12 strains (=„families“) with 100 variants/modifications.

Page 49: The VTC experience

3b.5d Symbian MobilePhone Malware Test: Results

Rank/Product Detected (135 samples) DetectionRate(%) Grade ( 6) ANT 92 68,15 Risky ( 6) AVA 53 39,26 Risky ( 6) AVG 119 88,15 Risky ( 2) AVK 131 97,04 Very Good ( 1) AVP 134 99,26 Excellent ( 4) BDF 126 93,33 Good(13) FPW 13 9,63 Inacceptable ( 2) FSE 132 97,78 Very Good ( 6) IKA 57 42,22 Risky ( 6) MKS 55 40,74 Risky ( 6) NAV 81 60,00 Risky(13) NVC 5 3,70 Inacceptable

( 4) SCN 123 91,11 Good ( 6) SWP 60 44,44 Risky

Page 50: The VTC experience

3C Lessons learned for AV-Test Centers

1) Continuous improvement of knowledge & skills of AV test personnell (courses, events)

2) Publish test methods in detail, as basis for analysis of test methods & discourse about improvement

3) Work with trusted AVcompanies but avoid dependency

4) Sound test database requires uniform naming as well as proper quality assurance measures (including publication of problems, including own failures)

5) Send test results (incl. missed samples) to AV companies for analysis&verification of results.

6) Publish all details of tests (methods, problems, findings) to allow for expert analysis (NOT samples!)

Page 51: The VTC experience

Agenda: Chapter 4

1. Background: Hamburg´s IT Security Curriculum

2. Development of aVTC @ Uni-HH

3. Methods used in aVTC tests

4. Demand for inherently secure systems

Page 52: The VTC experience

4.1 Contemporary Solution: „Tower of IT“

AA

BB

B

WAN

Protected LAN

AM

AMLANU1

KryptoBoxFirewall

Intrustion DetectionAntiMalware

KryptoBox

Malicious Information

Zone Red: NO PROTECTION

Zone Blue: Hi-Protection

Zone Yellow: Partial Protection

U#

Page 53: The VTC experience

4.2 Requirements for Inherently Safe&Secure Systems

Basic requirement: for all IT systems in a ubiquitous network (including devices in personal contact), manufacturers specify and guarantee essential functions and features.

Requirement #1: „SafeComputing“ (SC): SC architecture guarantees: functionality of processes, persistence & integrity of objects, encapsulation of processes, graceful degradation (!), benign recovery (!)

Requirement #2: „SecureNetworking“ (SN): SN protocol guarantees: confidentiality, integrity, authenticity of sender/receiver, reliability of transfer, non-repudiation (!), non-deniability (!)

Requirement #3: Assurance of functional adequacy:All functions and features must be specified and implemented in a way to permit adequate assurance of specifications.

Page 54: The VTC experience

4.3 Residual Risks in Ubiquitous Computing

Future Secure and InSecure Networlds:

Stand alone:local anomalies

No protectionagainst attacksin FreeNetwork

No anomalies

Lokal stark

Inherently secure against attacks

FreeNetwork

SecureNetwork

!!!Protection from import of anomalies, attacks, flooding

Page 55: The VTC experience

4.4 Enforcement of Inherent Security

Path #1: DT Manufacturers establish and enforce adequate quality and standards.

Example: Vapor engine quality enforced through

„Dampfkessel Ueberwachungs-Verein“ (now: TÜV)Presently, no such self-organisation of the ICT

industry is available.

Path #2: Directives (EU, president) and laws enforce protection of customers (persons AND enterprises), including damage compensation and preventive actions.

Example: customer protection legislation in USA etc following Nader´s book „Unsafe at any speed!“