Top Banner
CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz
40

CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Dec 21, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

CMSC 414Computer and Network Security

Lecture 25

Jonathan Katz

Page 2: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Secure programming techniques(Based on: “Programming Secure Applications

for Unix-Like Systems,” David Wheeler)

Page 3: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Overview

Validate all input

Avoid buffer overflows

Program internals…

Careful calls to other resources

Send info back intelligently

Page 4: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Validating input

Determine acceptable input, check for match --- don’t just check against list of “non-matches”– Limit maximum length– Watch out for special characters, escape chars.

Check bounds on integer values– E.g., sendmail bug…

Page 5: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Validating input

Filenames– Disallow *, .., etc.

Html, URLs, cookies– Cf. cross-site scripting attacks

Command-line arguments– Even argv[0]…

Config files

Page 6: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Avoiding buffer overflows

Use arrays instead of pointers (cf. Java)

Avoid strcpy(), strcat(), etc.– Use strncpy(), strncat(), instead– Even these are not perfect… (e.g., no null

termination)

Make buffers (slightly) longer than necessary to avoid “off-by-one” errors

Page 7: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Program internals… Avoid race conditions

– E.g., authorizing file access, then opening file

Watch out for temporary files in shared directories (e.g., /tmp)

Watch out for “spoofed” IP addresses/email addresses

Simple, open design; fail-safe defaults; completge mediation; etc.

Don’t write your own crypto algorithms– Use crypto appropriately

Page 8: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Calling other resources

Use only “safe” library routines

Limit call parameters to valid values– Avoid metacharacters

Avoid calling the shell

Page 9: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

User output

Minimize feedback– Don’t explain failures to untrusted users– Don’t release version numbers…– Don’t offer “too much” help (suggested

filenames, etc.)

Don’t use printf(userInput)– Use printf(“%s”, userInput) instead…

Page 10: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Source code scanners

Used to check source code– E.g., flawfinder, cqual

“Static” analysis vs. “dynamic” analysis– Not perfect– Dynamic analysis can slow down execution,

lead to bloated code– Will see examples of dynamic analysis later…

Page 11: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

“Higher-level” techniques

Page 12: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Addressing buffer overflows Basic stack exploit can be prevented by marking

stack segment as non-executable, or randomizing stack location.– Code patches exist for Linux and Solaris.– Some complications on x86.

Problems:– Does not defend against `return-to-libc’ exploit.

• Overflow sets ret-addr to address of libc function.– Some apps need executable stack (e.g. LISP interpreters).

– Does not block more general overflow exploits:• Overflow on heap: overflow buffer next to func pointer.

Patch not shipped by default for Linux and Solaris

Page 13: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Run-time checking: StackGuard

Embed “canaries” in stack frames and verify their integrity prior to function return

strretsfplocaltopof

stackcanarystrretsfplocal canary

Frame 1Frame 2

Page 14: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Canary types Random canary: (used in Visual Studio 2003)

– Choose random string at program startup.– Insert canary string into every stack frame.– Verify canary before returning from function.– To corrupt random canary, attacker must learn current

random string.

Terminator canary:Canary = 0, newline, linefeed, EOF

– String functions will not copy beyond terminator.– Attacker cannot use string functions to corrupt stack

Page 15: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Canaries, continued…

StackGuard implemented as a GCC patch– Program must be recompiled

Minimal performance effects:

Not foolproof…

Page 16: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Run-time checking: Libsafe

Intercepts calls to strcpy (dest, src)– Validates sufficient space in current stack

frame:|frame-pointer – dest| > strlen(src)

– If so, does strcpy. Otherwise, terminates application

destret-addrsfptopof

stacksrc buf ret-addrsfp

libsafe main

Page 17: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

More methods …

Address obfuscation– Encrypt return address on stack by XORing

with random string. Decrypt just before returning from function.

– Attacker needs decryption key to set return address to desired value.

PaX ASLR: Randomize location of libc– Attacker cannot jump directly to exec function

Page 18: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Software fault isolation

Partition code into data and code segments

Code inserted before each load/store/jump– Verify that target address is safe

Can be done at compiler, link, or run time– Increases program size, slows down execution

Page 19: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Security for mobile code

Mobile code is particularly dangerous!

Sandboxing– Limit the ability of code to do harmful things

Code-signing– Mechanism to decide whether code should be

trusted or not

ActiveX uses code-signing, Java uses sandboxing (plus code-signing)

Page 20: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Code signing

Code producer signs code

Binary notion of trust

What if code producer compromised?

Lack of PKI => non-scalable approach

Page 21: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

“Proof-carrying code”

Input: code, safety policy of client

Output: safety proof for code

Proof generation expensive– Proof verification cheaper– Prove once, use everywhere (with same policy)

Prover/compiler need not be trusted– Only need to trust the verifier

Page 22: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Sandboxing in Java

Focus on preventing system modification and violations of user privacy– Denial of service attacks much harder to

prevent, and not handled all that well

We will discuss some of the basics, but not all the most up-to-date variants of the Java security model

Page 23: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Sandboxing

A default sandbox applied to untrusted code

Users can change the defaults…– Can also define “larger” sandboxes for

“partially trusted” code– Trust in code determined using code-signing…

Page 24: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Some examples…

Default sandbox prevents:– Reading/writing/deleting files on client system– Listing directory contents– Creating new network connections to other

hosts (other than originating host)– Etc.

Page 25: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Sandbox components

Verifier, Class loader, and Security Manager

If any of these fail, security may be compromised

Page 26: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Verifier

Java program is compiled to platform-independent Java byte code

This code is verified before it is run– Prevents, for example, malicious “hand-

written” byte code

Efficiency gains by checking code before it is run, rather than constantly checking it while running

Page 27: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Verifier…

Checks:– Byte code is well-formatted– No forged pointers– No violation of access restrictions– No incorrect typing

Of course, cannot be perfect…

Page 28: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Class loader

Helps prevent “spoofed” classes from being loaded– E.g., external class claiming to be the security

manager

Whenever a class needs to be loaded, this is done by a class loader– The class loader decides where to obtain the

code for the class

Page 29: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Security manager

Restricts the way an applet uses Java API calls– All calls to the OS are mediated by the security

manager

Security managers are browser-dependent!

Page 30: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

System call monitoring

Monitor all system calls– Enforce particular policy– Policy may be loaded in kernel

Hand-tune policy for individual applications

Similar to Java security manager– Difference in where implemented

Page 31: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Viruses/malicious code

Page 32: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Viruses/malicious code

Virus – passes malicious code to other non-malicious programs– Or documents with “executable” components

Trojan horse – software with unintended side effects

Worm – propagates via network– Typically stand-alone software, in contrast to

viruses which are attached to other programs

Page 33: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Viruses

Can insert themselves before program, can surround program, or can be interspersed throughout program– In the last case, virus writer needs to know

about the specifics of the other program

Two ways to “insert” virus:– Insert virus in memory at (old) location of

original program– Change pointer structure…

Page 34: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Viruses…

Boot sector viruses– If a virus is loaded early in the boot process,

can be very difficult (impossible?) to detect

Memory-resident viruses– Note that virus might complicate its own

detection– E.g., removing virus name from list of active

programs, or list of files on disk

Page 35: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Some examples

BRAIN virus– Locates itself in upper memory; resets the

upper memory bound below itself– Traps “disk reads” so that it can handle any

requests to read from the boot sector– Not inherently malicious, although some

variants were

Page 36: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Morris worm (1988)

Resource exhaustion (unintended)– Was supposed to have only one copy running, but did

not work correctly…

Spread in three ways– Exploited buffer overflow flaw in fingerd

– Exploited flaw in sendmail debug mode

– Guessing user passwords(!) on current network

Bootstrap loader would be used to obtain the rest of the worm

Page 37: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Chernobyl virus (1998)

When infected program run, virus becomes resident in memory of machine– Rebooting does not help

Virus writes random garbage to hard drive

Attempts to trash FLASH BIOS– Physically destroys the hardware…

Page 38: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Melissa virus/worm (1999)

Word macro…– When file opened, would create and send

infected document to names in user’s Outlook Express mailbox

– Recipient would be asked whether to disable macros(!)

• If macros enabled, virus would launch

Page 39: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Code red (2001)

Propagated itself on web server running Microsoft’s Internet Information Server– Infection using buffer overflow…– Propagation by checking IP addresses on port

80 of the PC to see if they are vulnerable

Page 40: CMSC 414 Computer and Network Security Lecture 25 Jonathan Katz.

Detecting viruses

Can try to look for “signatures”– Unreliable unless up-to-date

– Encrypted viruses

– Polymorphic viruses

Examine storage– Sizes of files, “jump” instruction at beginning of code

– Can be hard to distinguish from normal software

Check for (unusual) execution patterns– Hard to distinguish from normal software…