Top Banner
Learning Conditional Abstractions (CAL) Bryan A. Brady Bryan A. Brady 1* 1* Randal E. Bryant Randal E. Bryant 2 Sanjit A. Seshia Sanjit A. Seshia 3 1 IBM, Poughkeepsie, NY IBM, Poughkeepsie, NY 2 CS Department, Carnegie Mellon CS Department, Carnegie Mellon University University 3 EECS Department, UC Berkeley EECS Department, UC Berkeley * Work performed at UC Berkeley Work performed at UC Berkeley FMCAD 2011, Austin, TX FMCAD 2011, Austin, TX 1 November 2011 1 November 2011
38

Learning Conditional Abstractions (CAL)

Jan 21, 2016

Download

Documents

Alina

Learning Conditional Abstractions (CAL). Bryan A. Brady 1* Randal E. Bryant 2 Sanjit A. Seshia 3 1 IBM, Poughkeepsie, NY 2 CS Department, Carnegie Mellon University 3 EECS Department, UC Berkeley * Work performed at UC Berkeley FMCAD 2011, Austin, TX 1 November 2011. - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Learning Conditional Abstractions (CAL)

Learning Conditional Abstractions (CAL)

Learning Conditional Abstractions (CAL)

Bryan A. BradyBryan A. Brady1*1*

Randal E. BryantRandal E. Bryant22

Sanjit A. SeshiaSanjit A. Seshia33

11IBM, Poughkeepsie, NYIBM, Poughkeepsie, NY22CS Department, Carnegie Mellon UniversityCS Department, Carnegie Mellon University33EECS Department, UC BerkeleyEECS Department, UC Berkeley**Work performed at UC BerkeleyWork performed at UC Berkeley

FMCAD 2011, Austin, TXFMCAD 2011, Austin, TX

1 November 20111 November 2011

Page 2: Learning Conditional Abstractions (CAL)

Learning Conditional AbstractionsLearning Conditional Abstractions

2

PhilosophyPhilosophy: :

Create abstractions by generalizing simulation Create abstractions by generalizing simulation data.data.

Learning Conditional Abstractions (CAL):Learning Conditional Abstractions (CAL):

Use machine learning from traces to compute Use machine learning from traces to compute abstraction conditions.abstraction conditions.

Page 3: Learning Conditional Abstractions (CAL)

Abstraction Levels in FVAbstraction Levels in FV

• Term-level verifiersTerm-level verifiers• SMT-based verifiers (e.g., SMT-based verifiers (e.g.,

UCLID)UCLID)• Able to scale to much more Able to scale to much more

complex systemscomplex systems• How to decide what to How to decide what to

abstract? abstract?

Bit Level

Bit Vector Level

Term Level

Designs are typically at this Designs are typically at this levellevel

• Most tools operate at this levelMost tools operate at this level• Model checkersModel checkers• Equivalence checkersEquivalence checkers

• Capacity limited by Capacity limited by • State bits State bits • Details of bit-manipulationsDetails of bit-manipulations

??????

Bit BlastBit Blast

3

Page 4: Learning Conditional Abstractions (CAL)

Motivating ExampleMotivating Example

x1 x2 xn

fAfB

• Equivalence/Refinement Equivalence/Refinement CheckingChecking

• Term-level abstractionTerm-level abstraction• Replace bit-vector operators Replace bit-vector operators

with uninterpreted functionswith uninterpreted functions• Represent data with arbitrary Represent data with arbitrary

encodingencoding

• Difficult to reason about Difficult to reason about some operatorssome operators• Multiply, DivideMultiply, Divide• Modulo, PowerModulo, Power

f(...)

=

4

Page 5: Learning Conditional Abstractions (CAL)

Term-Level AbstractionTerm-Level Abstraction

FullyFullyuninterpreteuninterprete

dd

UF

16

20

out2

instr

Precise, word-levelPrecise, word-level

16

ALU

JMP

=

01

4 16

16

out1

19 015

instr

ExampleExample

instr :=instr := JMPJMP 12341234

outout11:=:= 12341234

outout22:=:= 00

Need to partially Need to partially abstractabstract

5

Page 6: Learning Conditional Abstractions (CAL)

Term-Level AbstractionTerm-Level Abstraction

FullyFullyuninterpreteuninterprete

dd

UF

16

20

out

instr

Precise, word-levelPrecise, word-level

16

ALU

JMP

=

01

4 16

16

out

19 015

instr

Partially-interpretedPartially-interpreted

16

UF

JMP

=

01

4 16

16

out

19 015

instr

6

Page 7: Learning Conditional Abstractions (CAL)

Term-Level AbstractionTerm-Level Abstraction

RTL

VerificationModel

Manual AbstractionManual Abstraction• Requires intimate knowledge of designRequires intimate knowledge of design• Multiple models of same designMultiple models of same design• Spurious counter-examplesSpurious counter-examples

Automatic AbstractionAutomatic Abstraction• How to choose the How to choose the right level right level of of

abstractionabstraction• Some blocks require Some blocks require conditional conditional

abstractionabstraction• Often requires many iterations of Often requires many iterations of

abstraction refinementabstraction refinement7

Page 8: Learning Conditional Abstractions (CAL)

OutlineOutline

• MotivationMotivation

• Related workRelated work

• BackgroundBackground

• The CAL ApproachThe CAL Approach

• Illustrative ExampleIllustrative Example

• ResultsResults

• ConclusionConclusion

8

Page 9: Learning Conditional Abstractions (CAL)

Related WorkRelated Work

Author/Author/TechniqueTechnique

AbstractioAbstractionn

TypeType

Abstraction Abstraction GranularityGranularity

MethodMethod

R. E. Bryant, et al., TACAS 2007

Data Datapath reduction via successive

approximation

CEGAR

P. BjesseCAV’08

Data Reduces datapaths without BV ops

Selective bit-blasting

Z. Andraus, et al.DAC’04, LPAR’08

Data, Function

Fully abstracts all operators

CEGAR

ATLAS Function Partially abstracts some modules

Hybrid static-dynamic

CAL Function Partially abstracts some modules

Machine learning/CEGAR

9

Page 10: Learning Conditional Abstractions (CAL)

OutlineOutline

• MotivationMotivation

• Related workRelated work

• BackgroundBackground• ATLASATLAS• Conditional AbstractionConditional Abstraction

• The CAL ApproachThe CAL Approach

• Illustrative ExampleIllustrative Example

• ResultsResults

• ConclusionConclusion

10

Page 11: Learning Conditional Abstractions (CAL)

Background: The ATLAS ApproachBackground: The ATLAS Approach

Hybrid approachHybrid approach

• Phase 1: Identify abstraction candidates with Phase 1: Identify abstraction candidates with random simulationrandom simulation

• Phase 2: Use dataflow analysis to compute Phase 2: Use dataflow analysis to compute conditions under which it is precise to abstractconditions under which it is precise to abstract

• Phase 3: Generate abstracted modelPhase 3: Generate abstracted model

11

Page 12: Learning Conditional Abstractions (CAL)

Identify Abstraction CandidatesIdentify Abstraction Candidates

1.1. Find isomorphic sub-Find isomorphic sub-circuits circuits ((fblocksfblocks))• Modules, functionsModules, functions

x1 x2 xn

fA fB

=

ab

c

ab

c

2.2. Replace each fblock with Replace each fblock with a random functiona random function, over , over the inputs of the fblockthe inputs of the fblock

3.3. Verify via simulationVerify via simulation::• Check original property Check original property

for N different random for N different random functionsfunctions

RFa

b

c

RFa

b

c

aRFb

c

aRFb

c

ab

RFc

ab

RFc

12

Page 13: Learning Conditional Abstractions (CAL)

Identify Abstraction CandidatesIdentify Abstraction Candidates

x1 x2 xn

fA fB

=

ab

c

ab

c

aUFb

c

aUFb

c

4.4. Do not abstract fblocks Do not abstract fblocks that fail in some fraction of that fail in some fraction of simulationssimulations

5.5. Replace remaining fblocks Replace remaining fblocks with partially-abstract with partially-abstract functions and functions and compute compute conditions conditions under which the under which the fblock is modeled preciselyfblock is modeled precisely

Intuition:Intuition:fblocks that can not be fblocks that can not be abstracted will fail when abstracted will fail when replaced with random replaced with random functions. functions.

Intuition:Intuition:fblocks can contain a corner fblocks can contain a corner case that random simulation case that random simulation didndidn’’t explore t explore

13

Page 14: Learning Conditional Abstractions (CAL)

Modeling with Uninterpreted FunctionsModeling with Uninterpreted Functions

x1 x2 xn

fA fB

=

ab

c

ab

c

aUFb

c

aUFb

c

g

y1 y2 yn

UF b

0 1

14

Page 15: Learning Conditional Abstractions (CAL)

Interpretation ConditionsInterpretation Conditions

DD11,D,D2 2 : word-level designs: word-level designsTT11,T,T2 2 : term-level models: term-level modelsxx : input signals : input signalscc : interpretation condition : interpretation condition

f1

=

f2

=

D1

x

D2 T1 T2

c

∃∃c ≠ true s.t. x.f∀c ≠ true s.t. x.f∀ 11⇔f⇔f22

This problem is NP-hard, so we use This problem is NP-hard, so we use

heuristics to compute heuristics to compute cc

Trivial case, model precisely: Trivial case, model precisely: c = truec = true

Problem:Problem:Compute interpretation Compute interpretation condition condition c(x)c(x) such thatsuch that x.f∀x.f∀ 11⇔f⇔f22

Ideal case, fully abstract: Ideal case, fully abstract: c = falsec = false

Realistic case, we need to solve:Realistic case, we need to solve:

15

Page 16: Learning Conditional Abstractions (CAL)

OutlineOutline

• MotivationMotivation

• Related workRelated work

• BackgroundBackground

• The CAL ApproachThe CAL Approach

• Illustrative ExampleIllustrative Example

• ResultsResults

• ConclusionConclusion

16

Page 17: Learning Conditional Abstractions (CAL)

Related WorkRelated Work

Previous work related to Previous work related to Learning Learning and and AbstractionAbstraction

• Learning Abstractions for Model CheckingLearning Abstractions for Model Checking• Anubhav Gupta, Ph.D. thesis, CMU, 2006Anubhav Gupta, Ph.D. thesis, CMU, 2006• Localization abstraction: learn the variables to Localization abstraction: learn the variables to

make visiblemake visible

• Our approach:Our approach:• Learn when to apply function abstractionLearn when to apply function abstraction

17

Page 18: Learning Conditional Abstractions (CAL)

The CAL ApproachThe CAL Approach

CAL = Machine Learning + CEGARCAL = Machine Learning + CEGAR

1.1. Identify abstraction candidates with random Identify abstraction candidates with random simulationsimulation

2.2. Perform unconditional abstractionPerform unconditional abstraction

3.3. If spurious counterexamples arise, use If spurious counterexamples arise, use machine learning to refine abstraction by machine learning to refine abstraction by computing abstraction conditionscomputing abstraction conditions

4.4. Repeat Step 3 until Valid or real Repeat Step 3 until Valid or real counterexamplecounterexample

18

Page 19: Learning Conditional Abstractions (CAL)

The CAL ApproachThe CAL Approach

Modules to Abstract

RTLRandom

Simulation

Generate Term-Level Model

Invoke Verifier

Simulation Traces

Learn Abstraction Conditions

Abstraction Conditions

Valid?YesYes

DoneDone

Counterexample

Spurious?

NoNoDoneDone

YeYess

Generate Similar Traces

NoNo

19

Page 20: Learning Conditional Abstractions (CAL)

Use of Machine LearningUse of Machine Learning

ExamplesExamples(positive/(positive/negative)negative)

ConceptConcept(classifier)(classifier)

LearningAlgorithm

In our setting:In our setting:

Simulation tracesSimulation traces(correct / failing)(correct / failing)

InterpretationInterpretationconditioncondition

LearningAlgorithm

20

Page 21: Learning Conditional Abstractions (CAL)

Important Considerations in LearningImportant Considerations in Learning

• How to How to generate tracesgenerate traces for learning? for learning?

• What are the relevant What are the relevant featuresfeatures??

• Random simulations: using random functions in place of UFsRandom simulations: using random functions in place of UFs• Counterexamples Counterexamples

• Inputs to functional block being abstractedInputs to functional block being abstracted• Signals corresponding to Signals corresponding to ““unit of workunit of work”” being processed being processed

21

Page 22: Learning Conditional Abstractions (CAL)

Generating Traces: WitnessesGenerating Traces: Witnesses

Modified version of random simulationModified version of random simulation

x1 x2 xn

fA fB

=

ab

c

ab

c

RFa

b

c

RFa

b

c

aRFb

c

aRFb

c

ab

RFc

ab

RFcRFc

RFa

RFb

RFc

RFa

RFb

1.1. Replace Replace allall modules that modules that are being abstracted with are being abstracted with RF at same timeRF at same time

2.2. Verify via simulation for N Verify via simulation for N iterationsiterations

3.3. Log signals for each Log signals for each passing simulation runpassing simulation run

* Important note: initial Important note: initial state selected randomly state selected randomly or based on a testbenchor based on a testbench

22

Page 23: Learning Conditional Abstractions (CAL)

Generating Traces: Similar CounterexamplesGenerating Traces: Similar Counterexamples

x1 x2 xn

fA fB

=

ab

c

ab

c

RFa

b

c

RFa

b

c

aRFb

c

aRFb

c

ab

RFc

ab

RFc

1.1. Replace modules that are Replace modules that are being abstracted with RF, being abstracted with RF, one by oneone by one

2.2. Verify via simulation for N Verify via simulation for N iterationsiterations

3.3. Log signals for each Log signals for each failing simulation runfailing simulation run

* Important note: initial Important note: initial state set to be consistent state set to be consistent with the original with the original counterexample for each counterexample for each verification runverification run

4.4. Repeat this process for Repeat this process for each fblock that is being each fblock that is being abstractedabstracted

23

Page 24: Learning Conditional Abstractions (CAL)

Feature Selection HeuristicsFeature Selection Heuristics

1.1. Include inputs to the fblock being abstractedInclude inputs to the fblock being abstracted• Advantage: automatic, direct relevanceAdvantage: automatic, direct relevance• Disadvantage: might not be enoughDisadvantage: might not be enough

2.2. Include signals encoding the Include signals encoding the ““unit-of-workunit-of-work”” being processed by the designbeing processed by the design• Example: an instruction, a packet, etc.Example: an instruction, a packet, etc.• Advantage: often times the Advantage: often times the ““unit-of-workunit-of-work”” has has

direct impact on whether or not to abstractdirect impact on whether or not to abstract• Disadvantage: might require limited human Disadvantage: might require limited human

guidanceguidance

24

Page 25: Learning Conditional Abstractions (CAL)

OutlineOutline

• MotivationMotivation

• Related workRelated work

• BackgroundBackground

• The CAL ApproachThe CAL Approach

• Illustrative ExampleIllustrative Example

• ResultsResults

• ConclusionConclusion

25

Page 26: Learning Conditional Abstractions (CAL)

Learning ExampleLearning Example

Example: Y86 processor designExample: Y86 processor designAbstraction: ALU moduleAbstraction: ALU module

Unconditional abstraction Unconditional abstraction Counterexample Counterexample

Sample data setSample data setbad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0good,11,0,1,-1good,11,0,1,-1good,11,0,1,1good,11,0,1,1good,6,3,-1,-1good,6,3,-1,-1good,6,6,-1,1good,6,6,-1,1good,9,0,1,1good,9,0,1,1

Attribute, instr, aluOp, argA, argBAttribute, instr, aluOp, argA, argB

{Good, Bad}{Good, Bad} {-1,0,1}{-1,0,1}

Abstract interpretation:Abstract interpretation:x < 0 x < 0 -1 -1x = 0 x = 0 0 0x > 0 x > 0 1 1

{0,1,...,15}

26

Page 27: Learning Conditional Abstractions (CAL)

Learning ExampleLearning Example

Example: Y86 processor designExample: Y86 processor designAbstraction: ALU moduleAbstraction: ALU module

Unconditional abstraction Unconditional abstraction Counterexample Counterexample

Sample data setSample data setbad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0good,11,0,1,-1good,11,0,1,-1good,11,0,1,1good,11,0,1,1good,6,3,-1,-1good,6,3,-1,-1good,6,6,-1,1good,6,6,-1,1good,9,0,1,1good,9,0,1,1

Feature selection based on Feature selection based on ““unit-of-unit-of-workwork””

Interpretation condition learned:Interpretation condition learned:

InstrInstrE E = JXX ∧ b = = JXX ∧ b = 0 0

InstrInstrE E = JXX ∧ b = = JXX ∧ b = 0 0

Verification succeeds when above Verification succeeds when above interpretation condition is used!interpretation condition is used!

27

Page 28: Learning Conditional Abstractions (CAL)

Learning ExampleLearning Example

Example: Y86 processor designExample: Y86 processor designAbstraction: ALU moduleAbstraction: ALU module

Unconditional abstraction Unconditional abstraction Counterexample Counterexample

Sample data setSample data setbad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0good,0,1,-1good,0,1,-1good,0,1,1good,0,1,1good,3,-1,-1good,3,-1,-1good,6,-1,1good,6,-1,1good,0,1,1good,0,1,1

If feature selection is based on fblock If feature selection is based on fblock inputs only... inputs only...

Interpretation condition learned:Interpretation condition learned:

truetruetruetrue

Recall that this means we always Recall that this means we always interpret! interpret!

Poor decision tree results from Poor decision tree results from reasonable design decision. More reasonable design decision. More information needed.information needed. 28

Page 29: Learning Conditional Abstractions (CAL)

OutlineOutline

• MotivationMotivation

• Related workRelated work

• BackgroundBackground

• The CAL ApproachThe CAL Approach

• Illustrative ExampleIllustrative Example

• ResultsResults

• ConclusionConclusion

29

Page 30: Learning Conditional Abstractions (CAL)

Experiments/BenchmarksExperiments/Benchmarks

Pipeline fragment:Pipeline fragment:• Abstract ALUAbstract ALU• JUMP must be modeled precisely.JUMP must be modeled precisely.• ATLAS: Automatic Term-Level Abstraction of RTL Designs. ATLAS: Automatic Term-Level Abstraction of RTL Designs. B. A. B. A.

Brady, R. E. Bryant, S. A. Seshia, J. W. OBrady, R. E. Bryant, S. A. Seshia, J. W. O’’Leary. MEMOCODE 2010Leary. MEMOCODE 2010

Y86:Y86:• Correspondence checking of 5-stage microprocessorCorrespondence checking of 5-stage microprocessor• Multiple design variationsMultiple design variations• Computer Systems: A ProgrammerComputer Systems: A Programmer’’s Perspectives Perspective. Prentice-Hall, 2002. . Prentice-Hall, 2002.

R. E. Bryant and D. R. OR. E. Bryant and D. R. O’’Hallaron. Hallaron.

Low-power Multiplier:Low-power Multiplier:• Performs equivalence checking between two versions of a multiplierPerforms equivalence checking between two versions of a multiplier• One is a typical multiplierOne is a typical multiplier• The The ““low-powerlow-power”” version shuts down the multiplier and uses a shifter version shuts down the multiplier and uses a shifter

when one of the operands is a power of 2when one of the operands is a power of 2• Low-Power Verification with Term-Level Abstraction.Low-Power Verification with Term-Level Abstraction. B. A. Brady. B. A. Brady.

TECHCON TECHCON ‘‘1010

30

Page 31: Learning Conditional Abstractions (CAL)

Experiments/BenchmarksExperiments/Benchmarks

Pipeline fragmentPipeline fragment

Interpretation Interpretation ConditionCondition

ABC ABC (sec)(sec)

UCLID Runtime (sec)UCLID Runtime (sec)

SATSAT SMTSMT

truetrue 0.020.02 28.5128.51 27.0127.01

op = JMPop = JMP ---- 0.310.31 0.010.01

Low-Power MultiplierLow-Power Multiplier

BMC BMC DepthDepth

UCLID Runtime (sec)UCLID Runtime (sec)

SATSAT SMTSMT

No AbsNo Abs AbsAbs No AbsNo Abs AbsAbs

1 2.81 2.55 1.27 1.38

2 12.56 14.79 2.80 2.63

5 67.43 22.45 8.23 8.16

10 216.75 202.25 21.18 22.00 31

Page 32: Learning Conditional Abstractions (CAL)

Experiments/BenchmarksExperiments/Benchmarks

Y86: Y86: BTFNTBTFNT

Interpretation Interpretation ConditionCondition

ABCABC(sec)(sec)

UCLID Runtime (sec)UCLID Runtime (sec)

SATSAT SMTSMT

truetrue > 1200> 1200 > 1200> 1200 > 1200> 1200

op = ADD∧ aluB = 0op = ADD∧ aluB = 0 ---- 133.03133.03 105.34105.34

InstrInstrEE = JXX ∧ aluB = = JXX ∧ aluB = 00

---- 101.10101.10 65.5265.52

Y86: Y86: NTNT

Interpretation Interpretation ConditionCondition

ABCABC(sec)(sec)

UCLID Runtime (sec)UCLID Runtime (sec)

SATSAT SMTSMT

truetrue > > 12001200

> 1200> 1200 > 1200> 1200

op = ADD∧ aluB = 0op = ADD∧ aluB = 0 ---- 154.95154.95 89.0289.02

InstrInstrEE = JXX = JXX ---- 191.34191.34 187.64187.64

BTFNTBTFNT ---- 94.0094.00 52.7652.7632

Page 33: Learning Conditional Abstractions (CAL)

OutlineOutline

• MotivationMotivation

• Related workRelated work

• BackgroundBackground

• The CAL ApproachThe CAL Approach

• Illustrative ExampleIllustrative Example

• ResultsResults

• ConclusionConclusion

33

Page 34: Learning Conditional Abstractions (CAL)

Summary / Future WorkSummary / Future Work

SummarySummary• Use machine learning + CEGAR to compute Use machine learning + CEGAR to compute

conditional function abstractionsconditional function abstractions• Outperforms purely bit-level techniquesOutperforms purely bit-level techniques

34

Future WorkFuture Work• Better feature selection: picking “unit-of-work” Better feature selection: picking “unit-of-work”

signalssignals• Investigate using different abstraction Investigate using different abstraction

conditions for different instantiations of the conditions for different instantiations of the same fblock.same fblock.

• Apply to softwareApply to software• Investigate interactions between abstractionsInvestigate interactions between abstractions

Page 35: Learning Conditional Abstractions (CAL)

Thanks!Thanks!

35

Page 36: Learning Conditional Abstractions (CAL)

NP-HardNP-Hard

Need to interpret MULT when Need to interpret MULT when f(xf(x11,x,x22,...,x,...,xnn) = true ) = true

+10+1+5+2

x1

=

...

0 1

0

MULTMULT

xx2 xn

0 1

f

Checking satisfiability ofChecking satisfiability of f(xf(x11,x,x22,...,x,...,xnn) ) is NP-Hardis NP-Hard

36

Page 37: Learning Conditional Abstractions (CAL)

Related WorkRelated Work

AuthorAuthor AbstractionAbstractionTypeType

Abstraction Abstraction GranularityGranularity

MethodMethod

Z. Andraus, et al.DAC’04, LPAR’08

Data, Function Fully abstracts all operators

CEGAR

H. Jain, et al. DAC’05

Data Maintains predicates over data signals

Predicate abstraction

P. BjesseCAV’08

Data Reduces datapaths without BV ops

Selective bit-blasting

R. E. Bryant, et al., TACAS 2007

Data Datapath reduction via successive approximation

CEGAR

v2ucl Data Reduces datapaths without BV ops

Type qualifiers and inference

ATLAS Function Partially abstracts some modules

Hybrid static-dynamic

CAL Function Partially abstracts some modules

Machine learning/CEGAR

37

Page 38: Learning Conditional Abstractions (CAL)

Term-Level AbstractionTerm-Level Abstraction

Data Abstraction: Data Abstraction: • Represent data with arbitrary integer valuesRepresent data with arbitrary integer values• No specific encodingNo specific encoding

x0

x1

xn-1

x

Function Abstraction: Function Abstraction: Represent functional units with uninterpreted functionsRepresent functional units with uninterpreted functions

ALU

ALU

38