Learning Conditional Abstractions (CAL) Bryan A. Brady Bryan A. Brady 1* 1* Randal E. Bryant Randal E. Bryant 2 Sanjit A. Seshia Sanjit A. Seshia 3 1 IBM, Poughkeepsie, NY IBM, Poughkeepsie, NY 2 CS Department, Carnegie Mellon CS Department, Carnegie Mellon University University 3 EECS Department, UC Berkeley EECS Department, UC Berkeley * Work performed at UC Berkeley Work performed at UC Berkeley FMCAD 2011, Austin, TX FMCAD 2011, Austin, TX 1 November 2011 1 November 2011
Learning Conditional Abstractions (CAL). Bryan A. Brady 1* Randal E. Bryant 2 Sanjit A. Seshia 3 1 IBM, Poughkeepsie, NY 2 CS Department, Carnegie Mellon University 3 EECS Department, UC Berkeley * Work performed at UC Berkeley FMCAD 2011, Austin, TX 1 November 2011. - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Learning Conditional Abstractions (CAL)
Learning Conditional Abstractions (CAL)
Bryan A. BradyBryan A. Brady1*1*
Randal E. BryantRandal E. Bryant22
Sanjit A. SeshiaSanjit A. Seshia33
11IBM, Poughkeepsie, NYIBM, Poughkeepsie, NY22CS Department, Carnegie Mellon UniversityCS Department, Carnegie Mellon University33EECS Department, UC BerkeleyEECS Department, UC Berkeley**Work performed at UC BerkeleyWork performed at UC Berkeley
with uninterpreted functionswith uninterpreted functions• Represent data with arbitrary Represent data with arbitrary
encodingencoding
• Difficult to reason about Difficult to reason about some operatorssome operators• Multiply, DivideMultiply, Divide• Modulo, PowerModulo, Power
f(...)
=
4
Term-Level AbstractionTerm-Level Abstraction
FullyFullyuninterpreteuninterprete
dd
UF
16
20
out2
instr
Precise, word-levelPrecise, word-level
16
ALU
JMP
=
01
4 16
16
out1
19 015
instr
ExampleExample
instr :=instr := JMPJMP 12341234
outout11:=:= 12341234
outout22:=:= 00
Need to partially Need to partially abstractabstract
5
Term-Level AbstractionTerm-Level Abstraction
FullyFullyuninterpreteuninterprete
dd
UF
16
20
out
instr
Precise, word-levelPrecise, word-level
16
ALU
JMP
=
01
4 16
16
out
19 015
instr
Partially-interpretedPartially-interpreted
16
UF
JMP
=
01
4 16
16
out
19 015
instr
6
Term-Level AbstractionTerm-Level Abstraction
RTL
VerificationModel
Manual AbstractionManual Abstraction• Requires intimate knowledge of designRequires intimate knowledge of design• Multiple models of same designMultiple models of same design• Spurious counter-examplesSpurious counter-examples
Automatic AbstractionAutomatic Abstraction• How to choose the How to choose the right level right level of of
abstractionabstraction• Some blocks require Some blocks require conditional conditional
abstractionabstraction• Often requires many iterations of Often requires many iterations of
Background: The ATLAS ApproachBackground: The ATLAS Approach
Hybrid approachHybrid approach
• Phase 1: Identify abstraction candidates with Phase 1: Identify abstraction candidates with random simulationrandom simulation
• Phase 2: Use dataflow analysis to compute Phase 2: Use dataflow analysis to compute conditions under which it is precise to abstractconditions under which it is precise to abstract
• Phase 3: Generate abstracted modelPhase 3: Generate abstracted model
2.2. Replace each fblock with Replace each fblock with a random functiona random function, over , over the inputs of the fblockthe inputs of the fblock
3.3. Verify via simulationVerify via simulation::• Check original property Check original property
for N different random for N different random functionsfunctions
4.4. Do not abstract fblocks Do not abstract fblocks that fail in some fraction of that fail in some fraction of simulationssimulations
5.5. Replace remaining fblocks Replace remaining fblocks with partially-abstract with partially-abstract functions and functions and compute compute conditions conditions under which the under which the fblock is modeled preciselyfblock is modeled precisely
Intuition:Intuition:fblocks that can not be fblocks that can not be abstracted will fail when abstracted will fail when replaced with random replaced with random functions. functions.
Intuition:Intuition:fblocks can contain a corner fblocks can contain a corner case that random simulation case that random simulation didndidn’’t explore t explore
13
Modeling with Uninterpreted FunctionsModeling with Uninterpreted Functions
Realistic case, we need to solve:Realistic case, we need to solve:
15
OutlineOutline
• MotivationMotivation
• Related workRelated work
• BackgroundBackground
• The CAL ApproachThe CAL Approach
• Illustrative ExampleIllustrative Example
• ResultsResults
• ConclusionConclusion
16
Related WorkRelated Work
Previous work related to Previous work related to Learning Learning and and AbstractionAbstraction
• Learning Abstractions for Model CheckingLearning Abstractions for Model Checking• Anubhav Gupta, Ph.D. thesis, CMU, 2006Anubhav Gupta, Ph.D. thesis, CMU, 2006• Localization abstraction: learn the variables to Localization abstraction: learn the variables to
make visiblemake visible
• Our approach:Our approach:• Learn when to apply function abstractionLearn when to apply function abstraction
3.3. If spurious counterexamples arise, use If spurious counterexamples arise, use machine learning to refine abstraction by machine learning to refine abstraction by computing abstraction conditionscomputing abstraction conditions
4.4. Repeat Step 3 until Valid or real Repeat Step 3 until Valid or real counterexamplecounterexample
Important Considerations in LearningImportant Considerations in Learning
• How to How to generate tracesgenerate traces for learning? for learning?
• What are the relevant What are the relevant featuresfeatures??
• Random simulations: using random functions in place of UFsRandom simulations: using random functions in place of UFs• Counterexamples Counterexamples
• Inputs to functional block being abstractedInputs to functional block being abstracted• Signals corresponding to Signals corresponding to ““unit of workunit of work”” being processed being processed
Modified version of random simulationModified version of random simulation
x1 x2 xn
fA fB
=
ab
c
ab
c
RFa
b
c
RFa
b
c
aRFb
c
aRFb
c
ab
RFc
ab
RFcRFc
RFa
RFb
RFc
RFa
RFb
1.1. Replace Replace allall modules that modules that are being abstracted with are being abstracted with RF at same timeRF at same time
2.2. Verify via simulation for N Verify via simulation for N iterationsiterations
3.3. Log signals for each Log signals for each passing simulation runpassing simulation run
* Important note: initial Important note: initial state selected randomly state selected randomly or based on a testbenchor based on a testbench
22
Generating Traces: Similar CounterexamplesGenerating Traces: Similar Counterexamples
x1 x2 xn
fA fB
=
ab
c
ab
c
RFa
b
c
RFa
b
c
aRFb
c
aRFb
c
ab
RFc
ab
RFc
1.1. Replace modules that are Replace modules that are being abstracted with RF, being abstracted with RF, one by oneone by one
2.2. Verify via simulation for N Verify via simulation for N iterationsiterations
3.3. Log signals for each Log signals for each failing simulation runfailing simulation run
* Important note: initial Important note: initial state set to be consistent state set to be consistent with the original with the original counterexample for each counterexample for each verification runverification run
4.4. Repeat this process for Repeat this process for each fblock that is being each fblock that is being abstractedabstracted
1.1. Include inputs to the fblock being abstractedInclude inputs to the fblock being abstracted• Advantage: automatic, direct relevanceAdvantage: automatic, direct relevance• Disadvantage: might not be enoughDisadvantage: might not be enough
2.2. Include signals encoding the Include signals encoding the ““unit-of-workunit-of-work”” being processed by the designbeing processed by the design• Example: an instruction, a packet, etc.Example: an instruction, a packet, etc.• Advantage: often times the Advantage: often times the ““unit-of-workunit-of-work”” has has
direct impact on whether or not to abstractdirect impact on whether or not to abstract• Disadvantage: might require limited human Disadvantage: might require limited human
guidanceguidance
24
OutlineOutline
• MotivationMotivation
• Related workRelated work
• BackgroundBackground
• The CAL ApproachThe CAL Approach
• Illustrative ExampleIllustrative Example
• ResultsResults
• ConclusionConclusion
25
Learning ExampleLearning Example
Example: Y86 processor designExample: Y86 processor designAbstraction: ALU moduleAbstraction: ALU module
Sample data setSample data setbad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0good,11,0,1,-1good,11,0,1,-1good,11,0,1,1good,11,0,1,1good,6,3,-1,-1good,6,3,-1,-1good,6,6,-1,1good,6,6,-1,1good,9,0,1,1good,9,0,1,1
Sample data setSample data setbad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0bad,7,0,1,0good,11,0,1,-1good,11,0,1,-1good,11,0,1,1good,11,0,1,1good,6,3,-1,-1good,6,3,-1,-1good,6,6,-1,1good,6,6,-1,1good,9,0,1,1good,9,0,1,1
Feature selection based on Feature selection based on ““unit-of-unit-of-workwork””
Sample data setSample data setbad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0bad,0,1,0good,0,1,-1good,0,1,-1good,0,1,1good,0,1,1good,3,-1,-1good,3,-1,-1good,6,-1,1good,6,-1,1good,0,1,1good,0,1,1
If feature selection is based on fblock If feature selection is based on fblock inputs only... inputs only...
Recall that this means we always Recall that this means we always interpret! interpret!
Poor decision tree results from Poor decision tree results from reasonable design decision. More reasonable design decision. More information needed.information needed. 28
OutlineOutline
• MotivationMotivation
• Related workRelated work
• BackgroundBackground
• The CAL ApproachThe CAL Approach
• Illustrative ExampleIllustrative Example
• ResultsResults
• ConclusionConclusion
29
Experiments/BenchmarksExperiments/Benchmarks
Pipeline fragment:Pipeline fragment:• Abstract ALUAbstract ALU• JUMP must be modeled precisely.JUMP must be modeled precisely.• ATLAS: Automatic Term-Level Abstraction of RTL Designs. ATLAS: Automatic Term-Level Abstraction of RTL Designs. B. A. B. A.
Brady, R. E. Bryant, S. A. Seshia, J. W. OBrady, R. E. Bryant, S. A. Seshia, J. W. O’’Leary. MEMOCODE 2010Leary. MEMOCODE 2010
Y86:Y86:• Correspondence checking of 5-stage microprocessorCorrespondence checking of 5-stage microprocessor• Multiple design variationsMultiple design variations• Computer Systems: A ProgrammerComputer Systems: A Programmer’’s Perspectives Perspective. Prentice-Hall, 2002. . Prentice-Hall, 2002.
R. E. Bryant and D. R. OR. E. Bryant and D. R. O’’Hallaron. Hallaron.
Low-power Multiplier:Low-power Multiplier:• Performs equivalence checking between two versions of a multiplierPerforms equivalence checking between two versions of a multiplier• One is a typical multiplierOne is a typical multiplier• The The ““low-powerlow-power”” version shuts down the multiplier and uses a shifter version shuts down the multiplier and uses a shifter
when one of the operands is a power of 2when one of the operands is a power of 2• Low-Power Verification with Term-Level Abstraction.Low-Power Verification with Term-Level Abstraction. B. A. Brady. B. A. Brady.
signalssignals• Investigate using different abstraction Investigate using different abstraction
conditions for different instantiations of the conditions for different instantiations of the same fblock.same fblock.
• Apply to softwareApply to software• Investigate interactions between abstractionsInvestigate interactions between abstractions
Thanks!Thanks!
35
NP-HardNP-Hard
Need to interpret MULT when Need to interpret MULT when f(xf(x11,x,x22,...,x,...,xnn) = true ) = true
+10+1+5+2
x1
=
...
0 1
0
MULTMULT
xx2 xn
0 1
f
Checking satisfiability ofChecking satisfiability of f(xf(x11,x,x22,...,x,...,xnn) ) is NP-Hardis NP-Hard
36
Related WorkRelated Work
AuthorAuthor AbstractionAbstractionTypeType
Abstraction Abstraction GranularityGranularity
MethodMethod
Z. Andraus, et al.DAC’04, LPAR’08
Data, Function Fully abstracts all operators
CEGAR
H. Jain, et al. DAC’05
Data Maintains predicates over data signals
Predicate abstraction
P. BjesseCAV’08
Data Reduces datapaths without BV ops
Selective bit-blasting
R. E. Bryant, et al., TACAS 2007
Data Datapath reduction via successive approximation
CEGAR
v2ucl Data Reduces datapaths without BV ops
Type qualifiers and inference
ATLAS Function Partially abstracts some modules
Hybrid static-dynamic
CAL Function Partially abstracts some modules
Machine learning/CEGAR
37
Term-Level AbstractionTerm-Level Abstraction
Data Abstraction: Data Abstraction: • Represent data with arbitrary integer valuesRepresent data with arbitrary integer values• No specific encodingNo specific encoding
x0
x1
xn-1
x
Function Abstraction: Function Abstraction: Represent functional units with uninterpreted functionsRepresent functional units with uninterpreted functions