Top Banner
DUSD(Labs) GSRC GSRC bX update bX update March 2003 March 2003 Aaron Ng, Marius Eriksen and Igor Markov Aaron Ng, Marius Eriksen and Igor Markov University of Michigan University of Michigan
48

DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

Dec 21, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

DUSD(Labs)

GSRCGSRC

bX updatebX updateMarch 2003March 2003

Aaron Ng, Marius Eriksen and Igor MarkovAaron Ng, Marius Eriksen and Igor MarkovUniversity of MichiganUniversity of Michigan

Page 2: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 2

OutlineOutline

Motivation, issues in benchmarkingMotivation, issues in benchmarking

bX in the picturebX in the picture

Sample application: Evaluation of toolsSample application: Evaluation of tools

Future focusFuture focus

Contact info, linksContact info, links

Page 3: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 3

Motivation, issues in benchmarkingMotivation, issues in benchmarking

1.1. EvaluationEvaluation• independent reproduction of results and experimentsindependent reproduction of results and experiments

• explicit methods requiredexplicit methods required minimum room for misinterpretation of resultsminimum room for misinterpretation of results

• evaluation of algorithms across entire problem spaceevaluation of algorithms across entire problem space conflicting and correlating optimization objectives conflicting and correlating optimization objectives separation of placement and routing tasksseparation of placement and routing tasks

Page 4: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 4

Motivation, issues in benchmarking Motivation, issues in benchmarking (cont’d)(cont’d)

2.2. Availability of resultsAvailability of results• raw experimental resultsraw experimental results

• availability allows verificationavailability allows verification

• results provide insight into the performance of a toolresults provide insight into the performance of a tool

Page 5: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 5

Motivation, issues in benchmarking Motivation, issues in benchmarking (cont’d)(cont’d)

3.3. Standard formatsStandard formats• meaningful comparison of resultsmeaningful comparison of results

• compatibility between tools and benchmarkscompatibility between tools and benchmarks

• correct interpretation of benchmarkscorrect interpretation of benchmarks

Page 6: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 6

bX in the picturebX in the picture

1.1. AutomationAutomation• ‘‘live’ repositorylive’ repository

support for execution of tools on benchmarkssupport for execution of tools on benchmarks distributed network of computational hostsdistributed network of computational hosts

• online reporting of resultsonline reporting of results automatic updates when changes in dependencies occurautomatic updates when changes in dependencies occur

Page 7: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 7

bX in the picture bX in the picture (cont’d)(cont’d)

2.2. Scripts and flowsScripts and flows• reproduction of resultsreproduction of results

scripts and flows describe scripts and flows describe experimentsexperiments scripts can be saved, shared and reusedscripts can be saved, shared and reused

• representation of entire problem spacerepresentation of entire problem space relationship between optimization objectivesrelationship between optimization objectives e.g. the effect of placement results on routinge.g. the effect of placement results on routing

Page 8: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 8

bX in the picture bX in the picture (cont’d)(cont’d)

3.3. Standard formatsStandard formats• interoperability between tools and benchmarksinteroperability between tools and benchmarks

• meaningful comparison of resultsmeaningful comparison of results

Page 9: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 9

Sample application: Evaluation of toolsSample application: Evaluation of tools

1.1. PlacersPlacers• CapoCapo

randomizedrandomized fixed die placerfixed die placer emphasis on routabilityemphasis on routability tuned on proprietary Cadence benchmarkstuned on proprietary Cadence benchmarks

Page 10: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 10

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

1.1. Placers Placers (cont’d)(cont’d)

• DragonDragon randomizedrandomized variable-die placervariable-die placer tuned on IBM-Place benchmarkstuned on IBM-Place benchmarks

Page 11: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 11

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

1.1. Placers Placers (cont’d)(cont’d)

• KraftWerkKraftWerk deterministicdeterministic fixed-die placerfixed-die placer results typically have cell overlapsresults typically have cell overlaps additional legalization step by DOMINOadditional legalization step by DOMINO

Page 12: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 12

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

2.2. BenchmarksBenchmarks• PEKOPEKO

artificial netlistsartificial netlists designed to match statistical parameters of IBM netlistsdesigned to match statistical parameters of IBM netlists known optimal wirelengthknown optimal wirelength concern that they are not representative of industry circuitsconcern that they are not representative of industry circuits

Page 13: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 13

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

2.2. Benchmarks Benchmarks (cont’d)(cont’d)

• gridsgrids 4 fixed vertices, n4 fixed vertices, n22 1x1 movables 1x1 movables tests placers on datapath-like circuitstests placers on datapath-like circuits known optimal placementknown optimal placement results are easily visualized for debuggingresults are easily visualized for debugging

Page 14: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 14

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flowExample flow

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor

A script in bX serves as a template describing an experiment, and can be saved and shared.

Scripts are instantiated by defining the individual components of the script.

Flows are instantiated scripts.

Flows can be re-executed to reproduce results.

Page 15: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 15

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor

Flow parameters:

Capo

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

After completion, the results of the jobs will be automatically posted online.

In the case of the placement job, the results include wirelength and runtime.

Page 16: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 16

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

post-processing

post-processor

Flow parameters:

Capo

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 17: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 17

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

If we swapped Capo with Dragon:

Capo

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 18: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 18

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

If we swapped Capo with Dragon:

Dragon

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 19: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 19

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

If we swapped Capo with Dragon:

Dragon

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 20: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 20

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor Capo

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 21: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 21

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor

Modify the flow:

Capo

PEKO

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 22: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 22

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor

Modify the flow:

Capo

grid

(default)

placement map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 23: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 23

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor

Modify the flow:

Capo

grid

(default)

grid graph

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 24: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 24

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor

Modify the flow:

Capo

grid

(default)

grid graph

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 25: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 25

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

Swap Capo with Dragon:

Capo

grid

(default)

grid graph

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 26: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 26

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

Swap Capo with Dragon:

Dragon

grid

(default)

grid graph

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 27: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 27

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

Swap Capo with Dragon:

Dragon

grid

(default)

grid graph

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 28: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 28

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor Capo

PEKO

(default)

congestion map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 29: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 29

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 30: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 30

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

placerbenchmark

parameters

evaluator

post-processor

Page 31: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 31

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

placerbenchmark

parameters

legalizer

legalization

evaluator

post-processor

Page 32: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 32

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

legalizer

legalizationlegalizer

evaluator

post-processor

Page 33: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 33

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator

post-processor

legalizer

legalizationlegalizer

Page 34: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 34

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator

post-processor

legalizer

legalizationlegalizer

Page 35: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 35

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator

post-processor

legalizer

legalization

router

routing

routerlegalizer

Page 36: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 36

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

post-processing

post-processor KraftWerk

PEKO

(default)

congestion map

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator

post-processor

legalizer

legalization

router

routing

routerlegalizer

Page 37: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 37

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters evaluator

placement

evaluation

KraftWerk

PEKO

(default)

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator

legalizer

legalizationlegalizer

post-processing

post-processor

post-processor

router

routing

router

Page 38: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 38

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters

evaluator1

placementevaluation KraftWerk

PEKO

(default)

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator1

legalizer

legalizationlegalizer

Page 39: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 39

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters

placement KraftWerk

PEKO

(default)

overlap/legality & wirelength

DOMINO

placerbenchmark

parameters

evaluator1

legalizer

legalizationlegalizer

evaluator1

evaluation

evaluator2

evaluation

Page 40: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 40

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters

placement KraftWerk

PEKO

(default)

overlap/legality & wirelength

routability

DOMINO

placerbenchmark

parameters

evaluator1

legalizer

legalizationlegalizer

evaluator2

evaluator1

evaluation

evaluator2

evaluation

Page 41: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 41

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters

placement KraftWerk

PEKO

(default)

overlap/legality & wirelength

routability

DOMINO

placerbenchmark

parameters

evaluator1

legalizer

legalization evaluator2

evaluator1

evaluation

evaluator2

evaluation

evaluator3

evaluationlegalizer

Page 42: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 42

Sample application: Evaluation of tools Sample application: Evaluation of tools (cont’d)(cont’d)

3.3. Example flow Example flow (cont’d)(cont’d)

placer

benchmark

parameters

placement KraftWerk

PEKO

(default)

overlap/legality & wirelength

routability

DOMINO

timing analysis

placerbenchmark

parameters

evaluator1

legalizer

legalization evaluator2

evaluator1

evaluation

evaluator2

evaluation

evaluator3

evaluationlegalizer

evaluator3

Page 43: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 43

Future FocusFuture Focus

1.1. Easy deploymentEasy deployment• downloadable bX distributiondownloadable bX distribution

in the form of a binary or installation packagein the form of a binary or installation package

Page 44: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 44

Future Focus Future Focus (cont’d)(cont’d)

2.2. Interpretation of resultsInterpretation of results• multiple views and query supportmultiple views and query support

• for example, for example, ‘‘show all results for solver S’show all results for solver S’ ‘‘show the hardest benchmarks for solver S’show the hardest benchmarks for solver S’ ‘‘has the solution quality decreased for benchmark B, has the solution quality decreased for benchmark B,

since the upload of the new version of solver S?’since the upload of the new version of solver S?’

Page 45: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 45

Future Focus Future Focus (cont’d)(cont’d)

3.3. Type checkingType checking• MIME-like affinity between solvers and benchmarksMIME-like affinity between solvers and benchmarks

compatibility checkscompatibility checks useful for performing queries on different ‘families’useful for performing queries on different ‘families’

• ‘‘learning’ of new file typeslearning’ of new file types

Page 46: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 46

Future focus Future focus (cont’d)(cont’d)

4.4. GSRC BookshelfGSRC Bookshelf• populate bX with implementations from Bookshelfpopulate bX with implementations from Bookshelf

still the same ‘one-stop-shop’still the same ‘one-stop-shop’ except that it will be a except that it will be a live live repositoryrepository

Page 47: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 47

Future Focus Future Focus (cont’d)(cont’d)

5.5. OpenAccessOpenAccess• method of communicating data between jobsmethod of communicating data between jobs

provide interoperability between toolsprovide interoperability between tools

• single ‘design-through-manufacturing’ data model single ‘design-through-manufacturing’ data model

Page 48: DUSD(Labs) GSRC bX update March 2003 Aaron Ng, Marius Eriksen and Igor Markov University of Michigan.

12/09/02 48

Contact info, linksContact info, links

For more info or source code:For more info or source code:

[email protected]@umich.eduFeedback and comments are appreciated. Feedback and comments are appreciated.

OpenAccessOpenAccess www.openeda.orgwww.openeda.org

www.cadence.com/feature/open_access.htmlwww.cadence.com/feature/open_access.html

GSRC BookshelfGSRC Bookshelf www.gigascale.org/bookshelfwww.gigascale.org/bookshelf

Thanks!Thanks!