Top Banner
Model Driven Techniques for Model Driven Techniques for Evaluating Evaluating QoS of Middleware Configurations QoS of Middleware Configurations Arvind S. Krishna, Emre Turkay Andy Gokhale, & Douglas C. Schmidt Institute for Software Integrated Systems (ISIS) Vanderbilt University Nashville, TN 37203 Real-time Application Symposium (RTAS 2005) San Francisco, California
21

Model Driven Techniques for Evaluating QoS of Middleware Configurations

Feb 01, 2016

Download

Documents

barth

Model Driven Techniques for Evaluating QoS of Middleware Configurations. Arvind S. Krishna , Emre Turkay Andy Gokhale, & Douglas C. Schmidt Institute for Software Integrated Systems (ISIS) Vanderbilt University Nashville, TN 37203. Real-time Application Symposium (RTAS 2005) - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Model Driven Techniques for Evaluating Model Driven Techniques for Evaluating QoS of Middleware ConfigurationsQoS of Middleware Configurations

Arvind S. Krishna, Emre TurkayAndy Gokhale, & Douglas C. Schmidt

Institute for Software IntegratedSystems (ISIS)

Vanderbilt UniversityNashville, TN 37203

Real-time Application Symposium (RTAS 2005)

San Francisco, California

Page 2: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Presentation SummaryComponent middleware technologies

• Focus on business logic

• Automates the plumbing code to configure & deploy middleware

• Component encapsulate business logic

• Difficulty in provisioning & deploying

• Error prone task of handcrafting XML

Model Driven Generative Technologies (MDD)

• Focus is on

• Modeling – System composition technique

• Validating – Correct by construction

• Generating – Deployment, configuration info

multiple layers of middleware

• Supports configuring, provisioning, & deploying quality of Service (QoS)-enabled middleware

This presentation addresses key configuration & QoS evaluation challenges of middleware for DRE applications

Page 3: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Motivating DRE Application

Robot Assembly Application• Human Machine Interface (HMI) Component – human accepts/rejects watch

• Management Work Instructions (MWI) Component – decide what action to perform on the watch, e.g. set the appropriate time

• Watch Setting Manager (WSM) Component– Executes action on every watch

Goal• Increase number of

items processed by minimizing end-to-end latency

• Palette Conveyor Manager (PCM) Component – Watch Assembly line that moves watches from source to destination

• Robot Manager Component – Robotic Arm that moves the watches

Page 4: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Robot Assembly Challenges (1/2)Configuration Challenges • Map component level features & requirements to middleware configurations• WSM component interacts with HMI & Pallet Manager Component

• Configuring component properties• Configuring package properties • Configuring underlying middleware

Hook for the concurrency strategy

Hook for the request demuxing strategyHook for

marshaling strategy

Hook for the connection management strategy

Hook for the underlying transport strategy

Hook for the event demuxing strategy

Page 5: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Robot Assembly Challenges (2/2)

Configuration Evaluation Challenges

• How do we make sure chosen middleware configurations lead to overall goal of the system

• Minimizing end-to-end latency of the overall system

• What configuration of middleware hosting HMI & WSM components lead to best end-to-end latency

HumanMachineInterface

ManagementWork

Instructions

WatchSettingManager

RobotManager

PalletConveyorManager

Critical Flow Path

Page 6: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Research Challenges

CoSMIC

packaging

asse

mbl

y

specification

configuration

plan

ning

feedback

Component Developer

Component

ResourceRequirements

Impl

Impl

Impl

Properties

(1) d

evel

ops

Component Assembler

Component Assembly

Component Component

Component Component

(2) assembles

www.dre.vanderbilt.edu/cosmic

Component Package

Component Assembly

Component Component

Component Component

Component Assembly

Component Component

Component Component

Component packager

(3) packages

Component configurator(4

) con

figur

es

Component deployer

(5) deployment planningAssembly

DeploymentApplication

Assembly

Assembly

(7) feedback to

configuration &

planning

Analysis & Benchmarking

systemanalyzer

(6) analysis & benchmarking

Ensuring syntactically &

semantically valid middleware

configurations

Understanding consequences of

deployment decisions on overall QoS

Alleviating accidental complexities in evaluating/

benchmarking QoS

Page 7: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Resolving Configuration Challenges (1/2)Context

•Different middleware implementations provide different configuration mechanisms to configure the middleware

• CIAO provides service configuration options to tune middleware performance

• www.dre.vanderbilt.edu/ CIAO.html

ProblemThis approach is error prone since:•Need to know the syntax•Need to remember names of strategies•Need to know compatible strategies

Page 8: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Solution

•Developed a domain-specific modeling language for TAO/CIAO called Options Configuration Modeling Language (OCML)

•OCML ensures syntactic & semantic validity of middleware configurations•Detect error at model construction time

•OCML is used by

•Middleware developer to design the configuration model

•Application developer to configure the middleware for a specific application

•OCML metamodel is platform-independent

•OCML models are platform-specific

• Generates a Wizard to set configuration options and provides documentation for each option

Resolving Configuration Challenges (2/2)

Page 9: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Resolving Evaluation Challenges (1/3)

Context• Component integrators must make appropriate deployment decisions,

including identifying the entities (e.g., CPUs) of the target environment where the packages will be deployed

HumanMachineInterface

WatchSettingManager

PalletConveyorManager

RobotManager

ProblemHow to ensure a particular deployment configuration meets QoS requirements

How do we simulate load & background load for benchmarking?

How do we measure & monitor QoS for a given deployment

How do we measure & monitor QoS for a given deployment

Page 10: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Resolving Evaluation Challenges (2/3)

Solution• Provide a model-driven tool-

suite to empirically evaluate & refine configurations to maximize application QoS

BGML Workflow

1. End-user composes the scenario in the BGML modeling paradigm

2. Associate QoS properties with this scenario, such as latency, throughput or jitter

3. Synthesize the appropriate test code to run the experiment & measure the QoS

4. Feed-back metrics into models to verify if system meets appropriate QoS at design time

Component Interaction

Experimenter

BGML

ModelExperimet

AssociateQoS

Characteristics

Synthesize&

ExecuteFeedback

Test bed

1 2

34

IDL .cpp

Scriptfiles

•The tool enables synthesis of all the scaffolding code required to set up, run, & tear-down the experiment

•Using BGML it is possible to synthesize:• Benchmarking code• Component implementation code• Build & Component IDL files

Page 11: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Resolving Evaluation Challenges (2/3)

• Each configuration option can then be tested to identify the configuration that maximizes the QoS for the scenario

• These empirically refined configurations can be reused across applications that have similar/same application domains

• These configurations can be viewed as Configuration & Customization (C&C) patterns

template <typename T>voidBenchmark_AcceptWorkOrderResponse<T>::svc (void){ ACE_Sample_History history (5000); ACE_hrtime_t test_start = ACE_OS::gethrtime ();

ACE_UINT32 gsf = ACE_High_Res_Timer::global_scale_factor (); for (i = 0; i < 5000; i++) { ACE_hrtime_t start = ACE_OS::gethrtime (); (void) this->remote_ref_-> AcceptWorkOrderResponse (arg0, arg1); ACE_CHECK; ACE_hrtime_t now = ACE_OS::gethrtime (); history.sample (now - start); }}

• BGML allows actual composition of target interaction scenario, auto-generates benchmarking code

Page 12: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Problem• Using each tool in isolation does not provide complete information• OCML does not know about performance • BGML does not know what the configuration is

Need for Tool Integration (MDD Process) (1/2)

Context

• OCML tool resolves accidental complexity in configuring components

• BGML tool resolves accidental complexity in evaluating QoS

OCML Correct Configuration

HumanMachineInterface

BGML Measures critical flow path latency

Page 13: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Need for Tool Integration (MDD Process) (2/2)

Solution MDD ProcessMDD Process leveraging PICML, OCML & BGML

• PICML interaction scenario, Deployment & Component configuration

• OCML Model middleware hosting individual Components

• BGML Capture Evaluation Concerns OCML

PICML

BGML

MDDProcess

Apply MDD process to DRE application scenario to answer:

• How does Middleware Configuration affect QoS?

• How do Deployment decisions affect QoS?

Candidate configuration (s)

Least latency

Page 14: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

MDD Process (1/3)

Step 1: PICML Tool• PICML used to generate deployment plan information

Step 2: Middleware Configuration• OCML associated with Implementation Artifacts

• OCML provides a wizard with documentation to configure the artifacts

• Configuration of middleware that hosts the “executors” a.k.a Servants in CORBA 2.0

Virtual nodes

Process Collocation

Mapping

DocumentationPane

Option selection

Artifact

Page 15: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

MDD Process (2/3)Step 2 Choosing Configurations

• How best to configure middleware hosting HMI and WSM components to minimize end-to-end latency

• Component roles

•Display component – pure client

•Watch Manager component – “peer role” does not need concurrency

• For each component (Display) narrow down selected configurations

•Fixed part – determined a priori

•Dynamic – cannot determine without testing

HMI Component

WSM Component

Configuration Space

Step 3 Capturing QoS concerns

• Profile & Generate Multiple work-orders exchanged between Watch Manager Component and Human for Acceptance/Rejection

• Use Timers to measure end-to-end critical path latency in the scenario

• Same code can be used to evaluate different combinations of configurations

Page 16: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

MDD Process (3/3)

Workspace & Glue Generation

• Create workspace and projects to generate build files for the scenario

Time-stamp send & receive

Load generator for

the accept operation

To enact a scenario, this process automates:• Deployment Plan – XML deployment

information• svc.conf – Configuration for each

component implementation• Benchmark code – source code for

executing benchmarks• IDL & CIDL files• Build Files – MPC files (www.ociweb.com)

workspace { RobotManager WatchSettingManager PalletteConveyorManager HumanMachineInterface ManagementWorkInstructions}

Projects having artifacts

Solution

Page 17: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Experimental Results / Highlights (1/3)

Automation / Code Generation

Experiment Execution

• Totally we conducted 64 experiments for different combinations of Human Machine Interface & Watch Setting Manager Components

• The latency measures were tabulated to look for the configuration that minimized latency

• Corresponding end-to-end measures were also checked

DRE Experimental Scenarios

Total Files/Lines of Code Required

Automated by MDD Process

•Robot Assembly

•Basic SP

•65 files (includes IDL/CIDL) generated files

•54 files (includes IDL/CIDL) generated files

•For Robot Assembly number of files automated 60 (script files not generated yet..)

•For BasicSP 49 files are auto-generated

Automated execution of experiments: scripts used to set-up & tear down experiments

Page 18: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Experimental Results / Highlights (2/3)

Observations

• Similar configurations affected QoS similarly

• For both cases we observed (G1,H1,I2,J2) minimized latency the most

• Both cases showed that G is the most important configuration

• Penalty for not setting G to G1 is ~4 µsecs in BasicSP & ~60 µsecs in RobotAssembly

• Other options are not important, i.e., setting them or leaving to defaults leads to same behavior

• Figure shows a visualization of the configuration space

• Circles represent a point in the configuration space

• Edge represents the distance (performance) degradation from moving from one point to another

Defining operating regions enable setting more important configurations allowing flexibility in others

Page 19: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Experimental Results / Highlights (3/3)

• How does platform affect QoS?

• Providing feedback on deployment plan i.e. Provides Component – Node mappings

• BasicSP scenario

• Tried two combinations as shown in table

• Process

• No changes required from earlier experiment: capture same end-to-end latency

• Change component node mapping to re-generate the deployment plan

• Observe & tabulate latency changes

• Real-time component placement decided a priori software tied to the hardware

• During failure:

• Important to decide where to place components to ensure QoS

•This process aids for making this decision

ACEDOC

TANGO

Ethernet

DisplayAirframe

GPS

ACEDOC

TANGO

Ethernet

GPSAirframe

Display

Page 20: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Concluding Remarks

• MDD process provides a flexible model-based approach for evaluating QoS of middleware configurations

• Auto-generates most of the code required to run the experiment

• OCML does not automatically generate configuration space

• The script for automatically evaluating different configurations was not generated

• Feedback to “Planner” allows refinement of configuration during testing phase

Our Future work:

• EMULab ns style script generation for easy simulation

• Strategies for interfacing with higher level performance monitoring tools

PatternsDatabase

Scoreboard

IdentifyConfiguration

Patterns

Map Features toConfigurations

• Identifying patterns in configuration allows mapping features directly onto middleware configurations

Page 21: Model Driven Techniques for Evaluating  QoS of Middleware Configurations

Downloading the Middleware & Tools

•http://www.dre.vanderbilt.edu/cosmic

• Beta & stable releases can be accessed from http://www.dre.vanderbilt.edu/Download.html

OCML & BGML are part of the CoSMIC MDD tool suite