Top Banner
Module 1 DS324EE – DataStage Enterprise Edition Concept Review
310

Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Dec 26, 2015

Download

Documents

Catherine Payne
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module 1

DS324EE – DataStage Enterprise Edition

Concept Review

Page 2: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Ascential’s Enterprise Data Integration Platform

CRMERPSCM

RDBMSLegacy

Real-time Client-server Web services

Data WarehouseOther apps.

ANY SOURCE

ANY TARGET

CRMERPSCMBI/AnalyticsRDBMSReal-time Client-server Web servicesData WarehouseOther apps.

Command & Control

DISCOVERDISCOVER

Gather relevant information for target enterprise applications

Data Profiling

PREPAREPREPARE

Data Quality

Cleanse, correct and match input data

TRANSFORMTRANSFORM

Extract, Transform,

Load

Standardize and enrich data and load to targets

Meta Data Management

Parallel Execution

Page 3: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Course Objectives

You will learn to:– Build DataStage EE jobs using complex logic– Utilize parallel processing techniques to increase job

performance– Build custom stages based on application needs

Course emphasis is:– Advanced usage of DataStage EE– Application job development– Best practices techniques

Page 4: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Course Agenda

Day 1– Review of EE Concepts– Sequential Access– Standards– DBMS Access

Day 2– EE Architecture– Transforming Data– Sorting Data

Day 3– Combining Data– Configuration Files

Day 4– Extending EE– Meta Data Usage– Job Control– Testing

Page 5: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module Objectives

Provide a background for completing work in the DSEE advanced course

Ensure all students will have a successful advanced class

Tasks– Review parallel processing concepts

Page 6: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Review Topics

DataStage architecture

DataStage client review– Administrator– Manager– Designer– Director

Parallel processing paradigm

DataStage Enterprise Edition

Page 7: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Microsoft® Windows NT or UNIX

Designer DirectorRepository

ManagerAdministrator

Extract Cleanse Transform IntegrateDiscover Prepare Transform Extend

Parallel Execution

Meta Data Management

Command & Control

Microsoft® Windows NT/2000/XP

ANY SOURCE

ANY TARGET

CRMERPSCMBI/AnalyticsRDBMSReal-Time Client-server Web servicesData WarehouseOther apps.

Server Repository

Client-Server Architecture

Page 8: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Process Flow

Administrator – add/delete projects, set defaults

Manager – import meta data, backup projects

Designer – assemble jobs, compile, and execute

Director – execute jobs, examine job run logs

Page 9: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Administrator – Licensing and Timeout

Page 10: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Administrator – Project Creation/Removal

Functions specific to a

project.

Page 11: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Administrator – Project Properties

RCP for parallel jobs should be

enabled

Variables for parallel

processing

Page 12: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Administrator – Environment Variables

Variables are category specific

Page 13: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

OSH is what is run by the EE Framework

Page 14: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

DataStage Manager

Page 15: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Export Objects to MetaStage

Push meta data to

MetaStage

Page 16: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Designer Workspace

Can execute the job from

Designer

Page 17: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

DataStage Generated OSH

The EE Framework runs OSH

Page 18: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Director – Executing Jobs

Messages from previous run in different color

Page 19: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Stages

Can now customize the Designer’s palette

Page 20: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Popular Stages

Row generator

Peek

Page 21: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Row Generator

Can build test data

Repeatable property

Edit row in column tab

Page 22: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Peek

Displays field values– Will be displayed in job log or sent to a file– Skip records option– Can control number of records to be displayed

Can be used as stub stage for iterative development (more later)

Page 23: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Why EE is so Effective

Parallel processing paradigm– More hardware, faster processing– Level of parallelization is determined by a configuration

file read at runtime

Emphasis on memory– Data read into memory and lookups performed like

hash table

Page 24: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Parallel processing = executing your application on multiple CPUs– Scalable processing = add more resources

(CPUs, RAM, and disks) to increase system performance

• Example system containing6 CPUs (or processing nodes)and disks

1 2

3 4

5 6

Scalable Systems

Page 25: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Three main types of scalable systems

Symmetric Multiprocessors (SMP), shared memory

Clusters: UNIX systems connected via networks

MPP

note

Scaleable Systems: Examples

Page 26: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

• Multiple CPUs with a single operating system• Programs communicate using shared memory• All CPUs share system resources

(OS, memory with single linear address space, disks, I/O)

When used with enterprise edition:• Data transport uses shared memory• Simplified startup

cpu cpu

cpu cpu

enterprise edition treats NUMA (NonUniform Memory Access) as SMP

SMP: Shared Everything

Page 27: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Source

Transform

Target

Data Warehouse

Operational Data

Archived Data

Clean Load

Disk Disk Disk

Traditional approach to batch processing:• Write to disk and read from disk before each processing operation• Sub-optimal utilization of resources

• a 10 GB stream leads to 70 GB of I/O• processing resources can sit idle during I/O

• Very complex to manage (lots and lots of small jobs)• Becomes impractical with big data volumes

• disk I/O consumes the processing• terabytes of disk required for temporary staging

Traditional Batch Processing

Page 28: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Data Pipelining• Transform, clean and load processes are executing simultaneously on the same processor• rows are moving forward through the flow

Source

Transform

Target

Data Warehouse

Operational Data

Archived Data Clean Load

• Start a downstream process while an upstream process is still running.• This eliminates intermediate storing to disk, which is critical for big data.• This also keeps the processors busy.• Still has limits on scalability

Think of a conveyor belt moving the rows from process to process!

Pipeline Multiprocessing

Page 29: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Data Partitioning

Transform

SourceData

Transform

Transform

Transform

Node 1

Node 2

Node 3

Node 4

A-F

G- M

N-T

U-Z

• Break up big data into partitions

• Run one partition on each processor

• 4X times faster on 4 processors - With data big enough: 100X faster on 100 processors

• This is exactly how the parallel databases work!

• Data Partitioning requires the same transform to all partitions: Aaron Abbott and Zygmund Zorn undergo the same transform

Partition Parallelism

Page 30: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Putting It All Together: Parallel Dataflow

Source Target

Transform Clean Load

Pipelining

Par

titio

ning

SourceData

Data Warehouse

Combining Parallelism Types

Page 31: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

• Dataset: uniform set of rows in the Framework's internal representation - Three flavors: 1. file sets *.fs : stored on multiple Unix files as flat files 2. persistent: *.ds : stored on multiple Unix files in Framework

format read and written using the DataSet Stage

3. virtual: *.v : links, in Framework format, NOT stored on disk - The Framework processes only datasets—hence possible need for

Import - Different datasets typically have different schemas- Convention: "dataset" = Framework data set.

• Partition: subset of rows in a dataset earmarked for processing by the same node (virtual CPU, declared in a configuration file).

- All the partitions of a dataset follow the same schema: that of the dataset

EE Program Elements

Page 32: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Putting It All Together: Parallel Dataflow with Repartioning on-the-fly

Without Landing To Disk!

Source Target

Transform Clean Load

Pipelining

SourceData Data

Warehouse

Par

titio

ning

Rep

artit

ioni

ng

A-FG- M

N-TU-Z

Customer last name Customer zip code Credit card number

Rep

artit

ioni

ng

Repartitioning

Page 33: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Orchestrate Program(sequential data flow)

Orchestrate Application Frameworkand Runtime System

Import

Clean 1

Clean 2

Merge Analyze

Configuration File

Centralized Error Handlingand Event Logging

Parallel access to data in files

Parallel access to data in RDBMS

Inter-node communications

Parallel pipelining

Parallelization of operations

Import

Clean 1

Merge Analyze

Clean 2

Relational Data

PerformanceVisualization

Flat Files

Orchestrate Framework:Provides parallel processing

DataStage Enterprise Edition:Best-of-breed scalable data integration platformNo limitations on data volumes or throughput

DataStage EE Architecture

DataStage Engine:Provides data integration platform

Page 34: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

DSEE:– Automatically scales to fit the machine– Handles data flow among multiple CPU’s and disks

With DSEE you can:– Create applications for SMP’s, clusters and MPP’s…

enterprise edition is architecture-neutral– Access relational databases in parallel– Execute external applications in parallel– Store data across multiple disks and nodes

Introduction to DataStage EE

Page 35: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

User assembles the flow using the DataStage Designer:

…and gets: parallel access, propagation, transformation, and load. The design is good for 1 node, 4 nodes,

or N nodes. To change # nodes, just swap configuration file. No need to modify or recompile your design!

Job Design VS. Execution

Page 36: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Partitioners distribute rows into partitions– implement data-partition parallelism

Collectors = inverse partitioners

Live on input links of stages running – in parallel (partitioners)– sequentially (collectors)

Use a choice of methods

Partitioners and Collectors

Page 37: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Example Partitioning Icons

partitioner

Page 38: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

Complete exercises 1-1 and 1-2, and 1-3

Page 39: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module 2

DSEE Sequential Access

Page 40: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module Objectives

You will learn to:– Import sequential files into the EE Framework– Utilize parallel processing techniques to increase

sequential file access– Understand usage of the Sequential, DataSet, FileSet,

and LookupFileSet stages– Manage partitioned data stored by the Framework

Page 41: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Types of Sequential Data Stages

Sequential– Fixed or variable length

File Set

Lookup File Set

Data Set

Page 42: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

The EE Framework processes only datasets

For files other than datasets, such as flat files, enterprise edition must perform import and export operations – this is performed by import and export OSH operators (generated by Sequential or FileSet stages)

During import or export DataStage performs format translations – into, or out of, the EE internal format

Data is described to the Framework in a schema

Sequential Stage Introduction

Page 43: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

How the Sequential Stage Works

Generates Import/Export operators

Types of transport– Performs direct C++ file I/O streams– Source programs which feed stdout (gunzip) send

stdout into EE via sequential pipe

Page 44: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Using the Sequential File Stage

Importing/Exporting Data

Both import and export of general files (text, binary) are performed by the SequentialFile Stage.

– Data import:

– Data export

EE internal format

EE internal format

Page 45: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Working With Flat Files

Sequential File Stage– Normally will execute in sequential mode– Can execute in parallel if reading multiple files (file

pattern option)– Can use multiple readers within a node on fixed width

file– DSEE needs to know

How file is divided into rowsHow row is divided into columns

Page 46: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Processes Needed to Import Data

Recordization– Divides input stream into records– Set on the format tab

Columnization– Divides the record into columns– Default set on the format tab but can be overridden on

the columns tab– Can be “incomplete” if using a schema or not even

specified in the stage if using RCP

Page 47: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

File Format Example

Fie ld 1

F ie ld 1

F ie ld 1

F ie ld 1

F ie ld 1

F ie ld 1

,

,

,

,

,

,

Last fie ld

Last fie ld

n l

n l,

F ie ld D e lim ite r

F in a l D e lim ite r = c o m m a

F in a l D e lim ite r = e n d

R e co rd d e lim ite r

Page 48: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Sequential File Stage

To set the properties, use stage editor– Page (general, input/output)– Tabs (format, columns)

Sequential stage link rules– One input link– One output links (except for reject link definition)– One reject link

Will reject any records not matching meta data in the column definitions

Page 49: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Job Design Using Sequential Stages

Stage categories

Page 50: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

General Tab – Sequential Source

Multiple output links Show records

Page 51: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Properties – Multiple Files

Click to add more files having the same meta data.

Page 52: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Properties - Multiple Readers

Multiple readers option allows you to set number of readers

Page 53: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Format Tab

File into records Record into columns

Page 54: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Read Methods

Page 55: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Reject Link

Reject mode = output

Source– All records not matching the meta data (the column

definitions)

Target– All records that are rejected for any reason

Meta data – one column, datatype = raw

Page 56: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

File Set Stage

Can read or write file sets

Files suffixed by .fs

File set consists of:1. Descriptor file – contains location of raw data files +

meta data

2. Individual raw data files

Can be processed in parallel

Page 57: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

File Set Stage Example

Descriptor file

Page 58: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

File Set Usage

Why use a file set?– 2G limit on some file systems– Need to distribute data among nodes to prevent

overruns– If used in parallel, runs faster that sequential file

Page 59: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Lookup File Set Stage

Can create file sets

Usually used in conjunction with Lookup stages

Page 60: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Lookup File Set > Properties

Key column specified

Key column dropped in

descriptor file

Page 61: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Data Set

Operating system (Framework) file

Suffixed by .ds

Referred to by a control file

Managed by Data Set Management utility from GUI (Manager, Designer, Director)

Represents persistent data

Key to good performance in set of linked jobs

Page 62: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Persistent Datasets

Accessed from/to disk with DataSet Stage.

Two parts: – Descriptor file:

contains metadata, data location, but NOT the data itself

– Data file(s) contain the data multiple Unix files (one per node), accessible in parallel

input.ds

node1:/local/disk1/…node2:/local/disk2/…

record ( partno: int32; description: string; )

Page 63: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Quiz!

• True or False?Everything that has been data-partitioned must be

collected in same job

Page 64: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Data Set Stage

Is the data partitioned?

Page 65: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Engine Data Translation

Occurs on import– From sequential files or file sets– From RDBMS

Occurs on export– From datasets to file sets or sequential files– From datasets to RDBMS

Engine most efficient when processing internally formatted records (I.e. data contained in datasets)

Page 66: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Managing DataSets

GUI (Manager, Designer, Director) – tools > data set management

Alternative methods – Orchadmin

Unix command line utilityList recordsRemove data sets (will remove all components)

– DsrecordsLists number of records in a dataset

Page 67: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Data Set Management

Display data

Schema

Page 68: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Data Set Management From Unix

Alternative method of managing file sets and data sets– Dsrecords

Gives record count– Unix command-line utility

– $ dsrecords ds_name

I.e.. $ dsrecords myDS.ds156999 records

– Orchadmin Manages EE persistent data sets

– Unix command-line utility

I.e. $ orchadmin rm myDataSet.ds

Page 69: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

Complete exercises 2-1, 2-2, 2-3, and 2-4.

Page 70: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Blank

Page 71: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module 3

Standards and Techniques

Page 72: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Objectives

Establish standard techniques for DSEE development

Will cover:– Job documentation– Naming conventions for jobs, links, and stages– Iterative job design– Useful stages for job development – Using configuration files for development– Using environmental variables– Job parameters

Page 73: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Job Presentation

Document using the annotation stage

Page 74: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Job Properties Documentation

Description shows in DS Manager and MetaStage

Organize jobs into categories

Page 75: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Naming conventions

Stages named after the – Data they access– Function they perform– DO NOT leave defaulted stage names like

Sequential_File_0

Links named for the data they carry– DO NOT leave defaulted link names like DSLink3

Page 76: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Stage and Link Names

Stages and links renamed to data they

handle

Page 77: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Create Reusable Job Components

Use enterprise edition shared containers when feasible

Container

Page 78: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Use Iterative Job Design

Use copy or peek stage as stub

Test job in phases – small first, then increasing in complexity

Use Peek stage to examine records

Page 79: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Copy or Peek Stage Stub

Copy stage

Page 80: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Transformer StageTechniques

Suggestions -– Always include reject link.– Always test for null value before using a column in a

function. – Try to use RCP and only map columns that have a

derivation other than a copy. More on RCP later.– Be aware of Column and Stage variable Data Types.

Often user does not pay attention to Stage Variable type.

– Avoid type conversions.Try to maintain the data type as imported.

Page 81: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

The Copy Stage

With 1 link in, 1 link out:

the Copy Stage is the ultimate "no-op" (place-holder): – Partitioners– Sort / Remove Duplicates– Rename, Drop column

… can be inserted on: – input link (Partitioning): Partitioners, Sort, Remove Duplicates)– output link (Mapping page): Rename, Drop.

Sometimes replace the transformer:– Rename,– Drop, – Implicit type Conversions– Link Constraint – break up schema

Page 82: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Developing Jobs

1. Keep it simple• Jobs with many stages are hard to debug and maintain.

2. Start small and Build to final Solution• Use view data, copy, and peek. • Start from source and work out.• Develop with a 1 node configuration file.

3. Solve the business problem before the performance problem.• Don’t worry too much about partitioning until the sequential flow

works as expected.

4. If you have to write to Disk use a Persistent Data set.

Page 83: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Final Result

Page 84: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Good Things to Have in each Job

Use job parameters

Some helpful environmental variables to add to job parameters– $APT_DUMP_SCORE

Report OSH to message log

– $APT_CONFIG_FILEEstablishes runtime parameters to EE engine; I.e. Degree of

parallelization

Page 85: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Setting Job Parameters

Click to add environment

variables

Page 86: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

DUMP SCORE Output

Double-click

Mapping Node--> partition

Setting APT_DUMP_SCORE yields:

PartitonerAnd

Collector

Page 87: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Use Multiple Configuration Files

Make a set for 1X, 2X,….

Use different ones for test versus production

Include as a parameter in each job

Page 88: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

Complete exercise 3-1

Page 89: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module 4

DBMS Access

Page 90: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Objectives

Understand how DSEE reads and writes records to an RDBMS

Understand how to handle nulls on DBMS lookup

Utilize this knowledge to:– Read and write database tables– Use database tables to lookup data– Use null handling options to clean data

Page 91: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Parallel Database Connectivity

TraditionalTraditionalClient-ServerClient-Server enterprise editionenterprise edition

SortSort

ClientClient

Parallel RDBMSParallel RDBMS

ClientClient

ClientClient

ClientClient

ClientClient

Parallel RDBMSParallel RDBMS

Only RDBMS is running in parallel Each application has only one connection Suitable only for small data volumes

Parallel server runs APPLICATIONS Application has parallel connections to RDBMS Suitable for large data volumes Higher levels of integration possible

ClientClient

LoadLoad

Page 92: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

RDBMS AccessSupported Databases

enterprise edition provides high performance / scalable interfaces for:

DB2

Informix

Oracle

Teradata

Users must be granted specific privileges, depending on RDBMS.

Page 93: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Automatically convert RDBMS table layouts to/from enterprise edition Table Definitions

RDBMS nulls converted to/from nullable field values

Support for standard SQL syntax for specifying:– field list for SELECT statement– filter for WHERE clause– open command, close command

Can write an explicit SQL query to access RDBMS EE supplies additional information in the SQL query

RDBMS AccessSupported Databases

Page 94: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

RDBMS Stages

DB2/UDB Enterprise

Informix Enterprise

Oracle Enterprise

Teradata Enterprise

ODBC

Page 95: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

RDBMS Usage

As a source– Extract data from table (stream link)

– Extract as table, generated SQL, or user-defined SQL

– User-defined can perform joins, access views

– Lookup (reference link)– Normal lookup is memory-based (all table data read into memory)

– Can perform one lookup at a time in DBMS (sparse option)

– Continue/drop/fail options

As a target– Inserts– Upserts (Inserts and updates)– Loader

Page 96: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

RDBMS Source – Stream Link

Stream link

Page 97: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

DBMS Source - User-defined SQL

Columns in SQL statement must match the meta data

in columns tab

Page 98: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

User-defined SQL– Exercise 4-1

Page 99: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

DBMS Source – Reference Link

Reject link

Page 100: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Lookup Reject Link

“Output” option automatically creates the reject link

Page 101: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Null Handling

Must handle null condition if lookup record is not found and “continue” option is chosen

Can be done in a transformer stage

Page 102: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Lookup Stage Mapping

Link name

Page 103: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Lookup Stage Properties

Reference link

Must have same column name in input and reference links. You will get the results of the lookup

in the output column.

Page 104: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

DBMS as a Target

Page 105: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

DBMS As Target

Write Methods– Delete– Load– Upsert– Write (DB2)

Write mode for load method– Truncate– Create– Replace– Append

Page 106: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Target Properties

Upsert mode determines options

Generated code can be copied

Page 107: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Checking for Nulls

Use Transformer stage to test for fields with null values (Use IsNull functions)

In Transformer, can reject or load default value

Page 108: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

Complete exercise 4-2

Page 109: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module 5

Platform Architecture

Page 110: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Objectives

Understand how enterprise edition Framework processes data

You will be able to:– Read and understand OSH– Perform troubleshooting

Page 111: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Concepts

The EE Platform

OSH (generated by DataStage Parallel Canvas, and run by DataStage Director)

Conductor,Section leaders,players.

Configuration files (only one active at a time, describes H/W)

Schemas/tables

Schema propagation/RCP

Buildop,Wrapper

Datasets (data in Framework's internal representation)

Page 112: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Output Data Set schema:prov_num:int16;member_num:int8;custid:int32;

Input Data Set schema:prov_num:int16;member_num:int8;custid:int32;

EE Stages Involve A Series Of Processing Steps

InputInterface

Partitioner

Business Logic

Output

Interface

EE Stage

• Piece of Application Logic Running Against Individual Records

• Parallel or Sequential

• Three Sources– Ascential Supplied– Commercial tools/applications– Custom/Existing programs

DS-EE Program Elements

Page 113: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

• EE Delivers Parallelism in Two Ways– Pipeline

– Partition

• Block Buffering Between Components – Eliminates Need for Program

Load Balancing

– Maintains Orderly Data FlowPipeline

Partition

Dual Parallelism Eliminates Bottlenecks!

Producer

Consumer

DS-EE Program ElementsStage Execution

Page 114: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Stages Control Partition Parallelism

Execution Mode (sequential/parallel) is controlled by Stage

– default = parallel for most Ascential-supplied Stages– User can override default mode– Parallel Stage inserts the default partitioner (Auto) on its input links – Sequential Stage inserts the default collector (Auto) on its input links – user can override default

execution mode (parallel/sequential) of Stage (Advanced tab)choice of partitioner/collector (Input – Partitioning Tab)

Page 115: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

How Parallel Is It?

Degree of Parallelism is determined by the configuration file– Total number of logical nodes in default pool, or a

subset if using "constraints". Constraints are assigned to specific pools as defined in

configuration file and can be referenced in the stage

Page 116: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

OSH

DataStage EE GUI generates OSH scripts– Ability to view OSH turned on in Administrator– OSH can be viewed in Designer using job properties

The Framework executes OSH

What is OSH?– Orchestrate shell– Has a UNIX command-line interface

Page 117: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

OSH Script

An osh script is a quoted string which specifies:– The operators and connections of a single Orchestrate

step – In its simplest form, it is:

osh “op < in.ds > out.ds”

Where:– op is an Orchestrate operator– in.ds is the input data set– out.ds is the output data set

Page 118: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

OSH Operators

Operator is an instance of a C++ class inheriting from APT_Operator

Developers can create new operators

Examples of existing operators:– Import– Export– RemoveDups

Page 119: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Enable Visible OSH in Administrator

Will be enabled for all projects

Page 120: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

View OSH in Designer

Schema

Operator

Page 121: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

OSH Practice

Exercise 5-1

Page 122: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Orchestrate May Add Operators to Your Command

$ osh " echo 'Hello world!' [par] > outfile "

Let’s revisit the following OSH command:

3)collector

4)export(from datasetto Unix flat file)

2)partitioner

1)“wrapper”(turning aUnix command intoan DS/EE Operator)

The Framework silently inserts operators (steps 1,2,3,4)

Page 123: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

• Step: unit of OSH program– one OSH command = one step

– at end of step: synchronization, storage to disk

• Datasets: set of rows processed by Framework

– Orchestrate data sets:

– persistent (terminal) *.ds, and

– virtual (internal) *.v.

– Also: flat “file sets” *.fs

• Schema: data description (metadata) for datasets and links.

Steps, with internal and terminal datasets and links, described by schemas

Elements of a Framework Program

Page 124: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

• Consist of Partitioned Data and Schema

• Can be Persistent (*.ds) or Virtual (*.v, Link)

• Overcome 2 GB File Limit

=

What you program: What gets processed:

. . .

Multiple files per partitionEach file up to 2GBytes (or larger)

Operator A

Operator A

Operator A

Operator A

Node 1 Node 2 Node 3 Node 4

data filesof x.ds

$ osh “operator_A > x.ds“

GUI

OSH

Orchestrate Datasets

What gets generated:

Page 125: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Computing Architectures: Definition

Clusters and MPP Systems

Shared Disk Shared Nothing

Uniprocessor

Dedicated Disk

• IBM, Sun, HP, Compaq• 2 to 64 processors• Majority of installations

Shared Memory

SMP System(Symmetric Multiprocessor)

DiskDisk

CPU

Memory

CPU CPU CPU

• PC• Workstation• Single processor server

CPU

• 2 to hundreds of processors• MPP: IBM and NCR Teradata• each node is a uniprocessor or SMP

CPU

Disk

Memory

CPU

Disk

Memory

CPU

Disk

Memory

CPU

Disk

Memory

Page 126: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Job Execution:Orchestrate

Conductor Node

C

Processing Node

SL

PP P

SL

PP P

Processing Node

• Conductor - initial DS/EE process– Step Composer– Creates Section Leader processes (one/node)– Consolidates massages, outputs them– Manages orderly shutdown.

• Section Leader – Forks Players processes (one/Stage)– Manages up/down communication.

• Players– The actual processes associated with Stages– Combined players: one process only– Send stderr to SL– Establish connections to other players for data flow– Clean up upon completion.

• Communication:- SMP: Shared Memory- MPP: TCP

Page 127: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Working with Configuration Files

You can easily switch between config files: '1-node' file - for sequential execution, lighter reports—handy for testing 'MedN-nodes' file - aims at a mix of pipeline and data-partitioned parallelism 'BigN-nodes' file - aims at full data-partitioned parallelism

Only one file is active while a step is runningThe Framework queries (first) the environment variable:

$APT_CONFIG_FILE

# nodes declared in the config file needs not match # CPUsSame configuration file can be used in development and target

machines

Page 128: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

SchedulingNodes, Processes, and CPUs

DS/EE does not: – know how many CPUs are available– schedule

Who knows what?

Who does what?– DS/EE creates (Nodes*Ops) Unix processes – The O/S schedules these processes on the CPUs

Nodes = # logical nodes declared in config. fileOps = # ops. (approx. # blue boxes in V.O.)Processes = # Unix processesCPUs = # available CPUs

Nodes Ops Processes CPUs

User Y N

Orchestrate Y Y Nodes * Ops N

O/S " Y

Page 129: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

{ node "n1" { fastname "s1" pool "" "n1" "s1" "app2" "sort" resource disk "/orch/n1/d1" {} resource disk "/orch/n1/d2" {} resource scratchdisk "/temp" {"sort"} } node "n2" { fastname "s2" pool "" "n2" "s2" "app1" resource disk "/orch/n2/d1" {} resource disk "/orch/n2/d2" {} resource scratchdisk "/temp" {} } node "n3" { fastname "s3" pool "" "n3" "s3" "app1" resource disk "/orch/n3/d1" {} resource scratchdisk "/temp" {} } node "n4" { fastname "s4" pool "" "n4" "s4" "app1" resource disk "/orch/n4/d1" {} resource scratchdisk "/temp" {} }

1

43

2

Configuring DSEE – Node Pools

Page 130: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

{ node "n1" { fastname "s1" pool "" "n1" "s1" "app2" "sort" resource disk "/orch/n1/d1" {} resource disk "/orch/n1/d2" {"bigdata"} resource scratchdisk "/temp" {"sort"} } node "n2" { fastname "s2" pool "" "n2" "s2" "app1" resource disk "/orch/n2/d1" {} resource disk "/orch/n2/d2" {"bigdata"} resource scratchdisk "/temp" {} } node "n3" { fastname "s3" pool "" "n3" "s3" "app1" resource disk "/orch/n3/d1" {} resource scratchdisk "/temp" {} } node "n4" { fastname "s4" pool "" "n4" "s4" "app1" resource disk "/orch/n4/d1" {} resource scratchdisk "/temp" {} }

1

43

2

Configuring DSEE – Disk Pools

Page 131: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

node 1 node 2

Parallel to parallel flow may incur reshuffling:Records may jump between nodes

partitioner

Re-Partitioning

Page 132: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

N

Partitioner with parallel import

When a partitioner receives:

• sequential input (1 partition), it creates N partitions

• parallel input (N partitions), it outputs N partitions*, may result in re-partitioning

* Assuming no “constraints”

Nnode 1 node 2

Re-Partitioning X-ray

Page 133: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Stage 2

Stage 1

partitioner

If Stage 2runs in parallel,DS/EE silently inserts a partitionerupstream of it.

If Stage 1 alsoruns in parallel,re-partitioning occurs.

partition 1 partition 2

In most cases, automatic re-partitioning is benign (no reshuffling), preserving the same partitioning as upstream.

Re-partitioning can be forcedto be benign, using either:

same preserve partitioning

Automatic Re-Partitioning

Page 134: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Partitioning Methods

Auto

Hash

Entire

Range

Range Map

Page 135: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

• Collectors combine partitions of a dataset into a single input stream to a sequential Stage

data partitions

collector

sequential Stage

...

–Collectors do NOT synchronize data

Collectors

Page 136: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Partitioning and Repartitioning Are Visible On Job Design

Page 137: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Partitioning and Collecting Icons

Partitioner Collector

Page 138: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Setting a Node Constraint in the GUI

Page 139: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Reading Messages in Director

Set APT_DUMP_SCORE to true

Can be specified as job parameter

Messages sent to Director log

If set, parallel job will produce a report showing the operators, processes, and datasets in the running job

Page 140: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Messages With APT_DUMP_SCORE = True

Page 141: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

Complete exercise 5-2

Page 142: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Blank

Page 143: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module 6

Transforming Data

Page 144: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module Objectives

Understand ways DataStage allows you to transform data

Use this understanding to:– Create column derivations using user-defined code or

system functions– Filter records based on business criteria– Control data flow based on data conditions

Page 145: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Transformed Data

Transformed data is:– Outgoing column is a derivation that may, or may not,

include incoming fields or parts of incoming fields– May be comprised of system variables

Frequently uses functions performed on something (ie. incoming columns)– Divided into categories – I.e.

Date and timeMathematicalLogicalNull handlingMore

Page 146: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Stages Review

Stages that can transform data– Transformer

ParallelBasic (from Parallel palette)

– Aggregator (discussed in later module)

Sample stages that do not transform data– Sequential– FileSet– DataSet– DBMS

Page 147: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Transformer Stage Functions

Control data flow

Create derivations

Page 148: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Flow Control

Separate record flow down links based on data condition – specified in Transformer stage constraints

Transformer stage can filter records

Other stages can filter records but do not exhibit advanced flow control– Sequential– Lookup– Filter

Page 149: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Rejecting Data

Reject option on sequential stage– Data does not agree with meta data– Output consists of one column with binary data type

Reject links (from Lookup stage) result from the drop option of the property “If Not Found”– Lookup “failed”– All columns on reject link (no column mapping option)

Reject constraints are controlled from the constraint editor of the transformer– Can control column mapping– Use the “Other/Log” checkbox

Page 150: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Rejecting Data Example

“If Not Found” property

Contstraint – Other/log option

Property Reject Mode = Output

Page 151: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Transformer Stage Properties

Page 152: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Transformer Stage Variables

First of transformer stage entities to execute

Execute in order from top to bottom– Can write a program by using one stage variable to

point to the results of a previous stage variable

Multi-purpose– Counters– Hold values for previous rows to make comparison– Hold derivations to be used in multiple field dervations– Can be used to control execution of constraints

Page 153: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Stage Variables

Show/Hide button

Page 154: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Transforming Data

Derivations– Using expressions– Using functions

Date/time

Transformer Stage Issues– Sometimes require sorting before the transformer stage

– I.e. using stage variable as accumulator and need to break on change of column value

Checking for nulls

Page 155: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Checking for Nulls

Nulls can get introduced into the dataflow because of failed lookups and the way in which you chose to handle this condition

Can be handled in constraints, derivations, stage variables, or a combination of these

Page 156: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Nullability

Can set the value of null; i.e.. If value of column is null put “NULL” in the outgoing column

Source Field Destination Field Result

not_nullable not_nullable Source value propagates to destination.

not_nullable nullable Source value propagates; destination value is never null.

nullable not_nullable WARNING messages in log. If source value is null,a fatal error occurs. Must handle in transformer.

nullable nullable Source value or null propagates.

Page 157: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Transformer Stage- Handling Rejects

1. Constraint Rejects– All expressions are false

and reject row is checked

2. Expression Error Rejects– Improperly Handled Null

Page 158: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Transformer: Execution Order

• Derivations in stage variables are executed first

• Constraints are executed before derivations

• Column derivations in earlier links are executed before later links

• Derivations in higher columns are executed before lower columns

Page 159: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Two Transformers for the Parallel Palette

All > Processing >

Transformer

Is the non-Universe transformer

Has a specific set of functions

No DS routines available

Parallel > Processing

Basic Transformer

Makes server style transforms available on the parallel palette

Can use DS routines

No need for shared container to get Universe functionality on the parallel palette•Program in Basic for both transformers

Page 160: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Transformer Functions From Derivation Editor

Data & Time

Logical

Mathematical

Null Handling

Number

Raw

String

Type Conversion

Utility

Page 161: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Timestamps and Dates

Data & Time

Also some in Type Conversion

Page 162: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

Complete exercises 6-1, 6-2, and 6-3

Page 163: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module 7

Sorting Data

Page 164: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Objectives

Understand DataStage EE sorting options

Use this understanding to create sorted list of data to enable functionality within a transformer stage

Page 165: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Sorting Data

Important because– Transformer may be using stage variables for

accumulators or control breaks and order is important– Other stages may run faster – I.e Aggregator– Facilitates the RemoveDups stage, order is important– Job has partitioning requirements

Can be performed – Option within stages (use input > partitioning tab and

set partitioning to anything other than auto)– As a separate stage (more complex sorts)

Page 166: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Sorting Alternatives

• Alternative representation of same flow:

Page 167: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Sort Option on Stage Link

Page 168: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Sort Stage

Page 169: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Sort Utility

DataStage – the default

SyncSort

UNIX

Page 170: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Sort Stage - Outputs

Specifies how the output is derived

Page 171: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Sort Specification Options

Input Link Property

– Limited functionality– Max memory/partition is 20 MB, then

spills to scratch Sort Stage

– Tunable to use more memory before spilling to scratch.

Note: Spread I/O by adding more scratch file systems to each node of the APT_CONFIG_FILE

Page 172: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Removing Duplicates

Can be done by Sort – Use unique option

OR

Remove Duplicates stage– Has more sophisticated ways to remove duplicates

Page 173: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

Complete exercise 7-1

Page 174: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Blank

Page 175: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module 8

Combining Data

Page 176: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Objectives

Understand how DataStage can combine data using the Join, Lookup, Merge, and Aggregator stages

Use this understanding to create jobs that will– Combine data from separate input streams– Aggregate data to form summary totals

Page 177: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Combining Data

There are two ways to combine data:

– Horizontally: Several input links; one output link (+ optional rejects) made of columns from different input links. E.g.,

JoinsLookupMerge

– Vertically: One input link, output with column combining values from all input rows. E.g.,

Aggregator

Page 178: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Recall the Join, Lookup & Merge

Stages

These "three Stages" combine two or more input links according to values of user-designated "key" column(s).

They differ mainly in:– Memory usage– Treatment of rows with unmatched key values– Input requirements (sorted, de-duplicated)

Page 179: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Joins - Lookup - Merge:Not all Links are Created Equal!

Joins Lookup Merge

Primary Input: port 0 Left Source MasterSecondary Input(s): ports 1,… Right LU Table(s) Update(s)

• enterprise edition distinguishes between:- The Primary Input (Framework port 0)- Secondary - in some cases "Reference" (other ports)

• Naming convention:

Tip: Check "Input Ordering" tab to make sure intended

Primary is listed first

Page 180: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Join Stage Editor

One of four variants:– Inner– Left Outer– Right Outer– Full Outer

Several key columns allowed

Link Order immaterial for Inner and Full Outer Joins (but VERY important for Left/Right Outer and Lookup and Merge)

Page 181: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

1. The Join Stage

Four types:

2 sorted input links, 1 output link – "left" on primary input, "right" on secondary input– Pre-sort make joins "lightweight": few rows need to be in RAM

• Inner• Left Outer• Right Outer• Full Outer

Page 182: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

2. The Lookup Stage

Combines: – one source link with– one or more duplicate-free table links

no pre-sort necessaryallows multiple keys LUTsflexible exception handling forsource input rows with no match

Lookup

Sourceinput

One or more tables (LUTs)

Output Reject

0

1

2

0

1

Page 183: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

The Lookup Stage

Lookup Tables should be small enough to fit into physical memory (otherwise, performance hit due to paging)

– Space time trade-off: presort vs. in RAM table

On an MPP you should partition the lookup tables using entire partitioning method, or partition them the same way you partition the source link

On an SMP, no physical duplication of a Lookup Table occurs

Page 184: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

The Lookup Stage

Lookup File Set – Like a persistent data set only it contains

metadata about the key.– Useful for staging lookup tables

RDBMS LOOKUP– SPARSE

Select for each row. Might become a performance bottleneck.

– NORMAL Loads to an in memory hash table first.

Page 185: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

3. The Merge Stage

Combines – one sorted, duplicate-free master (primary) link with – one or more sorted update (secondary) links.– Pre-sort makes merge "lightweight": few rows need to be in RAM (as with

joins, but opposite to lookup). Follows the Master-Update model:

– Master row and one or more updates row are merged if they have the same value in user-specified key column(s).

– A non-key column occurs in several inputs? The lowest input port number prevails (e.g., master over update; update values are ignored)

– Unmatched ("Bad") master rows can be either kept dropped

– Unmatched ("Bad") update rows in input link can be captured in a "reject" link– Matched update rows are consumed.

Page 186: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

The Merge Stage

Allows composite keysMultiple update linksMatched update rows are consumed Unmatched updates can be captured LightweightSpace/time tradeoff: presorts vs. in-RAM table

Master One or more updates

Output Rejects

Merge

0

0

21

21

Page 187: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

In this table:

• , <comma> = separator between primary and secondary input links

(out and reject links)

Synopsis:Joins, Lookup, & Merge

Joins Lookup Merge

Model RDBMS-style relational Source - in RAM LU Table Master -Update(s)Memory usage light heavy light

# and names of Inputs exactly 2: 1 left, 1 right 1 Source, N LU Tables 1 Master, N Update(s)

Mandatory Input Sort both inputs no all inputsDuplicates in primary input OK (x-product) OK Warning!Duplicates in secondary input(s) OK (x-product) Warning! OK only when N = 1Options on unmatched primary NONE [fail] | continue | drop | reject [keep] | dropOptions on unmatched secondary NONE NONE capture in reject set(s)

On match, secondary entries are reusable reusable consumed

# Outputs 1 1 out, (1 reject) 1 out, (N rejects)Captured in reject set(s) Nothing (N/A) unmatched primary entries unmatched secondary entries

Page 188: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

The Aggregator Stage

Purpose: Perform data aggregations

Specify:

Zero or more key columns that define the aggregation units (or groups)

Columns to be aggregated

Aggregation functions:count (nulls/non-nulls) sum

max/min/rangestandard error %coeff. of variationsum of weights un/corrected sum of squaresvariance mean standard

deviation

The grouping method (hash table or pre-sort) is a performance issue

Page 189: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Grouping Methods

Hash: results for each aggregation group are stored in a hash table, and the table is written out after all input has been processed– doesn’t require sorted data– good when number of unique groups is small. Running tally

for each group’s aggregate calculations need to fit easily into memory. Require about 1KB/group of RAM.

– Example: average family income by state, requires .05MB of RAM

Sort: results for only a single aggregation group are kept in memory; when new group is seen (key value changes), current group written out.– requires input sorted by grouping keys– can handle unlimited numbers of groups– Example: average daily balance by credit card

Page 190: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Aggregator Functions

Sum

Min, max

Mean

Missing value count

Non-missing value count

Percent coefficient of variation

Page 191: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Aggregator Properties

Page 192: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Aggregation Types

Aggregation types

Page 193: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Containers

Two varieties– Local– Shared

Local– Simplifies a large, complex diagram

Shared– Creates reusable object that many jobs can include

Page 194: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Creating a Container

Create a job

Select (loop) portions to containerize

Edit > Construct container > local or shared

Page 195: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Using a Container

Select as though it were a stage

Page 196: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

Complete exercise 8-1

Page 197: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module 9

Configuration Files

Page 198: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Objectives

Understand how DataStage EE uses configuration files to determine parallel behavior

Use this understanding to– Build a EE configuration file for a computer system– Change node configurations to support adding

resources to processes that need them– Create a job that will change resource allocations at the

stage level

Page 199: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Configuration File Concepts

Determine the processing nodes and disk space connected to each node

When system changes, need only change the configuration file – no need to recompile jobs

When DataStage job runs, platform reads configuration file– Platform automatically scales the application to fit the

system

Page 200: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Processing Nodes Are

Locations on which the framework runs applications

Logical rather than physical construct

Do not necessarily correspond to the number of CPUs in your system– Typically one node for two CPUs

Can define one processing node for multiple physical nodes or multiple processing nodes for one physical node

Page 201: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Optimizing Parallelism

Degree of parallelism determined by number of nodes defined

Parallelism should be optimized, not maximized– Increasing parallelism distributes work load but also

increases Framework overhead

Hardware influences degree of parallelism possible

System hardware partially determines configuration

Page 202: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

More Factors to Consider

Communication amongst operators– Should be optimized by your configuration– Operators exchanging large amounts of data should be

assigned to nodes communicating by shared memory or high-speed link

SMP – leave some processors for operating system

Desirable to equalize partitioning of data

Use an experimental approach– Start with small data sets– Try different parallelism while scaling up data set sizes

Page 203: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Factors Affecting Optimal Degree of Parallelism

CPU intensive applications– Benefit from the greatest possible parallelism

Applications that are disk intensive– Number of logical nodes equals the number of disk

spindles being accessed

Page 204: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

EE Configuration File

Text file containing string data that is passed to the Framework– Sits on server side– Can be displayed and edited

Name and location found in environmental variable APT_CONFIG_FILE

Components– Node– Fast name– Pools– Resource

Page 205: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Sample Configuration File

{

node “Node1"

{

fastname "BlackHole"

pools "" "node1"

resource disk "/usr/dsadm/Ascential/DataStage/Datasets" {pools "" }

resource scratchdisk "/usr/dsadm/Ascential/DataStage/Scratch" {pools "" }

}

}

Page 206: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Node Options

• Node name - name of a processing node used by EE– Typically the network name– Use command uname -n to obtain network name

Fastname – – Name of node as referred to by fastest network in the system– Operators use physical node name to open connections– NOTE: for SMP, all CPUs share single connection to network

Pools– Names of pools to which this node is assigned– Used to logically group nodes– Can also be used to group resources

Resource– Disk– Scratchdisk

Page 207: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Node Pools

Node “node1" {

fastname “server_name” pool "pool_name”

}

"pool_name" is the name of the node pool. I.e. “extra”

Node pools group processing nodes based on usage.

– Example: memory capacity and high-speed I/O.

One node can be assigned to multiple pools.

Default node pool (” ") is made up of each node defined in the config file, unless it’s qualified as belonging to a different pool and it is not designated as belonging to the default pool (see following example).

Page 208: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Resource Disk and Scratchdisk

node “node_0" {

fastname “server_name” pool "pool_name”

resource disk “path” {pool “pool_1”}

resource scratchdisk “path” {pool “pool_1”}

...

}

Resource type can be disk(s) or scratchdisk(s)

"pool_1" is the disk or scratchdisk pool, allowing you to group disks and/or scratchdisks.

Page 209: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Disk Pools

Disk pools allocate storage

Pooling applies to both disk types

By default, EE uses the default pool, specified by “”

pool "bigdata"

Page 210: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Sorting Requirements

Resource pools can also be specified for sorting:

The Sort stage looks first for scratch disk resources in a “sort” pool, and then in the default disk pool

Sort uses as many scratch disks as defined in the first pool it finds

Page 211: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

{ node "n1" { fastname “s1" pool "" "n1" "s1" "sort" resource disk "/data/n1/d1" {} resource disk "/data/n1/d2" {} resource scratchdisk "/scratch" {"sort"} } node "n2" { fastname "s2" pool "" "n2" "s2" "app1" resource disk "/data/n2/d1" {} resource scratchdisk "/scratch" {} } node "n3" { fastname "s3" pool "" "n3" "s3" "app1" resource disk "/data/n3/d1" {} resource scratchdisk "/scratch" {} } node "n4" { fastname "s4" pool "" "n4" "s4" "app1" resource disk "/data/n4/d1" {} resource scratchdisk "/scratch" {} } ...}

{ node "n1" { fastname “s1" pool "" "n1" "s1" "sort" resource disk "/data/n1/d1" {} resource disk "/data/n1/d2" {} resource scratchdisk "/scratch" {"sort"} } node "n2" { fastname "s2" pool "" "n2" "s2" "app1" resource disk "/data/n2/d1" {} resource scratchdisk "/scratch" {} } node "n3" { fastname "s3" pool "" "n3" "s3" "app1" resource disk "/data/n3/d1" {} resource scratchdisk "/scratch" {} } node "n4" { fastname "s4" pool "" "n4" "s4" "app1" resource disk "/data/n4/d1" {} resource scratchdisk "/scratch" {} } ...}

4 5

1

6

2 3

Configuration File: Example

Page 212: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Resource Types

Disk

Scratchdisk

DB2

Oracle

Saswork

Sortwork

Can exist in a pool– Groups resources together

Page 213: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Using Different Configurations

Lookup stage where DBMS is using a sparse lookup type

Page 214: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Building a Configuration File

Scoping the hardware:– Is the hardware configuration SMP, Cluster, or MPP?– Define each node structure (an SMP would be single node):

Number of CPUs CPU speed Available memory Available page/swap space Connectivity (network/back-panel speed)

– Is the machine dedicated to EE? If not, what other applications are running on it?

– Get a breakdown of the resource usage (vmstat, mpstat, iostat)

– Are there other configuration restrictions? E.g. DB only runs on certain nodes and ETL cannot run on them?

Page 215: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

To Create Hardware Specifications

Hardware Type ___________

Node Name _____________

# CPUs _____________

CPU Speed _____________

Memory _____________

Page Space _____________

Swap Space _____________

TCP Addr Switch Addr

_____________ _____________

Complete one per nodeComplete one per disk subsystem

Hardware Type ________________

Shared between nodes Y or N

Storage Type ________________

Storage Size ________________

Read Cache size _______________

Write Cache size _______________

Read hit ratio _______________

Write hit ratio _______________

I/O rate (R/W) _______________

# channels/ controllers____

Throughput ____

In addition, record all coexistence usage (other applications or subsystems sharing this disk subsystem).

Page 216: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

“Ballpark” Tuning

1. Calculate the theoretic I/O bandwidth available (don’t forget to reduce this by the amount other applications on this machine or others are impacting the I/O subsystem). See DS Performance and tuning for calculation methods.

2. Determine the I/O bandwidth being achieved by the DS application (rows/sec * bytes/row).

3. If the I/O rate isn’t approximately equal to the theoretical, there is probably a bottleneck elsewhere (CPU, Memory, etc).

4. Attempt to tune to the I/O bandwidth. 5. Pay particular attention to I/O intensive competing

workloads such as database logging, paging/swapping, etc.

As a rule of thumb, generally expect the application to be I/O constrained. First try to parallelize (spread out) the I/O as much as possible. To validate that you’ve achieved this:

Some useful commands:iostat (I/O activity)vmstat (system activity)mpstat (processor utilizationsar (resource usage)

Page 217: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

Complete exercise 9-1 and 9-2

Page 218: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Blank

Page 219: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module 10

Extending DataStage EE

Page 220: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Objectives

Understand the methods by which you can add functionality to EE

Use this understanding to:– Build a DataStage EE stage that handles special

processing needs not supplied with the vanilla stages– Build a DataStage EE job that uses the new stage

Page 221: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

EE Extensibility Overview

Sometimes it will be to your advantage to leverage EE’s extensibility. This extensibility includes:

Wrappers

Buildops

Custom Stages

Page 222: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

When To Leverage EE Extensibility

Types of situations:Complex business logic, not easily accomplished using standard EE stagesReuse of existing C, C++, Java, COBOL, etc…

Page 223: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Wrappers vs. Buildop vs. Custom

Wrappers are good if you cannot or do not want to modify the application and performance is not critical.

Buildops: good if you need custom coding but do not need dynamic (runtime-based) input and output interfaces.

Custom (C++ coding using framework API): good if you need custom coding and need dynamic input and output interfaces.

Page 224: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Building “Wrapped” Stages

You can “wrapper” a legacy executable: Binary Unix command Shell script

… and turn it into a enterprise edition stage capable, among other things, of parallel execution…

As long as the legacy executable is: amenable to data-partition parallelism

no dependencies between rows

pipe-safe can read rows sequentially no random access to data

Page 225: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Wrappers (Cont’d)

Wrappers are treated as a black box EE has no knowledge of contents

EE has no means of managing anything that occurs inside the wrapper

EE only knows how to export data to and import data from the wrapper

User must know at design time the intended behavior of the wrapper and its schema interface

If the wrappered application needs to see all records prior to processing, it cannot run in parallel.

Page 226: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

LS Example

Can this command be wrappered?

Page 227: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Creating a Wrapper

Used in this job ---

To create the “ls” stage

Page 228: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Creating Wrapped Stages

From Manager:Right-Click on Stage Type

> New Parallel Stage > Wrapped

We will "Wrapper” an existing Unix executables – the ls command

Wrapper Starting Point

Page 229: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Wrapper - General Page

Unix command to be wrapped

Name of stage

Page 230: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Conscientiously maintaining the Creator page for all your wrapped stages will eventually earn you the thanks of others.

The "Creator" Page

Page 231: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Wrapper – Properties Page

If your stage will have properties appear, complete the Properties page

This will be the name of the property as it

appears in your stage

Page 232: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Wrapper - Wrapped Page

Interfaces – input and output columns - these should first be entered into the table definitions meta data (DS Manager); let’s

do that now.

Page 233: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

• Layout interfaces describe what columns the stage:– Needs for its inputs (if any)– Creates for its outputs (if any)– Should be created as tables with columns in Manager

Interface schemas

Page 234: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Column Definition for Wrapper Interface

Page 235: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

How Does the Wrapping Work?

– Define the schema for export and importSchemas become interface

schemas of the operator and allow for by-name column access

– Define multiple inputs/outputs required by UNIX executable

import

export

stdout ornamed pipe

stdin ornamed pipe

UNIX executable

output schema

input schema

QUIZ: Why does export precede import?

Page 236: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Update the Wrapper Interfaces

This wrapper will have no input interface – i.e. no input link. The location will come as a job parameter that will be passed to the appropriate stage property.

Page 237: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Resulting Job

Wrapped stage

Page 238: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Job Run

Show file from Designer palette

Page 239: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Wrapper Story: Cobol Application

Hardware Environment: – IBM SP2, 2 nodes with 4 CPU’s per node.

Software:– DB2/EEE, COBOL, EE

Original COBOL Application:– Extracted source table, performed lookup against table in DB2, and

Loaded results to target table.– 4 hours 20 minutes sequential execution

enterprise edition Solution:– Used EE to perform Parallel DB2 Extracts and Loads– Used EE to execute COBOL application in Parallel– EE Framework handled data transfer between

DB2/EEE and COBOL application– 30 minutes 8-way parallel execution

Page 240: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Buildops

Buildop provides a simple means of extending beyond the functionality provided by EE, but does not use an existing executable (like the wrapper)

Reasons to use Buildop include: Speed / Performance

Complex business logic that cannot be easily represented using existing stages

– Lookups across a range of values– Surrogate key generation– Rolling aggregates

Build once and reusable everywhere within project, no shared container necessary

Can combine functionality from different stages into one

Page 241: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

BuildOps

– The user performs the fun tasks: encapsulate the business logic in a custom operator

– The enterprise edition interface called “buildop” automatically performs the tedious, error-prone tasks: invoke needed header files, build the necessary “plumbing” for a correct and efficient parallel execution.

– Exploits extensibility of EE Framework

Page 242: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

From Manager (or Designer):Repository pane:

Right-Click on Stage Type > New Parallel Stage > {Custom | Build | Wrapped}

• "Build" stages from within enterprise edition

• "Wrapping” existing “Unix” executables

BuildOp Process Overview

Page 243: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

General Page

Identicalto Wrappers,except: Under the Build

Tab, your program!

Page 244: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Logic Tab forBusiness Logic

Enter Business C/C++ logic and arithmetic in four pages under the Logic tab

Main code section goes in Per-Record page- it will be applied to all rows

NOTE: Code will need to be Ansi C/C++ compliant. If code does not compile outside of EE, it won’t compile within EE either!

Page 245: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Code Sections under Logic Tab

Temporary variables declared [and initialized] here

Logic here is executed once BEFORE processing the FIRST row

Logic here is executed once AFTER processing the LAST row

Page 246: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

I/O and Transfer

Under Interface tab: Input, Output & Transfer pages

Optional renaming of output port from default "out0"

Write row

Input page: 'Auto Read'Read next row

In-RepositoryTable Definition

'False' setting,not to interfere with Transfer page

First line: output 0

Page 247: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

I/O and Transfer

• Transfer from input in0 to output out0.• If page left blank or Auto Transfer = "False" (and RCP = "False") Only columns in output Table Definition are written

First line:Transfer of index 0

Page 248: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Building StagesSimple Example

Example - sumNoTransfer– Add input columns "a" and "b"; ignores other columns

that might be present in input– Produces a new "sum" column– Do not transfer input columns

sumNoTransfera:int32; b:int32

sum:int32

Page 249: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

NO TRANSFER

• Causes:

- RCP set to "False" in stage definition and

- Transfer page left blank, or Auto Transfer = "False"

• Effects:

- input columns "a" and "b" are not transferred

- only new column "sum" is transferred

Compare with transfer ON…

From Peek:

No Transfer

Page 250: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Transfer

TRANSFER

• Causes:

- RCP set to "True" in stage definition or

- Auto Transfer set to "True"

• Effects:- new column "sum" is transferred, as well as- input columns "a" and "b" and- input column "ignored" (present in input, but

not mentioned in stage)

Page 251: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Out Table;

Output

Adding a Column With Row ID

Page 252: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Columns

DS-EE type

Defined in Table Definitions

Value refreshed from row to row

Temp C++ variables

C/C++ type

Need declaration (in Definitions or Pre-Loop page)

Value persistent throughout "loop" over rows, unless modified in code

Columns vs. Temporary C++ Variables

Page 253: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

Complete exercise 10-1 and 10-2

Page 254: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

Complete exercises 10-3 and 10-4

Page 255: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Custom Stage

Reasons for a custom stage:– Add EE operator not already in DataStage EE– Build your own Operator and add to DataStage EE

Use EE API

Use Custom Stage to add new operator to EE canvas

Page 256: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Custom Stage

DataStage Manager > select Stage Types branch > right click

Page 257: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Custom Stage

Name of Orchestrate operator to be used

Number of input and output links allowed

Page 258: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Custom Stage – Properties Tab

Page 259: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

The Result

Page 260: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Blank

Page 261: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module 11

Meta Data in DataStage EE

Page 262: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Objectives

Understand how EE uses meta data, particularly schemas and runtime column propagation

Use this understanding to:– Build schema definition files to be invoked in DataStage

jobs– Use RCP to manage meta data usage in EE jobs

Page 263: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Establishing Meta Data

Data definitions– Recordization and columnization– Fields have properties that can be set at individual field

levelData types in GUI are translated to types used by EE

– Described as properties on the format/columns tab (outputs or inputs pages) OR

– Using a schema file (can be full or partial)

Schemas– Can be imported into Manager– Can be pointed to by some job stages (i.e. Sequential)

Page 264: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Data Formatting – Record Level

Format tab

Meta data described on a record basis

Record level properties

Page 265: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Data Formatting – Column Level

Defaults for all columns

Page 266: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Column Overrides

Edit row from within the columns tab

Set individual column properties

Page 267: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Extended Column Properties

Field and string

settings

Page 268: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Extended Properties – String Type

Note the ability to convert ASCII to EBCDIC

Page 269: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Editing Columns

Properties depend on the data type

Page 270: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Schema

Alternative way to specify column definitions for data used in EE jobs

Written in a plain text file

Can be written as a partial record definition

Can be imported into the DataStage repositoryrecord {final_delim=end, delim=",", quote=double}

( first_name: string;

last_name: string;

gender: string;

birth_date: date;

income: decimal[9,2];

state: string;

)

Page 271: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Creating a Schema

Using a text editor– Follow correct syntax for definitions– OR

Import from an existing data set or file set– On DataStage Manager import > Table Definitions >

Orchestrate Schema Definitions– Select checkbox for a file with .fs or .ds

Page 272: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Importing a Schema

Schema location can be on the server or local

work station

Page 273: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Data Types

Date

Decimal

Floating point

Integer

String

Time

Timestamp

Vector

Subrecord

Raw

Tagged

Page 274: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Partial Schemas

Only need to define column definitions that you are actually going to operate on

Allowed by stages with format tab– Sequential file stage– File set stage– External source stage– External target stage– Column import stage

Page 275: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Runtime Column Propagation

DataStage EE is flexible about meta data. It can cope with the situation where meta data isn’t fully defined. You can define part of your schema and specify that, if your job encounters extra columns that are not defined in the meta data when it actually runs, it will adopt these extra columns and propagate them through the rest of the job. This is known as runtime column propagation (RCP).

RCP is always on at runtime.

Design and compile time column mapping enforcement.

– RCP is off by default.– Enable first at project level. (Administrator project

properties)– Enable at job level. (job properties General tab)– Enable at Stage. (Link Output Column tab)

Page 276: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Enabling RCP at Project Level

Page 277: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Enabling RCP at Job Level

Page 278: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Enabling RCP at Stage Level

Go to output link’s columns tab

For transformer you can find the output links columns tab by first going to stage properties

Page 279: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Using RCP with Sequential Stages

To utilize runtime column propagation in the sequential stage you must use the “use schema” option

Stages with this restriction:– Sequential– File Set– External Source– External Target

Page 280: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Runtime Column Propagation

When RCP is Disabled– DataStage Designer will enforce Stage Input

Column to Output Column mappings.– At job compile time modify operators are

inserted on output links in the generated osh.

Page 281: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Runtime Column Propagation

When RCP is Enabled– DataStage Designer will not enforce mapping

rules.– No Modify operator inserted at compile time.– Danger of runtime error if column names

incoming do not match column names outgoing link – case sensitivity.

Page 282: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

Complete exercises 11-1 and 11-2

Page 283: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module 12

Job Control Using the Job Sequencer

Page 284: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Objectives

Understand how the DataStage job sequencer works

Use this understanding to build a control job to run a sequence of DataStage jobs

Page 285: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Job Control Options

Manually write job control– Code generated in Basic– Use the job control tab on the job properties page– Generates basic code which you can modify

Job Sequencer– Build a controlling job much the same way you build

other jobs– Comprised of stages and links– No basic coding

Page 286: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Job Sequencer

Build like a regular job

Type “Job Sequence”

Has stages and links

Job Activity stage represents a DataStage job

Links represent passing control

Stages

Page 287: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Example

Job Activity stage – contains

conditional triggers

Page 288: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Job Activity Properties

Job parameters to be passed

Job to be executed – select from dropdown

Page 289: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Job Activity Trigger

Trigger appears as a link in the diagram

Custom options let you define the code

Page 290: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Options

Use custom option for conditionals– Execute if job run successful or warnings only

Can add “wait for file” to execute

Add “execute command” stage to drop real tables and rename new tables to current tables

Page 291: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Job Activity With Multiple Links

Different links having different

triggers

Page 292: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Sequencer Stage

Can be set to all or any

Build job sequencer to control job for the collections application

Page 293: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Notification

Notification Stage

Page 294: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Notification Activity

Page 295: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Sample DataStage log from Mail Notification

Sample DataStage log from Mail Notification

Page 296: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

E-Mail Message

Notification Activity Message

Page 297: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Exercise

Complete exercise 12-1

Page 298: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Blank

Page 299: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Module 13

Testing and Debugging

Page 300: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Objectives

Understand spectrum of tools to perform testing and debugging

Use this understanding to troubleshoot a DataStage job

Page 301: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Environment Variables

Page 302: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Parallel Environment Variables

Page 303: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Environment VariablesStage Specific

Page 304: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Environment Variables

Page 305: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Environment VariablesCompiler

Page 306: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Typical Job Log Messages:

Environment variables

Configuration File information

Framework Info/Warning/Error messages

Output from the Peek Stage

Additional info with "Reporting" environments

Tracing/Debug output

– Must compile job in trace mode– Adds overhead

The Director

Page 307: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

• Job Properties, from Menu Bar of Designer• Director will

prompt you before eachrun

Job Level Environmental Variables

Page 308: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Troubleshooting

If you get an error during compile, check the following:

Compilation problems– If Transformer used, check C++ compiler, LD_LIRBARY_PATH– If Buildop errors try buildop from command line– Some stages may not support RCP – can cause column mismatch .– Use the Show Error and More buttons– Examine Generated OSH– Check environment variables settings

Very little integrity checking during compile, should run validate from Director.

Highlights source of error

Page 309: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Generating Test Data

Row Generator stage can be used– Column definitions– Data type dependent

Row Generator plus lookup stages provides good way to create robust test data from pattern files

Page 310: Module 1 DS324EE – DataStage Enterprise Edition Concept Review.

Blank