Top Banner
INFORMATICA INFORMATICA POWER CENTER – 8.6.0 An informatica power sector is a single, unified data integration platform which allows the companies to access the data from multiple source systems, transforming the data into a homogenous format and delivers the data throughout the enterprise at any speed. An informatica power center is a client server technology which allows you to design, run, monitor and administrate the data acquisition applications known as Mappings . A mapping is a graphical representation of data flow from source to destination. A mapping logically defines extraction, transformation and loading. A mapping is an ETL plan that is created with following types of metadata I. Source Definition (E) II. Target Definition (L) III. Transformation Rule (T) INFORMATICA PRODUCTS : - The following are the various products from informatica co-operation. I. Informatica Power center II. Informatica Power Mart III. Informatica Power Exchange IV. Informatica Power Analyzer V. Informatica Metadata Reporter I NFORMATICA 1
116
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript

INFORMATICAINFORMATICA POWER CENTER 8.6.0An informatica power sector is a single, unified data integration platform which allows the companies to access the data from multiple source systems, transforming the data into a homogenous format and delivers the data throughout the enterprise at any speed.An informatica power center is a client server technology which allows you to design, run, monitor and administrate the data acquisition applications known as Mappings.A mapping is a graphical representation of data flow from source to destination.A mapping logically defines extraction, transformation and loading.A mapping is an ETL plan that is created with following types of metadata

I. Source Definition (E)

II. Target Definition (L)

III. Transformation Rule (T)

INFORMATICA PRODUCTS: -

The following are the various products from informatica co-operation.

I. Informatica Power centerII. Informatica Power Mart

III. Informatica Power Exchange

IV. Informatica Power AnalyzerV. Informatica Metadata Reporter

VI. Informatica Cloud

VII. Informatica Data Quality

VIII. Business to Business (B2B)

Note: - An informatica Co-operation was founded in the year 1993 Red Wood City, Loss Angels, California. But Public Ltd Company happened in the year 1997.

POWER CENTER 8.6 COMPONENTS: -When we install Informatica Power Center the following components gets installI. Informatica Power Center Clients

II. Power Center RepositoryIII. Power Center Domain

IV. Repository Service

V. Integration Service

VI. Power Center Administration Console

Power Center Clients: -There are four power center client components gets install

I. Power Center Designer

II. Power Center Work Flow Manager

III. Power Center Work Flow Monitor

IV. Power Center Repository Manager

Designer: -

The designer is a GUI based client component which allows you to design the plan of ETL processor called Mapping.

A mapping is made up of following metadata definitions

I. Source

II. Transformation Rule

III. Target

Work-Flow Manager: -It is a GUI based client component which allows you to perform following tasks

I. Create session for each mapping

II. Create work-flow to execute one (or) more sessions

III. Start work-flow

Session: -

A session is a centre object that runs mapping on integration service.

Work-flow: -

A work flow is a top object in the power center development hi-archery to start one (or) more sessions in sequence (or) parallel (or) both.

Work-flow Monitor: -Its a GUI based client component which monitors the work-flows and sessions running on integration service.

The work-flow sessions status is displayed as succeed (or) failed.

It allows you to fetch session log from repository.

Steps involving in build data acquisition

i. Create Source definition

ii. Create Target definition

iii. Design a mapping with (or) without Transformation Rule

iv. Create session for each mapping

v. Create work-flow

vi. Start work-flow

Repository Manager: -Its a GUI based administrative client component which allows you to define following tasks

i. Create edit and delete foldersii. Assign permission and privileges to the user and user group

Folder: -

A folder is repository object which allows you to organize the metadata stored in repository.

POWER CENTER REPOSITORY: -The power center repository is a relational database that contains metadata which is required to perform ETL process. There are two types of repositories

I. Local Repository

II. Global Repository

Local Repository: -

The repository that supports to share the metadata within the repository across multi users.Global Repository: -The metadata can be shared across multiple repositories.The repository is the brain of ETL system that stores ETL code (Metadata).

REPOSITORY SERVICE: -The repository service manages connections to the power center repository from client applications.

The repository service is a multi threaded process that inserts, retrieves, update and delete metadata from repository.

The repository service ensures that there is a consistency of the metadata stored in the repository.

INTEGRATION SERVICE: -An integration service is an ETL engine that performs extraction, transformation and loading.

The integration service reads sessions and mapping from repository through repository service.

The integration service also stores the metadata such as session and work-flow status, session log in the repository through repository service.

An integration service is created with following componentsi. Reader

ii. DTM (Data Transformation Manager)

iii. Writer

READER: - It performs extraction from mapping sources.

DATA TRANSFORMATION MANAGER (DTM): - It allocates buffer to process the data according to the transformation logic that you configure in mapping.

WRITER: - It loads the data into the mapping targets.

POWER CENTER DOMAIN: -The power center domain is a primary unit for managing and administrating power center services.

The power center has a Service Oriented Architecture (SOA) that provides the ability to scale the services and share resources across multiple machines.

The power center domain is a collection of one (or) more nodes.A node is a process unit. A node which hosts the domain is known as master gate-way node (or) primary node.

A master gate-way node receives the request from the client and distributes the request to the other nodes known as worker nodes.

If the master gate-way node fails the user request cannot be processed.

In a really configurations we configure the domain with more than one node as master node.

POWER CENTER ADMINISTRATION CONSOLE: -It is a web application which allows you to manage power center domain.

The following administrative tasks can be performed using administration consoleI. Configure Existing nodes

II. Enable (or) Disable nodes

III. Add (or) Delete nodes

IV. Create users and user groups

V. Create power center services for each node

The following are the pre-requests to build an ETL process

I. Setup source and target database

II. Create ODBC connections

III. Start power center services

IV. Create folder

Setup Source and Target Database: -

Start ( Programs ( Oracle Application Development SQL PlusLog ON to Oracle with the following details

User Name: - SYSTEM

Password: - NIPUNA

Host String: - ORCL

SQL > Create user Batch 10AM identified by Target;

User created.SQL > Grant DBA to Batch 10AM;

Grant Succeeded.SQL > CONN Batch 10AM/Target;

Connected

SQL > Create table DIM_EMPLOYEE (EMPNO Number (5) Primary Key, ENAME Varchar2 (10);

Job Varchar2 (10), Sal Number (7, 2);

COMM Number (7, 2), Dept No Number (2);

Table created

SQL > Select * from DIM_EMPLOYEE;

No rows selected.

EX: - SQL > CONN SCOTT/TIGER;

Connected

SQL > Select COUNT (*) from EMP;

COUNT (*)

SQL > CONN BATCH 10AM/TARGET;

Connected

SQL > Select COUNT (*) from DIM_EMPLOYEE;

COUNT (*)

CREATION OF ODBC CONNECTIONS: -Source ODBC Connection: -

Start Settings ( Control Panel ( Administrative Tools ( Data Source (ODBC)

Select a SYSTEM DSN

Click on ADDSelect a driver Oracle in ORADBLOG_HIME

Click on FINISHEnter the following details

Data Source Name: - SCOTT_ODBC

TNS Service Name: - ORCL

User ID: - SCOTT

Click on Test Connection

Enter the PASSWORD TIGER

Click OKAgain Click OKTarget ODBC CONNECTION: -Enter the following details

Data Source Name: - BATCH 10AM_ODBC

TNS Service Name: - ORCL

User ID: - BATCH 10AM

Click on Test Connection

Enter the PASSWORD TARGET

Click OKPower Center Services: -

Start Settings ( Control Panel ( Administrative Tools ( Services

Start the following Services

I. Informatica Orchestration Server

II. Informatica Services 8.6.0

Creation of Folder: -Start ( Programs ( Informatica Power Center 8.6.0 Clients ( Power Center Repository Manager

From repository navigator windows ( select the Repository Name with NIPUNA_REP ( Right Click ( Click on CONNECTEnter User Name: - ADMINISTRATOR

Password: - ADMINISTRATOR

Click on CONNECT

From FOLDER Menu select CREATEEnter the folder name: - BATCH 10AM (own wish)

Click OKPOWER CENTER 8.6.0 INSTALLATION: -Creation of Repository User Account: -Start ( Programs ( Oracle Application Development ( SQL Plus

Log ON to the Oracle with the following details

User Name: -SYSTEM

Password: - NIPUNA

SQL > Create User NIPUNA (User Name) IDENTIFIED by REP;

Grant DBA to NIPUNASQL > CONN NIPUNA/REP;

ConnectedInstalling Server Software: -Browse to the location of server folder

Click on OKClick on NEXTClick on NEXT

Click on browse to select License Key

Click on NEXTClick on NEXTClick on NEXTClick on NEXTClick on INSTALLSelect CREATE NEW DOMAINClick on NEXT

Enter the account information to store power center domain configure metadata

Database Type: - ORACLE

Database URL: - NIPUNA: 1521 (Computer Name: Port No)Database User ID: - NIPUNA

Database Password: - REP

Database Service Name: - ORCL

Test connection passed ( Click on NEXTEnter the DOMAIN PASSWORD and CONFIRM PASSWORD to create domain

Domain User Name: - Admin (Default)

Domain Password: - Admin

Confirm Password: - Admin

Click on NEXTUncheck Run Informatica Services under Different User Account

Click on NEXTClick on DONECREATION of POWER CENTER SERVICES: -Start ( Programs ( Informatica Power Center 8.6.0 ( Services ( Informatica Power Center Administration Console

Enter User Name: - Admin

Password: - Admin

Click on OKSelect Administration Console

CREATION of REPOSITORY SERVICE: -

From create Menu ( Select Repository Service

Enter the following details to create Repository Service

Service Name: - NIPUNA_REP (Our Wish)

Node: NODE01_NIPUNA

Database Type: - ORACLE

Connect String: - ORCL

Database User: - NIPUNA

Database Password: - REP

Select create new repository contentClick on CREATE

Click on CLOSE

CREATION of INTEGRATION SERVICE: -From create Menu ( Select Integration Service

Enter the following details to create Integration Service

Service Name: - NIPUNA (Our Wish)Assign: - NODE

Node: - NODE01_NIPUNA

Associated Repository Service: - NIPUNA_REP

Repository User Name: - Administrator

Repository Password: - Administrator

Data movement Mode: - UNICODE

Click on CREATESelect Enable the Integration Service after creation

Click on OK** Click on CLOSEFrom Toolbar Click on SIGNOFFStart ( Settings ( Control Panel ( Administrative Tools ( Services ( Informatica Services 8.6.0

Set the start type to MANUAL for the following details

i. Informatica Orchestration Server

ii. Informatica Services 8.6.0

Right Click and Select MANUAL for above each services

Restart the above Services

Log ON to Power Center Administration Console

Installing Power Center Client Software: -Browse to the location of Client Folder

Click on INSTALLClick on OK

Click on NEXT

Click on NEXTClick on NEXTClick on NEXTClick on INSTALLClick on NEXTSelect application Power Center Designer

Click on DONE

From Repository Navigator Window Select the REPOSITORIES and from REPOSITORY Menu click on ADD

Enter the following details

Repository: - NIPUNA_REP

User Name: - Administrator

Select Repository Service ( NIPUNA_REP ( Right Click and Connect

From connection settings just select the DOMAIN: - DOMAIN_NIPUNA

Next ( Enter the Password: - Administrator

Click on CONNECTSTEPS INVOLVING IN IMPLIEMENTING DATA ACQUISITON: -

Step1: - Creation of Source Definition

A Source definition is created using source analyzer in the designer client component.

Procedure: -

Open the Client Power Center DesignerConnect to the Repository Service with a valid User Name and Password.

From Repository navigator window select the folder from Tools Menu select Source Analyzer.

From Sources Menu, click on Import from Database.

Connect to database with the following details

ODBC Data Source: - SCOTT_ODBC

User Name: - SCOTT

Owner Name: - SCOTT

Password: - TIGER

Click on ConnectSelect the table (EMP)

Click on OKFrom Repository Menu click on SAVEStep2: - Creation of Target DefinitionA Target definition is created using Target Designer ToolProcedure: -

From Tools Menu Target Designer

From Target Menu select Import from Database

Connect to database with the following details

ODBC Data Source: - BATCH10AM_ODBC

User Name: - BATCH10AM

Owner Name: - BATCH10AM

Password: - TARGET

Click on connectSelect the table (DIM_EMPOLYEE)

Click on OK

From Repository Menu click on SAVE

Step3: - Design Mapping Without Transformation Rule

A pass through mapping is designed with source and target definitions without transformation rule.

A mapping is design using Mapping Designer ToolProcedure: -

From Tools Menu select Mapping Designer

From Mapping Menu select Create

Enter the Mapping Name: - M_Pass_Through

Click on OKFrom Repository navigator window drop the Source and Target definitions

From Source Qualifier map the columns to the Target Definition with the simple drag and drop operations.

From Repository Menu click on SAVE

Note: - Every Source definition default associates with Source Qualifier to define extraction.Step4: - Creation of SessionA session is created using Task Developer Tool in a work-flow manager client component.

A session is required to run the mapping on Integration Service.

Procedure: -Open the client Power Center Work-Flow Manager

Connect to Repository with a valid User Name and Password.

Select the Folder from the Repository navigator window.

From Tools Menu Task DeveloperFrom Tasks Menu select Create

Select the task type SessionEnter the Name: - S_M_Pass_Through

Click on CreateSelect the Mapping

Click on OK

Click on DoneCreation of Reader Connection: -

From Work-Flow Manager client and Select Connections

Click on RelationalSelect the type OracleClick on NewEnter the following details

Name: - SCOTT_READER (or) READER_SCOTT (Our Wish)

User Name: - SCOTT

Password: - TIGER

Connect String: - ORCL

Click on OKCreation of Writer Connection: -

Click on NewEnter the following details

Name: - BATCH10AM_WRITER (Our Wish)

User Name: - BATCH10AM

Password: - TARGET

Connect String: - ORCL

Click on OKDouble click the Session ( Select the Mapping Tab

From left window select the Source: - SQ_EMP

From connection set Reader connection value TypeValue

RelationalSCOTT_READER

From left window select the Target DIM_EMPOLYEE

From connections set Writer connection value

TypeValue

RelationalBATCH10AM_WRITER

From properties set Target load to NORMALClick on Apply

Click on OKFrom Repository Menu click on SAVEStep5: - Creation of Work-FlowA work-flow is created using Work-Flow Designer Tool.

Procedure: -

From Tools Menu select Work-Flow Designer

From Work-Flow Menu select CreateEnter the Work-Flow Name: - W_S_M_Pass_Through

Click on OKFrom Session sub folder drop the Session beside the start task (Work-Flow).

From Tasks Menu select Link TaskDrag the Link from start task and drop on Session

From Repository Menu click on SAVEStep6: - Start Work-Flow

From Work-Flows click Start Work-FlowCREATION OF TARGET DEFINITIONProcedure: -

From Tools select Target Designer

From Sources sub folder drop the Source Definition (EMP) so the Target Designer Work Space.

Double click on Target Definition click on RENAME

Select the Column Tab

Add new column (or) Deleting existing column

Click on ApplyClick on OKFrom Targets Menu click on Generate/Execute SQLClick on CONNECT to the Target database using ODBC connection

Select Create TableClick on Generate/ExecuteClick on CLOSE

TRANSFORMATONS AND TYPES OF TRANSFORMATIONS: -A transformation is a power center object which allows you to build the business logic to process the data. There are two types of transformations

I. Active Transformation

II. Passive Transformation

ACTIVE TRANSFORMATION: -A transformation which can affect the number of rows (or) change the number of rows when the data is moving from source to target is known as Active Transformation.

The following are the list of Active Transformations used for processing the data

I. Filter Transformation

II. Aggregator Transformation

III. Source Qualified Transformation

IV. Joiner Transformation

V. Union Transformation

VI. Router Transformation

VII. Rank Transformation

VIII. Sorter Transformation

IX. Update Strategy Transformation

X. Transaction Control Transformation

XI. Normalizer Transformation

XII. SQL TransformationPASSIVE TRANSFORMATION: -A transformation which doesnt affect (or) change the number of rows is known as Passive Transformation.

The following are the list of Passive Transformations used for processing the data

I. Expression TransformationII. Stored Procedure Transformation

III. Sequence Generator Transformation

IV. Look-up Transformation

V. XML Source Qualifier Transformation

VI. SQL Transformation

PORTS AND TYPES OF PORTS: -A port represents column of the task (or) file. There are two types of ports

Input Port: -A port which can receive the data is known as Input Port, which is designated as I.

Output Port: -

A port which can provide the data is known as Output Port, which is designated as O.CONNECTED AND UNCONNECTED TRANSFORMATIONS: -CONNECTED TRANSFORAMTION: -A transformation which is a part of mapping data-flow is known as Connected Transformations.

It is connected to the source and connected to the target.

A connected transformation can receive multiple ports and can return multiple output ports.

All Active and Passive transformations can be defined as Connected Transformations.

UNCONNECTED TRANSFORMATION: -A transformation which is not part of mapping data-flow is known as Unconnected Transformation.It is neither connected to the source nor connected to the target.

An unconnected transformation can receive the multiple input ports but returns a single output port.

The following Transformations can be defined as Unconnected

I. Lookup TransformationII. Stored Procedure Transformation

POWER CENTER TRANSFORMATION LANGUAGE: -The power center transformation language is a set of built-in functions used to build transformation logic to process the data.

The Informatica function set is similar to SQL functions.

The built-in functions are categorized as follows

I. String Function

II. Numeric FunctionIII. Date FunctionIV. Aggregate FunctionV. Conversion FunctionVI. Cleansing FunctionVII. Variable FunctionVIII. Scientific FunctionIX. Test FunctionX. Miscellaneous FunctionFILTER TRANSFORMATION: -This is of type an Active Transformation which filters the data records based on given condition.

The integration service evaluates the condition in the filter transformation, returns True (or) False. The filter transformation returns True when the input record is satisfied with given condition those records are allowed for further processing are loading.False indicates that the records are rejected from filter transformation.

The rejected records cannot be captured (or) received.

The filter transformation supports to develop a single condition and allows you to pass the data to the single target.The filter transformation functions as WHERE Clause in SQL.

Define the filter transformation to perform data cleansing.

Business Rule: -

Calculate the Tax (Sal*0.17) for top three employees based on salary, who belongs to sales department, the sales department ID is 30. RANK TRANSFORMATION: -This is of type an Active Transformation which allows you to calculate ranks to identify the top and bottom performs.

The rank transformation is created with following types of ports.

I. Input Port (I)

II. Output Port (O)

III. Rank Port (R)

IV. Variable Port (V)

Rank Port: -

A port which is participated to determine the ranks is designated as Rank Port.

Variable Port: -A port which allows you to store the data temporarily, allows you to develop expressions is known as Variable Port.

The following properties can be set to calculate the ranks

I. Top/Bottom

II. Number of Ranks

The Rank Transformation by default is created with Rank Index output port.

The integration service uses the cache memory to process the Rank Transformation.

Note: - ***

The following are the Cache based Transformations (or) Costly Transformation

a. Rank Transformation

b. Sorter Transformationc. Joiner Transformationd. Look-up Transformatione. Aggregate TransformationEXPRESSION TRANSFORMATION: -This is of type Passive Transformation which allows you to calculate expressions for each record.The expression transformation is created with the following types of ports

I. Input Port (I)

II. Output Port (O)

III. Variable Port (V)

The expression transformation supports to develop expressions either in only output ports (or) variable ports.

Define the expression transformation to perform data scrubbing.

Procedure: -

Creation a source definition with the name EMP

Creation a target definition with the name EMP_TAX (Column EMP NO, ENAME, JOB, SAL, TAX, DEPT NO)Create a mapping with the name M_EMP_TAX_CALCULATION drop source and target definitions.

From Transformation Menu select CREATE

Select the Transformation type FILTER

Enter the name FILTER_EMPOLYEES (Our Wish)

Click on CREATE

Click on DONE

From source qualifier copy the required ports to the filter transformation.

Double click on filter transformation select Properties Tab

Transform AttributeValue

Filter ConditionDept No: - 30

Click on APPLY

Click on Ok

From Transformation Menu select CREATE

Select the Transformation type RANKEnter the name RANK_EMPOLYEES (Our Wish)

Click on CREATE

Click on DONE

From Filter Transformation copy the ports to the Rank Transformation

Double click on Rank Transformation select Ports Tab

For a Port name Sal select Rank Port (R)

Select Properties Tab

Transformation AttributeValue

Top/Bottom

Number of RanksTop

3

Click on APPLY

Click on OK

Create the Transformation type Expression

From Rank Transformation copy the Ports to the Expression Transformation (Expect Rank Index)

Double click on Expression Transformation select Ports Tab

From Tool Bar click on ADD A NEW PORT

Port NameData TypePrecisionScaleI/P O/P VarExpression

Taxdecimal72 O -Sal*0.17

Click on OKFrom Expression Transformation connect the Ports to the Target.Mapping Rule: -i) Filter the records when it contains Nulls

Solution: -

Create a Filter Transformation with the following condition

Transformation AttributeValue

Filter ConditionIIF (ISNULL (COMM), FALSE, TRUE)

ii) Filter the records if anyone column contains Nulls

Solution: -

Create a Filter Transformation with the following condition

IIF ((ISNULL (EMPNO) OR

ISNULL (ENAME) OR

ISNULL (JOB) OR

ISNULL (MGR) OR

ISNULL (HIREDATE) OR

ISNULL (SAL) OR

ISNULL (COMM) OR

ISNULL (DEPTNO), FALSE, TRUE)

iii) Migrate all records from Source to Target using Filter TransformationSolution: -

Create a Filter Transformation with the following condition

TRUE

Note: - The default Filter Condition is TRUE

iv) Loading Employees whose name starts with S

Solution: -

Create the Filter Transformation with the following condition

SUBSTR (EMPNAME, 1, 1) = S

v) Loading Employees whose number is Even

Solution: -

Create the Filter Transformation with the following condition

MOD (EMPNO, 2) = 0

vi) Loading Employees whose Employee name is having more (or) equal to 6 characters

Solution: -

Create the Filter Transformation with the following condition

LEN (ENAME) >= 6vii) Double the count in the target when the source having N records

Solution: -

Create Target definition as a multiple instants

EXPRESSION TRANSFORMATION Variable Port: -

A port which can store the data temporarily is known as Variable Port (V).

Variable Ports are LOCAL to the transformation.

A Variable Port is required to Simplify the Complex Expressions and improves the efficiency of the calculation.Variable Ports are not visible in normal view, but visible in edit view.

Mapping Rules: -i) Calculate the TAX based on total SALARY.

Total salary is calculated as sum of salary and commission, where commission column may having NULL values

If the total salary is greater than 2500 then calculate the tax as (total sal*0.15), else calculate as (total sal*0.12).

Solution: -

Create the transformation type Expression

Double click on Expression Transformation and select the Ports Tab

Port NameData TypePrecisionScaleI O VExpression

V_Total Sal

TAXDecimal

Decimal7

72

2 V OIIF(ISNULL(COMM)SAL, SAL+COMM)IIF(V_TOTAL SAL > 2500,

V_TOTAL SAL*0.15

V_TOTAL SAL*0.12)

Click on APPLYClick on OK

ii) Reject the records which contains NULL values

Create the Transformation type Expression

Double click on Expression Transformation and select the Ports Tab

Port NameData TypePrecisionScaleI O VExpression

Exception_FlagString100 OIIF((ISNULL(EMPNO) ORISNULL(ENAME) OR

ISNULL(JOB) OR

ISNULL(SAL) OR

ISNULL(COMM) OR

ISNULL(DEPTNO), E,C)

Click on APPLYClick on OK

Create the Transformation type Filter and develop the following condition

Transformation AttributeValue

Filter conditionException_flag=C

Click on APPLY

Click on OK

DATA ATTRIBUTES: -Mapping Rules: -

i) Derive the calendar attributes such as Year, Quarter, Month, Week, Day, Day number in month, Day number in year.

Solution: -

Create the Transformation type ExpressionDouble click on Expression Transformation and select Ports Tab

From Toolbar click on Add a New Port

Port NameData TypePSOExpression

YOJ

Quarter

Month

Month Name

Week no. in month

Week no. in year

Day no. in weekDay no. in month

Day no. in yearDecimal

Decimal

Decimal

String

Decimal

Decimal

Decimal

Decimal

Decimal6

6

6

10

6

6

6

6

60

0

0

0

0

0

0

0

0O

O

O

O

O

O

O

O

OTO_DECIMAL(TO_CHAR(HIREDATE, YYYY))

TO_DECIMAL(TO_CHAR(HIREDATE, Q))

TO_DECIMAL(TO_CHAR(HIREDATE, MM))

TO_DECIMAL(TO_CHAR(HIREDATE, MON))

TO_DECIMAL(TO_CHAR(HIREDATE, W))

TO_DECIMAL(TO_CHAR(HIREDATE, WW))

TO_DECIMAL(TO_CHAR(HIREDATE, D))

TO_DECIMAL(TO_CHAR(HIREDATE, DD))

TO_DECIMAL(TO_CHAR(HIREDATE, DDD))

ii) Calculate the total salary for each employee Total SAL= SAL+COMM, where COMM column having NULL valuesSolution: -

Port NameData TypePSVExpression

COMMDecimal60VSAL+IIF(ISNULL(COMM),0,COMM)

iii) Decode the gender 0 as M, 1 as F and unknown as UNK

Solution: -

Create the Transformation type Expression

Develop the following conditions

IIF(GENDER=0, M, IIF(GENDER=1, F, UNK)) DECODE(GENDER, 0, M,

1, F,

UNK)

Note: - The power center supports following comments to ignore expressions (or) any text messages while executing on Integration Service.

--, ||

iv) Concatenate two string fields such as first name and last name

Solution: -

Create the Transformation type Expression

Develop the following conditions

LTRIM(RTRIM(EFNAME)) || || LTRIM(RTRIM(ELNAME))

CONCAT(EFNAME, CONCAT( , ELNAME))

v) Calculate the employee experience in number of years

Solution: -

Create the Transformation type Expression

Develop the following conditions

1. DATE_DIFF(SYSDATE, HIREDATE, YYYY)vi) Calculate the Tax(SAL*0.17) for top three employees of each department

Solution: -

FILTER TRANSFORMATION PERFORMANCE OPTIMIZATION: -Keep the Filter Transformation as close to Source Qualifier as possible to filter the data early in the data-flow so that, the number of records are reduced for further processing.

EXPRESSION TRANSFORMATION PERFORMANCE OPTIMIZATION: -Create Variable ports to simplify the complex expressions.

Use DECODE function rather than using multiple IIF functions.

Use string operator || to concatenate two string fields rather than using CONCAT function.ROUTER TRANSFORMATION: -This is a type an Active Transformation which allows you to develop multiple conditions and allows you to pass the records to the multiple targets.The Router Transformation created with two types of groups

I. Input group

II. Output group

INPUT GROUP: -

An Input group receives the data from the source pipe line.

There should be only input group to receive the data.

OUTPUT GROUP: -

An Output group provides the data for further processing (or) loading.

There are two types of Output groupsUser Defined Output group: -

It allows you to develop condition

Each group has one condition all group conditions are evaluated for each row.

One row can pass multiple conditions.

Unlinked group Outputs are ignored.

Default group: -

The default group captures a row that fails all group conditions.

PERFORMANCE CONSIDERATIONS: -The Router Transformation has a performance advantage over multiple Filter Transformation because a row is read once into the Input group but evaluate multiple times based in the number of groups. Whereas using multiple Transformations requires the same row data to be duplicated for each Filter Transformation.DIFFERENCES BETWEEN FILTER & ROUTER TRANSFORMATIONS: -FILTERROUTER

i) Single condition

ii) Single target

iii) It does not capture the rejected

records that fails to meet the Filter

conditioni) Multiple conditions

ii) Multiple targets

iii) The default group can captures

rejected records that fails all group

conditions

MAPPING RULES: -i) Correct the data pass through one target and exception data pass through another target if any record contains NULL value. That is defined as Exception.

Solution: -

Create the Transformation type Expression and develop the following Expression

IIF ((ISNULL (EMPNO) OR

ISNULL (ENAME) OR

ISNULL (JOB) OR

ISNULL (MGR) OR

ISNULL (HIREDATE) OR

ISNULL (SAL) OR

ISNULL (COMM) OR

ISNULL (DEPTNO), E, C)

Create the Transformation type Router

Double click on the Router Transformation and select groups Tab

From Toolbar Tab click on ADD a new group

Group NameGroup Filter Condition

Correct_Data

Exception_DataException_Flag= C

Exception_Flag= E

Default ONEClick on APPLY

Click on OKSelect * from EMP_EVEN;

i) Even number Employees to one target and odd number Employees to another table

Solution: -

Create the Transformation type Router

Double click on the Router Transformation and select the group Tab

Group NameGroup Filter Condition

EVENMOD(EMP,2)=O

ii) Employee name start with A pass through one target and Employee name start with S pass through another target

Solution: -

Create transformation type Router

Double click on Router TabFrom Toolbar click on ADD a new group

Group NameGroup Filter Condition

ENAME_A

ENAME_STRUE

TRUE

SUBSTR (ENAME, 1, 1) = A

SUBSTR (ENAME, 1, 1) = SSORTER TRANSFORMATION: -This is of type Active Transformation which allows you to sort the data either in ascending order (or) descending order.

The port(s) which is participated in sorting the data is designated Key Port.

Use the Sorter Transformation to eliminate duplicates hence it is known as Active Transformation.

AGGREGATOR TRANSFORMATION: -This is of type an Active Transformation which allows you to calculate the summaries for groups of records.An Aggregator Transformation is due to perform aggregator calculations.

An Aggregator Transformation is created with following components

I. Group By

II. Aggregate Expressions

III. Sorted Input

IV. Aggregate Cache

Group By: -

It defines a group on a port(s) for which we calculate Aggregate Expressions.

Aggregate Expressions: -

The Aggregate expressions can be developed either in only output ports (or) variable ports. The following Aggregate functions can be used to define the standard Aggregation

I. Sum()

II. Average()

III. Max()

IV. Min()

V. Count()

VI. First()

VII. Last()

VIII. Median()

IX. Variance()

X. Percentile()

The Aggregate functions can be used only in Aggregator Transformation.

It calculates the single value for all records in a group.

**Only one Aggregate function can be nested with in an Aggregate function.

Conditional statements can be used with Aggregate functions.

Sorted Input: -

It instructs the Aggregator to except the data to be sorted.

The Aggregator Transformation can handle sorted (or) unsorted data.

The sorted data can be aggregated more efficiently decreasing total processing time.

**The Integration Service will cache the data for each group and releases the cached data upon reaching the first record of next group.

The cache size minimizes the load on the machine can be reduced.

**The data must be sorted according to the order of Aggregator group by ports.

The Group By ports are sorted using Sorter Transformation, keep the Sorter Transformation prior to Aggregator Transformation.

Unsorted Aggregator: -

No rows are released from cache until all rows are aggregated.

The cache size requirements increases, there by load on the machine increases.

The Aggregator efficiency decreases.Aggregator Cache: -

When the mapping contains an Aggregator Transformation the Integration Service uses the cache memory to process the aggregator.

When the session completes the cache was erased.Note: -

Nested Aggregation: -

SUM (AVG (SAL)) -- One level nesting

SUM (AVG (MAX (SAL))) -- Two level nesting

Mapping Rule: -

Calculate the total salary for each group for employees whose salary is greater than 1500

LOOKUP TRANSFORMATION: -This is of type an Active Transformation which allows you to lookup on Relational tables, Flat Files, Synonyms and Views.

When the mapping contains a Lookup Transformation the Integration Service queries the lookup data and compares is with transformation port values (Source data)

The Lookup Transformation is created with the following type of ports

I. Input Port (I)

II. Output Port (O)

III. Lookup Port (L)

IV. Return Port (R)

The Lookup Transformation is used to perform the following tasks

I. Get a related value

II. In implementing Slowly Changing Dimensions

There are two types of Lookups.

I. Connected LookupII. Unconnected Lookup

Connected Lookup: -

Its a part of mapping data-flow.

It can receive multiple Input ports and can provide multiple Output ports (Single record)

Unconnected Lookup: -

It is not a part of mapping data-flow.

It is neither connected to the Source nor connected to the Target.

An Unconnected Lookup can receive multiple Input ports but it returns a Single Output Port, which is designated as Return Port (R).

Lookup Transformation Cache: -

Caching can significantly impact the performance.

Cached Lookup: -

Lookup data is cached locally on server.

Source rows (or) records are looked-up against cache.

Only one SQL select is needed.

Un-Cached Lookup: -

For each Source row one SQL select is needed.

Source rows are looked-up against database (DB lookup).

Rule of Thumb: -

Cache the Lookup if the number of records in the lookup table is small relative to number of records in the Source.

Un-Cached Lookup Performance Consideration: -

You have ten rows in the Source and one million records in the Lookup table then the Power Center built the cache for the lookup table then checks the ten Sources ros against the cache.

It takes more time to built cache of one million rows then going to database ten times and lookup against the table directly.Use the Un-cached Lookup instead of building the cache.

Note: -

By default the cache is enable to built Lookup cache.

Procedure: -Create Source and Target definitions (DEPTNO, DNAME, LOC, and SUMSAL)

Create a Mapping with the name M_Aggregation_Lookup

Drop the Source and Target definitions

Create the Transformation type Sorter and Aggregator

From Source Qualifier copy the following ports to the Sorter Transformation

DEPTNO & SAL

Double click on Sorter Transformation and select Port Tab

For a Port Name: - DEPTNO select keyClick on APPLY & OK

From Sorter Transformation copy the ports to Aggregator

Double click on Aggregator Transformation and select Ports Tab

For a Port Name: - DEPTNO select Group By

Uncheck the Output Port for a Port Name: - SAL

From Toolbar click on ADD a new port

Port NameData TypePrecisionScaleOExpression

SUMSALDecimal72OSUM (SAL)

Select Properties Tab and select Sorted Input

Click on APPLY & OK

From Aggregation Transformation connect the Ports to Target

From Transformation Menu select CREATE

Select the Transformation type LOOKUP

Enter the Name

Click on CREATE

Select the SOURCE

Select the table DEPTClick on OK

From Aggregator Transformation copy the port DEPTNO to the LOOKUP

Double click the LOOKUP Transformation select the Condition Tab

From Toolbar click on ADD a new Condition

Lookup Table ColumnOperatorTransformation Port

DEPTNO=DEPTNO1

Click on APPLY

Click on OK

From LOOKUP Transformation connect the Ports to Target

Note: -

Lookup Transformation supports Joins (Horizontal Merging).

Lookup Transformation also supports Inequality comparisons (=, !=).

Lookup Transformation supports multiple Lookup conditions.

The Lookup Transformation supports only AND operator between multiple Lookup conditions.

It does not supports OR operator.

JOINER TRANSFORMATION: -This is of type an Active Transformation which allows you to combine the data records horizontal from multiple sources based on join condition.

The Joiner Transformation supports only two Input Streams per Joiner.

The Joiner Transformation supports multiple join conditions, the conditions are combined using operator AND (does not supports OR).

The Inputs to the Joiner Transformation are designated as Master and Detail Source.A Source which is having a lesser number of records is designated as Master Source, which occupies the least amount of space in the cache.

The Integration Service creates the joiner cache for only Master Source.

The Joiner Transformation is created with the following types of ports

I. Input Port (I)

II. Output Port (O)

III. Master Port (M)

A Master Source is defined with the master ports.

The Joiner Transformation supports Homogenous Data Sources and Heterogeneous Data Sources to combine the records horizontally.

A join which is made on same data sources is known as Homogeneous Joins.

Ex: -

ORACLE TABLE + ORACLE TABLE

SQL SERVER + SQL SERVER TABLE

A join which is made on two different data sources is known as Heterogeneous Joins.

Ex: -

SQL SERVER TABLE + ORACLE TABLEThe Joiner Transformation also supports non-relational sources such as Flat Files, XML Files etc.

The Joiner Transformation supports the following types of Joins

I) Normal Join (Equi-Join (or) Inner Join)

II) Master Outer Join

III) Detail Outer Join

IV) Full Outer Join

The default join type is Normal Join.

The Joiner Transformation does not supports non-equi joins ( , ==, |=)

NORMAL JOIN: -

It combines the records from Master and Detail Sources based on equality match.

MASTER OUTER JOIN: -

It keeps all the records from the Detail Source and matching records from Master Source.

DETAIL OUTER JOIN: -

It keeps all the records from the Master Source and matching records from Detail Source.

FULL OUTER JOIN: -

It keeps matching and non-matching records from both Master and Detail Sources.

Performance Considerations: -

Define the Master Source which can occupy the least amount of space in cache.

An Inner Join can improve the performance over an Outer Join because an Inner Join can result lesser number of records than Outer Join.

User Sorted Input: - Keep the Sorter Transformation prior to Joiner, sort the data on a port which is participate in Join condition

Ex: - DEPTNO

JOIN TRANSFORMATION - ADVANTAGES: -

Can Joiner heterogeneous sources Can Joiner non-relational sources Can Joiner partially transformed dataDISADVANTAGES: -

Can only Join two input data streams per Joiner Only supports Equi-Join

Does not supports OR condition

Procedure: -Create two Source definitions with a name EMP, DEPT

Create a Target definition with a name EMP_DEPT (EMPNO, ENAME, JOB, SAL, DEPTNO, DNAME, and LOC)

Create a Mapping with a name M_Homogeneous_Join

Drag both Source and Target into Mapping Work Space

Create Transformation type JOINER with a name JOIN_EMP_DEPT

Connect the required ports from SQ_EMP to the Joiner Transformation

Connect the ports from SQ_DEPT to the Joiner Transformation

Double click on Joiner Transformation and click on Conditional TabFrom Toolbar click on ADD a new port

MASTEROPERATORDETAIL

DEPTNO1=DEPTNO

Click on Properties and unselect the Sorted Input

Connect the ports from joiner transformation to corresponding port in the TargetCreate Session

Create Work-Flow

Start Work-Flow

UNION TRANSFORMATION: -It is of type an Active Transformation used to combine multiple sources into a single output.

It supports Homogeneous sources as well as Heterogeneous sources.

All the inputs of a Union Transformation should have the same structure (number of columns and datatypes should be same)

Union Transformation works like Union All in oracle.

Union Transformation is created with two types of groups

i) Input group: - Which receives the dataii) Output group: - Which sends the data to either target (or) any other down stream transformation for further processing.Procedure: -

Create two Source definitions with the name EMP, EMP1 with the same structure

Create a Target definition with the name EMP_UNION with the same structure from EMP (or) EMP1Create a Mapping with the name M_HOMOGENEOUS_UNION

Drag the Sources EMP and EMP1, drop on work space

Drag the Target EMP_UNION, drop on work spaceCreate a Transformation of type UNION with the name UNION_EMP_EMP1

Double click on the header of UNION Transformation

Click on Groups Tab and click on ADD a new group (name the Group as EMP & EMP1)

Click on Group Ports Tab and ADD the following Ports

PORTNAMEDATATYPEPRECISIONSCALE

EMPNOInteger100

ENAMEString100

JOBString100

MGRInteger100

HIREDATEDate/Time299

SALDecimal100

COMMDecimal100

DEPTNOInteger100

Click on APPLYClick on OK

Connect the Ports from SQ_EMP to EMP group of Union Transformation

Connect the Ports from SQ_EMP1 to EMP1 group of Union Transformation

Connect the Ports from the Output group of Union Transformation of the Target

From Repository click on SAVE

Data-flow Diagram: -

Create a Session with name S_M_UNION

Create a Work-flow with name W_S_M_UNION

Start Work-flow

NOTE: - Union Transformation supports vertical merging.

HETEROGENEOUS JOINS: -A Join which is made on two different data sources is known as Heterogeneous Joins.

Creation of SQL Server Database: -Procedure: -

START ( PROGRAMS ( MICROSOFT SQL SERVER ( ENTERPRISE MANAGERFrom left window expand CONSOLE ROOT

Select the DATABASES folder, Right click and click on NEW DATABASE

Enter the DATABASE NAME (Ex: - BATCH10AM)

Click on OK

Creation of Table in SQL Server Database: -

Procedure: -START ( PROGRAMS ( MICROSOFT SQL SERVER ( QUERY ANALYSER

Connect to SQL Server with the following details

SQL SERVER: - NIPUNA (computer name)

SQL Server Authentication

LOG NAME: - SA

PASSWORD: - SA

Click on OK

Commands: -

USE BATCH10AM

CREATE TABLE DEPT (DEPTNO INTEGER, DNAME VARCHAR (10), LOC VARCHAR (10))

SP_HELP DEPT

INSERT INTO DEPT VALUES (10, SALES, HYD)

INSERT INTO DEPT VALUES (20, OPERATIONS, CHE)

INSERT INTO DEPT VALUES (30, ACCOUNTS, DEL)

SELECT * FROM DEPT

Creation of ODBC connections: -START ( SETTINGS ( CONTROL PANEL ( ADMINSTRATIVE TOOLS ( DATA SOURCES (ODBC)Select the System DSN Tab and click on ADD

Select the driver SQL Server

Click on FINISHEnter the following details

Name: - BATCH10AM_SQL_SERVER (our wish)

Server: - NIPUNA (computer name)

Click on NEXT

Select with SQL Server Authentication

Log ID: - SA

Password: - SA

Click on NEXT

Select change the default database to BATCH10AM

Click on NEXT

Click on FINISH

Click on Test Data Sources

Click on OK

Creation of Reader connection to Microsoft SQL Server: -

Procedure: -Open the Client Power Center Work-flow Manager

From Connections Menu, select Relational

Select the type Microsoft SQL Server

Click on NEW

Enter the following details to create connection object

Name: - SQL_SERVER_READER (our wish)

User Name: - SA

Password: - SA

AttributeValue

Database NameSCOTTDB

PasswordNIPUNA (computer name)

Click on OKCreation of Writer connection to Microsoft SQL Server: -

Procedure: -Open the Client Power Center Work-flow Manager

From Connections Menu, select Relational

Select the type Microsoft SQL Server

Click on NEW

Enter the following details to create connection object

Name: - SQL_SERVER_READER (our wish)

User Name: - SA

Password: - SA

AttributeValue

Database NameBATCH10AMDB

PasswordNIPUNA (computer name)

Click on OK

Creation of SQL definition Microsoft SQL Server: -From Tools Menu select Source Analyzer

From Sources Menu click on Import from Database

Connect to the database with the following details

ODBC data Source: - BATCH10AM_SQL_SERVER

User Name: - SA

Owner Name: -SA

Password: - SA

Click on CONNECT

From Show Owners select ALL

Select the Table

Click on OK

From Repository Menu click on SAVE

Create a Target definition with a name EMP_DEPT (oracle)

Columns list (EMPNO, ENAME, JOB, SAL, DEPTNO, DNAME, LOC)

Create a Mapping with a name M_HETEROGENEOUS_JOIN

Drop the Sources definition on work space

Create the Transformation type JOINER

From SQ_EMP copy the required Ports to the Joiner Transformation

From SQ_DEPT copy the required Ports to the Joiner Transformation

Change the Datatype for a Port Name: - DEPTNO from Integer to Decimal

Double click on Joiner Transformation and select the Condition Tab

From Toolbar click on ADD a new conditionMASTEROPERATORDETAIL

DEPTNO1=DEPTNO

Click on APPLY

Click on OK

From Joiner Transformation connect the ports to the Target definition

From Repository Menu click on SAVE

Create a Session with a name S_M_HETEROGENEOUS_JOIN

Create a Work-flow with a name W_S_M_HETEROGENEOUS_JOIN

Start Work-flow

SOURCE QUALIFIER TRANSFORMATION: -This is of type an Active Transformation which supports the users to write SQL queries known as SQL Over Write.

The Source Qualifier Transformation supports SQL over write when the source is database.

The Source Qualifier Transformation supports source filters, user defined joins, sorting input data, eliminating duplicates using distinct etc.

The Source Qualifier Transformation supports to read the data from tables and Flat files (Text files).

The Source Qualifier Transformation functions as SQL SELECT statements.

Key Points: -

The following Transformations support SQL over Write

I. Source Qualifier Transformation

II. Lookup Transformation

SQL OVER WRITE VERTICAL MERGING: -Procedure: -

Create Source definition EMP (Oracle) and Target definition EMP_COUNT

Create a Mapping with the name M_SQLOVERWRITE_UNION

Drop the Source and Target definitions on Work Space

From Source Qualifier connect the ports to the Target and select the Properties Tab

TRANSFORMATION ATTRIBUTEVALUE

SQL querySELECT * FROM EMP UNIONALL

SELECT * FROM EMP WHERE ROWNUM < 7

Click on APPLY

Click on OK

SQL OVER WRITE SELF JOINS: -

The Source Qualifier Transformation supports only homogeneous data sources to perform horizontal merging.Source Qualifier supports SQL joins such as Inner Join (Equi Join), Left Outer Join, Right Outer Join and Full Outer Join.

Advantages of Source Qualifier Join: - It can join any number of Tables.

Full functionality of standard SQL available.

May reduces volume of data on network.

Disadvantages of Source Qualifier Join: -

It can only join homogeneous relation tables

It can affect performances on the source database because source database servers may not be tuned with required buffer sizes.

Procedure: -Create Sources with the names EMP (Oracle) and DEPT (Oracle)Create Target with the name EMP_SQL_JOIN (Oracle)

Column list (EMPNO, ENAME, JOB, SAL, DEPTNO, DNAME, LOC)Create a Mapping with the name M_SQL_JOIN

Drop the Sources and Target definitions on work space

Select the SQ_DEPT click on DELETE

From DEPT Source definition copy the ports to SQ_EMP

Double click on Source qualifier Transformation and select the Properties Tab

TRANSFORMATION ATTRIBUTEVALUE

SQL QuerySELECT EMP.EMPNO, EMP.ENAME, EMP.JOB, EMP.SAL, EMP.DEPTNO, DEPT.DNAME, DEPT.LOC from EMP INNER JOIN DEPT ON EMP.DEPTNO=DEPT.DEPTNO;

Click on APPLYClick on OK

From Source qualifier connect the Ports to the Target

PROPERTIES OF THE SOURCE QUALIFIER: -I) Source filter supports to write conditions to filter the data.

Filter the data early in the data-flow by defining source filter to reduce the number of records for further processing.

It improves the performance of data extraction.

II) Keep the Filter Transformation as close to the Source qualifier as possible to filter the data early in the data-flow.

If possible move the same condition to Source Qualifier Transformation.

III) User defined joins: - It defines the join condition in the Where Clause

Syntax: - EMP.DEPTNO=DEPT.DEPTNO

IV) Number of Sorted Ports: - Number of input ports used for sorting the data

It defines an Order by clause in SQL select statement.

Performance Considerations: -

Use Sorter Transformation to perform sort rather than using an order by clause in SQL over ride because the Source database may not be clowned with required buffer sizes.V) Select distinct: - It eliminates duplicates from Sources

Pre Sort: -

Integration Service executes SQL statements using Source database connection before it starts extraction.Post Sort: -

Integration Service executes SQL statements using Source database connection after starts extraction.

Note: - We can write multiple SQL statements which are separated by Semi-colon.

VI) Session Over ride: - Its a process of changing the business logic at Session level. The Integration Service executes the Session level logic

High priority than mapping over ride.Procedure: -Double click on the Session and select the Mapping Tab

From left window select SQ_EMP

From Properties Tab select the Attribute SQL query

Click on BROWSE to open SQL editor

Change the Business Logic

Click on OK

Click on APPLY

Click on OK

Design a Mapping without Importing the Source definition from database

STORED PROCEDURE TRANSFORMATION: -It is of type a Passive Transformation used to call the Procedures written at database level.

Stored Procedures are reusable.

A Stored Procedure is nothing but a set of SQL statements.

Properties of Stored Procedure Transformation: -Normal Property: - (default property)

Use Normal property to perform Row-by-Row calculations.

Source Pre-Load: -Integration Service executes a stored procedure before extracting the data from source.

Source Post-Load: -

Integration Service executes a stored procedure after extracting the data from source.

Target Pre-Load: -

Integration Service executes a stored procedure before loading the data into target.

Target Post-Load: -

Integration Service executes a stored procedure after loading the data into target.

Connect to the Target and Create the following procedure: -SQL> SHOW USER

USER IS BATCH 10AM

SQL> CREATE OR REPLACE PROCEDURE TAXCAL_PROC (SAL IN NUMBER, TAX OUT NUMBER)

IS

BEGIN

TAX: = SAL*0.2;

END; / (ENTER)

Procedure created

SQL>

Procedure: -

Create a Target definition with the name EMP_TAX_SP

(EMPNO, ENAME, JOB, SAL, DEPTNO, TAX, DNAME, LOC)

Create a Mapping with the name M_TAXCAL_SP

Drag the Source EMP and Target EMP_TAXCAL_SP definitions on work space

Connect the Ports from Source qualifier to the corresponding Ports in the Target

Create Transformation type Stored Procedure with name SP_TAXCAL

Click on CREATE

Enter the following database information

ODBC Source: - BATCH10AM_ODBC (Oracle in Oarcle10g_home1)

Username: - BATCH10AM

Owner name: - BATCH10AM

Password: - TARGET

Click on CONNECT

Expand the Procedures

Select the SP_TAXSAL (Target Table)Click on DONE

Connect the Port SAL from Source qualifier to Port SAL in the Stored Procedure Transformation

Connect the Port TAX from Stored Procedure Transformation to Port TAX in the Target From Repository Menu click on SAVE

Create a Session with the name S_M_TAXCAL_SP

Double click on Session and select Mapping Tab

Specify the Source and Target connections

From Transformations select Stored Procedure (SP_TAXCAL)

Set the following Property

ATTRIBUTEVALUE

CONNECTION INFORMATIONDefault is Target but Select BATCH10AM

Click on APPLYClick on OK

Create a Work-flow with the name W_ S_M_TAXCAL_SP

Start Work-flow

FLAT FILES: -A flat file is a ASCII character text file which is saved with an expression .txt, .csv (comma separated value), .dat.

There are two types of flat filesI. Delimited Flat FileII. Fixed Width Flat File

Delimited Flat File: -Each column (or) field is separated by some special characters such as Comma, Tab, Space, Semi-colon, pipe etc.

Ex: - Customer_East.txt

Step: -1) Creation of Source definitionProcedure: -From Tools Menu select Source Analyzer

From Sources Menu click on Import from File

Select the location of the File (C:\Flat File)

Select the File

Click on OK

Select the Flat File type Delimited

Select Import Field names from first lineClick on NEXT

Select the Delimiter Comma

Click on NEXT

Click on FINISH

From Repository Menu click on SAVEStep: -2) Creation of Target definitionCreate a Target definition with the name CUSTOMER (Oracle)

Step: -3) Creation of MappingCreate a Pass through Mapping with the name M_FLAT_FILE

Drop Source and Target definition on Work Space

Step: -4) Creation of SessionCreate a Session with the name S_M_FLAT_FILE

Double click on Session and select the Mapping Tab

From left window select SQ_CUSTOMER_EAST

From Properties set the following attributes

AttributeValue

Source File type

Source File directorySource File nameDirectC:\Flat File

Customer_East.txt

From left window select the Target (Customer)Set the Writer connection with the load type Normal

Click on APPLY

Click on OK

From Repository Menu click on SAVE

Note: -

Source qualifier does not support to write SQL statement in Files, it supports only in databases

Power Center supports Flat Files but as per definition DWH does not supports Files.

File List: -A Flat file is a ASCII character text file which is solved with an extension .txt, .CSV, (comma separated value), .dat.

There are two types of flat files

I. Delimited Flat File (variable length flat file)

II. Fixed Width Flat File

Delimited Flat file: -Each column (or) filed is separated by some special characters as Comma, Tab, Space, semi colon, Pipe etc.

Ex: - CUSTOMER_EAST.txt

Step: -1) Creation of Source definitionProcedure: -

From Tools Menu select Source definition

From Sources Menu click on Import from File

Select the location of the File (C:\Flai File)

Select the File

Click on OK

Select the Flat File type Delimited

Select Imported Field names from first line

Click on NEXT

Click on FINISH

From Repository Menu click on SAVE

Step: -2) Creation of Target definition

Create a Target definition with the name CUSTOMER (Oracle)Step: -3) Creation of Mapping

Create a Pass through Mapping with the name M_Flat_File

Drop the Source and Target definitions on Work Space

Note: -Source qualifier does not supports to write SQL query in Files, it supports only in databases

Step: -4) Creation of Session

Create a Session with the name S_M_Flat_file

Double click on the session and select Mapping Tab

From Left window select SQ_CUSTOMER_EAST

From Properties set the following attributes

AttributesValue

Source File type

Source File directory

Source file nameDirect

C:\Flat File

CUSTOMER_EAST.txt

From Left window select the Target (Customer)

Set the Writer connection with the load type Normal

Click on APPLY

Click on OK

From Repository Menu click on SAVE

Note: -

Power Center supports Flat Files but as per definition DWH does not supports Files.

**FILE LIST: -A File list is a collection of multiple text files with the same Delimiter, Metadata that can be merged with the Source File type as Indirect.

Creation of Flat File: -

Open the text editor Notepad provide the path of each source file

C:\Flat File\CUSTOMER_EAST.txt

C:\Flat File\CUSTOMER_SOUTH.txt

C:\Flat File\CUSTOMER_WEST.txt

SAVE the File with the name List.txt.

Sources Files: -CUSTOMER_EAST.txt

CUSTOMER_SOUTH.txt

CUSTOMER_WEST.txt

CNO, CNAME, AMOUNTCNO, CNAME, AMOUNT

CNO, CNAME, AMOUNT

100, Arun, 2000

102, Chandu, 2000

104, Sandy, 2000

101, Anil, 1000

103, chitti, 2000

105, Dusty, 2000

Procedure: -

Create a Target definition with the name CUSTOMER_LIST (oracle)

Create a Pass through Mapping with the name M_File_List

Drop the Source and Target definitions on work Space

Create a session with the name S_M_File_List

Double click the Session and select the Mapping tab

From Left window select the Source qualifier (SQ_CUSTOMER_EAST)

From Properties set the following attributes

AttributesValue

Source File type

Source File directory

Source File nameIndirect

C:\List

LIST.txt

From Left window select the Target

Set the Writer connection with the load type Normal

Click on APPLY

Click on OK

From Repository Menu click on SAVE

**Rejected Truncated/Over Flow Rows: -

From Mapping double click on Target definition and select Properties Tab

Transformation AttributeValue

Reject Truncate /Over Flow rows

Click on APPLY

Click on OK

When you Run the Session the Integration Service rejects the truncated records, it can view by the following directory.

C:\Informatica\Power Center 8.6.0\ Server\ Infa_Shared\Bad Files

Fixed Width Flat Files: -

Every record is having a same length.

The record has to be split at a given break point using break lines.

EX: - EMPLOYEE.txt

7001VSSNAAYANA450020

7002SIVA 500010

7003SURYA 550030

Note: -

Fixed Width files improve the performance over Delimited files.

Comma, Tab, Space, Pipe are column character

/n is a record Delimiter character

Step: -1) Creation of Source definitionProcedure: -

From Tools Menu click on source Analyzer

From sources Menu click on Import from File

Select the location of the File

Select the File

Click on OK

Select the Flat File type Fixed Width

Click on NEXT

Provide the column names for each field

Click on FINISH

Right click on the Source definition and click on Edit

Click on Advance

Set number of bytes to skip between the record : 2

Click on OK

Click on APPLY

Click on OK

From Repository Menu click on SAVE

From Target to start Work Flow same procedure as Delimited Flat File

XML Source Qualifier Transformation: -

The XML Source qualifier transformation reads the data from XML files, which are saved with an extension .xml.

Every XML source definition by default associates with XML source qualifier transformation.

EX: - EMP.xml

7001

SMITH

MANAGER

5000

10

7002

WARD

CLERK

3000

20

Step: -1) Creation of Source definition

From Tools Menu click on Source Analyzer

From Sources Menu click on Import XML definition

Select the location of an XML file with the files of type .xml

Select the file EMP.xml

Click on OPEN

Click on YES

Click on OK

Click on NEXT

Select Hierarchy relationships

Select Denormalized XML views

Click on FINISH

From Repository Menu click on SAVE

Step: -2) Creation of Target definition

By default we get XPK_EMP, delete that by double click on Target definition

From Mapping to start Work flow procedure is same as aboveTransaction Controlled Transformation: -

This is of type an Active transformation which controls the transactions bounded by Commit and Rollback.

The transaction controlled transformation functions as TCL commands (commit, Rollback) in SQL.

Power Center supports to control the transactions at two different levels

i) At mapping level using transaction controlled transformation

ii) At session level using commit interval

The conditional transaction control expressions can be developed using transaction controlled transformation at mapping level.

EX: -IIF (SAL > 3000, Commit, Rollback)

The following are the constant can be used to write condition based commits

i) TC_COMMIT_AFTER

ii) TC_COMMIT_BEFORE

iii) TC_ROLLBACK_AFTER

iv) TC_ROLLBACk_BEFORE

The Transaction Controlled transformation is use to perform condition based commits (or) User defined Commits.

The transactions can be controlled using commit interval property which is defined as Session level.

A commit interval is a number of rows at which the Integration Service applies a commit to the Target.

The default commit interval is 10,000 rows.

The following are the commits defined at session level

i) Target based commit

ii) User defined commit

iii) Source based commit

The default commit type is Target based commit.

Target Based Commit: -

During Target Based commit Session the Integration Service commits the Target based on the following factorsi) Commit Interval: - It defines the number of rows at which Integration Service applies the commit.

ii) Writer Wait Timeout: -

The amount of time writer waits before it issues a commit.

Configure the write wait timeout in the Integration Service.

iii) Buffer Blocks: -

Blocks of memory that holds rows of data during Session.When you run the Target based commit session the Integration Service may issue a commit BEFORE, ON and AFTER the given commit interval.The Integration Service uses the following process to issue the commit

i) When the Integration service reaches a commit interval it continues to fill the writer buffer block.When the writer buffer block fills the Integration Service issues a commit.ii) If the writer buffer fills before the commit interval the Integration Service writes the data to the Target but waits to issue a commit.

The Integration Service issues the commit using writer wait timeout.

Source Based Commit: -The Integration Service commits the data to Target Based on number of rows from Active Source (i.e., Source Qualifier)Case: -1)

You have a Source Based commit session that passes 10,000 records from active source.

When the 10,000 records reach the Target the Integration Service issues a commit.

If the Session completes successfully then Integration Service issues a commit at 10,000, 20,000, and 30,000, so on.

Case: -2)

You have a Source Based commit session that passes 10,000 records from active source but 3,000 rows are dropped due to Transformation Logic.

The Integration Service issues a commit to the Target when 7,000 remaining rows reach the Target.

Procedure: -

Create Source and Target definitions

Create a Mapping with the name M_USERDEFINED_COMMIT

Drop the source and Target definitions on work Space

Create the Transformation type Transaction Control

From Source qualifier copy the ports to Transaction controlled transformation.

Transaction AttributeValue

Transaction Control ConditionIIF (SAL>1500, TC_COMMIT_AFTER, TC_ROLLBACK_AFTER)

Click on APPLY

Click on OK

From Transaction Controlled transformation connect the ports to the Target

Create a Session with the name S_M_USERDEFINED_COMMIT

Set Writer and Reader connections

Create the Work Flow with the name W_S_M_USERDEFINED_COMMIT

Start Work FlowNote: -

Target Based Commit session improves the session performance then Source Based Commit Session.

Certification level Question: -

Here we have two commit points, since T1 based on Active Source SQ_EMP and T2 based on the Aggregator transformation.

When you run the Source Based Commit Session the Integration Service identifies the two Active Sources to issues the Commit.

For T1 Active Source is SQ_EMP and for T2 Active Source is Aggregator transformation.

Normalizer transformation: -

This is of an Active Transformation which allows you to read the data from Cobol Sources.

The Cobol Source definition associates with Normalizer Transformation like Source Qualifier.

Use the Normalizer Transformation to perform Horizontal data pivot.

The data pivot is a process of converting a single input record into multiple output records.

EX: - Pivoting data (setting a single record into multiple records)

Procedure: -

Creation of Source definition: -

Accounts.txt

Creation of Target definition: -

Create a Target definition with the name T_ACCOUNT

(YEAR, ACCOUNT, MONTH, AMOUNT)

Create a Mapping with the name M_DATE_POINT

Drop the Source and Target definitions on Work Space

Create the Transformation type Normalizer transformation

Double click on the Normalizer transformation and select Normalizer Tab

COLUMN NAMEOCCURSDATA TYPEPRECISIONSCALE

Year

Account

Amount0

0

3Number

String

Number5

10

100

0

0

Click on APPLY

Click on OK

From Source qualifier connect the Ports to Normalizer transformation

From Normalizer transformation connect the Ports to Target [GCID_AMOUNT (Month]

GCID ( generated column ID

GK ( generated key

SEQUENCE GENERATOR TRANSFORMATION: -

This is of type a Passive transformation which generates Numeric values.

The Numeric values are defined as Primary keys.

The Sequence generator transformation is created with the default output ports

i) NEXTVALii) CURRVAL This Transformation does not support to edit the existing output ports and does not support to create the new ports.

Use the Sequence generator transformation to generate unique primary keys that are used as Surrogate keys.

The Sequence generator transformation is used to implement ting Slowly Changing Dimensions.

** Use the Nextval port to generate the sequence numbers by connecting it to downstream Target (or) Target.

You connect the Nextval port to generate the sequence based on current balance and Increment by properties.

** Currval is NEXTVAL + INCREMANT BY ValueThe following properties can be set to define the Sequence generator

Current Value: -

It defines the current value of the sequence.

The Integration Service uses the current value as the basis to generate the sequence values for each Session.

At the end of the each session the Integration Service updates the current value to the current value to the last value generated for the session +1.

End Value: -

The End value is the maximum value you want the Integration Service generate.

If the Integration Service reaches the end value and Sequence generator is not configure to cycle then session fails with an error message (Over flow error).

Increment By: -

Difference between two consecutive values from Nextval port.

The Integration Service generates the sequence (Nextval) based on current value and Increment by properties.

Cycle: -

Use cycle to generate repeating sequence such as numbers 1 through 12 to correspond months in a year.

If selected the Sequence generator returns to the start value when the end value is required otherwise it stops.Start Value: -

Start value after the generated sequence if you select cycle.Reset: -

If reset is enable the Integration Service generates the values based on original current values for each session.

Otherwise Integration Service updates the current value to the last value generated +1.

Note: - The default value for a standard Sequence generator is zero for number of cache values.

Cyclic Loading (Round Robin Loading): -

Exercise: -

First record to first Target, second record to second Target, third record to third Target and fourth record to fourth Target and so on.Procedure: -

Create Source definition and Target definitions

Create Transformation type Sequence generator

Double click on Sequence generator and select the Properties Tab

Transformation ValueValue

Start Value

Increment Value

Cyclic1

1

Create the Transformation type Router

From Source qualifier copy the ports to the Router

From Sequence generator copy the Nextval port to the Router Transformation

Double click on the Router Transformation and select the Groups Tab

Group NameGroup Filter Condition

T1

T2

T3Nextval=1

Nextval=2

Nextval=3

Click on APPLY

Click on OK

From Repository Menu click on SAVE

Sequence Generation Blocks: -

(

In Integration Service generates numbers in blocks (number of links from sequence = number of blocks)

(

Integration Service generates the Single block which is copied to the multiple targets using expression.

(

Design a Mapping which can pass Even number rows to one Target and Odd number rows to one Target

(

Data flow diagram for generating sequence numbers without using Sequence generator.

Sequence Generator Transformation Performance Optimization: -

It is best to configure the Sequence generator transformation as close to the Target as possible in a mapping otherwise a mapping will be carrying sequence numbers through the transformation process which will not be transformed.

UPDATE STRATEGY TRANSFORMATION: -

This is of type an Active transformation which flags the source records for INSERT, UPDATE, DELETE and REJECT the rows to define data driven operations.

An Update Strategy transformation functions as DML commands.

The Informatica Power Center supports two different ways to implement an Update Strategy.

i) Using Update Strategy transformation in mapping level.

ii) Using Target treat source rows as property in Session level.

An Update Strategy transformation operator on Target database account.

The Target table definition should required primary key to Update and Delete records from Target.

An Update Strategy transformation is used in updating Slowly Changing Dimensions.

Update Strategy in Mapping Level: -

When you want the Integration Service to perform multiple database operations on Target then we use an Update Strategy transformation at mapping level.

When we use a Update Strategy transformation in mapping the Integration Servie follows the instructions coded in mapping.

When we use an Update Strategy transformation in mapping the treat sources rows as property set as date driven as session level.

The conditional Update Strategy expressions can be developed using an Update Strategy transformation with the following constraint.

i. DD_INSERT ( 0 [DD Data Driven]

ii. DD_UPDATE ( 1iii. DD_DELETE ( 2

iv. DD_REJECT ( 3

The default Update Strategy expression is DD_INSERT

Update Strategy in Session Level: -

When you want the Integration Service to perform single database operations only then we use treat sources rows as property to define Update Strategy at Session Level.Treat Source Rows As: INSERT: -

The Integration Service performs only insert operation in the Target table.

Treat Source Rows As: UPDATE: -

The Integration Service performs only update to the target based on primary key constraint.

Treat Source Rows As: DELETE: -

The Integration Service performs only delete operations on the target based on key constraint.

Treat Source Rows As: DATA DRIVEN: -

The Integration Service follows the instructions coded in update strategy transformation in mapping.

SLOWLY CHANGING DIMESION TYPE-1 IMPLIMENTATION: -

A type-1 dimension captures only current changes.

It does not store any historical data.

Procedure: -

Create a Source definition with the name EMP

Create a Target definition with the name DIM_EMPOLYEE_TYPE1

(EMPKEY, EMPNO, ENAME, JOB, SAL, DEPTNO)

EMPKEY is the primary key (Alternate Key)

Create a Mapping with the name M_EMPLOYEE_TYPE1_DIM

Drop the Source definition on Work Space

Drop the Target definition with the two instances on Work Space

(DIM_EMPOLYEE_TYPE1 for Insert and DIM_EMPOLYEE_TYPE11 for update)

Create Transformation type Lookup with the name LKP_Trgt

From Target select the table DIM_EMPOLYEE_TYPE1

Click on OK

From Source qualifier copy the port EMPNO to the Lookup transformation

Double click on Lookup transformation and select Conditions TabFrom Toolbar click on ADD a new condition

Lookup Table columnOperatorTransformation Port

EMPNO=EMPNO1

Select Properties Tab

Transformation AttributeValue

Connection InformationBATCH 10AM

Click on APPLY

Click on OK

Create Transformation type Expression

From Source qualifier copy the required ports to the Expression transformation

From Lookup transformation copy the port EMPKEY to Expression transformation

Double click on Expression transformation and select the Ports Tab

From Toolbar click on ADD a new port

Port NameData typePSOExpression

Insert_FlagString100OIIF (ISNULL (EMPKEY), TRUE, FALSE)

Create the Transformation type Router

From Expression transformation copy all the ports to the Router transformation

Double click on Router transformation and select the Groups Tab

From Toolbar click on ADD a new group

Group NameGroup Filter Condition

Insert_Flag

Update_FlagInsert_Flag = TRUE

Update_Flag = FALSE

Click on APPLY

Click on OK

Insert Flow: -

Create Transformation type Sequence generator and Update Strategy

From Router transformation, from Insert group copy the following to Update Strategy (EMPNO1, ENAME1, JOB1, SAL1, and DEPTNO1)

Double click on Update Strategy transformation and select the Properties Tab

Transformation AttributeValue

Update Strategy ExpressionDD_INSERT (or) 0

From Update Strategy transformation connect the Nextval port to the Target

Update Flow: -

Create Transformation type Update Strategy

From Router transformation, from update group copy the following to Update Strategy (EMPKEY3, EMPNO3, ENAME3, JOB3, SAL3, DEPTNO3)

Double click on Update Strategy transformation and select the Properties Tab

Transformation AttributeValue

Update Strategy ExpressionDD_INSERT (or) 1

From Update Strategy transformation connect the Nextval port to the Target

From Repository Menu click on SAVE

Create Session with the name S_M_EMPLOYEE_TYPE1_DIM

Create Work Flow with the name W_ S_M_EMPLOYEE_TYPE1_DIM

Start Work Flow

Mapping Logic: -

IMPLEMENTATION OF TYPE2: -A type2 dimension stores historical changes in the Target.For each update in the OLTP Source system is inserts a new record in the Target.The Informatica Power Center supports three different methods to maintain history

i) Keeping Primary Version number in a separate column

ii) Mark the correct dimension record with the flag

iii) Use an effective dates (Start date and end date)Version Based History: -

Source definition: EMPTarget definition: DIM_EMPLOYEE_TYPE2(EMPKEY, EMPNO, ENAME, JOB, SAL, DEPTNO, VERSION)Create a Mapping with the name M_EMPLOYEE_TYPE2_DATADrop the Source definition on Work SpaceDrop the Target definition with the two instancesCreate Transformation type Lookup with the name LKP_TRGSelect the Target table DIM_EMPLOYEE_TYPE2Click on OKFrom Source Qualifier copy the Port EMPNO to the Lookup transformationDouble click on Lookup transformation and select the Condition TabFrom Toolbar click on ADD a new Condition

Lookup Table ColumnOperatorTransformation Port

EMPNO=EMPNO1

From Properties TabTransformation AttributeValue

Lookup SQL Override SELECT

OUT.EMPKEY AS EMPKEY,

OUT.EMPNO AS EMPNO,

OUT.ENAME AS ENAME,

OUT.JOB AS JOB,

OUT.SAL AS SAL,

OUT.DEPTNO AS DEPTNO,

OUT.VERSION AS VERSION

FROM DIM_EMPLOYEE_TYPE2 OUTWHERE OUT.EMPKEY=(SELECT MAX (INN.EMPKEY) FROM DIM_EMPLOYEE_TYPE2

INN WHERE INN.EMPNO=OUT.EMPNO

Connection InformationBATCH10AMDB

Click on APPLYClick on OK

Create the Transformation type Expression

From Source Qualifier copy the required Ports to Expression transformation

From Lookup transformation copy the following Ports to the Expression transformation [EMPKEY, SAL, and VERSION]

Double click on Expression transformation

Select Ports Tab click on ADD a new Port

Port NameDatatypePSOExpression

INSERT_FLAGUPDATE_FLAGStringString101000OOIIF (ISNULL (EMPKEY), TRUE, FALSE)IIF (NOT ISNULL (EMPKEY)

AND

(SAL!=SAL1), TRUE, FALSE)

Create the Transformation type Router

From Expression transformation copy the following Ports to the Router transformation [EMPNO, ENAME, JOB, SAL, DEPTNO, VERSION, INSERT_FLAG, UPDATE_FLAG]

Double click on Router transformation and select the Groups Tab

From Toolbar click on ADD a new Group

Group NameGroup Filter Condition

INSERT_FLAG

UPDATE_FLAGINSERT_FLAG=TRUE

UPDATE_FLAG=TRUE

Insert Flag: -

Create the Transformation type Expression, Update Strategy and Sequence Generator

From Router transformation, from Insert group copy the following Ports to Expression transformation [EMPNO1, ENAME1, JOB1, SAL1, and DEPTNO1]

Double click on Expression transformation select Ports Tab

From Toolbar click on ADD a new Port

Port NameDatatypePSOExpression

VersionDecimal50O0

Click on APPLY

Click on OK

From Expression transformation copy the Ports to Update Strategy transformation and develop the following expression DD_INSERT

From Update Strategy transformation connect the Ports to the Target definition

From Sequence Generator transformation connect the Nextval Port to the Target definition (EMPKEY)

Update Flag as Insert: -

Create the Transformation type Expression, Update Strategy

From Router transformation, from Update group copy the following Ports to Expression transformation [EMPNO3, ENAME3, JOB3, SAL3, DEPTNO1 and VERSION3]

Double click on Expression transformation select Ports Tab

Uncheck the Output Port for a port name VERSION3

From Toolbar click on ADD a new Port

Port NameDatatypePSOExpression

VersionDecimal50OVERSION3+1

Click on APPLY

Click on OK

From Expression transformation copy the Ports to Update Strategy transformation and develop the following expression DD_INSERT

From Update Strategy transformation connect the Ports to the Target definition

From Sequence Generator transformation connect the Nextval Port to the Target definition (EMPKEY)

Type2 SCD Version Based at Mapping Level: -

Type2 SCD Version Based at Session Level: -

SLOWLY CHANGING DIMENSION TYPE2 IMPLIEMENTION USING START DATE and END DATE: -Procedure: -

Create a Source definition with the name EMP

Create a Target definition with the name DIM_EMPLOYEE_TYPE2 with the Ports

[EMPKEY, EMPNO, ENAME, JOB, SAL, DEPTNO, START_DATE, END_DATE]Create a Mapping with the name M_Type2_Data

Drop the Source definition on Work Space

Drop the Target definition with three instances on Work Space

Employee_Type2 (New record to Insert)

Employee_Type2 (1) (Updated record to Insert)

Employee_Type2 (2) (Update End_Date)

Create the Transformation type LookupEnter the name LKP_TRG and click on CREATE

Select the Target table EMPLOYEE_TYPE2

Click on OKClick on DONE

From Source Qualifier copy the Port EMPNO to the Lookup transformation

Double click on Lookup transformation and select Conditions Tab

LOOKUP TABLE COLUMN OPERATORTRANSFORMATION PORT

EMPNO=EMPNO1

Unwanted Ports can be deleted for best performance

Select the Properties Tab

TRANSFORMATION ATTRIBUTEVALUE

Lookup SQL OverrideSELECT

OUT.EMPKEY AS EMPKEY,

OUT.EMPNO AS EMPNO,

OUT.ENAME AS ENAME,

OUT.JOB AS JOB,

OUT.SAL AS SAL,

OUT.DEPTNO AS DEPTNO,

OUT.START_DATE AS START_DATE,

OUT.END_DATE AS END_DATE

FROM EMPLOYEE_TYPE2 OUTWHERE OUT.EMPKEY=

(SELECT MAX (INN.EMPKEY) FROM EMPLOYEE_TYPE2

INN WHERE INN.EMPNO=OUT.EMPNO

Connection InformationBATCH10AMDB

Click on APPLY

Click on OK

Create the Transformation type Expression

From Source Qualifier copy the following Ports to the Expression transformation

[EMPNO, ENAME, JOB, SAL, DEPTNO]

From Lookup transformation copy the following Ports to the Expression transformation [EMPKEY and SAL]

Double click on Expression transformation and select the Ports Tab

From Toolbar click on ADD a new Port

Port NameDatatypePSOExpression

INSERT_FLAG

UPDATE_FLAGString

String10

100

0O

OIIF (ISNULL (EMPKEY), TRUE, FALSE)

IIF (NOT ISNULL (EMPKEY)

AND

(SAL!=SAL1), TRUE, FALSE)

Click on APPLY

Click on OK

Create the Transformation type Router

From Expression transformation copy the following Ports to the Router transformation [EMPNO, ENAME, JOB, SAL, DEPTNO, VERSION, INSERT_FLAG, UPDATE_FLAG]

Double click on Router transformation and select the Groups Tab

From Toolbar click on ADD a new Group

Group NameGroup Filter Condition

INSERT_FLAG

UPDATE_FLAGINSERT_FLAG=TRUE

UPDATE_FLAG=TRUE

Insert Flow for New Record: -

Create the Transformation type Expression, Update Strategy and Sequence Generator

From Router transformation, from Insert group copy the following Ports to Expression transformation [EMPNO1, ENAME1, JOB1, SAL1, and DEPTNO1]

Double click on Expression transformation select Ports Tab

From Toolbar click on ADD a new Port

Port NameDatatypePSOExpression

Start_DateDate/Time299OSYSDATE

Click on APPLY

Click on OK

From Expression transformation copy the Ports to Update Strategy transformation

From Update Strategy transformation connect the Ports to the Target definition

From Sequence Generator transformation connect the Nextval Port to the Target definition (EMPKEY)

Update as Insert Flow: -

Create the Transformation type Expression, Update Strategy

From Router transformation, from Update group copy the following Ports to Expression transformation [EMPNO3, ENAME3, JOB3, SAL3, DEPTNO1 and VERSION3]

Double click on Expression transformation select Ports Tab

From Toolbar click on ADD a new Port

Port NameDatatypePSOExpression

Start_DateDate/Time299OSYSDATE

Click on APPLY

Click on OK

From Expression transformation copy the Ports to Update Strategy transformation

From Update Strategy transformation connect the Ports to the Target definition

From Sequence Generator transformation connect the Nextval Port to the Target definition (EMPKEY)