Top Banner
Data Transfer Process Table of Contents 1. Data Transfer Process 1.1. Use of Data Transfer Process 1.2. Integration 1.3. Features 2. Types of DTP 3. Processing Modes of Data Transfer Processes 3.1 Background Processing Modes for Standard Data Transfer Processes 3.11 Parallel extraction and processing (transformation and update) 3.12 Serial extraction, immediate parallel processing 3.13 Serial extraction and processing of the source packages 3.2 Decision Tree for Defining the Processing Mode 4. Creating Standard Data Transfer Processes 4.1 Prerequisites 4.2 Creating Data Transfer Processes Using Process Chains 4.3 Creating Data Transfer Processes from the Object Tree in the Data Warehousing Workbench 5. Direct Access Data Transfer Processes 5.1 Datasources & Direct Access Property: 5.2 Direct Access Scenarios: effected layer Components 5.3 Data Flow Diagram : 5.4 Creating Data Transfer Processes for Direct Access 6. Error DTP 6.1 Steps in Handling Erroneous Records 6.2 Step 1: Analyze Erroneous Records 6.3 Step 2: Enable Error Stack 6.4 Step 3: Define Semantic Groups
41
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Data Transfer Process

Data Transfer Process Table of Contents

1. Data Transfer Process

1.1. Use of Data Transfer Process

1.2. Integration

1.3.Features

2. Types of DTP

3. Processing Modes of Data Transfer Processes  

3.1 Background Processing Modes for Standard Data Transfer Processes

3.11 Parallel extraction and processing (transformation and update)

3.12 Serial extraction, immediate parallel processing

3.13 Serial extraction and processing of the source packages

3.2 Decision Tree for Defining the Processing Mode

4. Creating Standard Data Transfer Processes  

4.1 Prerequisites

4.2 Creating Data Transfer Processes Using Process Chains

4.3 Creating Data Transfer Processes from the Object Tree in the Data Warehousing Workbench

5. Direct Access Data Transfer Processes  

5.1 Datasources & Direct Access Property:

5.2 Direct Access Scenarios: effected layer Components

5.3 Data Flow Diagram :

5.4 Creating Data Transfer Processes for Direct Access  

6. Error DTP

6.1 Steps in Handling Erroneous Records

6.2 Step 1: Analyze Erroneous Records

6.3 Step 2: Enable Error Stack

6.4 Step 3: Define Semantic Groups

6.5 Step 4: Settings for Error Stack

6.6 Step 5: Execute the DTP load

6.7 Step 6: Validate Erroneous records in Error Stack

6.8 Step 7: Create and execute the Error DTP

7.0 Real-Time Data Acquisition

7.1 Creating Data Transfer Processes for Real-Time Data Acquisition

Page 2: Data Transfer Process

Data Transfer Process  

DefinitionObject that determines how data is transferred between two persistent objects.

UseData transfer process (DTP) loads data within BI from one object to another object with respect to transformations and filters. In short, DTP determines how data is transferred between two persistent objects. As of SAP NetWeaver 7.0, the InfoPackage only loads data to the entry layer of BI (PSA).

It is used to load the data from PSA to data target (cube or ods or infoobject) thus, it replaced the data mart interface and the Info Package.

The data transfer process makes the transfer processes in the data warehousing layer more transparent. Optimized parallel processing improves the performance of the transfer process (the data transfer process determines the processing mode). You can use the data transfer process to separate delta processes for different targets and you can use filter options between the persistent objects on various levels. For example, you can use filters between a DataStore object and an InfoCube.

Data transfer processes are used for standard data transfer, for real-time data acquisition, and for accessing data directly.

IntegrationThe following graphic illustrates how the data transfer process is positioned in relation to the objects in the dataflow of BI and the other objects in BI process control:

The InfoPackage controls the transfer of data from the source to the entry layer of BI. The data transfer process controls the distribution of data within BI. The graphic illustrates an example of a data update from the DataSource to an InfoProvider. The data can be updated from an InfoProvider to another InfoProvider using a data transfer process. The data transfer process can also be used to control data distribution from a BI system into any target outside of the BI system. For this purpose, a data transfer process with an open hub destination is used as the target.

Page 3: Data Transfer Process

FeaturesYou use a process chain to define a data transfer process. Alternatively, you can define a data transfer process for an InfoProvider in an object tree in the Data Warehousing Workbench. It is recommend that you use process chains.

The request is processed in the steps that have been defined for the data transfer process (extraction, transformation, filter and so on). The monitor for the data transfer process request shows the header information, request status, and the status and messages for the individual processing steps.

With a data transfer process, you can transfer data either in full extraction mode or in delta mode. In full mode, the entire dataset of the source is transferred to the target; in delta mode, only the data that was posted to the source since the last data transfer is transferred.

The data transfer process supports you in handling data records with errors. When you define the data transfer process, you can determine how the system responds to errors. At runtime, the incorrect data records are sorted and written to an error stack (request-based database table). A special error DTP further updates the data records from the error stack into the target.

2. Types of Data Transfer Processes  

Standard DTP - Standard DTP is used to update data from PSA to data targets ( Info cube, DSO  etc).

Direct Access DTP - DTP for Direct Access is the only available option for VirtualProviders.

Error DTP - Error DTP is used to update error records from Error stock to the corresponding data targets.

Real-time data acquisition (RDA) supports operative reporting. The data is transferred into BI at regular intervals as it gets updated in the system which is directly available for reporting and analysis.

Processing Modes of Data Transfer Processes  There are various processing modes for processing a data transfer process request (DTP request) with sub step extraction and processing (transformation and update).

Background Processing Modes for Standard Data Transfer ProcessesThe request of a standard DTP should always be processed in as many parallel processes as possible. There are 3 processing modes for background processing of standard DTPs. Each processing mode stands for a different degree of parallelization:

Parallel extraction and processing (transformation and update)

The data packages are extracted and processed in parallel process, meaning that a parallel process is derived from the main process for each data package. This parallel process extracts and processes the data.

Serial extraction, immediate parallel processing

The data packages are extracted sequentially in a process. The packages are processed in parallel processes, meaning that the main process extracts the data packages sequentially and derives a process that processes the data for each data package.

Serial extraction and processing of the source packages

The data packages are extracted and processed sequentially in a process, the main process.

Page 4: Data Transfer Process

Decision Tree for Defining the Processing ModeThe figure below illustrates how the system defines one of the described processing modes based on the system properties described above:

 

Other Processing Modes

Serial in the dialog process (for debugging)

With this processing mode you execute the data transfer process in debugging mode. The request is processed synchronously in a dialog process and the update of the data is simulated.

No data transfer; delta status in source: fetched

With this processing mode you execute a delta without transferring data. This is analogous to simulating the delta initialization with the InfoPackage. In this case you execute the DTP directly in the dialog.

Processing mode for real-time data packages

With this processing mode you execute data transfer processes for real-time data acquisition.

Processing mode for direct access

With this processing mode you execute data transfer processes for direct access.

 

Page 5: Data Transfer Process

 

 

 

4. Creating Standard Data Transfer Processes  You use the data transfer process (DTP) to transfer data from source objects to target objects in BI. You can also use the data transfer process to access InfoProvider data directly. 

4.1 PrerequisitesYou have used transformations to define the data flow between the source and target object.

Procedure:

4.2 Creating Data Transfer Processes Using Process ChainsYou are in the plan view of the process chain that you want to use for the data transfer process.

Process type Data Transfer Process is available in the Loading Process and Post processing process category.

       1.      Use drag and drop or double-click to insert the process into the process chain.

       2.      To create a data transfer process as a new process variant, enter a technical name and choose Create.

The dialog box for creating a data transfer process appears.

       3.      Select Standard (Can Be Scheduled) as the type of data transfer process.

You can only use the type DTP for Direct Access as the target of the data transfer process for a VirtualProvider. More information: Creating Data Transfer Processes for Direct Access.

If you use the data transfer process in a process chain, you can only use the standard data transfer as the target of the data transfer process for a DataStore object. More information about data transfer processes for real-time data acquisition: Creating Data Transfer Processes for Real-Time Data Acquisition.

       4.      Select the target and source object.

First select the object type.

Two input helps are available when you select the source and target objects:

Page 6: Data Transfer Process

       5.      Choose Continue.

       6.      The data transfer process maintenance screen appears.

The header data for the data transfer process shows the description, ID, version and status of the data transfer process, along with the delta status.

       7.      On the Extraction tab page, specify the parameters:

                            a.      Choose Extraction Mode.

You can choose Delta or Full mode.

Unlike delta transfer using an InfoPackage, an explicit initialization of the delta process is not necessary for delta transfer with a DTP. When the data transfer process is executed in delta mode for the first time, all existing requests are retrieved from the source, and the delta status is initialized.

Only the extraction mode Full is available for the following sources:

■       InfoObjects ■       InfoSets ■       DataStore Objects for Direct Update

If you have selected transfer mode Delta, you can define further parameters:

Page 7: Data Transfer Process

With Only Get Delta Once, define if the source requests should be transferred only once.

Setting this flag ensures that the content of the InfoProvider is an exact representation of the source data.

A scenario of this type may be required if you always want an InfoProvider to contain the most recent data for a query, but technical reasons prevent the DataSource on which it is based from delivering a delta (new, changed or deleted data records). For this type of DataSource, the current data set for the required selection can only be transferred using a full update.

In this case, a DataStore object cannot normally be used to determine the missing delta information (overwrite and create delta).

Get All New Data in Source Request by Request

Since a DTP bundles all transfer-relevant requests from the source, it sometimes generates large requests. If you do not want to use a single DTP request to transfer the dataset from the source because the dataset is too large, you can set the Get All New Data in Source Request by Request flag. This specifies that you want the DTP to read only one request from the source at a time. Once processing is completed, the DTP request checks for further new requests in the source. If it finds any, it automatically creates an additional DTP request.

Filter: If it is necessary to determine the filter criteria for the delta transfer chose filter.

This means that you can use multiple data transfer processes with disjunctive selection conditions to efficiently transfer small sets of data from a source into one or more targets, instead of transferring large volumes of data. The filter thus restricts the amount of data to be copied and works like the selections in the InfoPackage. You can specify single values, multiple selections, intervals, selections based on variables, or routines. Choose Change Selection to change the list of InfoObjects that can be selected.

Page 8: Data Transfer Process

The icon next to pushbutton Filter indicates that predefined selections exist for the data transfer process. The quick info text for this icon displays the selections as a character string.

 Semantic Groups:

To specify how you want to build the data packages that are read from the source (DataSource or InfoProvider).To do this, define key fields. Data records that have the same key are combined in a single data package.

This setting is only relevant for DataStore objects with data fields that are overwritten. This setting also defines the key fields for the error stack. By defining the key for the error stack, you ensure that the data can be updated in the target in the correct order once the incorrect data records have been corrected.

Page 9: Data Transfer Process

During parallel processing of time-dependent master data, the semantic key of the DTP may not contain the field of the data source.

                            

 Define any further settings that depend on the source object and data type.

       8.      On the Update tab page, specify the parameters:

                            a.      Make the settings for error handling. Define the following:

■       How you want to update valid records when errors occur.

■       How many errors can occur before the load process terminates.

b.      Apply any further settings that are relevant for the target object

Page 10: Data Transfer Process

                            .

       9.      On the Execute tab page, define the parameters:

On this tab page, the process flow of the program for the data transfer process is displayed in a tree structure.

If you want to execute a delta without transferring data, like when simulating the delta initialization with the InfoPackage, select No data transfer; delta status in source: fetched as processing mode. This processing mode is available when the data transfer process extracts in delta mode. In this case you execute the DTP directly in the dialog. A request started like this marks the data that is found in the source as fetched, without actually transferring it to the target. 

If delta requests have already been transferred for this data transfer process, you can still choose this mode.

If you want to execute the data transfer process in debugging mode, choose processing mode Serially in the Dialog Process (for Debugging). In this case, you can define breakpoints in the tree structure for the process flow of the program. The request is processed synchronously in a dialog process and the update of the data is simulated. If you select expert mode, you can also define selections for the simulation and activate or deactivate intermediate storage in addition to setting breakpoints. More information: Simulating and Debugging DTP Requests.

Specify the status that you want the system to adopt for the request if warnings are displayed.

Specify how you want the system to define the overall status of the request.

Page 11: Data Transfer Process

Normally the system automatically defines the processing mode for the background processing of the respective data transfer process.

   10.      Check the data transfer process, then save and activate it.

   11.      Start process chain maintenance.

The data transfer process is displayed in the plan view and can be linked into your process chain. When you activate and schedule the chain, the system executes the data transfer process as soon as it is triggered by an event in the predecessor process in the chain.

4.3 Creating Data Transfer Processes from the Object Tree in the Data Warehousing WorkbenchThe starting point when creating a data transfer process is the target where you want to transfer data to. In the Data Warehousing Workbench, an object tree is displayed and you have highlighted the target object.

...

1. In the context menu, choose Create Data Transfer Process.

The dialog box for creating a data transfer process appears.

2. Proceed as described in steps 3 to 10 in the procedure for creating a data transfer process using a process chain. In step 4, you specify the source object only.

You can now execute the data transfer process directly.

Additional FunctionsChoose Goto -> Overview of DTP to display information about the source and target objects, the transformations, and the last changes to the data transfer process.

Page 12: Data Transfer Process

   

 Choose Goto  -> Batch Manager Settings to make settings for parallel processing with the data transfer process.

By choosing Goto  -> Settings for DTP Temporary Storage, you can define the settings for the temporary storage.

You can define the DB storage parameters with Extras  - > Settings for Error Stack.

Page 13: Data Transfer Process

5. Direct Access Virtual Infocube does not physically contains any data. It contains only the basic structure. Using this structure we can get data from different source system into our BI system. Here in this article by using a View Based Generic Datasource, a virtual Infocube can be loaded through Direct access DTP

Conceptual layer of direct Access

5.1 Datasources & Direct Access Property:In order to support a virtual infoprovider, a datasource must be of direct access enabled. The basic meaning of Direct Access can be understood as no Persistent storage of data on BI side. Generally all Generic Datasource supports Direct Access. But the Business Content Datasource needs to be activated for direct access. The most important use of this Direct Access Datasource can be seen in “Data Reconciliation“ Business Scenarios.

If a Datasource is supporting Direct Access, then directly the source system data can be accessed with the help of that Datasource, thus it can be used in a Virtual Infocube. There are two categories of this type of Datasources:

I. Preaggregation Supporting Datasource:Such types of datasources are capable of extracting aggregated data (key figures) when data is extracted from the fields of corresponding tables. So this Datasource can be used in a DSO with Overwrite mode also.

II. Preaggregation Not Supporting Datasource:Such types of datasources are not capable of aggregating data, when data is extracted from the fields of corresponding tables. So a DSO in overwrite mode cannot be used for this Datasource.

Page 14: Data Transfer Process

5.2 Direct Access Scenarios: effected layer Components

5.3 Data Flow Diagram :The following diagram shows the full fledge Data Flow for loading a Virtual Infocube.

Page 15: Data Transfer Process

5.4 Creating Data Transfer Processes for Direct Access  

UseYou use a data transfer process for direct access to access the data in an InfoProvider directly.

PrerequisitesFor loading data in a Virtual infocube using Data Transfer Process, firstly we have to create a Direct Access Datasource. Almost all Generic Datasource supports this feature. We can also create a Flat File Datasource supporting this feature. But here by using a View Based Generic Datasource, loading data into virtual infocube is shown.

1. Create a View based on your table in the ECC side using t-code SE11.2. Create a Generic Datasource based on this View using T-code RSO2. 3. While saving the Datasource Enable “Direct Access “.

1. BI side go to RSA1Source Systems, Right Click on your source system and then click Replicate Datasource. After replication is done just activate the Datasource once.

Page 16: Data Transfer Process

Note: Since our Datasource is Direct Access Enabled, so there is no need for creating Infopackage for this Datasource. It will access the data directly from the source system tables.

4. Create a Virtual Provider and choose ‘Based on Data Transfer Process for Direct Access

Create Transformation to define the data flow between the source and target object.

Creating Data Transfer Processes for Direct Access:

You use a direct access data transfer process (DTP) to access the data in a source system( Flat File or Generic Data source SAP R/3) directly.

ProcedureThe starting point when creating a data transfer process is the target into which you want to transfer data.

In the Data Warehousing Workbench, an object tree is displayed and you have highlighted the target object, a VirtualProvider.

...

       1.   In the context menu, choose Create Data Transfer Process.

DTP for Direct Access is displayed as the type of the data transfer process.

       2.   Select the type of source object.

Supported object types are DataSources, InfoCubes, DataStore objects and InfoObjects (texts and attributes, if they are released as InfoProviders).

       3.   Select the object from which you want to transfer data into the target.

When you select the source object, input help is available. Input help shows you the selection of objects that already exist in the data flow for target object. If only one object exists in the data flow, this is selected by default.

An additional List pushbutton is available. This allows you to select a source object from the complete list of objects that exist for this object type.

       4. Choose Continue.

Page 17: Data Transfer Process

On the Extraction tab page, the system displays information about the adapter, the format of the data and additional source-specific settings.

On the Update tab page, the system displays information about the target.

On the Execute tab page, the system displays the processing mode for direct access and the process flow of the program for the data transfer process.

You do not need to make any settings in the data transfer process.

       5.      Check the data transfer process, save and activate it.

Choose Goto  Overview of DTP to display information about the source and target objects, the transformations, and the last changes to the data transfer process.

Once the activation of Data Transfer Process is done, then go to your Infocube, Right Click the Infocube and select ‘Activate Direct Access

After this we need to save the Data Transfer Process

Page 18: Data Transfer Process

Once this is done go to the Infocube and right click, ‘Display Data’. Here you can choose what all data you want to view in the Infocube. The ‘Display Data

ResultYou can use the data transfer process to access data directly.

 

Page 19: Data Transfer Process

6 Error Stack & Error DTP

DefinitionA request-based table (PSA table) into which erroneous data records from a data transfer process are written. The error stack is based on the data source, that is, records from the source are written to the error stack.

UseAt runtime, erroneous data records are written to an error stack if the error handling for the data transfer process is activated. You use the error stack to update the data to the target destination once the error is resolved.

IntegrationIn the monitor for the data transfer process, you can navigate to the PSA maintenance by choosing Error Stack in the toolbar, and display and edit erroneous records in the error stack.

With an error DTP, you can update the data records to the target manually or by means of a process chain. Once the data records have been successfully updated, they are deleted from the error stack. If there are any erroneous data records, they are written to the error stack again in a new error DTP request.

When a DTP request is deleted, the corresponding data records are also deleted from the error stack.

IntroductionError Stack is a request-based table (PSA table) into which erroneous data records from a data transfer process (DTP) are written. The error stack is based on the data source (PSA, DSO or Info Cube), that is, records from the source are written to the error stack. For a data transfer process (DTP), you can specify how you want the system to respond when data records contain errors. If you activate error handling, the records with errors are written to the error stack. You can use a special data transfer process, the error DTP, to update the records to the target.

Page 20: Data Transfer Process

6.1 Settings for Error HandlingFor a data transfer process (DTP), you can specify how you want the system to respond when data records contain errors. If you activate error handling, the records with errors are written to a request-based database table (PSA table). This is the error stack. You can use a special data transfer process, the error DTP, to update the records to the target.

Temporary storage is available after each processing step of the DTP request. This allows you to find out which processing step the error occurred in.

Step 1: Analyze Erroneous Records Go to the Update tab of the DTP of the target where there are erroneous records. Error Stack is available after each processing step of the DTP request. This allows you to find out which processing step the error occurred in. Examples of Incorrect Data Records Field contains invalid characters or lowercase characters Error during conversion A routine returns a return code <> 0 Characteristic value is not found for master data Duplicate data records with relation to keys If no SID exists for the characteristic value If no SID exists for the value of the navigation attribute

Step 3: Enable Error Stack Change the DTP to Edit Mode and on the Update tab of DTP; select the Error Handling mechanism based on how you want system to respond to data records with errors. The various error handling mechanisms are explained as below:- No update, no reporting (default) - If errors occur, the system terminates the update of the entire data package. The request is not released for reporting. The system continues checking the records however. Update valid records, no reporting (request red) - This option allows you to update valid data. This data is only released for reporting after the administrator checks the incorrect records that have not been updated and manually releases the request by setting the overall status on the Status tab page in the monitor (QM action). Update valid records, reporting possible - Valid records can be reported immediately. Automatic follow-up actions, such as adjusting the aggregates, are also carried out.

Page 21: Data Transfer Process

Note: Specify the maximum number of incorrect data records allowed before the system terminates the transfer process. If you leave this blank, handling for incorrect data records is not activated, and the update is terminated as soon as the first error occurs.

Step 3: Define Semantic Groups Go to Extraction tab of the DTP and click on Semantic Groups to define the Semantic Groups to define the key fields of error stack. This setting is only relevant if you are transferring data to Data Store objects (DSO) with data fields that are overwritten. If errors occur, all subsequent data records with the same key are written to the error stack along with the incorrect data record. This guarantees the serialization of the data records, and consistent data processing. The serialization of the data records and thus the explicit definition of key fields for the error stack are not relevant for targets that are not updated by overwriting like Info Cube. The fields in the source can be enabled in the semantic group definition as per the below rules:

By default, all fields in the source that are uniquely assigned to a key field of the target DSO in the transformation are checked in the semantic group screen.

The fields whose assignment is not unique are not checked by default, but can be enabled. The fields in source which are assigned to the data field in the target DSO cannot be checked.

Page 22: Data Transfer Process

Step 4: Settings for Error Stack Also the DTP Error Stack settings can be modified by clicking on the Goto Settings for DTP Temporary Storage of the DTP. The various options available in the Settings for Temporary Storage are as given below:

1. Delete Temporary Storage – To delete the records in the Error Stack as per below conditions:- With Request Status ‘Green’ With Request Status ‘Red’ After ‘x’ days

2. Level of Detail - you specify how you want to track the transformation. 3. Fill Temporary Storage - you specify the processing steps after which you want the system to temporarily store the DTP request (such as extraction, filtering, removing new records with the same key and transformation)

Page 23: Data Transfer Process

Step 5: Execute the DTP load Save and activate the DTP after the settings are completed. Execute the DTP to load the data from the PSA to the data target. If there are erroneous records in the data load, they are collected in the error stack. You can access the error stack by clicking on the error stack on top of DTP.

Step 6: Validate Erroneous records in Error Stack In the monitor for the data transfer process, you can navigate to the PSA maintenance by choosing Error Stack in the toolbar, and display and edit erroneous records in the error stack. Validate and correct the erroneous records similar to editing the PSA and save the records.

Page 24: Data Transfer Process

Step 7: Create and execute the Error DTP Click on the Creating Error DTP on the Update tab of DTP to move data from error stack to the target. With an error DTP, you can update the data records to the target manually or by means of a process chain. The error DTP uses the full update mode to extract data from the error stack Once the data records have been successfully updated, they are deleted from the error stack. If there are any erroneous data records, they are written to the error stack again in a new error DTP request. When a DTP request is deleted, the corresponding data records are also deleted from the error stack.

Execute the Normal DTP

From the above screen shot you can check that 4 records having incorrect data went into ERROR STACK and 2 records got updated to DSO.

Page 25: Data Transfer Process

Check the Manage screen

Check the ERROR STACK data

Below are the 4 erroneous records.

Correct the ERROR STACK data.

Page 26: Data Transfer Process

Execute ERROR DTP Load the corrected data from ERROT STACK to Data Target through executing ERROR DTP.

7. Activate the Request. The two requests are available in manage of DSO one from Normal DTP, other with ERROR DTP.

Error DTP in the Administrator Workbench

ResultYou can use the data transfer process to update the data to the target destination once the error is resolved.

Page 27: Data Transfer Process

 7.0 Real-Time Data Acquisition :Sometimes business needs to make a decision at shorter intervals and single change in the transactional system data can make the decision. It is important to consider each record of transaction data to be transferred to BI system at Shorter Intervals.

Use of Real Time Data

Having complex reports in BI System, this helps in making decisions on the basis of data of a transactional system. Sometimes single change in the transactional system data can make the decision. Hence, its important to consider each record of transaction data to be transferred to BI system at Shorter Intervals Operational Reporting. Remote access of Data not feasible due to resource consumption

Real Time Data Acquisition Definition

Real time Data warehousing is a framework for deriving information from data as the data becomes unavailable. Lower time scale than for scheduled/batch data acquisition

Stream oriented

Near immediate availability for reporting In general, Real-time data warehousing supports tactical decision-makin

Push Vs Pull Mechanism

Data acquisition for Embedded and External BI works on PULL mechanism whereas REAL TIME DATA acquisition is based on PUSH mechanism.

PULL : Strategic Decision Making (Long term planning) Processes normally nights Request oriented Resource consumption - Less at night time

PUSH : Tactical Decision making (Daily decisions) Processes daily every 1/Min..1hour Data availability oriented Resource consumption - Permanent active background job running

Architecture of Real Time Data Acquisition

Page 28: Data Transfer Process

Data Transfer Process for RDA

Constraints Use real-time data acquisition to fill DataStore objects :-

Data is first transferred into the PSA and then into the DataStore object. Not possible to use the DataStore object as the source of a further real-time data transfer into another DataStore object/Cube.

Master data cannot be transferred to the BI Navigation attributes of the characteristic could no longer be used in aggregates. Because aggregates cannot react to real-time updates.

Datasources enabled for RDA cannot be used for standard data transfer

Delta queue contain one entry for each Datasource and target system at any given time.

RDA Scenarios

RDA can be used in two primary scenarios :

2. Via the Service API (SAPI) Usage of Infopackage for Real time Data Acquisition (Source to PSA)

Leverages DTP for Real time Data Acquisition (PSA to DataStore Object)

Also, the following two scenario‟s possible; Source system application writes the data to the delta queue. Application does not write data to the delta queue automatically.

The extractor writes the data to the delta queue at the request of BI.

3. Via a Web Service Usage of Web Services to populate the PSA

Leverages the Real-time DTP to transfer data to the DataStore Object

Daemon Definition

It defines the system process (Infopackage and DTP) fulfils a specific task at regular intervals. Daemon works on the basis of the list of the Data sources assigned to it via the Infopackage. It receives information from the Infopackage as to when and how often the data is to be extracted, which data target are to be supplied, when a request is to be closed and a new one opened. Real-time requests for the current and last day are displayed that have supplied the ODS Objects with the data.

Page 29: Data Transfer Process

How Daemon triggers the RDA Processing

Call source system for new records Status update of transferred records in confirmation table Update PSA Check for records in confirmation table If records are available in confirmation table the corresponding records exist as well in PSA Daemon flags records in source system Reply confirmation sent to BI Entries are flagged as processed in confirmation table If the PSA request has completed successfully and a new delta request has been opened, the updated

data is deleted from the delta queue of the source system. Initiate DTP after the records are confirmed in confirmation table Commit

- If not successful in BI, records will stay in confirmation table. - Guarantees restart, if the records available in confirmation table, even when the update in the source

system was successful.

Operating Mode

It runs in a permanent background job and only switches to a "sleep" mode after performing a data transfer if there is currently no other data in the delta queue.

Permanent RFC connection is required between every source system and the BI for real-time data transfer.

So that, not too much main memory is used and the RFC connection does not have to exist for too long, the daemon ends every hour on its own and schedules again. This happens with out the real-time requests has to be closed.

Daemon Closes Request

Page 30: Data Transfer Process

Creating Data Transfer Processes for Real-Time Data Acquisition  UseYou use the data transfer process (DTP) for real-time data acquisition to transfer data to the DataStore object from the PSA. In the DataStore object, the data is available for use in reporting.

PrerequisitesYou have used transformations to define the data flow between the DataSource and the DataStore object.

The selections for the data transfer process do not overlap with selections in other data transfer processes.

ProcedureThe starting point when creating a data transfer process is the DataStore object into which you want to transfer data. In the Data Warehousing Workbench, an object tree is displayed and you select the DataStore object.

...

       1.      In the context menu, choose Create Data Transfer Process.

The dialog box for creating a data transfer process appears.

       2.      Select DTP for Real-Time Data Acquisition as the DTP Type.

       3.      As the source object, select the DataSource from which you want to transfer data to the DataStore object.

The input help for the source object shows the selection of DataSources that already exist in the data flow for the DataStore object. An additional List pushbutton is available. This allows you to select a DataSource from the complete list of BI DataSources.

       4.      Choose Continue.

The data transfer process maintenance screen appears.

The header data for the data transfer process shows the description, ID, version, and status of the data transfer process, along with the delta status.

Page 31: Data Transfer Process

       5.      On the Extraction tab page, specify the parameters:

                         a.     Delta is chosen as the extraction mode for real-time data acquisition.

b.      If necessary, determine filter criteria for the delta transfer. To do this, choose Filter.

       6.      On the Update tab page, specify the parameters:

Make the settings for error handling. You define:

How you want to update valid records when errors occur.

How many errors can occur before the load process terminates.

       7.      On the Execute tab page, determine the parameters:

On this tab page, the process flow of the program for the data transfer process is displayed in a tree structure.

b. Specify the status that you want the system to adopt for the request if warnings are to be displayed in

the log.

a. Specify how you want the system to define the overall status of the request.

       8.      Check, save, and activate the data transfer process.

       9.      With Assign Daemon you go to the monitor for real-time data acquisition if there is already an InfoPackage for RDA for the DataSource that is the source for this DTP.

If there is not yet an InfoPackage for RDA for the DataSource that is the source for this DTP, the system informs you that you must first create an InfoPackage for RDA before you can assign the DTP.

Page 32: Data Transfer Process

To create a real time info package, make sure that Delta initialization should be successful.

After Delta initialization , Create another Infopackage and enable the “Real time” option as shown below.

This infopackage works as a Real Time Infopackage which picks the live data from the source.

Page 33: Data Transfer Process

Assign a new Daemon for Datasource from Unassigned node.

Assign the DTP to newly created Daemon.

Page 34: Data Transfer Process

Alternatively you can go to the monitor for real-time data acquisition from the context menu entry Assign RDA Daemon of the data transfer process if you are in the Data Warehousing Workbench.

Result

The data transfer process is assigned to the DataSource.

If the DataSource is already assigned to a daemon, the data transfer process appears in the monitor for real-time data acquisition under this daemon and the DataSource. It is now available for data processing by the daemon.

If the DataSource has not yet been assigned to a daemon, the data transfer process appears in the monitor for real-time data acquisition under the DataSource in the area Unassigned Objects. The data transfer process, the corresponding DataSource, the InfoPackage and possibly further associated data transfer processes are assigned to the specified daemon in the context menu of the data transfer process with Assign Daemon..