This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP AG. The information contained herein may be changed without prior notice.
All rights reserved.
Copyright
Trademarks: Some software products marketed by SAP AG and its distributors contain proprietary software components of other software vendors.Microsoft®, WINDOWS®, NT®, EXCEL®, Word®, PowerPoint® and SQL Server® are registered trademarks of Microsoft Corporation.IBM®, DB2®, DB2 Universal Database, OS/2®, Parallel Sysplex®, MVS/ESA, AIX®, S/390®, AS/400®, OS/390®, OS/400®, iSeries, pSeries, xSeries, zSeries, z/OS, AFP, Intelligent Miner, WebSphere®, Netfinity®, Tivoli®, Informix and Informix® Dynamic ServerTM are trademarks of IBM Corporation in USA and/or other countries.ORACLE® is a registered trademark of ORACLE Corporation.UNIX®, X/Open®, OSF/1®, and Motif® are registered trademarks of the Open Group.Citrix®, the Citrix logo, ICA®, Program Neighborhood®, MetaFrame®, WinFrame®, VideoFrame®, MultiWin® and other Citrix product names referenced herein are trademarks of Citrix Systems, Inc.HTML, DHTML, XML, XHTML are trademarks or registered trademarks of W3C®, World Wide Web Consortium, Massachusetts Institute of Technology.JAVA® is a registered trademark of Sun Microsystems, Inc.JAVASCRIPT® is a registered trademark of Sun Microsystems, Inc., used under license for technology invented and implemented by Netscape.MarketSet and Enterprise Buyer are jointly owned trademarks of SAP AG and Commerce One.SAP, SAP Logo, R/2, R/3, mySAP, mySAP.com, and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP AG in Germany and in several other countries all over the world. All other product and service names mentioned are the trademarks of their respective companies.
ParticipantsBusiness Warehouse Consultants and Developer
Duration: 2 days
User notes
These training materials are not a teach-yourself program. They complement the explanations provided by your course instructor. Space is provided on each page for you to note down additional information.
There may not be sufficient time during the course to complete all the exercises. The exercises provide additional examples that are covered during the course. You can also work through these examples in your own time to increase your understanding of the topics.
The data type of an internal table is completely specified by the following attributes:
Line typeThis is the source of the attributes of the individual columns. You normally specify a structure type butany data types are possible.
Key definitionThe key columns and their order define the criteria by which the tables are identified. Depending on the access type, the key can be defined as unique or non-unique. With unique keys there are no multiple entries with identical values in the key.
Data access type
With a key access – as with database tables – you access using the field contents. Example: A read access using the search term 'UA 0007' to an internal table with the unique keyCARRID CONNID and the data pictured above returns exactly one data record.
Index access: Unlike database tables, with internal tables the system may number the lines and givethe line an index. You can use this idex to access a specific table line.Example: A read access to a data record with index 5 returns the fifth data record of the internaltable
Another internal table attribute is the table type. Internal tables can be divided into three table types according to the respective access types:
• With standard tables the line numbering is maintained internally. Both index and key accesses are possible.
• With sorted tables the data records are always sorted according to key and saved. Here too, the index is maintained internally. Both index and key accesses are possible.
• With hashed tables the data records are managed optimized at run time. A unique key is a requirement. With hashed tables only key accesses are possible.
Which table type you use in each case, depends on how that table's entries are generally going to be accessed:
• For index accesses you should normally use standard tables.
• Sorted tables are best for unique keys and fixed sorting.
• With hashed tables the runtime optimization is noticeable only if the accesses are of the read type with a unique key.
This course deals with standard tables only. Apart from a few special cases, the syntax is identical for all three table types.
SBC400_T_SBC400FOCCSBC400_T_SBC400FOCCTable typeTable type
DATA itab_flightinfo TYPE sbc400_t_sbc400focc.DATADATA
itab_flightinfo
carrid seatsmaxconnid fldate seatsocc percentage
Table types can be defined locally in a program or globally in the ABAP Dictionary.For DATA itab_name TYPE itab_type you can use local or global types itab_type.
For detailed information on the definition of global table types in the ABAP Dictionary, refer to the SAP Library under Basis → ABAP Workbench → BC-ABAP Dictionary → Types → Table types.
* Fill selection tablewa_carrid-sign = 'I'.wa_carrid-option = 'EQ'.wa_carrid-low = 'AA'.
APPEND wa_carrid TO rg_carrid.: : :* Use selection tableSELECT …
WHERE carrid IN rg_carrid.
rgrg_carr_carridid
sign(1)TYPE c
option(2)TYPE c
low LIKEmy_carrid
high LIKEmy_carrid
wawa_carr_carridid
Use the SELECT-OPTION statement to create an internal table with a header, for which the runtime system automatically creates an input dialog for value sets for a selection screen. The system automatically inserts the appropriate entries in the internal table, from the user input. For more details, refer to the keyword documentation for the SELECT-OPTIONS statement.
As of SAP R/3 Basis Release 4.6A onwards, you can use the RANGE OF addition to the TYPES and DATA statements to define a corresponding (internal) table without a header line.
READ TABLE itab INTO waWITH KEY key.WITH KEYWITH KEY
Table scan
STANDARD
Complete/part keyComplete/part keyleftleft--aligned without gapsaligned without gaps
By qualified keyBy qualified key
Any componentAny componentconditioncondition
key
SORTED HASHEDSTANDARDkey
Table kind
Table kind
Binary search
Table scan
SORTED
Table scan
HASHED
Table scan Binary search
Hash algorithm
Must be completelyMust be completelyqualifiedqualified
Whenever you want to read individual table lines by declaring a complete key, use the READ TABLE ... WITH TABLE KEY statement (fastest single record access by key). The runtime system supports this syntax variant especially for SORTED and HASHED tables. If the table is a STANDARD table, the runtime system performs a table scan.The same applies if you have copied the values from all key fields of the entry to be read into the work area wa and are then use READ TABLE itab FROM wa.
The runtime system carries out the syntax variant READ TABLE ... WITH KEY (read an entry after applying any condition) using a table scan.The only exception to this rule applies to SORTED tables, if you fill the first n key fields with "=" (no gaps), where n <= number of key fields.With standard tables however, you can also sort correspondingly using SORT and then use the BINARY SEARCH addition.
Summary:Whenever possible, use READ TABLE ... WITH TABLE KEY or the variant with a correspondingly-filled work area.If you need to use READ TABLE ... WITH KEY, make your internal table a SORTED table.
Whenever you want to read individual table lines by declaring a complete key, use the READ TABLE ... WITH TABLE KEY statement (fastest single record access by key). The runtime system supports this syntax variant especially for SORTED and HASHED tables. If the table is a STANDARD table, the runtime system performs a table scan.The same applies if you have copied the values from all key fields of the entry to be read into the work area wa and are then use READ TABLE itab FROM wa.
The runtime system carries out the syntax variant READ TABLE ... WITH KEY (read an entry after applying any condition) using a table scan.The only exception to this rule applies to SORTED tables, if you fill the first n key fields with "=" (no gaps), where n <= number of key fields.With standard tables however, you can also sort correspondingly using SORT and then use the BINARY SEARCH addition.
Summary:Whenever possible, use READ TABLE ... WITH TABLE KEY or the variant with a correspondingly-filled work area.If you need to use READ TABLE ... WITH KEY, make your internal table a SORTED table.
First n key fields filledFirst n key fields filledwith "=" without gapswith "=" without gaps
Any logical expressionAny logical expressionfor columnsfor columns
log_expr
Table kind
Binary search for starting point, then loop only through group
level
Table scan
SORTED
Table scan
HASHED
The runtime system generally processes loops with a WHERE clause by performing a table scan - that is, determining whether the condition in the WHERE clause is true for each line in the table.
SORTED tables are the only exception to this rule. For these, the runtime system optimizes the runtime under the following condition:In the WHERE clause, the first n key fields are filled with a "=" (no gaps). (n is less than or equal to the number of all key fields). As a result, the loop is only performed on the lines that match the condition in the WHERE clause. Since the table is sorted, the first line can be specified to optimize performance at runtime (using a binary search).
<fs>DATA: var_a TYPE i VALUE 4.FIELD-SYMBOLS: <fs> TYPE i.
Data objects in the ABAP program
FIELDFIELD--SYMBOLSSYMBOLS
ASSIGN var_a TO <fs>.
<fs> = 77.
var_a
<fs>
var_a
<fs>
4
77
Time
ASSIGNASSIGN TOTO
Defining a field symbol:
Assigning a data object to a field symbol:
Assigning a value to a data objectusing a field symbol:
You can create a pointer to a data object in ABAP using a field symbol.First, declare a data object using the FIELD-SYMBOLS statement. This data object can contain a pointer to another data object at runtime. Where possible, you should give the field symbol the same type as the data object (TYPE i, in this example).
Note that the angle brackets (<>) are part of the name of the field symbol: In this example, the name is <fs>.
To point a field symbol at a data object, you must assign it to the object data_object using the ASSIGN data_object TO <fs> statement.
You can use the field symbol to access the content of the data object to which it points - either to read or to change this content.You can "redirect" a field symbol to a different data object at runtime using the ASSIGN statement.
In order to get access to single fields of a structure by a field symbol you can use the following statement:ASSIGN COMPONENT comp-no OF STRUCTURE struct TO <fs>.
Data objects in the ABAP programDefining the field symbol:
Accessing the line content using the field symbol:
<fs>
it_sflight
Assigning a line to the field symbol:
LOOP AT it_sflight ASSIGNING <fs>.
WRITE <fs>-connid.<fs>-carrid = 'LH'.
<fs>
it_sflight
<fs>
it_sflight
In the above example, a field symbol is assigned the line type of an internal table. This makes it possible to assign a table line to this field symbol. The syntax required for this is discussed later in this unit.
After a field symbol has been assigned a line, it is also possible to access the individual column values of the assigned line.
As well as being able to read the data contents, you can also change the contents of the individualcomponents.
Instead of READ TABLE ... INTO, you can use the READ TABLE ... ASSIGNING variant. This offers better performance at runtime for pure read accesses with a line width greater than or equal to 1000 bytes. If you then change the read line using MODIFY, READ ... ASSIGNING already improves runtime with a line width of 100 bytes.The same applies to LOOP ... INTO in comparison with LOOP ... ASSIGNING. The LOOP ...ASSIGNING variant offers better performance at runtime for any loop of five loop passes or more.
Both field symbol variants are much faster than work area variants, in particular when you use nested internal tables. This is because, if you use work areas instead, the whole inner internal table is copied (unless you prevent this by using a TRANSPORTING addition).
Always assign a type to field symbols, if you know their static type (again, for performance reasons).Note: If you use READ TABLE ... ASSIGNING the field symbol points to the originally assigned table line, even after the internal table has been sorted.Note that when using field symbols, you cannot change key fields in SORTED or HASHED tables. Trying to do so causes a runtime error.The following restrictions apply to LOOP ... ASSIGNING <fs>:
• You cannot use the SUM statement in control level processing.
• You cannot reassign field symbols within the loop. The statements ASSIGN do TO <fs> and UNASSIGN <fs> will cause runtime errors.
A subroutine is an internal module within a program. In a subroutine, you lift parts of a program out of the main programming block and put them somewhere else. This makes your program easier to read and allows you to use these code segments more than once.
You can pass data to the subroutine and back through its interface. This allows you to call the same function for different data objects. The example in the graphic shows a subroutine that calculates a percentage. This subroutine is called several times, even though different data objects are passed to the interface in each case.
Using subroutines makes your program more function oriented: it splits the program's task into subfunctions so that each subroutine is responsible for one subfunction.
This generally makes programs easier to maintain. In the Debugger, you can execute the subroutines in background so that you see only the result. This usually makes it easier to find the source of the error.
The structure of a subroutine includes the following:
• Each subroutine starts with FORM and ends with ENDFORM.
• The name of the subroutine is followed by the interface definition.
The statements that the subroutine executes come between FORM and ENDFORM.
FORM fill_itabCHANGINGf_itab TYPE sbc400_t_sbc400focc.
DATA l_wa LIKE LINE OF f_itab.
* LOOP AT it_flightinfo INTO wa_flightinfo* would be also possibleLOOP AT f_itab INTO l_wa....
ENDLOOP. ENDFORM.
Visible globally
Visible locally
DATA: it_flightinfo TYPE sbc400_t_sbc400focc,wa_flightinfo TYPE sbc400focc,... .
DATA DATA
You can define local data within a subroutine.
Both the formal parameters and the local data objects are active only at the run time of the subroutine. This means that memory is allocated only when the subroutine is called and is released as soon as the subroutine has been executed. Thus these parameters and data objects can be addressed only from within the subroutine.
The global data objects from the main program can also be addressed from the subroutine. However, you should avoid doing this wherever possible. Otherwise, you bypass the interface, which makes the program more prone to errors.
If a local data object or formal parameter has the same name as a global data object, the ABAP runtime system addresses the local data object in the subroutine and the global one outside it. These objects are then known as locally obscured objects.
Summary of hints about gobal and local data objects:
Address the global data objects in the main program and pass them to the subroutine using the interface.
Address only formal parameters and local data objects in the subroutine.
For clarity, avoid using identically named global and local variables. Instead, use a simple prefix, such as f_ for formal parameters and l_ for local data objects.
Open SQL statements are a subset of Standard SQL that is fully integrated in the ABAP language. (SQL stands for standard query language.)
Open SQL statements allow you to access the database in a uniform way from your programs, regardlessof the database system installed. The database interface converts Open SQL statements to database-specific SQL statements.
To program database read access, use the Open SQL statement SELECT.
The SELECT statement contains a series of clauses, each of which has a different task:
• The SELECT clause describes, among other things, whether the result of the selection will be severallines or a single data record and which fields of the table are to be read.
• The FROM clause names the source (database table or view) from which the data is to be selected.
• The INTO clause determines the internal data objects into which the selected data is to be placed.
• The WHERE clause specifies the conditions that the selection results must fulfill. It thus determines thelines to be selected from the table.
For information about other clauses, refer to the keyword documentation for SELECT.
SELECT SINGLE *FROM scarrINTO wa_scarrWHERE carrid = pa_car.
IF sy-subrc = 0. ...
SINGLESINGLE
Databaseinterface
wa_scarr
INTOINTOWHEREWHERE
SCARR
Databasetable
The SELECT SINGLE* statement allows you to read a single record from the database table. To ensurethat you read a unique entry, all of the key fields must be filled by the WHERE clause. The asterisk * afterSINGLE tells the database interface to read all columns in that line of the database table. If you want onlya specific selection of columns, you can list the required fields instead.
In the INTO clause, enter the destination where the database interface is to copy the data. The target areashould be structured left justified just like the required columns of the database table.
If you use the CORRESPONDING FIELDS OF addition in the INTO clause, you can fill the target areacomponent by component. The system fills only those components that have identical names to columns in the database table. Note: If you do not use this addition, the system fills the target area left justified, irrespective of its structure.
If the system finds a table entry matching your conditions, SY-SUBRC has the value zero (0).
The SINGLE addition tells the database that only one line needs to be read. The database can thenterminate the search as soon as it has found that line. Therefore, SELECT SINGLE produces betterperformance for single-record access than a SELECT loop if you supply values for all key fields
If you do not use the addition SINGLE with the SELECT statement, the system reads multiple recordsfrom the database. The field list determines the columns whose data is to be read from the database.
The number of requested lines to be read can be restricted using the WHERE clause.
In the WHERE clause you may enter only the field names of the database table. The name of the databasetable you want to access is found in the FROM clause.
Multiple logical conditions can be added to the WHERE clause using AND, NOT or OR statements.
The database delivers data to the database interface in packages. The ABAP runtime system copies thedata records to the target area line by line using a loop. It also enables sequential processing of all thestatements between SELECT and ENDSELECT.
After the ENDSELECT statement, you can check the return code for the SELECT loop. sy-subrc will have the value zero (0) if the database interface has found at least one record.
After the ENDSELECT statement, sy-dbcnt contains the total number of lines read.
The addition INTO TABLE itab causes the ABAP runtime system to copy the contents of the database interface directly to the internal table itab. This is called an array fetch.
Since the array fetch is not executed as a loop, do not program any ENDSELECT statement.
If you want to add lines to the end of an internal table that is already filled, instead of overwriting it, use the APPENDING TABLE itab addition.
sy-subrc has the value 0 if the system was able to read at least one record.
The program must contain a data object with a suitable type for each column that is required from a database table. For program maintenance reasons, you must use the corresponding Dictionary objects to assign types to the data objects. The INTO clause specifies the data object into which you want to placethe data from the database table. You can use the INTO clause in two different ways:
Using a flat structure – You define a structure in your program that has the same fields in the samesequence as the field list in the SELECT clause. You can enter the name of the structure in the INTOclause. The contents are copied left justified. The field names of the structure are disregarded here.
Using individual data objects –You can enter a set of data objects in the INTO clause. For example:DATA: gd_carrid TYPE sflight-carrid,
gd_connid TYPE sflight-connid,gd_fldate TYPE sflight-fldate,gd_seatsmax TYPE sflight-seatsmax,gd_seatsocc TYPE sflight-seatsocc.
INTO Clause: Same-Name Fields of Field List in theTarget Structure
ABAP program
DATA wa_sdyn_conn TYPE sdyn_conn.
SELECT SINGLE carrid connid deptimeFROM spfliINTO CORRESPONDING FIELDS OF wa_sdyn_connWHERE carrid = pa_car
AND connid = pa_con.
MANDT CARRID CONNID ... DEPTIME
mandt carrid connid ... deptime
INTO CORRESPONDING FIELDS OF INTO CORRESPONDING FIELDS OF wawa__sdynsdyn__connconn
carrid connid deptimecarrid connid deptime
Same type ascolumn read
If you use the INTO CORRESPONDING FIELDS clause, the data is placed in the fields with the samename in the target structure.
Advantages of this construction:
• The target structure does not have to be structured left justified in the identical way as the field list.
• This construction is easy to maintain, because extending the field list does not require other changes to the program, as long as there is a field in the target structure that has the same name and type.
To place data in internal table columns of the same name using an array fetch, use the INTO CORRESPONDING FIELDS OF TABLE itab statement.
When you go to the definition of a database table in the ABAP Dictionary, you will see information on all the technical attributes of the database table.
The following information is useful for improving the performance of database accesses:
Key fields – If the lines requested from the database are retrieved according to key fields, the Database Optimizer can perform access using a primary index.
Secondary index – If the lines requested from the database are retrieved according to fields, theDatabase Optimizer can perform access using a secondary index. Secondary indexes are displayed in a dialog box whenever you select Indexes. You choose an index from the dialog box by double-clickingit. The system then displays a screen with additional information about that index.
*** SAP program ************************************
PROGRAM <name of SAP program>.
<Call enhancement><Call enhancement><Object in customer namespace><Object in customer namespace>
Customer exit Exit function moduleExit function moduleBusiness add-in MethodMethod
The purpose of a program enhancement is always to call an object in the customer namespace. You canuse the following techniques here:
• Customer ExitsA special exit function module is called by the SAP application program. The function module is part of a function group that is handled in a special manner by the system.
• Business Transaction EventsThe SAP application program dynamically calls a function module in the customer namespace.
• Business Add-InsBAdIs are the new object oriented enhancement concept, which in the future will replace all otherenhancement concepts.The advantages of this concept lies in the facts that you can have several implementations for oneenhancement and that you can use filters.The application program calls a method of a class or instance of a class. This class lies in the customernamespace.
Include inInclude incustomer namespacecustomer namespace
CALL CUSTOMER-FUNCTION
This graphic shows the flow of a program providing an enhancement in the form of a function moduleexit.
The exit function module is called in the PAI logic of a screen at a position determined by the SAP application developer. Within the function module, the user can add functions in the customer namespaceusing an include.
To call function modules use the ABAP statement CALL CUSTOMER-FUNCTION 'nnn', where nnnis a three-digit number. The programmer must also create the function module he wants to call and itsrelated function group.
These function modules always belong to function groups whose names begin with X (X function group).
The following naming convention applies to function modules:Prefix: EXIT
Name: Name of the program that calls the function module
Suffix: Three-digit number
The three parts of the name are separated by an underscore.The CALL CUSTOMER-FUNCTION statement is only executed once the enhancement project has beenactivated. Multiple calls of the same function module are all activated at the same time.
Business Add-Ins: Implementation Maint. Initial Screen
<impl>Implementation name Name of the implementation
Business Add-Ins: Definition Sel.
Display
<badi>Definition name
CreateChangeCreate
To implement Business Add-Ins, use transaction SE19 (Tools -> ABAP Workbench -> Utilities -> Business Add-Ins ->Implementation).
Enter a name for the implementation and choose Create. A dialog box appears. Enter the name of the Business Add-In. The maintenance screen for the Business Add-In then appears.
Alternatively, you can use the Business Add-In definition transaction (SE18) to reach its implementations.The menu contains an entry, Implementation, which you can use to get an overview of the existing implementations. You can also create new implementations from here.
You can assign any name to the implementing class. However, it is a good idea to observe the proposed naming convention. The suggested name is constructed as follows:
• Namespace prefix, Y or Z
• CL_ (for class)
• IM_ (for implementation)
• Name of the implementation
To implement the method, double-click its name. The system starts the Class Builder editor.
When you have finished, you must activate your objects.
Your company wants to analyze data from SAP systems in a BW system. Since no DataSourcessuitable for this purpose are delivered with Business Content, you have been given the task of using thegeneric data extraction tools to generate a DataSource that can also extract data from severalDB tables and enrich it.
Furthermore, your company wants to include the old material number as an additional characteristic in certain queries in BW. You decide to modify/enhance a Business Content extractor for getting this piece of information.
When should you use the generic data extraction toolsto create a DataSource?
Business Content does not include a DataSource for your application.You want to implement a delta method on your generic DataSource
that cannot be implemented by using the generic delta functionality(timestamp, date, ...).
The application does not allow to create additional application specificgeneric extractors (CO-PA, FI-SL, LIS).
You use your own programs in the SAP system to populate your own tables.You have to extract data from several DB tables:
using a view is not possible because of insufficient JOINonly some fields are relevant of these tablesdata has to be enriched with information not available in BW system
Make sure first that the functionality is not available within Business Content!
When you create a generic DataSource, you first have to specify the application component and the texts that describe the DataSource. The application component determines where the DataSource is stored in the DataSources section of the source system tab in the Administrator Workbench after the DataSources have been replicated.
You are then able to select the source of data for the generic DataSource. Besides using function modules as sources, you can also use transparent tables, database views and the SAP Query / InfoSetQuery to extract data.
Click on the ‘Extraction by FM' button, if you want to extract data from a function module, and type in the name of the function module and of the extraction structure that you want to use. When you generate the DataSource, its extract structure corresponds to that of the view or table that you used. You get to the screen for maintaining a function module and for maintaining an existing extraction structure by double-clicking on the relevant field.
There are several templates delivered that helps you to use function modules to extract data:
The function module is called up several times during an extraction process:
1. Initialization call:Only the request parameters are transferred from the module here. It cannot transfer data at this point.2. First read call:The extractor delivers the data typed with the extraction structure to an interface table. The number of rows expected is specified in a request parameter (I_MAXSIZE).3. Further read calls:The extractor delivers the data connected to the last package, again in a package with I_MAXSIZE rows.4. Last callThe function module is now called until the exception NO_MORE_DATA is produced. No more data can be transferred in the call in which the exception is produced.
Example
An example of a function module that corresponds to these requirements is RSAX_BIW_GET_DATA_SIMPLE. A simple procedure for creating a syntactically correct module is to copy this into its own function group, and to copy the rows of the top include of the function group RSAX (LRSAXTOP) into the top include of the function group for the module. You then have to adjust the copied function module to the requirements of the module you created.
FUNCTION EXIT_USERDEF_001.INCLUDE ZXnnnU02.(user -defined Include).
ENDFUNCTION.
FUNCTION EXIT_USERDEF_002.INCLUDE ZXnnnU02.(user -defined Include).
ENDFUNCTION.
FUNCTION EXIT_USERDEF_00x.INCLUDE ZXnnnU0x.(user -defined Include).
ENDFUNCTION.
.
.
.
.
PERFORM < Unterroutine >.:
* user -enhancement no. 1CALL CUSTOMER -FUNCTION '001'
EXPORTING <Export Parameter> = <value>
IMPORTING <Import Parameter> = <value>
TABLES
<Table structure> EXCEPTIONS
<result>...
* user -enhancement no. 2CALL CUSTOMER -FUNCTION '002'
EXPORTING
.
.
.
PERFORM INTERMEDIARY PROCESS.
* user -enhancement no. XCALL CUSTOMER -FUNCTION '00X'
EXPORTING .
.
PERFORM FINAL PROCESS.
Pre-definedexit calls
You use Customer/User Exits to branch at pre-set points from the SAP standard program run into user-defined subprograms. This adds a few features of your own to complement the standard functions available.
This also gives SAP software the highest possible degree of flexibility.The user-defined sections of code are managed in isolation as objects in the customer namespace, so there is no need to worry about complications when the next Release comes or when correction packages are implemented.
In the example above, the section between the dotted lines describes the data that the user code can use for its logic and the input that the SAP code can expect to receive back from the customer-generated logic.
Append structure ZABIW_MARA_SShort description Append for BIW_MARA_S
Coponent Component typeZZDISMM DISMM. . .
1
23
In the BW IMG (transaction SBIW) choose the path: Business Information Warehouse Postprocessing of DataSources Edit DataSources
In the first step, you have to generate the customer append for the extract structure of your DataSource. The system proposes a name that begins with ZA followed by the name of the extract structure.
Next, you maintain the fields that you want to enhance later. The names of all of these fields must start with ZZ so that they can be identified easily.
Finally, you have to generate your append structure.
In the second step, you must define the function enhancement for filling the new fields in the extract structure.
You have to create a project before you can use an enhancement like this. You can then integrate one or more enhancements into this project. For our particular task, SAP provides the enhancement RSAP0001. Remember that an enhancement cannot be used in two projects simultaneously.
There are four different enhancement components available for the RSAP0001 enhancement. The code for filling the extract structure is stored in their INCLUDEs.
EXIT_SAPLRSAP_001: Transaction data supply
EXIT_SAPLRSAP_002: Master data attributes and text supply
EXIT_SAPLRSAP_004: Hierarchy supply
Documentation is available for every enhancement project. The documentation also explains the individual parameters of the function module.
Before you are able to use it, the enhancement must be activated.
Documentation for the whole enhancement is available under Help ? Display Documentation in the project management transaction for SAP enhancements (CMOD).
If your requirements are not met entirely by the DataSources supplied with Business Content, you can bring additional information into the extraction routine by developing your own program and integrating it in a function enhancement.
Note the following four steps:
1. Define the required fields in an append structure that is attached to the extract structure of your DataSource.
2. Write/Enhance your function exit to call up the relevant sources of data for your DataSource.
3. Replicate the DataSource in the BW system.
4. Extract the data for your enhanced Business Content.
Use the EXIT_SAPLRSAP_001 function exit to fill the extra fields with data. The original fields are filled by the extractor.
Method A: Enhancing the extract structure with a customer append and using a function exit to fill the fields. This method can be used with the 'view', 'ABAP Query', and 'function module' types of extraction.Procedure:
Do not change the standard view.
Create an append structure and define your new fields (attributes).
Use the EXIT_SAPLRSAP_002 function exit to fill the extra fields with data. The original fields are filled by the extractor.
as requested in the Scheduler of BW; usually not required)
TABLES I_T_SELECT (Selection criteria), I_T_FIELDS (Transfer structure fields: only these are actually filled in the data table and can be addressed in the program), C-/I_T_DATA (Table with the data obtained from the application API), C_T_MESSAGES
The higher complexity of SAP BW data flow during the staging process rises (e.g. consolidation requirements) the more flexibility via ABAP routines is necessary to cover all demands.
The reason for using ABAP routines during data staging can simply be the flexibility of coding: Either the combination of SAP BW Business Content with customer demands or highly qualified project demands are covered by the usage of routines.
All requirements have nothing to do with application programming. Also there is no GUI programming necessary to get project requirements of a data flow into the system.
4. Transfer rule – dynamic routines to combine field by field from the transfer structureto the communication structure.
5. Update rule – start routine.
6. Update rule – dynamic routines to combine field by field from the communicationstruture to the info provider key figures or to the ODS-Object data fields
7. Update rule – dynamic routines to combine field by field from the communicationstructure to the info provider characteristics or the ODS-Object key fields
8. InfoPackage – deletion or taking out of requests after the loading processOnly for non collapsed requests;
Otherwise the request is cancelled if there is an exception aggregation;
Conditions of deletion in process chains are valid for all data targets of the InfoPackage; within the scheduler the conditions are only valid for the selected data target
1. Choose type „6“2. Give the routine a name3. Coding within ABAP editor
working principle:All selection criteria’s are listed in an internal tableStructure of the selection table (“like RSSDLRANGE”):
InfoObject – Name of the InfoObject bei Zuweisung in der ÜbertragungsregelField name – Name of the fields from the transfer structureSelection criteria: SIGN – Zuweisungsart z.B. “I” für ‘insert’Selection criteria: OPTION – Operator z.B. “EQ” für ‘gleich’Selection criteria: FROM VALUE – ‘unterer’ Wert der ZuweisungSelection criteria: TO VALUE – ‘oberer’ Wert der Zuweisung (optional)
4. The interface of the form routine “compute_<field-name_transfer-structure>” delivers:the selection table l_t_rangeThe return code p_subrc (“like sy-subrc”)
5. Within the form routine there is the index l_idx (“like sy-tabix”) which gives back the actual record of the selection field
Example coding to append several selections dynamically:form compute_VKORG
tables l_t_range structure rssdlrange
changing p_subrc like sy-subrc.
* Insert source code to current selection field
*$*$ begin of routine - insert your code only below this line *-*
data: l_idx like sy-tabix.
data: z_l_t_range like l_t_range.
read table l_t_range with key
fieldname = 'VKORG'.
l_idx = sy-tabix.
* bisherige Selektionstabelle am aktuellen Index für das Feld löschen:
delete l_t_range index l_idx.
* Anhängen von vielen VKORGs an die Selektionstabelle
* (Interne Tabelle itab_vkorg wurde im Hintergrund befüllt)
loop at itab_vkorg.
z_l_t_range-fieldname ='VKORG'.
z_l_t_range-IOBJNM = l_t_range-IOBJNM.
z_l_t_range-low = itab-vkorg_low.
z_l_t_range-sign = 'I'.
z_l_t_range-option = 'EQ'.
append z_l_t_range to l_t_range.
endloop.
modify l_t_range index l_idx.
p_subrc = 0.
*$*$ end of routine - insert your code only before this line *-*
Example:InfoObject 0SALESORG
Field VKORG
Assignment of multiple sales organisationsfrom a table which is filled already
Additional declaration of a helping structureequal to the selection structure
Deletion of the typed static selection of fieldVKORG – reason: only the routine shouldread the values (although both is possible: static values and values from the routine)
Loop through sales organisation for all selection parameters
Assignment to the helping structure
Append of the structure to the selection table
Modify the selection table at the index of thesales organisation
1. Possible scenario: the last calendarday ‚0L_DATE‘ to get the necessarybooking day(definition of OLAP variables:please refer to course BW305)
2. Define the variable with necessaryfiscal variant (in this case) and view theactual values by clicking the „glasses“button(the day before february 5 2004 equals to 20040204)
3. An OLAP variable also can befilled via an (ABAP-) exit ...
In some cases it is necessary to load data on each 15th dayof the month and additionally at the end of month again
There is no delta mechanism for itDupplication of the data would be the effect (no deletion!)
The deletion functions or conditions are able to identify duplicates of the actual requestthese duplicates can be deleted with the settings on the dialog or via the ABAP routine
Upload from client workstation or the application server?Preferable use the server path advantages:
Data load in batch mode is possiblePath is physically equal to all administrators
Data file or control file?The data file loads master, text, hierachy ortransaction dataThe control file loads ONLY the control data forthe scheduler (like it is possible to set on thesetabs) – see box on the right side – when you usea control file you need to InfoPackages: first thecontrol file and then the data file
* Name of the data file to load
FILENAME = c:\temp\uvwxyz.abc
* Type of the data file to load (binary, CSV or ASCII)
FILETYPE = BIN or CSV or TXT
* Where remains the data file (appl. server or client workstation)
The interface of the form routine„compute_flat_file_filename“delivers:
The file name p_filename.The return code p_subrc.
Scenario to use it:Transaction data is delivered monthlyby flat fileThe naming convention includes thename of the month within the filenameDirectory pathes are maintained forevery yearCoding:
concatenateDirectory pathActual yearActual month
Usually the coding uses only string operations.
form compute_flat_file_filename
changing p_filename like rsldpsel-filename
p_subrc like sy-subrc.
* Insert source code to current selection field
*$*$ begin of routine - insert your code only below this line *-*
data: l_filename like rsldpsel-filename.
concatenate '\\sapcom01\bw\Data\' „path
sy-datum(4) „year
'\bwupload' "File name with
sy-datum+4(2) "Month
'.CSV' „save as csv file
into l_filename.
p_filename = l_filename.
p_subrc = 0.
*$*$ end of routine - insert your code only before this line *-*
All fields of the transfer structure arelisted in the header of the coding.Global declaration either for the start routine or the transfer rule routines are defined in the header of the coding(delete comment of TABLES and DATA declaration);the local declarations are defined within the form routine “STARTROUTINE”.The data package is delivered withthe internal table „DATAPAK“The transfer structure and all fieldsare accessable via the internal form routine declaration l_s_datapak-<feldname> – just delete thecomment.
PROGRAM CONVERSION_ROUTINE.
* Type pools used by conversion programTYPE-POOLS: RS, RSARC, RSARR, SBIWA, RSSM.
* Declaration of transfer structure (selected fields only)TYPES: BEGIN OF TRANSFER_STRUCTURE ,* InfoObject 0CO_AREA: CHAR - 000004
KOKRS(000004) TYPE C,* InfoObject 0CO_DOC_NO: CHAR - 000010
BELNR(000010) TYPE C,* ... Weitere InfoObjekte ...* InfoObject ZAUTYP: CHAR - 000004
AUART(000004) TYPE C,END OF TRANSFER_STRUCTURE .
* Declaration of DatapackageTYPES: TAB_TRANSTRU type table of TRANSFER_STRUCTURE.
* Global code used by conversion rules*$*$ begin of global - insert your declaration only below this line *-** TABLES: ...* DATA: ...*$*$ end of global - insert your declaration only before this line *-*
FORM STARTROUTINEUSING G_S_MINFO TYPE RSSM_S_MINFOCHANGING DATAPAK type TAB_TRANSTRU
G_T_ERRORLOG TYPE rssm_t_errorlog_intABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel datapackage
*$*$ begin of routine - insert your code only below this line *-** DATA: l_s_datapak_line type TRANSFER_STRUCTURE,* l_s_errorlog TYPE rssm_s_errorlog_int.
* abort <> 0 means skip whole data package !!!ABORT = 0.
*$*$ end of routine - insert your code only before this line *-*ENDFORM.
The interface delivers following information:The global structure G_S_MINFO delivers a hand full of usable informations of the type pool scheduler / monitor – e.g.:
REQUNR LIKE RSMONVIEW-RNR, "Request
DATAPAKID LIKE RSMONVIEW-DATAPAKID, "Data package number
ISOURCE TYPE RSA_ISOURCE, "InfoSource Name
TYPE TYPE RSA_ISTYPE, "Type InfoSource = {D,M,T,H}
LOGSYS TYPE RSA_LOGSYS, "source system
And some more ...
The data package DATAPAK is stored in an internal table of type TAB_TRANSTRU and all it’s records of the type of the transfer structure TRANSFER_STRUCTURE.
An error log of the global structure G_T_ERRORLOG allows to set monitor entries (please refer to chapter monitoring).
ABORT set as a return code for the whole data package call ( ABORT <> 0 means that the whole working process of the completedata package will be cancelled!)
Particularly the start routine of the transfer rules gives„classic“ scenarios of usage:
Selective deletion of records delivered with the data packageCoding example:DELETE DATAPAK WHERE VKORG = ‘1000’.
Advantage: smaller data package loading performance improvesReasons why (examples):
The InfoPackage from the data source does not offer selection criteriaThe processing time with a selection from the InfoPackagesent to the source system takes considerable more time than without selectionThe deletion criteria is easy to code within the start routine
Fill of an internal table from a DDIC select(please refer to the performance aspects)Complex data cleansing or data consolidation from various source systems eventually with a look up as ETL process
Start routine – example: comparison to transfer rule
* Global code used by conversion rules*$*$ begin of global - insert your declaration only below this line *-** TABLES: ...* DATA: ...*$*$ end of global - insert your declaration only before this line *-*
FORM STARTROUTINEUSING G_S_MINFO TYPE RSSM_S_MINFOCHANGING DATAPAK type TAB_TRANSTRU
G_T_ERRORLOG TYPE rssm_t_errorlog_intABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel datapackage
*$*$ begin of routine - insert your code only below this line *-*
DATA: l_s_datapak_line type TRANSFER_STRUCTURE,l_s_errorlog TYPE rssm_s_errorlog_int,
l_tabix like sy-tabix.
loop at datapak into l_s_datapak_line.l_tabix = sy-tabix.
if not l_s_datapak_line-FISCPER+5(2) = sy-datum+4(2).delete datapak index l_tabix.
endif.
endloop.
* abort <> 0 means skip whole data package !!!ABORT = 0.
*$*$ end of routine - insert your code only before this line *-*ENDFORM.
Scenario: assign all fields of the transfer structure – the whole record – only when the delivered record is from the same period as the actual month.If you use the start routine for this scenario all transfer rules have to be set to 1:1 (Field TS InfoObject CS).Actually within this szenario a part of the data package is not read – this is equal to following coding:
Structure of the start routinePROGRAM UPDATE_ROUTINE.
*$*$ begin of global - insert your declaration only below this line *-*
* TABLES: ...
* DATA: ...
*$*$ end of global - insert your declaration only before this line *-*
* The follow definition is new in the BW3.x
TYPES:
BEGIN OF DATA_PACKAGE_STRUCTURE.
INCLUDE STRUCTURE /BIC/CS80FIAR_O03.
TYPES:
RECNO LIKE sy-tabix,
END OF DATA_PACKAGE_STRUCTURE.
DATA:
DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
WITH HEADER LINE
WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
FORM startup
TABLES MONITOR STRUCTURE RSMONITOR "user defined monitoring
MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
DATA_PACKAGE STRUCTURE DATA_PACKAGE
USING RECORD_ALL LIKE SY-TABIX
SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
*$*$ begin of routine - insert your code only below this line *-*
* fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
* to make monitor entries
* if abort is not equal zero, the update process will be canceled
ABORT = 0.
*$*$ end of routine - insert your code only before this line *-*
The globale declaration either for the start routine or all update rule routines can be maintained within the header section;the local declaration will be done within the form routine“STARTUP”.
The communication structure and all it‘s fields are accessable via thestructure of the DATA_PACKAGE.
The data package is delivered withthe internal table„DATA_PACKAGE“.
The interface of the form routine “startup” delivers:The table MONITOR for all monitor entries (please see slide monitoring)The table MONITOR_RECNO for all monitor entries with record numberThe data package DATA_PACKAGE as an filled internal table of the type DATA_PACKAGE_STRUCTURE with all records of the type of the communication structure of it’s corresponding sourceThe value of the variable RECORD_ALL of the type of an table index to get to know the total number of records.The name of the data origin within the field SOURCE_SYSTEM.An ABORT set as return code to influence the full data package:
ABORT <> 0 means that the processing is canceled with a “red light” for the whole data uploadThe monitor gives information about that the loading process was canceled “user defined”!
Fill an internal table from a data base table (DDIC)(please refer also to performance aspects)Look up scenarios during loading into data targetsAdding or appending of information from other dataprovider or data storesConsolidation scenarios:
Consolidate characteristic and key figure combinationsKey figure calculation to store the result on the data base in order to release the OLAP processor – e.g. when multiple basis cubes are consolidated into one
Segmentations of characteristic values in order of key figure numbers(e.g. ABC classification on characteristic values)
Organisational wise an enterprise conducts a customer segmentation;Customers are devided procentually to several segments – alltogether 100%;All existing customer information now has to be splitted to the segmentation.The master data adjustment is relatively easy – only attributes to the dedicated segmentsAll transactional data respective the key figures have to be stored – export from existing Info ProviderOne actual customer record has to be splitted to the new customer segmentation – “make more records out of one”
Implementation:Model:
The splitting rules are loaded into one ODS-Object and into two master data info objects.A new cube as template of the former one is created and all segmentation characteristics are added into the model.An export data source delivers the data from the former cube.
Implementation within the start routine:1. Master data information and ODS-Object values are read into internal tables and sorted.2. Loop over the data package3. read additional informations from the itabs and check the communication structure when ok:4. loop over ODS-Object values5. all key figures from the first to the n’ segment are calculated with the segmentation informations from the ODS-
Object and finally appended to an internal helping table all deliverd records are multiplied
6. end loop ODS-Object.7. data records which do not depend to a segmentation split are appended directly (otherwise the overallresult is not correct)8. original record from the data package will be deleted.9. End loop data package10. Append internal helping table to (empty) data package table
All update rules of key figures and characteristics are set to “1:1” (no further adaption)
Secure your work:Even if there is a back up on the server it is easy to klick on the delete button by mistake
secure your work by downloading the coding
Documentation and readability:Data staging process coding usually is not the kind of modular software for often reuse.But this is no reason not to comment the coding orThat the coding is not readable anymore!
Processing via start routines vs. assignment within transfer or update rules:
The combination of startroutine with a cycle over the data package and additional routines in transfer or update rules my rises historical but it is not useful:
multiple (2x) cycle over the whole data package!Whenever the start routine is “touched”, the data package is edited and you know how to handle ABAP
stay in the start routine!
Debugging(see chapter)All ABAP’s are able to be debugged via the PSA data package simulation (or via the monitor)May be it is usefull to code a “break-point” for development reasons …
Monitoring –when there are monitor settings useful?
Monitor settings during data loadasserts
Report error at technical reasons e.g. the SELECT return code is an errorMotto:
“if that what you expect does occur”Set monitor settings useful!Be careful with “ERROR” Messages (Type “E”)
Cancel of data load( it is better to set an “I” to get a ‘green’ information or you set a “W” to display a ‘yellow’ warning)
Application/process informationenImportant informations which are useful to understand the data which is loaded.Motto:
“give information to the administrator that mirrors the professional quality of the data”Be economical with it because for every record of the data package an information within the monitor can be displayed (uge log files).
The transformation bibliothek delivers the possibility to create complex formulas without ABAP knowledgeDuring runtime in the background ABAP code is generatedThe transformation bibliothek covers functionality forfollowing categories:
stringsDate functionsAggregates as basis functions like IF, AND, OR etc.Mathematical FunctionsPaket functions
Customer functions can be developed via BADIsThe ABAP code is not precalculated or stored on the data base – it is calculated for every run
Structure of a transfer rule routine – part 1PROGRAM CONVERSION_ROUTINE.
* Type pools used by conversion program
TYPE-POOLS: RS, RSARC, RSARR, SBIWA, RSSM.
* Declaration of transfer structure (selected fields only)
TYPES: BEGIN OF TRANSFER_STRUCTURE ,
* InfoObject 0CO_AREA: CHAR - 000004
KOKRS(000004) TYPE C,
* InfoObject 0CO_DOC_NO: CHAR - 000010
* BELNR(000010) TYPE C,
* InfoObject 0CO_ITEM_NO: NUMC - 000003
* BUZEI(000003) TYPE N,
* InfoObject 0FISCVARNT: CHAR - 000002
* FISCVAR(000002) TYPE C,
* InfoObject 0FISCPER: NUMC - 000007
FISCPER(000007) TYPE N,
* InfoObject 0COORDER: CHAR - 000012
* AUFNR(000012) TYPE C,
* InfoObject 0VTYPE: NUMC - 000003
* VTYPE(000003) TYPE N,
* InfoObject 0VTDETAIL: NUMC - 000002
* VTDETAIL(000002) TYPE N,
* InfoObject 0VTSTAT: NUMC - 000001
• VTSTAT(000001) TYPE N,
• .... alle Felder der Transferstruktur ...
END OF TRANSFER_STRUCTURE .
* Global code used by conversion rules
*$*$ begin of global - insert your declaration only below this line *-*
* TABLES: ...
* DATA: ...
*$*$ end of global - insert your declaration only before this line *-*
Header structure equals to thedefinitions in a start routineThe fields which are selected to work with are not commented anymoreGlobal declarations for TABLES and DATA can be made bydeletion of the comments
Interface definition:The actual record number is deliverdby RECORD_NO The transfer structure and the chosen fields are accessable via TRAN_STRUCTURE The parameter RESULT gives the value back to the transfer rule and its info objectWith the global structure G_T_ERRORLOG it is possible to create monitor entries (see Monitoring).The parameter RETURNCODE of the type sy-subrc delivers the caller wheather the actual record can be processed (=0) or has to be skipped (<> 0)The parameter ABORT of the type sy-subrc delivers the caller wheather the processing of the whole data package has to be terminated (<> 0) or not (= 0)
ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel datapackage
*$*$ begin of routine - insert your code only below this line *-*
* DATA: l_s_errorlog TYPE rssm_s_errorlog_int.
* Only actual year:
if TRAN_STRUCTURE-FISCPER+5(2) = sy-datum+4(2).
RESULT = TRAN_STRUCTURE-KOKRS.
RETURNCODE = 0.
else.
* returncode <> 0 means skip this record
RETURNCODE = 4.
endif.
* abort <> 0 means skip whole data package !!!
ABORT = 0.
*$*$ end of routine - insert your code only before this line *-*
ENDFORM.
Within the form routine the field RESULT gets the outputExample comparing to the chapter start routine:
The transfer structure is delivering a periodic valueActual transfer routine from field KOKRS into info object 0CO_AREAAssignment of this field and the whole record only when the deliverd record is equal to the current month (Periods 13 – 16 or 17 are not booked).If the month is equal the result is written and other assignments are allowed because of RETURNCODE = 0If not the actual record is skipped
*$*$ begin of inverse routine - insert your code only below this line*-*
DATA:
L_S_SELECTION LIKE LINE OF C_T_SELECTION.
* An empty selection means all values
CLEAR C_T_SELECTION.
L_S_SELECTION-FIELDNM = 'FISCPER'.
* ...
* Selection of all values may be not exact
E_EXACT = RS_C_FALSE.
*$*$ end of inverse routine - insert your code only before this line *-*
Invertion routines are used if the data flow runs into the opposite direction retraction e.g. to the CRM-SystemInvertion routines are only to be implemented when
SAP Remote Cubes are used to directly read data from the source system orWith a report-report-interface and a combined Drill-Through to the source system
The coding is reflecting selection criteria for characteristic values which were allowed by running into the BW:
See example before:CO_AREA for one period and the actual month
Convertion routines depend to InfoObjectsWithin the transfer structure you can see which convertionroutine is defined for the objectsWithin the InfoPackage you can allow these converts or you canswitch them offConvertion routines are function modules to
Bring source system differen field definitions togetherEnhance data qualityAvoid data inconsistency
You can call the function modules by double click on the routinewithin the InfoObjectExamples:
ALPHA conversion (leading zeros),Material number conversionDate formatsAnd a lot more …
InfoCube characteristic update ruleODS-Object key field update rule
Characteristic assignment – change of source (1)Give a name for the update rule (2)Interface:
Like in the start routine globale declaration within theheaderThe form routine „compute_key_field“ delivers followingparameters:
The table MONITOR for monitoringThe communication structure COMM_STRUCTURE with all fields from the data sourceThe current record number of the loop over the data packageThe number of total recordsRESULT delivers to the marked characteristic the result valueRETURNCODE says wheather the current record will be processed or notABORT clarfies wheather the whole data package will be processed or not
PROGRAM UPDATE_ROUTINE.*$*$ begin of global - insert your declaration only below this line *-** TABLES: ...* DATA: ...*$*$ end of global - insert your declaration only before this line *-*
FORM compute_key_fieldTABLES MONITOR STRUCTURE RSMONITOR "user defined monitoringUSING COMM_STRUCTURE LIKE /BIC/CS80FIAR_O03
RECORD_NO LIKE SY-TABIXRECORD_ALL LIKE SY-TABIXSOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
CHANGING RESULT LIKE /BI0/V0FIAR_C03T-ACCT_TYPERETURNCODE LIKE SY-SUBRCABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
**$*$ begin of routine - insert your code only below this line *-** fill the internal table "MONITOR", to make monitor entries
* result value of the routineRESULT = .
* if the returncode is not equal zero, the result will not be updatedRETURNCODE = 0.
* if abort is not equal zero, the update process will be canceledABORT = 0.
*$*$ end of routine - insert your code only before this line *-**ENDFORM.
Within update rules there are variospossibilities to reference time dependecies:
Characteristic update rule with time reference
It is possible to update data targetcharacteristics of type „DATE“ via an automatic time distribution.
The field is feeded from a data field out of the communication structure
For example the data target characteristiccalendar day is feeded from the sourcecharacteristic calendar month
The effect is that every summariced keyfigure has to be distributed to the calendardays of the month
In addition to that it can be definedwheather the company calendar has to beused.
Time characteristics
Within the frame time reference various data target time characteristics can be filled automatically from only one time characteristic of the data source.
Interface:Like in the start routine globale declaration within theheader
The form routine „compute_data_field“ deliversfollowing parameters:
The table MONITOR for monitoring.
The communication structure COMM_STRUCTURE with all fields from the data source.
The current record number of the loop over the data package
The number of total records
RESULT delivers to the marked characteristic the result value
RETURNCODE says wheather the current record will be processed or not
ABORT clarfies wheather the whole data package will be processed or not.
PROGRAM UPDATE_ROUTINE.*$*$ begin of global - insert your declaration only below this line *-** TABLES: ...* DATA: ...*$*$ end of global - insert your declaration only before this line *-*
FORM compute_data_fieldTABLES MONITOR STRUCTURE RSMONITOR "user defined monitoringUSING COMM_STRUCTURE LIKE /BIC/CS80FIAR_O03
RECORD_NO LIKE SY-TABIXRECORD_ALL LIKE SY-TABIXSOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
CHANGING RESULT LIKE /BI0/V0FIAR_C03T-DEB_CRE_LCRETURNCODE LIKE SY-SUBRCABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
**$*$ begin of routine - insert your code only below this line *-** fill the internal table "MONITOR", to make monitor entries
* result value of the routineRESULT = .
* if the returncode is not equal zero, the result will not be updatedRETURNCODE = 0.
* if abort is not equal zero, the update process will be canceledABORT = 0.
*$*$ end of routine - insert your code only before this line *-**ENDFORM.
If it is necessary to use a return table the generatedupdate rule differs maginal.... (1) ... (2) ...Interface:
Global data declaration within the header in the same way like in start routine and characteristicsThe form routine „compute_data_field“ delivers followingparameters:
The table MONITOR for monitoring.The table RESULT_TABLES with the structure of the data targetThe communication structure COMM_STRUCTURE with all fields from the data source.The current record number of the loop over the data packageThe number of total recordsRESULT delivers to the marked characteristic the result valueRETURNCODE says wheather the current record will be processed or notABORT clarfies wheather the whole data package will be processed or not.The structure of the data target is assigned with the parameter ICUBE_VALUES
PROGRAM UPDATE_ROUTINE.*$*$ begin of global - insert your declaration only below this line *-** TABLES: ...* DATA: ...*$*$ end of global - insert your declaration only before this line *-*
FORM compute_data_fieldTABLES MONITOR STRUCTURE RSMONITOR "user defined monitoring
RESULT_TABLE STRUCTURE /BI0/V0FIAR_C03TUSING COMM_STRUCTURE LIKE /BIC/CS80FIAR_O03
RECORD_NO LIKE SY-TABIXRECORD_ALL LIKE SY-TABIXSOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYSICUBE_VALUES LIKE /BI0/V0FIAR_C03T
CHANGING RETURNCODE LIKE SY-SUBRCABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
**$*$ begin of routine - insert your code only below this line *-** fill the internal table "MONITOR", to make monitor entries
* if the returncode is not equal zero, the result will not be updatedRETURNCODE = 0.
* if abort is not equal zero, the update process will be canceledABORT = 0.
*$*$ end of routine - insert your code only before this line *-**ENDFORM.
Please refer to „How To ... Disaggregate on upload“ withinthe appendix or at SAPNet
Application scenario:the amounts of values of a cost center have to bedistributed to the employes of the cost center because of the demand to report on employees basis.
One record given from the communication structure has to be splitted to the number of employes of one costcenter.
If it is necessary to calculate the unit the generatedroutine differs maginal.... (1) ... (2) ...Interface::
Globale data declaration within the header in the same way like in start routine and characteristicsThe form routine „compute_data_field“ delivers followingparameters:
The table MONITOR for monitoring.Die Kommunikationsstruktur COMM_STRUCTURE mitallen Feldern aus der Datenquelle.The communication structure COMM_STRUCTURE with all fields from the data source.The current record number of the loop over the data package.The field SOURCE_SYSTEM delivers the logical system name of the data source.RESULT delivers to the marked characteristic the result value.UNIT delivers the value unit corresponding to the key figureRETURNCODE says wheather the current record will be processed or notABORT clarfies wheather the whole data package will be processed or not.
PROGRAM UPDATE_ROUTINE.*$*$ begin of global - insert your declaration only below this line *-** TABLES: ...* DATA: ...*$*$ end of global - insert your declaration only before this line *-*
FORM compute_data_fieldTABLES MONITOR STRUCTURE RSMONITOR "user defined monitoringUSING COMM_STRUCTURE LIKE /BIC/CS80FIAR_O03
RECORD_NO LIKE SY-TABIXRECORD_ALL LIKE SY-TABIXSOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
CHANGING RESULT LIKE /BI0/V0FIAR_C03T-DEB_CRE_LCUNIT LIKE /BI0/V0FIAR_C03T-LOC_CURRCYRETURNCODE LIKE SY-SUBRCABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
**$*$ begin of routine - insert your code only below this line *-** fill the internal table "MONITOR", to make monitor entries
* result value of the routineRESULT = .
* result value of the unitUNIT = .
* if the returncode is not equal zero, the result will not be updatedRETURNCODE = 0.
* if abort is not equal zero, the update process will be canceledABORT = 0.
*$*$ end of routine - insert your code only before this line *-**ENDFORM.
Copy update rulesWhen creating new update rules you can work with the template of a different Info Provider:
the start routine if existing and all other assignments are copied
Copy single update rulesYou can copy (duplicate) single update rules:
1. Mark the line
2. Menue ‚Edit‘ copy rule or F6 or right mouse klick
3. Another line is added with exactly the same update rules
Important: when the update rule runs one more record is added to the data target !!
This is an alternative to the return table feature ...
Data security, documentation and readability of the programEven here it is useful to save the programs via downloading although there is a back up on the application server
Similar to the might more complex start routines also here comments help to keep the coding readable
the readability of the program improves a lot
Debugging(see chapter)
All ABAP’s are able to be debugged via the PSA data package simulation (or via the monitor)
May be it is usefull to code a “break-point” for development reasons …
Configure flexible InfoPackage selection with ABAP
Delete equal or similar requests into an InfoProviderusing ABAP
Define with ABAP file and directory names for a flexible flat file upload
Use start routines flexible
Model transfer and update rules flexible with ABAP
Unit Summary
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Exercises
Unit: Data Staging Topic: Flexible Selection
1-1 You want to fill an InfoCube with data from a flat file.
1-1-1 Create an InfoCube ‘T_BC##’ with the description ‘Cube Group ##’ under InfoArea BW Training BW/RIG Workshops ABAP Topics – Backend
Group ##. Copy the InfoCube ‘T_BC00’. Activate the InfoCube.
1-1-2 Create an InfoSource ‘T_EXTERNAL_GR##’ with the description ‘External Data GR##’ under the application component BW Training ABAP Topics – Backend. Use the InfoSource ‘T_EXTERNAL_GR00’ as template.
Select ‘I_EXTERN – IDES External Data’ in the Source System field and make the InfoObject 0COSTELMNT available for data selection in an InfoPackage. Activate the InfoSource.
1-1-3 Create update rules for your InfoSource ‘T_EXTERNAL_GR##’ of your InfoCube ‘T_BC##’. Activate the update rules.
1-1-4 Save the flat file ‘T_COSTCENTER_TRANS_2004002’ to your local hard disk. You will find the file in the SAP menu under Office Folders
Shared folders (TA SO04) TRAINING ABAP Backend Workshop.
1-2 The flat file contains data records with different cost elements. You only want to load certain cost elements in your InfoCube. In order to constrain the upload you use an ABAP for the data selection.
1-2-1 Create an InfoPackage for your InfoSource ‘T_EXTERNAL_GR##’ with the description ‘Flexible Selection GR##’.
Create an ABAP Routine for the selection of 0COSTELMNT with the description ‘Cost element selection’. Define an internal table that you fill with the following values:
Insert the path of the flat file under the External data tab and make the following adjustments:
Field Name or Data Type Values
CSV file
Data Separator
Escape Sign
Separator for Thousands
Character Used for Decimal Point
Number of Header Rows to be Ignored
X
;
“
.
,
1
Save the InfoPackage and start the data load into your InfoCube immediately.
1-2-2 Check in the administration of your InfoCube if the upload was successful.
How many data records were inserted into the InfoCube?
Which values have the key figures?
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Unit: Data Staging Topic: Flexible File Name
2-1 You want to fill an InfoCube every month with transaction data from a flat file. The corresponding flat file will be available ever month in the same folder. The file name contains year and period.
Therefore you want to use a flexible file path that changes automatically the filename of the flat file.
2-1-1 Create an InfoPackage for your InfoSource ‘T_EXTERNAL_GR##’ with the description ‘Flexible File Name GR##’.
Create an ABAP Routine for your file path under the External data tab with the description ‘Flexible File Name’. The routine should return the current file name with the correct suffix concerning year and period.
Make the following adjustments:
Field Name or Data Type Values
CSV file
Data Separator
Escape Sign
Separator for Thousands
Character Used for Decimal Point
Number of Header Rows to be Ignored
X
;
“
.
,
1
Mark the radio button for deletion of the entire data target content. Save the InfoPackage and start the data load into your InfoCube immediately.
2-1-2 Check in the administration of your InfoCube if the upload was successful.
How many data records were inserted into the InfoCube?
Which values have the key figures?
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Unit: Data Staging Topic: Start Routine
3-1 You want to fill an InfoCube with data from a flat file. The flat file contains data records with the value type ‘actual’ (VTYPE = 10) and ‘plan’ (VTYPE = 20). The InfoCube should be filled only with actual data records.
Therefore you create a start routine for the transfer rules.
3-1-1 Edit the transfer rules of your InfoSource ‘T_EXTERNAL_GR##’.
Create a start routine for the transfer rules. The routine should delete all data records that have the value type ‘020’ (Plan). Activate the InfoSource.
Create an InfoPackage for your InfoSource ‘T_EXTERNAL_GR##’ with the description ‘Start Routine GR##’. Insert the path of the flat file under the External data tab and make the following adjustments:
Field Name or Data Type Values
CSV file
Data Separator
Escape Sign
Separator for Thousands
Character Used for Decimal Point
Number of Header Rows to be Ignored
X
;
“
.
,
1
Mark the radio button for deletion of the entire data target content. Save the InfoPackage and start the data load into your InfoCube immediately.
3-1-2 Check in the administration of your InfoCube if the upload was successful.
How many data records were inserted into the InfoCube?
Step 1 (I_STEP = 1) is called before the processing of the variable pop-up and gets called for every variable of the processing type “customer exit”. You can use this step to fill your variable with a default or proposal value.
Step 2 (I_STEP = 2) is called after processing of the variable pop-up. This step is called only for those variables that are not marked as “ready for input” and are set to “mandatory variable entry”.
Step 3 (I_STEP = 3) is called after all variable processing and gets called only once and not per variable. Here you can validate the user entries.
Step 0 (I_STEP = 0) is called for variables which are used in authorizations objects.
Depending on the variable type, the fields LOW, HIGH, SIGN and OPT of the internal table E_T_RANGE must be filled as follows and it is the same as in a selection option:
LOW Single value or lower interval limit (characteristic variable)
Depending on the variable type, the fields LOW, HIGH, SIGN and OPT of the internal table E_T_RANGE must be filled as follows:
HIGH Upper interval limit (char. variable with interval) InfoObject of the hierarchy node (hierarchy node variable):
when selecting a sheet in the LOW field this field must remain empty
when selecting a node that cannot be posted into the LOW field, the referred InfoObject, has to be filled in this field (usually here the InfoObject '0HIER_NODE'). Only if the node refers to another InfoObject than the variable, the name of the other InfoObject must appear here. HIGH remains empty for other variable type.
Please note that you cannot overwrite the user input values into a variable with this customer exit. You can only derive values for other variables or validate the user entries.
Business Scenario:The business department wants to check if the orders from the customers for a certain week (e.g. 6 weeks from now) are already transmitted into their system. Therefore they run a report with a variable, that is calculated automatically.
Note: Of course this is not the only suitable solution, but it will show you how the exit works.
Step by Step Solution:• You should create a variable on the info object 0CALWEEK like
this:
• Technical Name: WS_FWEEK• Description: Workshop Future Week• Interval• Mandatory variable entry• Processing: Customer Exit• NOT ready for input
• Define a query using the above created variable.
• For displaying the variable values in the executed query you also may define two text variables. These should, if necessary, be able to show the lower and the upper value calculated by the customer exit.
Example for filling variable values depending on an other variable /Deriving one variable value from another variable
Business scenario.A query should show in one column the value for one period. The period
should be entered by the user. In the second column the accumulated vale from the beginning of the year to the period of the first column should be displayed. The InfoCube contains only the InfoObject0CALMONTH (Month/Year) and not single InfoObjects for period and year.
Solution:For this scenario we need four variables: two variables for the column text and
two variables for the period values. One of these period variables is defined as a variable with a customer exit.
Step by Step Solution:• Create an input variable “MONTH” (you need this specific information for the coding example below). The variable is based on the InfoObject0CALMONTH. Set the following attributes:• Single Value• Ready for input• Mandatory variable entry
• Create a variable “CUMMONTH” (you need this specific information for the coding example below) with a customer exit as a processing type. The variable is based on the InfoObject 0CALMONTH. Set the following attributes:• Interval• NOT ready for input!• Mandatory variable entry• Create two text variables. Both variables use the “Replacement Path” as processing type. The first text variable is filled with the text from the “from value”. The second is filled with the text from the “to value”.
1-1 You want to create a report for your InfoCube that should display ‘cost center’, ‘cost element’ and ‘amount’. For ‘amount’ should be displayed values for ‘year’ chosen by the user and for the year before.
For the determination of ‘year before’ use a User Exit.
1-1-1 Open the BEx Analyzer and create a new query ‘BEXVARGR##’ with the description ‘BEx-Variable GR##’ for the InfoCube ‘T_BC00’.
Insert ‘T_05C00’ and ‘Cost Element’ in Rows.
Create a restricted key figure at the query level for ‘Amount’ restricted to ‘Fiscal year’. In order to restrict ‘Fiscal year’ create a variable ‘YEARGR##’ with the description ‘Year Input GR##’ for user entry.
Create another restricted key figure at the query level for ‘Amount’ restricted to ‘Fiscal year’. In order to restrict ‘Fiscal year’ create a further variable ‘YEXIT##’ with the description ‘Year with Exit GR##’ for a User Exit.
For creation of the User Exit use transaction ‘CMOD’ and project ‘RSR00001’. Implement your coding in the function module ZEXIT_VAR_GR##. The Exit should return the year before the year chosen by user entry. Save and activate the function module
1-1-7 Save and execute the query. Check if the values and variables are filled correctly.
A virtual characteristic or a virtual key figure is an InfoObject which is defined within the InfoProvideras meta data without having any data storedphysically.
What is a virtual characteristic or a virtual key figure?
You can also enter multiple InfoProviders or use Wildcards to select the InfoProviders
It is highly recommended to copy the sample code via Goto -> Sample Code -> Copy. The sample code can only be copied before the implementation is activated!
FLT_VAL: Filter values for the InfoProvider as defined in the BAdII_S_RKB1D: General informations about InfoProvider and Query. Most useful are the fields InfoCube and COMPUID or GENUNIID for identifying the queryI_TH_CHANM/KYFNM_USED: List of the Characteristics/ Key Figures used in the queryC_T_CHANM/KYFNM: List of Characteristics/Key Figuresused for calculation in the BAdI. Must be filled in thismethod.
You can select the InfoCube in case you did not restrict the BAdI Implementation to only one cube
You only need to provide the mode for characteristics
There are two modes:- no_selection: This mode is used for virtual characteristics. The value is not read from the database.- read: The value is always read from the cube, even if it is not used in the query. Example: You needto get prices for materials but the query runs on material group level.
You should add a public instance attributeP_CHA_<IOBJECT> for each InfoObject you want to accessin the COMPUTE method (real or virtual) and a publicinstance attribute P_KYF_<IOBJECT> for each key figureyou want to use. These attributes should be of type I.
In the COMPUTE method these attributes are filled thefollowing way:
Value = 0InfoObject is not used in the query.
Value > 0Position of the InfoObject in the structure C_S_DATA
I_PARTCUBE: At MultiProviders partial cube for that thevalues are calculated
I_NCUM: Non-cumulative Value
C_S_DATA: Data structure of the InfoProvider. Should befilled in the method. You can best access the structure byusing the attributes P_CHA_<characteristic> and P_KYF_<characteristic>. They are defined as Integer values and are filled at the moment of calling the method. Ifthe value is larger than 0, the number means the position of the field in structure C_S_DATA.
The Method is called once per record that is read from thecube. This can lead to huge performance problems duringquery execution. To avoid this keep the following things in mind:
Avoid Selects if possible
The mode READ may add additional InfoObjects to the query so that old aggregates do not work anymore
l_s_chanm-mode READ lowers the detail niveau of characteristics unvisible to the user
result of this is that existing aggregates can not be usedeventually performance reduction
At the conclusion of this exercise, you will be able to:
• Implement the BAdI for ‘Virtual Characteristics and Key Figures’
In this exercise you will define a virtual characteristic to filter values depending on your user name.
1-1 Implementing the BAdI
1-1-1 Create the BAdI ZT_VT_GR## with the definition RSR_OLAP_BADI. Give it the short text: Virtual Characteristics Group XX
1-1-2 Restrict it to the InfoCube T_VT_GR##.
1-1-3 In the DEFINE method enter the following coding:
* characteristic
l_s_chanm-chanm = 'T_FLAG'.
l_s_chanm-mode = rrke_c_mode-no_selection.
APPEND l_s_chanm TO c_t_chanm.
l_s_chanm-chanm = 'T_COUNT'.
l_s_chanm-mode = rrke_c_mode-read.
APPEND l_s_chanm TO c_t_chanm.
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
1-1-4 Create two attributes P_CHA_T_COUNT and P_CHA_T_FLAG in the class ZCL_IM_T_VT_PR## as static instance attributes of type I. Change the implementation of the COMPUTE method. Enter the following coding:
field-symbols: <l_count> type /bic/oit_count,
<l_flag> type /bic/oit_flag.
if P_CHA_T_COUNT > 0.
assign component P_CHA_T_COUNT of structure C_S_DATA to <l_count>.
endif.
if P_CHA_T_FLAG > 0.
assign component P_CHA_T_FLAG of structure C_S_DATA to <l_flag>.
endif.
if <l_count> <= sy-uname+6(2).
<l_flag> = 'X'.
else.
<l_flag> = ' ' .
endif.
1-1-4 Activate all objects.
1-2 Test the BAdI
1-2-1 Go to the Query designer and build some reports. List the fields T_COUNT, T_FLAG and T_NUMBER
1-2-2 Filter the value T_FLAG. Try the values ‚X‘ and ‚ ‚.
1-2-3 Remove T_COUNT. The sums don’t change even if there are less rows.
Note That the values are different for each user. You could use this technique to implement complex user-specific filters
Simulation/What-IfForecastTop-down distribution Bottom-up aggregation...
Planning sequences
Event-based execution of planning functions
Automatic Planning
Planning Functions
User Interfaces
Manual Planning
Excel, ALV and WebDocument management
Application design
Control and Distribution of Data
Process Control
Status and tracking monitorDistribution to operational systems
Use planning functions for Background Processing
Planning in BW-BPS comprises four major areas:
In Modeling the data model of a planning process is defined. That includes how the planning process will be set up in regards to business structure (slice and dice) and planning strategy (e.g. bottom-up, top-down, counter-current).
Planning functions are operations that support the automatic processing of plan data (e.g. copying, forecasting, ...) in a simulative fashion or embedded in batch processing.
Planning layouts in manual planning allow to set up the context of entering plan data for different users. Planning interfaces consist of one or many planning layouts together with planning functions to support the planner with the planning task.
In order to coordinate the sequence of the planning process a planning tasks may be assigned to the planners and the status of the planning accomplishments can be tracked over the cycle of a planning session.
Planning is usually executed along organizational structures. Different planners are viewing and changing data on different levels of granularity. These levels of granularity correspond to the levels in the hierarchy of the organizational structures.
When modeling the planning process this hierarchy must be materialized in the planning system.
The BW-BPS uses InfoCubes as a data storage. So-called planning levels define the fixed levels of aggregation in the InfoCube. Thus the organizational structures correspond to a set of planning levels.
On the slide we see an example of an organizational structure in the sales organization of a company that is selling food products. The planning in the company is done on the different levels defined by the organizational structure – there are planners for the different products (like “Orange Juice” and “Apple Juice”) for different product groups (“Juice” and “Water”), for different business units (“Non-Alcohol” and “Alcohol”) etc. We will use this hierarchy for showing how such a planning process can be modeled in the BW-BPS and by doing so will learn about the different components of the BW-BPS.
Database for planning modelsCorresponds to oneor more InfoCubesDetermines the characteristics and key figures for planning
The planning area is the base for planning models.
In the planning area all organizational structures and planning tasks related to a specific planning process can be set up.
It corresponds to one InfoCube (Basic Planning Area) or multiple InfoCubes (Multi-Planning Area) and may be seen as an enhancement of the InfoCube in order to be able to plan. The InfoCube includes all characteristics and key figures that are relevant in the planning process.
The InfoCube we use in our example contains the characteristics country, version, year and period (and fiscal year variant), product, and product line. The key figures are quantity, price, and revenue.
The planning area comprises:
Attributes (of planning area) – Connection to BW via InfoCubes
Data Slices – Locking of specific characteristics against changes
Variables
Master data – all characteristics and key figures of the corresponding InfoCube
Characteristic relationship – procedures to check, derive or propose valid combinations of characteristic values.
Slice of the InfoCubeIncludes selection of characteristics and characteristic valuesCarrier of planning methods
Planning levels determine the granularity on which data is planned.
They also may be used to define the hierarchical orders of the planning model.
In our example a planner who is responsible for planning on products uses a planning level that contains the characteristics country, year (and fiscal year variant), version, product and product group. A planner who is responsible for planning on the product group uses a different planning level that contains country, year (and fiscal year variant), version, and product group.
Planning functions (“automatic planning”) and planning layouts (manual data entry) are depending on exactly one planning level.
In addition planning levels may include selections of characteristic values (e.g. product group: water, juice)
The flag "Selection in package" determines whether characteristic values are selected in the planning level or in the planning package (e.g. for personalization)
Selection of characteristic values in the planning level is used when the same selection applies to several planning packages. (e.g. fiscal year: 2010 is to be planned by all planners)
Selection of characteristic values in the planning packages is used when different selections are needed on one planning level (e.g. one planning package for product “Orange Juice” another planning package for product “Apple Juice”)
Subset of the planning levelSelection of characteristicvaluesWork package for one user or a group of users
The planning package is a subset of the planning level (requires setting "selection in package" in planning level)
Planning packages are always assigned to exactly one planning level. One planning level can contain several planning packages.
Individual restrictions can be made using a planning package (e.g. single product according to responsibility) for personalization
If no planning package specific selection is necessary (full selection on the planning level, personalization using variables), at least an Ad-hoc-package must be created. In order to execute a planning method, the selection of a planning package is mandatory.
Techniques for changing plan dataManual planningPlanning functionsParameter groups
Planning methods cover the different ways how data may be generated or manipulated using BW-BPS
Manual planning comprises the planning layouts that are used for manual data entry.
Planning functions are methods that automatically change data.
BW-BPS delivers generic planning functions that range from copying, distributing, deleting and others to a formula builder which can be used to create calculations according to a company‘s individual business rules without needing any ABAP programming.
Planning functions are two-fold: the planning function and parameter group(s).
The planning function defines the basic settings of the function (what characteristics and key figures are the basis for the function?) – e.g. Datasets shall be copied from version 1 to version 2, then the planning function will get the information, that version is the basis for the copy.
The parameter group defines, which values should be used to execute the planning function and which conditions are applicable. For the copy that would mean from version 1 to version 2.
The separation of planning function and parameter groups allows to reuse planning functions for multiple purposes (e.g. one parameter group to copy from version 1 to version 2, another parameter group to copy from version 2 to version 3)
• All techniques for changing plandata automatically
• E.g.: Copy, Top-Down, Revaluate, Exit
• Defines all the parameters of a Function• E.g.: Copy version 100 to version 200
Planning functions change the transaction data in a planning package. Each planning function is assigned to just one planning level. There are various types of planning function (for example: exit, copy, distribute, delete, formula). Parameter groups are created for a planning function.
All Planning Functions are generic, which means, that the business logic has to be given by the person, which customizes it.
Generally speaking you create an input mask, defining the planning function, by addressing fields to the "fields to be changed".
Then, within the parameter group you will then have to fill in the concrete rule in the input mask you created within the planning function itself.
After that, the planning function is executed by double-clicking the planning package and the parameter group.
If you use a planning function type Exit you do not have predefined masks for the parameter group. You ca define parameters for your Exit yourself but you do not have to do so. Nevertheless you always need a (potentially empty) parameter group for execution of the planning function.
2002010Juice1002010WaterAmountFiscal YearProduct Group
Entries in the Level (not containing Product Group):
3002010AmountFiscal Year
4002010AmountFiscal Year
Entries in the InfoCube:
1002010#2002010Juice1002010WaterAmountFiscal YearProduct Group
Change New Record
Aggregation
400
In BW-BPS data can be created on any aggregation level of an InfoCube
e.g. the InfoCube includes the characteristics product group and product – a user can enter data on the aggregated level (product group) or on a detailed level (product group, product).
If the data is entered on a detailed level it can always automatically be aggregated. If the data is entered on an aggregated level it can either be distributed to a more detailed level or just stay on the aggregated level.
If data remains on an aggregated level, it is "not assigned" to some details (e.g. product)
Not assigned values may be distributed to detailed planning levels using distribution functions or manual assignment.
The example illustrates these effect
The InfoCube in the example includes the characteristics product group, fiscal year and the key figure amount.
First table: The planning level contains product group, fiscal year and amount, the planning package is restricting the characteristic values for product group to water and juice and the fiscal year to 2010
Second table: The planning level contains fiscal year and amount only and shows the aggregated amount entered in the first table. The amount is changed.
Third table: The planning level contains product group, fiscal year and amount, the planning package is restricting the characteristic values for product group to water and juice and the fiscal year to 2010. As a change was made on an aggregated level, there is an "unassigned value" (#) remaining regarding the product group.
The possible next step would be to decide whether the “unassigned value” should be distributed to the different product groups or remain on “unassigned”.
Planning methods in BW-BPS work on data which is temporarily stored in a buffer called planning buffer
Each user who works in BW-BPS writes data into a personal buffer
All data in the buffer is locked against changes by other users.
Whenever a planning function is called, subsets of data are used from the buffer. If they are not already in the buffer, the data is called from the transactional InfoCube.
The subsets that are called into the buffer depend on the data selection of the planning package and the planning function. (e.g. package includes countries Germany and Spain but the planning function only changes or creates data for Spain, then only a subset including Spain is called into the buffer.
Whenever a user chooses to save data is written into the InfoCube. In order to do so, deltas are calculated.
Even when a “save” of the data was carried out the data remains in the planning buffer. Only upon leaving of the planning application the data is released from the buffer and available for changes by other users.
Whenever data records of the InfoCube are manipulated (planning function, derivation) or the characteristics in the data records (combination check, create combination) dynamic structures are used. Thus any access has to be done using field symbols.
A variable type exit is defined via a function module that is returning a selection table. Upon runtime the variable will be replaces by this selection. Variables in BW-BPS can be used in the data selection, the definition of planning function, and the definition of planning layouts.
Characteristic relationships are used to model the relationships between characteristics. They can be used to control that only valid characteristic combinations are written to the InfoCube. E.g. a characteristic relationship between product groups and products assure that every record that contains a juice product (e.g. Orange Juice) is booked to the product group “Juice”. Any water product will be booked to the product group “Water”. Characteristic relationships are defined in the planning area and thus hold for all planning levels. Depending on which characteristics are selected in the planning level the system fills in missing characteristic values automatically (so-called “derivation”), or creates all possible entries as a template for the manual planning or checks all new records created by the user.
Planning functions of type exit can be used to create new records, change existing records, or delete existing records. We will concentrate on planning functions in this workshop.
Purpose: Used for changing key figures (and potentially updating characteristic values).
Function exits contain a main function module (always executed when the planning function is started) and optionally an initialization function module.
The fields which you want to change through the exit must be declared in the fields to be changed.
Definition of fields to be changed: characteristics whose values you wish to change in the planning function .
Notes:
In a planning function type Exit you can change any key figure.
You can only change the characteristics that are in the fields to be changed
You can only use characteristic values that are in the selection of the package
All characteristics that are not in the fields to be changed are use to partitionate the data upon execution of the planning function (subsets – see later).
Performance: The more fields are changed, the slower becomes the function.
Before a planning function is executed the selected data is cut into smaller sets of records called subsets.
The planning function will be executed several times, once for each subset.
The data from the package is grouped by the characteristics that are contained in the planning level but not in the fields to be changed: within a subset all records have the same characteristic values for those characteristics. Characteristics that are not in the planning level can be ignored.
Reason for building subsets: facilitates the coding of planning functions and Fox formulas.
Note You cannot rely on the sort order of the subsets! Do not implement coding on the sequential order in a formula or exit function.
When an exit function is executed, the system retrieves the data defined by the planning package. The data is read either from the InfoCube or from the buffer, but the developer does not have to worry about this.
In the first phase, the system builds subsets of the selected data. The process of the the subsets creation depends on the "fields to be changed" setting in the definition of the exit function. How many subsets are created depends on the "fields to be changed" and the selected data. Note Subsets will be created only for existing data!
Now the system calls the INIT function. Preliminary work like reading reference data or selecting from a database table should be implemented here. Optionally, you can return additional subsets. This way you can make sure that the main EXIT function is called for a subset.
In the second phase, the main EXIT function is called once for each subset.
You create a planning function type exit by creating a function module (the exit module) that has a predefined interface (a template can be found in the function group UPFX).
The function module receives a data table. You can create, delete or change existing records.
The planning function is called with several subsets. You have to make sure that you only create records that are contained in the current subset.
In order to use a planning function type exit you have to create it in the planning framework (BPS0), specify the fields to be changed and enter the name of your function module.
The most important parameters of the interface are:
Area, level, method, package, parameter group
The selection of the current subset – ito_charsel
The actual data table (containing the existing data) - xth_data
Example: Multiply all values for the keyfigure 0AMOUNT by 2.
Field-symbols: <ls_data> type any,
<l_amount> type any.
Loop at xth_data into <ls_data>.
assign component ‘S_KYFS-0AMOUNT’ of structure <ls_data>
to <l_amount>.
<l_amount> = <l_amount> * 2.
Endloop.
There is a second way to access the data in the xth_data table. You can define the type of the field symbols before the loop. Thus the assigns are executed only ones instead of in each loop. This is faster if you are accessing many fields in the structure of xth_data:
Data: lr_data type ref to data.
field-symbols <ls_data>.create data lr_data like line of xth_data. assign lr_data->* to <ls_data>.
[...]
assign component: 'S_CHAS-0FISCYEAR' of structure <ls_data> to <0FISCYEAR>, 'S_CHAS-0VERSION' of structure <ls_data> to <0VERSION>.
[...]
Loop at xth_data assigning <ls_data>. [..do something with 0Version and 0Fiscyear..]
Create simple planning function type Exit that deletes data
1. Copy template function module
2. Implement desired functionality.
3. Create a planning function
i. Locate planning level
ii. Add function module as planning function of type ‘Exit’
iii. Create a parameter group
4. Test planning function
Create the function module:
Call transaction se80. Navigate to function group UPFX.
Copy the function module “TEMPLATE_EXIT” to the function module “Z_PF_DELETE” in the function group “ZTRAINING”.
Open the new function module. Enter the coding “clear xth_data.” Save and activate the function module.
Create the planning function:
Enter the Planning Workbench (transaction BPS0).
Expand the planning area “Workshop PDESE1 (TRAINER)” (PDESE1), double-click on the planning level „Copy, vol*price Exit“ (LEV14).
In the lower tree do a right mouse click on the name of the planning level and choose “Create Planning Function”. Enter a name, a technical name and choose the type “Exit-Function” from the tree.
Enter the name of the function module (Z_PF_DELETE) into the field “Function module”.
Create a parameter group by using the context menu on the name of your exit function in the lower tree on the left. Enter a name and a technical name.
In order to view the existing data in the InfoCube double click on the “Ad-hoc” package, expand the node “Manual Planning” and double click on the layout “Enter volume” (LAY1).
In order to run the planning function double click on the parameter group. You will see that the data is deleted.
Save your changes in the customizing but not the changes in the data:
In the menu choose “Planning“ and then “Save model“.
Leave the planning workbench by choosing “Cancel“.
Problems: The planning function (Exit function module) is called several time. But preliminary work as reading reference data or selecting from a data base should only be done once.
If there are no records in the selection for a subset the planning function is not called with that subset. If there is no data at all in the selection then there are no subsets to be formed and the exit module is not called at all.
Solution: the “Init” moduleThe Init module is optional.
If a function module is entered as init module it is called exactly once when the planning function is executed. Any preliminary work as reading additional data from the data base should be done here.
The Init module can be used for „generating“ subsets if they do not exist yet.
The Init module has NO access to the data table.
A template can be found in function group UPFX.
The Init module receives information about the environment it is called from such as the planning area, level, package, function, and parameter group.
New subsets can be generated by filling the table xth_chas. The underlying structure of the table consists of all characteristics in the InfoCube. It is the same structure used as “S_CHAS” in the table xth_data. If a subset should be generated then it is enough to fill in one characteristic combination that would be contained in the subset.
Parameter groups are used to make the definition of planning functions more flexible. In predefined planning functions the type of planning function defines which fields can be filled in the parameter group.
In Exit functions the user can define which parameter can be filled by creating „Exit Parameters“:
Exit parameters are created in the screen of the planning function.
The types are defined by using an appropriate data dictionary data element.
The parameters are available in the parameter groups and can be filled there.
The Exit (and the Init) module receive a table of the created parameters and their values (it_exitp).
Example: In our exercise we had a copy function that was copying from the actual version (A00) to some plan version (P03). The plan version is hard coded in the ABAP coding. To make the planning function more flexible the target version could be specified in an exit parameter in the parameter group. In the coding the value of the exit parameter can be found in the table it_exitpand then used as a copy target.
Example 2: The standard copy function can either add to existing values or overwrite existing values. There is an example in the system how this behavior can be realized in an Exit using exit parameters:
Area PDESE1
Planning level: LEV14
Planning Function EXIT2: “Exit Copy with Parameters.”
Understand the prerequisites of planning functions in BW-BPS
Create a planning function type Exit in BW-BPS and use it
Planning Functions type Exit: Unit Summary
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Exercises
Unit: Virtual Planning Functions Type Exit
At the conclusion of this exercise, you will be able to:
• Implement a planning function of type ‘Exit’ to copy your plan data
In this exercise you will create a function module, which provides the functionality of the planning function. Subsequent, you will assign this function module to a planning level, and test your planning function.
Optionally, you
1 Create a simple planning function type Exit that copies data from the actual version “A00” to the plan version “P03”.
1-1 Create a new function group and copy the function module “TEMPLATE_EXIT“ to the function module “Z_BPS_COPY_xx“ where xx is the number of your group.
1-2 In your coding create the necessary field symbols, read the entry with the version “A00“, change the version and update the data table.
Note Be aware that there are already existing some records for the plan version (the planning function should overwrite them).
1-3 How many records will you find in the table xth_data (keep in mind which fields will be chosen as fields to be changed and how the subsets will be formed)?
1-4 Enter the planning workbench (TCODE BPS0) and choose your planning area (“Workshop PDESE1 Group xx”/PDESE1xx)
1-5 Create a planning function type Exit using your function module. Use the planning level “vol*price” (LEV7).
1-6 Which characteristic(s) have to be chosen as “field(s) to be changed”?
1-7 Check that the selection contains all necessary data, i.e. the versions “A00” and “P03”.
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
1-8 Start the layout “Copy version” (LAY3) to check/change the existing data.
1-9 Run your planning function and check the results.
Solution for 1-3: you will find either one or two records in each subset. As you are only changing the versions this is the only characteristic that should be in the fields to be changed. The data is grouped by all other characteristics (subsets!). There is some data in the actual version for each combination of country, year, product and product group that is in the selection. Thus you will see at least one record. A second record is contained if some data for the plan version also exits.
Hint Find a solution for the exercise in planning area PDESE1, LEV14, EXIT3. The name of the function module is “PDESE1_COPY_EXIT”. The exact menu paths are described in the notes of the slide “Planning Functions Type Exit – Example”.
2 [Optional Exercise] Create a planning function type exit that calculates the revenue for each product from the price and the sold volume.
2-1 The prices are actual prices that are already stored in the system. As the price for a product is independent of country the prices are stored in country “#”. The prices (key figure Z_PRICE) are stored in version “A00”.
2-2 In the planning level LEV7 the sales volumes (key figure T_QUANT) are entered by the users in the version “P03”. The revenue (key figure ZBPS_REV) should also be stored in version “P03”.
2-3 Create your planning function in planning level LEV7. When deciding which characteristics should be in the fields to be changed remember that the prices and the volumes must be contained in the same subset.
Note As the prices and the volumes must be in the same subset and as the records only differ in the characteristics version and country. Thus, these two characteristics must be in the fields to be changed. All the other characteristics need not be chosen as fields to be changed.
2-4 Run your planning function and check the results. Use the ‘Check calculation’ layout of the Mannual Planning to check your results (has revenue as column included).
Hint Find a solution for the exercise in planning area PDESE1, LEV14, EXIT3. The name of the function module is “PDESE1_PRICE_CALC”. The sequence of steps is the same as for exercise 1. The exact menu paths are described in the notes of the slide “Planning Functions Type Exit – Example”.
In a process chain you want to use a simple, independent programin a chain or a program scheduled by a user or another program in the background-> Use Process type „ABAP Program“
You write your own application log to be displayed in the Processtab strip for the logs, or you have implemented your own monitor, or have implemented a Customizing interface that goes beyond theparameters of a program, or want to access preceding chainprocesses in your program-> Implement own process type
Maintain process chains in RSPC
Maintain process types under Settings -> Maintain Process Typesin RSPC and create class under Tools -> ABAP Workbench -> Development -> Class Builder
Interfaces: IF_RSPC_*EXECUTE – Execution of the processGET_INFO – Give Information to successorsGET_LOG – Give back messagesMAINTAIN – Maintenance of variantsGET_VARIANT – F4 for variantsGET_DEFAULT_CHAIN – Give default-chain for a processCHECK – Check consistenyGET_STATUS – Give status of instanceCALL_MONITOR – own monitoring toolTRANSPORT – give back TLOGO-entryCONTEXT_MENU – Enhance context-menu
There are several services in Paket RSPC which shall help you implementing a process type
Generic variant storage class: CL_RSPC_VARIANTGeneric dynpro service for variantmaintainance: Function moduleRSPC_VARIANT_MAINTAINGeneric instance storage class: CL_RSPC_INSTANCEWrapper class for application log forunambigous assignment of logs to instance CL_RSPC_APPL_LOGService function group: RSPC_SERVICES
Switch data source to BWREPORT ZTEST.REPORT Z_SWITCH_BATCH_TO_TRANS.
* Close open request and switch data sourceCALL FUNCTION 'RSAPO_SWITCH_BATCH_TO_TRANS'EXPORTINGI_INFOCUBE = 'T_ABAP01'EXCEPTIONSCUBE_NOT_TRANSACTIONAL = 1INHERITED_ERROR = 2OTHERS = 3
.IF SY-SUBRC <> 0.* MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO* WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.ENDIF.
REPORT Z_SWITCH_TRANS_TO_BATCH.
CALL FUNCTION 'RSAPO_SWITCH_TRANS_TO_BATCH'EXPORTINGI_INFOCUBE = 'T_ABAP01'
Unit: Process Chains – customer defined process type
At the conclusion of this exercise, you will be able to:
• Implement your own process type
• Implement an ABAP OO class, which contains the functionality of your own process type
• Make the settings for a new process type
• Include the new process type in a process chain
A file is posted from some subsidiary to a specified folder on your application server. You know, that it will be posted on a certain day, but do not know the time it will arrive. Your customer-defined process type looks for the file itself and if found returns a ‘Green’ status. If the file is not found, it returns a ‘Red’ status. The filename and path is hard-coded.
Note: The source code for the process type can also be found as attachment of a mail in the SAP Office (TCODE SO04 Shared folders Training ABAP Backend Workshop Process Chains)
1-1 Implement the class of the process type
1-1-1 Create a new class named ZCL_PC_EX_<group> (TCODE SE24)
1-1-2 Implement the following methods of the interface IF_RSPC_MAINTAIN
1-1.2.1 MAINTAIN: Copy and paste the code underneath. This method is quite empty because we do not have persistence in this exercise. But still a process variant has to be determined, thus a trivial fixed name is given back. Save it.
IF i_variant IS INITIAL. " Create a new one * ---- No maintainance, thus hardcoded variant name ---- e_variant = 'HARDCODED'. e_variant_text = 'Filename hardcoded'. ELSE. " maintain an existing one * ---- No maintainance, thus according message ---- MESSAGE i023(rspc).
ENDIF.
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
1-1.2.2 GET_HEADER: Copy and paste the code underneath. This method is quite empty because we do not have persistence in this exercise. Save it.
e_variant_text = 'Filename hardcoded'.
1-1-3 Implement the following methods of the interface IF_RSPC_EXECUTE
1-1.3.1 GIVE_CHAIN: Copy and paste the code underneath. Save it.
* ---- We do not need to know about the chain during runtime, ---- * ---- our method works by itself ---- return = ' '.
1-1.3.2 EXECUTE: Copy and paste the code underneath. This method is performing the check on the existence of the file on run time. Activate it. Select everything on the following popup and press ok
* ---- These are the parameters of your process ---- DATA: * ---- Evt. Change below line and enter the correct path and filename p_file TYPE char90 VALUE 'N:\T_COSTCENTER_TRANS_2004002.CSV’. DATA: l_status TYPE btcxpgstat, l_unique_id TYPE sysuuid_25. * ==== Get Instance ==== CALL FUNCTION 'RSSM_UNIQUE_ID' IMPORTING e_uni_idc25 = l_unique_id. e_instance = l_unique_id. e_state = 'G'. * ==== Go for it ==== DATA: l_file_exists type c. CALL METHOD CL_GUI_FRONTEND_SERVICES=>FILE_EXIST EXPORTING FILE = p_file RECEIVING RESULT = l_file_exists. IF l_file_exists is initial. e_state = 'R'. else. e_state = 'G'. ENDIF.
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
1-2 Your process type is finished. You now need to declare it for the process chain framework. Go to transaction RSPC.
1-2-1 Create a chain named ZPC_EX_TYPE_<group> with a start process for immediate start.
1-2-2 Choose from the menu “Settings” “Process Types”. Choose “New Entries”.
• Enter a process type name ZPCTYPE<group>.
• Enter meaningful long and short descriptions (include your group number in the description).
• As ObjectTypeName enter the name of your class (ZCL_PC_EX_<group>). Choose “Abap OO” as the object type.
• As Possible Events choose “Process ends "successful" or "incorrect"”.
• Process is “Repeatable” but not “Repairable”.
• As ID choose a cool icon with F4. Select “internal name”.
• As process category choose ‘98’.
• Save and leave.
1-2-3 Your process type is ready to be inserted in the chain: Choose the tree “Process Types”. Find your process type under node “Others”. Add it to the process chain.
1-2-4 Connect the your process type with the trigger process.
1-2-5 Add any process type after your new process type (e.g. the InfoPackage you created in Chapter 3 for the Data Staging).
1-2-6 Now you can test your solution: Run the chain. Your process should become green if the file is there, and red if it was not there.
RSDBC_SQL_STATEMENT:DB Connect: BADIs for the DB SQL Statements
RSR_OLAP_AUTH_GEN:Generating Authority: Creating User (new in rel. 3.10)
RSR_OLAP_BADI:Virtual Characteristics and Key Figures in Reporting
RSU5_SAPI_BADI:BW Service APIs (in BW and in R/3 since 4.6, respectively)
SAP Enhancements and BAdIs (1)
The list of the SAP enhancements and BAdIs is based on the release 3.0B.
The complete transition from SAP enhancements to BAdIs will occur in the next releases. Pleasecheck your system for the existence of SAP enhancements or BAdIs, respectively (all documentedSAP enhancements and BAdIs will be included implementation guide [IMG]).
RSAR_CONNECTOR:Formula Builder - customer defined functions
BW_SCHEDULER:BAdI for subsequent processing
RSOD_DOC_BADI:BAdI for documents
SAP Enhancements and BAdIs (2)
The list of the SAP enhancements and BAdIs is based on the release 3.0B.
The complete transition from SAP enhancements to BAdIs will occur in the next releases. Pleasecheck your system for the existence of SAP enhancements or BAdIs, respectively (all documentedSAP enhancements and BAdIs will be included implementation guide [IMG]).
RSOD_ITEM_DOC:BAdI for the Web Item „Single Document“
RSOD_ITEM_DOC_LIST:BAdI for the Web Item „List of Documents“
RSOD_WWW_DOC_MAINT:BAdI for the maintanance of text documents in the web
SAP Enhancements and BAdIs (3)
The list of the SAP enhancements and BAdIs is based on the release 3.0B.
The complete transition from SAP enhancements to BAdIs will occur in the next releases. Pleasecheck your system for the existence of SAP enhancements or BAdIs, respectively (all documentedSAP enhancements and BAdIs will be included implementation guide [IMG]).
RSR00001 see Note 492445Enhancement for reporting variables
RSR00002 exists parallel with RSR_OLAP_BADI
RSR00004 automatically migrated to BAdI SMOD_RSR00004Report-Report-Interface
RSSBR001 obsolete (deleted with rel. 3.0B SP8)
SAP Enhancements and BAdIs (4)
The complete transition from SAP enhancements to BAdIs will occur in the next releases. Pleasecheck your system for the existence of SAP enhancements or BAdIs, respectively (all documentedSAP enhancements and BAdIs will be included implementation guide [IMG]).
The statement MOVE-CORRESPONDING source_struc TO target_struc copies thecontents of the structure source_struc into the structure target_struc component bycomponent. The value assignment works only if the components have identical names.
Overwrite : Spaces are overwritten by characters from the second character string
Concatenate several character strings
Split a character string
A B A PA B A P a b a pA B A P B A P
B B A P
A P A P
A B A B A PA P
A B PA B A P
A B A P
A A A A
Search in a character stringSearch in a character string
Replace first occurrenceReplace first occurrence
MoveMove
RemoveRemove
Overwrite Overwrite
ConcatenateConcatenate
SplitSplit
Replace all occurrencesReplace all occurrences
A B C A P
A B A PB ?Found: sy-subrc = 0
Position of search stringusing MATCH OFFSET off addition
Note for FIND statement (search in a character string): There are special comparison operators for strings, which you can use in logical expressions in a query (IF) to search more flexibly for character sequences in a character string. For more information, see the keyword documentation for IF.
For every statement, the operands are treated like type c fields, regardless of their actual field type. No internal type conversions take place.
All of the statements apart from TRANSLATE and CONDENSE set the system field sy-subrc. (SEARCH also sets the system field sy-fdpos with the offset of the character string found.)
All of the statements apart from SEARCH are case-sensitive.
To find out the occupied length of a string, use the standard function STRLEN().
For the SPLIT statement there is the variant SPLIT ... INTO TABLE <itab>, which you can use to split the character string dynamically. You do not need to specify the number of parts into which the string should be split.
The following single record operations are available for internal tables: In each case warepresents a structure that must have the same type as the line type of the internal table itab.
APPEND Appends the contents of a structure to an internal table. This operationshould be used with standard tables only.
INSERT Inserts the contents of a structure into an internal table. In a standard table itis appended, in a sorted table it is inserted in the right place, and in a hashedtable it is inserted according to the hash algorithm.
READ Copies the contents of a line in an internal table to a structure.
MODIFY Overwrites a line in an internal table with the content of a structure.
DELETE Deletes a line of an internal table.
COLLECT Accumulates the contents of a structure into an internal table. This statementmay be used only for tables whose non-key fields are all numeric. Thenumeric values are summarized for identical keys
For detailed information about the ABAP statements described here, refer to the relevant keyworddocumentation.
Inserting severallines from another internal table
LOOPLOOP
ENDLOOPENDLOOP
DELETEDELETE
INSERTINSERT
Appending severallines from another internal table
INSERT LINES OF itab2<condition1> TO itab1.
APPENDAPPEND
The following set operations are available for internal tables: In each case wa represents a structure that must have the same type as the line type of the internal table itab.
LOOP ... ENDLOOPThe LOOP places the lines of an internal table one by one into the structure specified in theINTO clause. All single record operations can be executed within the loop. In this case, for thesingle record operations, the system identifies the line to be processed.
DELETEDeletes the lines of the internal table that satisfy the condition <condition>.
INSERTCopies the contents of several lines of an internal table to another internal table.
APPENDAppends the contents of several lines of an internal table to another standard table.
For detailed information about the ABAP statements described here, refer to the relevant keyworddocumentation.
The following operations affect the whole internal table.
SORTYou can use this to sort any number of columns in a standard or hashed table in ascending ordescending order. You may want to take culture-specific sort rules into account.
REFRESHThis deletes the entire contents of an internal table. A part of the previously used workingmemory remains available for future insertions.
FREEThis deletes the entire contents of the internal table and releases the previously used workingmemory.
CLEARUnlike all other data objects, this statement has the same effect as the REFRESH statement on internal tables with no header line.
In ABAP you have two ways to execute different sequences of statements, depending on certainconditions.
With the CASE-ENDCASE construction the criterion for the execution of a statement block is thesimilarity of the data objects. If no comparison is successful, then the system executes the OTHERSbranch, if it is available.Except for the first WHEN branch, all further additions are optional.
With the IF-ENDIF construction, you can use any logical expressions. If the condition is met, thesystem executes the relevant statement sequence. If no comparison is successful, then the systemexecutes the OTHERS branch, if it is available. Except for the first query, all further branches areoptional.
For both constructions, the system executes only one statement sequence and always for the firstvalid case.
Recommendation:If, in every condition, you check that a variable is equal to a given value, use the CASE-ENDCASEconstruction. It is clearer and less runtime intensive.
Outside of loops you can also use CHECK instead of IF. This query sets the execution of all statements up to the end of the current processing block under one condition. If this isunsuccessful, the system continues with the first statement in the next processing block.
In ABAP there are four loop constructions, whereby LOOP-ENDLOOP and SELECT-ENDSELECT represent special cases. In the DO and WHILE loops, the system stores the numberof the current loop pass in the sy-index field. If these loops are nested, sy-index containsthe number of the current (that is, inner) loop.
Unconditional/index controlled loopsThe statements between DO and ENDDO are executed until the loop is left over other statements. You also have the option of specifying the maximum number of loop passes; otherwise, you mayget an endless loop.
Header controlled loopsThe statements between WHILE and ENDWHILE are executed only if the condition <logical_expression> has been met.
You can use the statements CHECK and EXIT for different effects on the way the loop isprocessed. For example, you can construct a footer-controlled loop.
A function group represents the main program for function modules. Several function modules thatoperate on the same data content are combined to form a function group.
The function group remains active for as long as the calling program is active. For example, if an executable program calls a function module, its entire function group is loaded as well. It remainsactive until the executable program is completed.
A function group can contain the same components as an executable program. These include:
Data ObjectsThese are then global in relation to the function group, that is, they are visible to and changeableby all function modules within the group. The validity period is the same as for the function group.
SubroutinesThese can be called from all function modules in the group.
ScreensThese can be called from all function modules in the group.
The global data of a function module is retained until the program that contained the first call of a function module in the function group is finished.
Thus, if a function module that writes values to the global data is called, other function modules in the same function group can access this data when the program calls them.
Seen from the outside, the global data is encapsulated, that is, it is not possible to access it directly.Therefore, you must have function modules that allow orderly access from the outside.
The same applies to all the other components of the function group (screens, subroutines).
WHEN 'CANCEL'.CALL FUNCTION'POPUP_TO_CONFIRM_LOSS_OF_DATA'
EXPORTINGtextline = text-001title = text-002
IMPORTINGanswer = pop_answer.
CASE pop_answer.when 'J'.
...when 'N'.
...ENDCASE.
...ENDCASE.
textlinetitle
CALL FUNCTIONCALL FUNCTION
EXPORTINGEXPORTING
IMPORTINGIMPORTINGanswer
You call function modules using the ABAP statement CALL FUNCTION. The name of the functionmodule follows in capital letters enclosed in single quotation marks.
After EXPORTING has taken place, the system passes values to the Import parameters of thefunction module.
After IMPORTING has taken place, the function module returns the result through its Exportparameter.
Looking at this from the calling program's side, we see that the parameters passed to the functionmodule are exported, and those passed from the function module to the program are imported.
On the Parameter assignment screen, the system displays the names of the interface parameters of the function module (formal parameters) to the left of the equal sign. The system displays the callingprogram's data objects (actual parameters) to the right of the equal sign.
With most function modules, exceptions are triggered in error situations. In such cases, theexceptions are assigned to the number values after EXCEPTIONS.
If the function module triggered such an exception, the respective value is placed into the systemfield sy-subrc.
If you do not want to list the triggered exception under EXCEPTIONS or you cannot do so, theexception is assigned to the OTHERS case, provided this addition is listed.
By evaluating this field, you can then react accordingly. In the example shown in the graphic, suitable messages are transmitted.
If no exception was triggered or neither the exception nor OTHERS was listed under EXCEPTIONS, the system sets the sy-subrc field to 0.
The structure of the application components is shown in the application hierarchy. From the SAP Easy Access menu, you access the application hierarchy by choosingTools → ABAP Workbench → Overview → Application Hierarchy.The application components are displayed in a tree structure in the application hierarchy. Expanding a component displays all the development classes that are assigned to that component.
You can select a subtree or branch and navigate from the application hierarchy to the R/3 RepositoryInformation System. The system then collects all development classes for the branch selected and passes them to the Repository Information System.
You can use the Repository Information System to search for specific Repository objects. Suitablesearch criteria are available for the various Repository objects.
Which objects are to be found using the Repository Information System depends on how you getthere:
From within the application hierachy, select the Information system (double-click the selectedapplication component or development class). This filters the the respective Repository objects.
From the SAP Easy Access menu, choose Tools→ ABAP Workbench → Overview → Information system. The system lists all the Repository Objects available for searching.
When you go to the definition of a database table in the ABAP Dictionary, you will see informationon all the technical attributes of the database table.
The following information is useful for improving the performance of database accesses:
Key fields – If the lines requested from the database are retrieved according to key fields, theDatabase Optimizer can perform access using a primary index.
Secondary index – If the lines requested from the database are retrieved according to fields, theDatabase Optimizer can perform access using a secondary index. Secondary indexes are displayedin a dialog box whenever you select Indexes. You choose an index from the dialog box by double-clicking it. The system then displays a screen with additional information about that index.
Performance searchtime increases searchtime increases searchtime is linear with the only logarithmically independet of the number of table with the number of number of tableentries table entries entries
Comment If you use only no index accessindex access possible andsorting is wasted the key needs to betime -> standard full qualified table otherwise: linear
READ TABLE itab INTO waWITH KEY key.WITH KEYWITH KEY
Table scan
STANDARD
Complete/part keyComplete/part keyleftleft--aligned without gapsaligned without gaps
By qualified keyBy qualified key
Any componentAny componentconditioncondition
key
SORTED HASHEDSTANDARDkey
Table kind
Table kind
Binary search
Table scan
SORTED
Table scan
HASHED
Table scan Binary search
Hash algorithm
Must be completelyMust be completelyqualifiedqualified
Whenever you want to read individual table lines by declaring a complete key, use the READ TABLE ... WITH TABLE KEY statement (fastest single record access by key). The runtime system supports this syntax variant especially for SORTED and HASHED tables. If the table is a STANDARD table, the runtime system performs a table scan.The same applies if you have copied the values from all key fields of the entry to be read into the work area wa and are then use READ TABLE itab FROM wa.
The runtime system carries out the syntax variant READ TABLE ... WITH KEY (read an entry after applying any condition) using a table scan.The only exception to this rule applies to SORTED tables, if you fill the first n key fields with "=" (no gaps), where n <= number of key fields.With standard tables however, you can also sort correspondingly using SORT and then use the BINARY SEARCH addition.
Summary:Whenever possible, use READ TABLE ... WITH TABLE KEY or the variant with a correspondingly-filled work area.If you need to use READ TABLE ... WITH KEY, make your internal table a SORTED table.
First n key fields filledFirst n key fields filledwith "=" without gapswith "=" without gaps
Any logical expressionAny logical expressionfor columnsfor columns
log_expr
Table kind
Binary search for starting point, then loop only through group
level
Table scan
SORTED
Table scan
HASHED
The runtime system generally processes loops with a WHERE clause by performing a table scan -that is, determining whether the condition in the WHERE clause is true for each line in the table.
SORTED tables are the only exception to this rule. For these, the runtime system optimizes the runtime under the following condition:In the WHERE clause, the first n key fields are filled with a "=" (no gaps). (n is less than or equal to the number of all key fields). As a result, the loop is only performed on the lines that match the condition in the WHERE clause. Since the table is sorted, the first line can be specified to optimize performance at runtime (using a binary search).
Instead of READ TABLE ... INTO, you can use the READ TABLE ... ASSIGNING variant.This offers better performance at runtime for pure read accesses with a line width greater than or equal to 1000 bytes. If you then change the read line using MODIFY, READ ... ASSIGNINGalready improves runtime with a line width of 100 bytes.The same applies to LOOP ... INTO in comparison with LOOP ... ASSIGNING. The LOOP ... ASSIGNING variant offers better performance at runtime for any loop of five loop passes or more.
Both field symbol variants are much faster than work area variants, in particular when you use nested internal tables. This is because, if you use work areas instead, the whole inner internal table is copied (unless you prevent this by using a TRANSPORTING addition).
Always assign a type to field symbols, if you know their static type (again, for performance reasons).Note: If you use READ TABLE ... ASSIGNING the field symbol points to the originally assigned table line, even after the internal table has been sorted.Note that when using field symbols, you cannot change key fields in SORTED or HASHED tables. Trying to do so causes a runtime error.The following restrictions apply to LOOP ... ASSIGNING <fs>:
You cannot use the SUM statement in control level processing.
You cannot reassign field symbols within the loop. The statements ASSIGN do TO <fs> and UNASSIGN <fs> will cause runtime errors.
FUNCTION RSAX_BIW_GET_DATA_SIMPLE. *"---------------------------------------------------------------------- *"*"Lokale Schnittstelle: *" IMPORTING *" VALUE(I_REQUNR) TYPE SRSC_S_IF_SIMPLE-REQUNR *" VALUE(I_DSOURCE) TYPE SRSC_S_IF_SIMPLE-DSOURCE *" OPTIONAL *" VALUE(I_MAXSIZE) TYPE SRSC_S_IF_SIMPLE-MAXSIZE *" OPTIONAL *" VALUE(I_INITFLAG) TYPE SRSC_S_IF_SIMPLE-INITFLAG *" OPTIONAL *" VALUE(I_READ_ONLY) TYPE SRSC_S_IF_SIMPLE-READONLY *" OPTIONAL *" TABLES *" I_T_SELECT TYPE SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL *" I_T_FIELDS TYPE SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL *" E_T_DATA STRUCTURE SFLIGHT OPTIONAL *" EXCEPTIONS *" NO_MORE_DATA *" ERROR_PASSED_TO_MESS_HANDLER *"---------------------------------------------------------------------- * Example: DataSource for table SFLIGHT TABLES: SFLIGHT. * Auxiliary Selection criteria structure DATA: L_S_SELECT TYPE SRSC_S_SELECT. * Maximum number of lines for DB table STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE, * counter S_COUNTER_DATAPAKID LIKE SY-TABIX, * cursor S_CURSOR TYPE CURSOR. * Select ranges RANGES: L_R_CARRID FOR SFLIGHT-CARRID, L_R_CONNID FOR SFLIGHT-CONNID. * Initialization mode (first call by SAPI) or data transfer mode * (following calls) ? IF I_INITFLAG = SBIWA_C_FLAG_ON. ************************************************************************ * Initialization: check input parameters * buffer input parameters * prepare data selection ************************************************************************
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
* Check DataSource validity CASE I_DSOURCE. WHEN '0SAPI_SFLIGHT_SIMPLE'. WHEN OTHERS. IF 1 = 2. MESSAGE E009(R3). ENDIF. * this is a typical log call. Please write every error message like this LOG_WRITE 'E' "message type 'R3' "message class '009' "message number I_DSOURCE "message variable 1 ' '. "message variable 2 RAISE ERROR_PASSED_TO_MESS_HANDLER. ENDCASE. APPEND LINES OF I_T_SELECT TO S_S_IF-T_SELECT. * Fill parameter buffer for data extraction calls S_S_IF-REQUNR = I_REQUNR. S_S_IF-DSOURCE = I_DSOURCE. S_S_IF-MAXSIZE = I_MAXSIZE. * Fill field list table for an optimized select statement * (in case that there is no 1:1 relation between InfoSource fields * and database table fields this may be far from beeing trivial) APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS. ELSE. "Initialization mode or data extraction ? ************************************************************************ * Data transfer: First Call OPEN CURSOR + FETCH * Following Calls FETCH only ************************************************************************ * First data package -> OPEN CURSOR IF S_COUNTER_DATAPAKID = 0. * Fill range tables BW will only pass down simple selection criteria * of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'. LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'CARRID'. MOVE-CORRESPONDING L_S_SELECT TO L_R_CARRID. APPEND L_R_CARRID. ENDLOOP. LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'CONNID'. MOVE-CORRESPONDING L_S_SELECT TO L_R_CONNID. APPEND L_R_CONNID. ENDLOOP. * Determine number of database records to be read per FETCH statement * from input parameter I_MAXSIZE. If there is a one to one relation
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
* between DataSource table lines and database entries, this is trivial. * In other cases, it may be impossible and some estimated value has to * be determined. OPEN CURSOR WITH HOLD S_CURSOR FOR SELECT (S_S_IF-T_FIELDS) FROM SFLIGHT WHERE CARRID IN L_R_CARRID AND CONNID IN L_R_CONNID. ENDIF. "First data package ? * Fetch records into interface table. * named E_T_'Name of extract structure'. FETCH NEXT CURSOR S_CURSOR APPENDING CORRESPONDING FIELDS OF TABLE E_T_DATA PACKAGE SIZE S_S_IF-MAXSIZE. IF SY-SUBRC <> 0. CLOSE CURSOR S_CURSOR. RAISE NO_MORE_DATA. ENDIF. S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1. ENDIF. "Initialization mode or data extraction ? ENDFUNCTION.
Customer Function Call for Supplying Transaction Data
Functionality
This function module allows you to fill user-defined fields that you have attached to existing InfoSources as an append structure. You can find further information the documentation under User-defined enhancement to the extract structure. The following transfer parameters are available: I_ISOURCE: Name of the InfoSource. I_T_FIELDS: List of the transfer structure fields. Only these fields are actually filled in the data table and can be sensibly addressed in the program. C_T_DATA: Table with the data received from the API in the format of source structure entered in table ROIS (field ROIS-STRUCTURE). I_UPDMODE: Transfer mode as requested in the Scheduler of the Business Information Warehouse. Not normally required. I_T_SELECT: Table with the selection criteria stored in the Scheduler of the SAP-Business Information Warehouse. This is not normally required. FUNCTION EXIT_SAPLRSAP_001. *"---------------------------------------------------------------------- *"*"Lokale Schnittstelle: *" IMPORTING *" VALUE(I_DATASOURCE) TYPE RSAOT_OLTPSOURCE *" VALUE(I_ISOURCE) TYPE SBIWA_S_INTERFACE-ISOURCE *" VALUE(I_UPDMODE) TYPE SBIWA_S_INTERFACE-UPDMODE *" TABLES *" I_T_SELECT TYPE SBIWA_T_SELECT *" I_T_FIELDS TYPE SBIWA_T_FIELDS *" C_T_DATA *" C_T_MESSAGES STRUCTURE BALMI OPTIONAL *" EXCEPTIONS *" RSAP_CUSTOMER_EXIT_ERROR *"----------------------------------------------------------------------
This function module allows you to fill customer-defined fields that you have hung as an append structure to master data structures of the SAP Business Information Warehouse. Transfer Parameters: I_CHABASNM: Name of the basic characteristic. I_T_FIELDS: List of the transfer structure fields. Only these fields are actually filled in the database table and can be used in any meaningful way in the program. I_T_DATA: Table - with the data contained by the API - in the format of source structure entered in the RODCHABAS table (field RODCHABAS-STRUCTURE). I_UPDMODE: Transfer mode, as requested in the scheduler of the Business Information Warehouse. This is not usually required. I_T_SELECT: Table with the selection conditions that are stored in the scheduler of the Business Information Warehouse. This is not usually. required.
Further Information
Documentation of the Business Information Warehouse under Customer-Defined Enhancement of the Extract Structure FUNCTION EXIT_SAPLRSAP_002. *"---------------------------------------------------------------------- *"*"Lokale Schnittstelle: *" IMPORTING *" VALUE(I_DATASOURCE) TYPE RSAOT_OLTPSOURCE *" VALUE(I_CHABASNM) TYPE SBIWA_S_INTERFACE-CHABASNM *" VALUE(I_UPDMODE) TYPE SBIWA_S_INTERFACE-UPDMODE *" TABLES *" I_T_SELECT TYPE SBIWA_T_SELECT *" I_T_FIELDS TYPE SBIWA_T_FIELDS *" I_T_DATA *" C_T_MESSAGES STRUCTURE BALMI OPTIONAL *" EXCEPTIONS *" RSAP_CUSTOMER_EXIT_ERROR *"----------------------------------------------------------------------
This function module allows you to change the contents of transfer tables generated for a hierarchy request. This can be useful, for example, with customer-defined hierarchy classes. The module should not used to change the contents of key fields. That should certainly be done in the SAP Business Information Warehouse! The following transfer parameters are available: C_T_HIETEXT: Table with the description of the hierarchy in the requested language. C_T_HIENODE: Table with all components of the hierarchy. C_T_FOLDERT: Table with the descriptions of all of the nodes of the hierarchy that cannot be posted to, in the requested languages. C_T_HIEINTV: Table wiht those hierarchy sheets that represent value intervals. I_UPDMODE: Transfer mode as requested in the Scheduler of the SAP Business Information Warehouse. This is not normally required. I_S_HIER_SEL: Structure that the requested hierarchy contains. I_T_LANGU: Table with the languages for which the descriptions have been requested.
Further information
SAP Business Information Warehouse documentation
FUNCTION EXIT_SAPLRSAP_004. *"---------------------------------------------------------------------- *"*"Lokale Schnittstelle: *" IMPORTING *" VALUE(I_DATASOURCE) TYPE RSAOT_OLTPSOURCE *" VALUE(I_S_HIEBAS) TYPE RSAP_S_HIEBAS *" VALUE(I_S_HIEFLAG) TYPE RSAP_S_HIEFLAG *" VALUE(I_S_HIER_SEL) TYPE RSAP_S_HIER_LIST *" VALUE(I_S_HEADER3) OPTIONAL *" TABLES *" I_T_LANGU TYPE SBIWA_T_LANGU *" C_T_HIETEXT TYPE RSAP_T_HIETEXT *" C_T_HIENODE TYPE RSAP_T_HIENODE *" C_T_FOLDERT TYPE RSAP_T_FOLDERT *" C_T_HIEINTV TYPE RSAP_T_HIEINTV *" C_T_HIENODE3 OPTIONAL *" C_T_HIEINTV3 OPTIONAL *" C_T_MESSAGES STRUCTURE BALMI OPTIONAL *" EXCEPTIONS *" RSAP_CUSTOMER_EXIT_ERROR *"----------------------------------------------------------------------
Appendix to chapter DataStaging – Extensive start routine example (Coding was implemented on a BW-Version 2.x – some maginal differences in the interface) ================================================================ PROGRAM UPDATE_ROUTINE. *$*$ begin of global - insert your declaration only below this line *-* TABLES: /BIC/AZLMS200. data: it_/BIC/PZCUSTNR type ZV_PZCUSTNR occurs 0 with header line, l_/BIC/PZCUSTNR type ZV_PZCUSTNR, it_/BIC/PZLARTVAR type ZV_PZLARTVAR occurs 0 with header line, l_/BIC/PZLARTVAR type ZV_PZLARTVAR, it_ods type /BIC/AZLMS200 occurs 0 with header line, l_ods type /BIC/AZLMS200, it_data_pack type /BIC/CS8ZSSLSTAT occurs 0 with header line, l_data_pack type /BIC/CS8ZSSLSTAT, l_loccustcat like /BIC/AZLMS200-/BIC/ZLOCCUCAT, l_no_records_input type i, l_tabix like syst-TABIX, l_rc like syst-subrc value '0'. *$*$ end of global - insert your declaration only before this line *-* FORM startup TABLES MONITOR STRUCTURE RSMONITOR "user defined monitoring DATA_PACKAGE STRUCTURE /BIC/CS8ZSSLSTAT USING RECORD_ALL LIKE SY-TABIX SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update * *$*$ begin of routine - insert your code only below this line *-* * fill the internal table "MONITOR", to make monitor entries ************************************************************************ * input: records of cube local sales statistics (zsslstat) via E-DS * lookup: ods data local market segmentation (zlms2) * output: split facts for multiple records (3 times more because of 3 Segmentations) ************************************************************************ * get master data for product area into itab select /BIC/ZPRODAREA /BIC/ZSOPCOMPS /BIC/ZLARTDIV /BIC/ZLARTNO /BIC/ZLARTVAR * due to performance select to view * from /BIC/PZLARTVAR into corresponding fields of l_/BIC/PZLARTVAR from ZV_PZLARTVAR into corresponding fields of l_/BIC/PZLARTVAR where objvers = 'A'. append l_/BIC/PZLARTVAR to it_/BIC/PZLARTVAR. endselect. sort it_/BIC/PZLARTVAR by /BIC/ZSOPCOMPS /BIC/ZLARTDIV /BIC/ZLARTNO /BIC/ZLARTVAR. * get master data for local customer category into itab
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
select /BIC/ZLOCCUCAT /BIC/ZSOPCOMPS /BIC/ZCUSTNR * due to performance select to view * from /BIC/PZCUSTNR into corresponding fields of l_/BIC/PZCUSTNR from ZV_PZCUSTNR into corresponding fields of l_/BIC/PZCUSTNR where OBJVERS = 'A'. append l_/BIC/PZCUSTNR to it_/BIC/PZCUSTNR. endselect. sort it_/BIC/PZCUSTNR by /BIC/ZSOPCOMPS /BIC/ZCUSTNR. * read ods data into itab select * from /BIC/AZLMS200 into l_ods. append l_ods to it_ods. endselect. sort it_ods by /BIC/ZSELLCOMP /BIC/ZLOCCUCAT /BIC/ZPRODAREA. * loop over the data package loop at DATA_PACKAGE. l_tabix = sy-tabix. * get product area from itab read table it_/BIC/PZLARTVAR with key /BIC/ZSOPCOMPS = DATA_PACKAGE-/BIC/ZSOPCOMPS /BIC/ZLARTDIV = DATA_PACKAGE-/bic/zlartdiv /BIC/ZLARTNO = DATA_PACKAGE-/BIC/ZLARTNO /BIC/ZLARTVAR = DATA_PACKAGE-/BIC/ZLARTVAR binary search. if sy-subrc = 4 or sy-subrc = 8. MONITOR-msgid = 'ZBWMESS'. MONITOR-msgty = 'I'. MONITOR-msgno = '20'. MONITOR-msgv1 = DATA_PACKAGE-/BIC/ZSOPCOMPS. MONITOR-msgv2 = DATA_PACKAGE-/bic/zlartdiv. MONITOR-msgv3 = DATA_PACKAGE-/BIC/ZLARTNO. MONITOR-msgv4 = DATA_PACKAGE-/BIC/ZLARTVAR. append MONITOR. endif. * if the product area of the cube's record is not from the inco * business skip the record if it_/BIC/PZLARTVAR-/BIC/ZPRODAREA(2) ne 'IN'. continue. endif. * get local customer category read table it_/BIC/PZCUSTNR with key /BIC/ZSOPCOMPS = DATA_PACKAGE-/BIC/ZSOPCOMPS /BIC/ZCUSTNR = DATA_PACKAGE-/BIC/ZCUSTNR binary search. if sy-subrc = 4 or sy-subrc = 8. MONITOR-msgid = 'ZBWMESS'. MONITOR-msgty = 'I'. MONITOR-msgno = '21'. MONITOR-msgv1 = DATA_PACKAGE-/BIC/ZSOPCOMPS. MONITOR-msgv2 = DATA_PACKAGE-/BIC/ZCUSTNR.
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
append MONITOR. endif. * create bw style of loccustcat * empty loccustcat values for the Inco business are filled with misc if it_/BIC/PZCUSTNR-/BIC/ZLOCCUCAT = ' ' or it_/BIC/PZCUSTNR-/BIC/ZLOCCUCAT is initial. concatenate DATA_PACKAGE-/BIC/ZSELLCOMP '_MISC' into l_loccustcat. else. l_loccustcat = it_/BIC/PZCUSTNR-/BIC/ZLOCCUCAT. endif. * check value if ods data fits to data package record l_rc = '8'. * and loop over ods data clear l_ods. loop at it_ods into l_ods where /BIC/ZSELLCOMP = DATA_PACKAGE-/BIC/ZSELLCOMP and /BIC/ZLOCCUCAT = l_loccustcat and /BIC/ZPRODAREA = it_/BIC/PZLARTVAR-/BIC/ZPRODAREA. * comparison of ods characteristics to cube characteristics * Segmentation 1: check institutions if l_ods-/BIC/ZSELLCOMP = DATA_PACKAGE-/BIC/ZSELLCOMP and l_ods-/BIC/ZLOCCUCAT = l_loccustcat and l_ods-/BIC/ZPRODAREA = it_/BIC/PZLARTVAR-/BIC/ZPRODAREA and l_ods-/BIC/ZSEGM = '10'. "Institutions * append new record to help structure (eq cube structure) clear l_data_pack. move-corresponding DATA_PACKAGE to l_data_pack. l_data_pack-/BIC/ZSEGM = l_ods-/BIC/ZSEGM. l_data_pack-/BIC/ZSEGMP = l_ods-/BIC/ZSEGMP. l_data_pack-/BIC/ZAPHTC = DATA_PACKAGE-/BIC/ZAPHTC * l_ods-/BIC/ZSEGMP. l_data_pack-/BIC/ZAPWHC = DATA_PACKAGE-/BIC/ZAPWHC * l_ods-/BIC/ZSEGMP. l_data_pack-/BIC/ZCIFREIAM = DATA_PACKAGE-/BIC/ZCIFREIAM * l_ods-/BIC/ZSEGMP. * … und hier kommen noch viele weitere Kennzahlen – im Projekt waren es an die 80 l_data_pack-/BIC/ZSTABONAM = DATA_PACKAGE-/BIC/ZSTABONAM "ok * l_ods-/BIC/ZSEGMP. append l_data_pack to it_data_pack. l_rc = '0'. * Segmentation 2: check health care elseif l_ods-/BIC/ZSELLCOMP = DATA_PACKAGE-/BIC/ZSELLCOMP and l_ods-/BIC/ZLOCCUCAT = l_loccustcat and l_ods-/BIC/ZPRODAREA = it_/BIC/PZLARTVAR-/BIC/ZPRODAREA and l_ods-/BIC/ZSEGM = '20'. "Health Care * … gleiches Vorgehen wie bei Segment 1 – die Kennzahlen werden mit der Prozentzahl multipliziert append l_data_pack to it_data_pack. l_rc = '0'. * Segmentation 3: check Retail elseif l_ods-/BIC/ZSELLCOMP = DATA_PACKAGE-/BIC/ZSELLCOMP and l_ods-/BIC/ZLOCCUCAT = l_loccustcat and l_ods-/BIC/ZPRODAREA = it_/BIC/PZLARTVAR-/BIC/ZPRODAREA and l_ods-/BIC/ZSEGM = '30'. "Retail * … gleiches Vorgehen wie bei Segment 1 und 2 – die Kennzahlen werden mit der Prozentzahl multipliziert
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
Fo
r
in
te
rn
al
u
se
b
y
CS
C
on
ly
append l_data_pack to it_data_pack. l_rc = '0'. endif. endloop. * those records of the data_package which do not have a split rule * are added at glance if sy-subrc <> 0. clear l_data_pack. move-corresponding DATA_PACKAGE to l_data_pack. append l_data_pack to it_data_pack. endif. * no match of records of split table ods and data package from cube if l_rc = '8'. MONITOR-msgid = 'ZBWMESS'. MONITOR-msgty = 'I'. MONITOR-msgno = '23'. MONITOR-msgv1 = DATA_PACKAGE-/BIC/ZCUSTNR. MONITOR-msgv2 = l_loccustcat. MONITOR-msgv3 = DATA_PACKAGE-/BIC/ZSOPCOMPS. append MONITOR. endif. * delete record from data package delete DATA_PACKAGE index l_tabix. endloop. * append records of internal data pack append lines of it_data_pack to DATA_PACKAGE. * if abort is not equal zero, the update process will be canceled ABORT = 0. *$*$ end of routine - insert your code only before this line *-* * ENDFORM.