PUBLIC Document Version: 2022.7 – 2022-03-22 Administering SAP Data Warehouse Cloud © 2022 SAP SE or an SAP affiliate company. All rights reserved. THE BEST RUN
PUBLICDocument Version: 2022.7 – 2022-03-22
Administering SAP Data Warehouse Cloud
© 2
022
SAP
SE o
r an
SAP affi
liate
com
pany
. All r
ight
s re
serv
ed.
THE BEST RUN
Content
1 Administering SAP Data Warehouse Cloud. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51.1 System Requirements and Technical Prerequisites. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81.2 Rules for Technical Names. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101.3 Configure the Size of Your SAP Data Warehouse Cloud Tenant. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Supported Sizes for Your Tenant. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2 Managing Users and Roles. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 282.1 Configuring Identity Provider Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Enable IdP-Initiated Single Sign On (SAP Data Center Only). . . . . . . . . . . . . . . . . . . . . . . . . . . . 28Renewing the SAP Analytics Cloud SAML Signing Certificate. . . . . . . . . . . . . . . . . . . . . . . . . . . 30Enabling a Custom SAML Identity Provider. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
2.2 Managing SAP Data Warehouse Cloud Users. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .40Creating a New User. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .41Importing or Modifying Users from a File. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42Exporting Users. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .44Updating User Email Addresses. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46Deleting Users. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .46
2.3 Managing Roles and Privileges. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47Assigning Roles to Users. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .47Transfer the System Owner Role. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49Creating a Custom Role. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50Roles and Privileges by App. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
3 Creating Spaces and Allocating Storage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .633.1 Create a Space. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 633.2 Allocate Storage to a Space. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 653.3 Set a Priority and Statement Limits for a Space. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 673.4 Monitor Tenant and Space Storage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .683.5 Unlock a Space That Has Exceeded Its Assigned Storage. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 693.6 Create, Read, Update, and Delete Spaces via the Command Line. . . . . . . . . . . . . . . . . . . . . . . . . . .69
Install or Update the dwc Command Line Interface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
The Space Definition File Format. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
4 Preparing Connectivity for Connections. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 834.1 Preparing Data Provisioning Agent Connectivity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Install the Data Provisioning Agent. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89Connect and Configure the Data Provisioning Agent. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90Register Adapters with SAP Data Warehouse Cloud. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
2 PUBLICAdministering SAP Data Warehouse Cloud
Content
Prerequisites for ABAP RFC Streaming. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 944.2 Preparing Cloud Connector Connectivity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
Configure Cloud Connector. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95Set Up Cloud Connector in SAP Data Warehouse Cloud. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .99
4.3 Add IP address to IP Allowlist. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1004.4 Finding SAP Data Warehouse Cloud IP addresses. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1024.5 Upload Certificates. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1034.6 Upload Third-Party ODBC Drivers (Required for Data Flows). . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1044.7 Prepare Connectivity to Adverity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1054.8 Prepare Connectivity to Amazon Athena. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1064.9 Prepare Connectivity to Amazon Redshift. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1064.10 Prepare Connectivity for Cloud Data Integration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1074.11 Prepare Connectivity for Generic JDBC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1084.12 Prepare Connectivity for Generic OData. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1084.13 Prepare Connectivity for Generic SFTP. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1094.14 Prepare Connectivity to Google BigQuery. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1104.15 Prepare Connectivity to Microsoft Azure SQL Database. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1104.16 Prepare Connectivity to Microsoft SQL Server. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1114.17 Prepare Connectivity to SAP Open Connectors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1124.18 Prepare Connectivity to Oracle. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1134.19 Prepare Connectivity to Precog. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1144.20 Prepare Connectivity to SAP ABAP Systems. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1144.21 Prepare Connectivity to SAP BW. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1164.22 Preparing SAP BW/4HANA Model Transfer Connectivity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
Create Live Data Connection of Type Tunnel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118Supported Source Versions for SAP BW/4HANA Model Transfer Connections. . . . . . . . . . . . . . .119
4.23 Prepare Connectivity to SAP ECC. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1204.24 Prepare Connectivity to SAP Fieldglass. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1214.25 Prepare Connectivity to SAP HANA. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1214.26 Prepare Connectivity to SAP Marketing Cloud. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1224.27 Prepare Connectivity to SAP SuccessFactors for Analytical Dashboards. . . . . . . . . . . . . . . . . . . . . 1234.28 Prepare Connectivity to SAP S/4HANA Cloud. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1244.29 Prepare Connectivity to SAP S/4HANA On-Premise. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
5 Managing and Monitoring Connectivity for Data Integration. . . . . . . . . . . . . . . . . . . . . . . . . .1265.1 Monitoring Data Provisioning Agent in SAP Data Warehouse Cloud. . . . . . . . . . . . . . . . . . . . . . . . 126
Monitoring Data Provisioning Agent Logs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126Enable Access to Data Provisioning Agent Logs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127Review Data Provisioning Agent Logs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128Receive Notifications About Data Provisioning Agent Status Changes. . . . . . . . . . . . . . . . . . . . 129
5.2 Pause Real-Time Replication for an Agent . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1305.3 Troubleshooting the Data Provisioning Agent. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
Administering SAP Data Warehouse CloudContent PUBLIC 3
5.4 Troubleshooting the Cloud Connector (SAP HANA Smart Data Acess). . . . . . . . . . . . . . . . . . . . . . 133
6 Integrating Analytics Clients. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1366.1 Integrating SAP Analytics Cloud . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .136
Link Your Tenants. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137Managing OAuth Clients. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
6.2 Integrating Third-Party Analytics Clients. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145Integrating Third-Party BI Clients via ODBC on Microsoft Windows. . . . . . . . . . . . . . . . . . . . . . 145
7 Creating Database User Groups. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1487.1 Creating a Database User Group. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1497.2 Creating a User. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1507.3 Creating a Schema. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1517.4 Creating a Role. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1527.5 Granting a Role. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1537.6 Revoking a Role. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1547.7 Dropping a Role. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1557.8 Allow a Space to Read From the Database User Group Schema. . . . . . . . . . . . . . . . . . . . . . . . . . . 1567.9 Allow a Space to Write to the Database User Group Schema. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
8 Monitoring and Troubleshooting SAP Data Warehouse Cloud. . . . . . . . . . . . . . . . . . . . . . . . . 1608.1 Database Analysis User. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160
Creating Database Analysis User. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162Deleting Database Analysis User . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163Connecting with SAP HANA Cockpit. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164
8.2 Setting up a Monitoring Space in SAP Data Warehouse Cloud. . . . . . . . . . . . . . . . . . . . . . . . . . . . 1658.3 Monitoring Tasks, Logs and Schedules With Dedicated Monitoring Views. . . . . . . . . . . . . . . . . . . . 1668.4 Monitor Database Operations with Audit Logs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
Delete Audit Logs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1698.5 Monitor Object Changes with Activities. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1708.6 Configuring Notifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1708.7 Requesting Help from SAP Support. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
4 PUBLICAdministering SAP Data Warehouse Cloud
Content
1 Administering SAP Data Warehouse Cloud
Administrators configure, manage, and monitor the SAP Data Warehouse Cloud tenant to support the work of acquiring, preparing, and modeling data for analytics. They manage users, create spaces and allocate storage to them, prepare and monitor connectivity for data integration, and perform ongoing monitoring and maintainance of the tenant.
You perform the administration tasks in one of the following tools available in the side navigation area of SAP Data Warehouse Cloud.
Security
Tool Task More Information
Users Create, modify, and manage users in SAP Data Warehouse Cloud.
Managing SAP Data Warehouse Cloud Users [page 40]
Roles Assign pre-defined standard roles or custom roles that you have created to users.
Managing Roles and Privileges [page 47]
Activities Track the activities that users perform on objects such as spaces, tables, views, data flows, and others, track changes to users and roles, and more.
Monitor Object Changes with Activities [page 170]
System Configuration
Tab Task More Information
Data Integration Live Data Connections (Tunnel): For SAP BW∕4HANA model import, you need Cloud Connector to make http requests to SAP BW∕4HANA. This requires a live data connection of type tunnel to SAP BW∕4HANA.
Create Live Data Connection of Type Tunnel [page 118]
Administering SAP Data Warehouse CloudAdministering SAP Data Warehouse Cloud PUBLIC 5
Tab Task More Information
On-Premise Agents: Manage Data Provisioning Agents which are required to act as gateway to SAP Data Warehouse Cloudto enable using connections to on-premise sources for remote tables and building views.
Connect and Configure the Data Provisioning Agent [page 90]
Register Adapters with SAP Data Warehouse Cloud [page 93]
Monitoring Data Provisioning Agent in SAP Data Warehouse Cloud [page 126]
Pause Real-Time Replication for an Agent [page 130]
Third-Party Drivers: Upload driver files that are required for certain third-party cloud connections to use them for data flows.
Upload Third-Party ODBC Drivers (Required for Data Flows) [page 104]
Tenant Links Link My Tenants: Link an SAP Analytics Cloud tenant to your SAP Data Warehouse Cloud tenant to enable the product switch in the top right of the shell bar, and be able to easily navigate between them.
Link Your Tenants [page 137]
Security SSL/TLS Certificates : Upload server certificates to enable secure SSL/TLS-based connections to certain sources.
Upload Certificates [page 103]
Password Policy Configuration: Define your password policy settings for the database users. The policy can be enabled when configuring your database users.
Configuring Password Policies
Audit Audit View Enablement: Configure a space that gets access to audit views and allows you to display the audit logs in that space.
Enable Audit Logging
IP Allowlist IP Allowlist: Control the range of external public IPv4 addresses that get access to the database of your SAP Data Warehouse Cloud by adding them to an allowlist.
Add IP address to IP Allowlist [page 100]
Task Logs Clean-up task logs to reduce storage consumption in your SAP Data Warehouse Cloud tenant.
Deleting Task Logs to Reduce Storage Consumption
Database Access Database Analysis Users: Create a database analysis user to connect to your SAP HANA Cloud database to analyze, diagnose and solve database issues. Only create this user for a specific specific task and delete right after the task has been completed.
Monitoring and Troubleshooting SAP Data Warehouse Cloud [page 160]
6 PUBLICAdministering SAP Data Warehouse Cloud
Administering SAP Data Warehouse Cloud
Tab Task More Information
Database User Groups: Create an isolated environment with corresponding administrators where you can work more freely with SQL in your SAP HANA Cloud database.
Creating Database User Groups [page 148]
Tenant Configuration Allocate the capacity units to storage and compute resources for your tenant, at first login.
Configure the Size of Your SAP Data Warehouse Cloud Tenant [page 12]
System Administration
Tab Task More Information
System Configuration Session timeout: Set the amount of time before a user session expires if the user doesn't interact with the system.
By default the session timeout is set to 3600 seconds (1 hour). The minimum value is 300 seconds, and the maximum value is 43200 seconds.
Data Source Configuration SAP Cloud Platform (SAP CP) Account: Get subaccount information for SAP Data Warehouse Cloud. You need the information to configure the Cloud Connector that SAP Data Warehouse Cloud uses to connect to sources for data flows and model import.
Set Up Cloud Connector in SAP Data Warehouse Cloud [page 99]
Live Data Sources: If you want to use SAP BW∕4HANA model import, you need to allow data from your live data connection of type tunnel to securely leave your network.
On-premise data sources: Add location IDs if you have connected multiple Cloud Connector instances to your SAP Data Warehouse Cloud subaccount and you want to offer them for selection when creating connections using a Cloud Connector.
Security Authentication Method: Select the authentication method used by SAP Data Warehouse Cloud.
Enabling a Custom SAML Identity Provider [page 31]
SAML Single Sign-On (SSO) Configuration: Configure SAML SSO if you selected it as authentication method.
App Integration OAuth Clients: You can use Open Authorization (OAuth) protocol to allow third-party applications access.
Managing OAuth Clients [page 138]
Administering SAP Data Warehouse CloudAdministering SAP Data Warehouse Cloud PUBLIC 7
Tab Task More Information
Trusted Identity Providers: If you use the OAuth 2.0 SAML Bearer Assertion workflow, you must add a trusted identity provider.
Trusted Origins: Enter the origins that will be hosting your client application.
Notifications Make sure that users are notified appropriately about issues in the tenant.
Configuring Notifications [page 170]
System About
Every user can view information about the software components and versions of your system.
Users with the DW Administrator role can open a More section to find more details, for example outbound and database IP addresses that might be required for allowlists in source systems or databases of SAP Data Warehouse Cloud. For more information, see Finding SAP Data Warehouse Cloud IP addresses [page 102].
1.1 System Requirements and Technical Prerequisites
SAP Data Warehouse Cloud is a fully web-based offering. You will need an internet connection and a system that meets certain requirements.
The requirements listed here are for the current release.
Client Software Requirements
Client Software Version Additional Information
Desktop browser Google Chrome, latest version Google releases continuous updates to their Chrome browser. We make every effort to fully test and support the latest versions as they are released. However, if defects are introduced with OEM-specific browser software, we cannot guarantee fixes in all cases.
For additional system requirements, see your web browser documentation.
8 PUBLICAdministering SAP Data Warehouse Cloud
Administering SAP Data Warehouse Cloud
Client Software Version Additional Information
Microsoft Edge based on the Chromium engine, latest version
Microsoft has available for download continuous updates to their new Chromium-based Edge browser. We make every effort to fully test and support the latest versions as they are released.
Additional software Adobe Acrobat Reader 9 or higher -
Client Configuration Requirements
Client Configuration Setting Additional Information
Network bandwidth Minimum 500-800 kbit/s per user In general, SAP Data Warehouse Cloud requires no more bandwidth than is required to browse the internet. All application modules are designed for speed and responsiveness with minimal use of large graphic files.
Screen resolution XGA 1024x768 (high color) or higher
Widescreen: 1366x766 or higher
-
Minimum recommended browser cache size
250 MB SAP Data Warehouse Cloud is a Web 2.0 application. We recommend allowing browser caching because the application uses it heavily for static content such as image files. If you clear your cache, the browser will not perform as well until the deleted files are downloaded again to the browser and cached for use next time.
To set browser cache size, see your browser documentation.
HTTP 1.1 Enable -
JavaScript Enable -
Cookies Enable web browser session cookies (non-persistent) for authentication purposes
-
Pop-up windows Allow pop-up windows from SAP Data Warehouse Cloud domains
-
Power Option Recommendation High Performance mode for improved JavaScript performance
For Microsoft based Operating Systems
Administering SAP Data Warehouse CloudAdministering SAP Data Warehouse Cloud PUBLIC 9
Supported Languages
Client Browser What's Supported
Menus, buttons, messages, and other elements of the user interface.
Bulgarian (bgBG); Catalan (caES); Chinese (zhTW); Chinese (Simplified) (zhCN); Croatian (hrHR); Czech (csCZ); Danish (daDK); Dutch (nlNL); English (enGB); English (enUS); French (frCA); French (frFR); Finnish (fiFI); German (deDE); German (deCH); Greek (elGR); Hindi (hiIN); Hungarian (huHU); Indonesian (idID); Italian (itIT); Japanese (jaJP); Korean (koKR); Malay (msMY); Norwegian (noNO); Polish (plPL); Portuguese (Brazil) (ptBR); Portuguese (Portugal) (ptPT); Romanian (roRO); Russian (ruRU); Serbian (srRS); Slovakian (skSK); Slovenian (slSL); Spanish (esES); Spanish (esMX); Swedish (svSE); Thai (thTH); Turkish (trTR);Ukrainian (ukUA); Vietnamese (viVN) and Welsh (cyGB).
Data Connectivity
Connectivity to On-Premise Sources with SAP HANA Smart Data IntegrationWe recommend to use the latest version of the Data Provisioning Agent but at least the recommended minimum version from SAP Note 2419138 . Make sure that all agents that you want to connect to SAP Data Warehouse Cloud have the same latest version.
For information on minimum requirements for the on-premise sources, see the documentation Configure Data Provisioning Adapters in the SAP HANA Smart Data Integration and SAP HANA Smart Data Quality documentation.
1.2 Rules for Technical Names
Rules and restrictions apply to the technical names of objects that you create in SAP Data Warehouse Cloud. The technical name by default is synchronized with the business name by using rules to automatically replace invalid characters.
When specifying the technical name of an object, bear in mind the following rules and restrictions:
10 PUBLICAdministering SAP Data Warehouse Cloud
Administering SAP Data Warehouse Cloud
Object Type Rule
Space The space ID can only contain uppercase letters, numbers, and underscores (_). Reserved keywords, such as SYS, CREATE, or SYSTEM, must not be used. Unless advised to do so, the ID must not contain prefix _SYS and should not contain prefixes: DWC_, SAP_. The maximum length is 20 characters.
Reserved keywords: SYS, PUBLIC, CREATE, SYSTEM, DBADMIN, PAL_STEM_TFIDF, SAP_PA_APL, DWC_USER_OWNER, DWC_TENANT_OWNER, DWC_AUDIT_READER, DWC_GLOBAL, and DWC_GLOBAL_LOG.
SAP BW bridge instance
Remote table generated during the import of analysis authorizations from a SAP BW or SAP BW/4HANA system
The technical name can contain any characters except for the asterisk (*), colon (:), and hash sign (#). Also, tab, carriage return, and newline must not be used, and space must not be used at the start of the name. The maximum length is 50 characters.
Entity created in the Data Builder and Business Builder, for example a table or view
Column
Attribute
Measure
Analytical measure
The technical name can only contain alphanumeric characters and underscores (_). The maximum length is 30 characters.
Association The technical name can only contain alphanumeric characters, underscores (_), and dots (.). The maximum length is 10.
Input parameter The technical name can only contain uppercase letters, numbers, and underscores (_). The maximum length is 30 characters.
Database analysis user The user name suffix can only contain uppercase letters, numbers, and underscores (_). The maximum length is 41 characters. This suffix is added to the default prefix DWCDBUSER# to create your full user name. Note that you cannot change the prefix as it is a reserved prefix.
Database user group user The user name suffix can only contain uppercase letters, numbers, and underscores (_). The maximum length is 41 characters. This suffix is added to the default prefix DWCDBGROUP# to create your full user name. Note that you cannot change the prefix as it is a reserved prefix.
Database user (Open SQL schema) The user name suffix can only contain uppercase letters, numbers, and underscores (_). The maximum length is 41 characters. This suffix is added to the default prefix <space ID># to create your full user name. Note that you cannot change the prefix.
Administering SAP Data Warehouse CloudAdministering SAP Data Warehouse Cloud PUBLIC 11
Object Type Rule
Connection The technical name can only contain alphanumeric characters and underscores (_). Underscore (_) must not be used at the start or end of the name. The maximum length is 40 characters.
Data access control The technical name can only contain alphanumeric characters, and underscores (_). The maximum length is 50 characters.
The technical name by default is synchronized with the business name. While entering the business name, invalid characters are replaced in the technical name as follows:
Rule Example
Reserved keywords which are not allowed are removed. " SYS" ""
Leading underscores (_) are removed. "_NAME" "NAME"
Leading and trailing whitespaces (" ") are removed. " NAME " "NAME"
Whitespaces (" ") within a name are replaced with underscores (_).
"NA ME" "NA_ME"
Characters with diacritical signs are replaced with their basic character.
"Namé" "Name"
Non-alphanumeric characters are removed. "N$ME" "NME"
Dots (.) and double quotes (") are replaced with underscores (_).
"N.AM"E" "N_AM_E"
Leading dots (.) are removed. ".NAME" "NAME"
1.3 Configure the Size of Your SAP Data Warehouse Cloud Tenant
Configure the size of your tenant by allocating the capacity units based on your business needs.
Restriction
For information about the availability of this tenant configuration step, see SAP note 3144215 .
Your tenant has been provisioned and is available for the final configuration step: allocating your total number of capacity units to storage and compute resources.
You can confirm the default sizes or modify them within the permitted size combinations, at your first login in the dedicated Tenant Configuration page of the Configuration area.
As long as you have not finalized the size configuration of the tenant, you can use only administration features of SAP Data Warehouse Cloud (but you cannot create spaces in Space Management).
12 PUBLICAdministering SAP Data Warehouse Cloud
Administering SAP Data Warehouse Cloud
CautionOnce you save the size configuration of your tenant, you cannot change it.
NoteIf you need further resources, please reach out to SAP.
To access the SAP Data Warehouse Cloud roadmap, see the SAP Roadmap Explorer .
Procedure
The whole process may take more than 90 minutes. The configuration process is not long, but the operational process in the background can take a while.
1. Once you’ve logged into SAP Data Warehouse Cloud, a message indicates that the tenant has not been configured. Click the link in the message. The Tenant Configuration page in the Configuration area is displayed.
2. Adjust any of the individual sizes to obtain a configuration that fits your exact needs.For more information on supported sizes, see Supported Sizes for Your Tenant [page 15].
Property Description
Storage Select the size of disk storage using the + and - buttons.
You can specify from 256 GB (minimum), by increments of 256 GB.
Compute Blocks Select the number of compute blocks using the + and - buttons.
You can specify from 2 blocks (minimum), by increments of 1 block. Each compute block provides a certain number of vCPUs and a range of 60 to 64 GB of RAM, depending on the hyperscaler.
The number for vCPUs and memory are calculated based on the compute blocks and you cannot directly modify them.
Dependencies between storage and compute blocks: the storage and compute blocks depend on each other. When you modify a parameter, the other one needs to be changed accordingly. The minimum and maximum allowed storage sizes depend on the total memory defined by the selected compute blocks. The storage size must be higher than the memory size and can only be increased up to 4 times as much as the memory. All the possible size combinations of storage and compute blocks are controlled in the Tenant Configuration page. When the size combination you enter is allowed, the sizes are kept. When the size combination you enter is not allowed, error messages guide you to modify the sizes.
Memory Displays the size of memory calculated based on the selected number of compute blocks.
Total vCPUs Displays the number of vCPUs calculated based on the selected number of compute blocks.
Administering SAP Data Warehouse CloudAdministering SAP Data Warehouse Cloud PUBLIC 13
Property Description
Data Lake Storage [optional] Select the size of data lake disk storage using the + and - buttons.
You can specify from 0 TB (minimum) to 90 TB (maximum), by increments of 5 TB.
Data lake storage includes data lake compute.
SAP BW Bridge Storage [optional] Select the size of SAP BW bridge using the dropdown menu, starting from 0 GB (minimum).
SAP BW bridge includes SAP BTP, ABAP environment, runtime and compute.
NoteIf you want to allocate capacity units to SAP BW bridge, you’ll need to request the SAP BW bridge Cloud-ABAP Service for your tenant by opening an incident via ServiceNow with the component DWC-BWB.
First finalize the size configuration of your tenant, then open the incident as a next step. Once the incident has been processed, you can create the SAP BW bridge instance in the dedicated page SAP BW Bridge of the Configuration area with the size you’ve allocated (see Provisioning the SAP BW Bridge Tenant).
Allocated CUs Displays the total number of capacity units consumed by the storage and compute resources you've specified.
3. Review the sizes you’ve specified for the different resources.
CautionOnce you click Save, you cannot modify the size configuration of your tenant.
4. Click Save.
The configuration process of the tenant starts. Note that the whole process may take more than 90 minutes.
In case an error occurs, you are notified that the configuration cannot be completed and that you need to try again later by clicking the Retry button (which replaces the Save button in such a case). The delay depends on the error (for example, if there is an error on the SAP HANA Cloud database side, you may need to retry after 60 minutes).
NoteThe process for allocating capacity units to SAP BW bridge is not part of the configuration process and runs in parallel.
14 PUBLICAdministering SAP Data Warehouse Cloud
Administering SAP Data Warehouse Cloud
1.3.1 Supported Sizes for Your Tenant
View all supported sizes for compute and storage resources and the number of capacity units consumed.
Supported Size Combinations for Compute and Storage
Compute Blocks Storage (GB) Capacity Units
2 256 4300
3 256 6424
3 512 6476
4 256 8548
4 512 8600
4 768 8652
5 256 10672
5 512 10724
5 768 10776
5 1024 10828
6 256 12796
6 512 12848
6 768 12900
6 1024 12952
6 1280 13004
7 256 14920
7 512 14972
7 768 15024
7 1024 15076
7 1280 15128
7 1536 15180
8 256 17044
8 512 17096
8 768 17148
8 1024 17200
8 1280 17252
8 1536 17304
8 1792 17356
Administering SAP Data Warehouse CloudAdministering SAP Data Warehouse Cloud PUBLIC 15
Compute Blocks Storage (GB) Capacity Units
9 512 19220
9 768 19272
9 1024 19324
9 1280 19376
9 1536 19428
9 1792 19480
9 2048 19532
10 512 21344
10 768 21396
10 1024 21448
10 1280 21500
10 1536 21552
10 1792 21604
10 2048 21656
10 2304 21708
11 512 23468
11 768 23520
11 1024 23572
11 1280 23624
11 1536 23676
11 1792 23728
11 2048 23780
11 2304 23832
11 2560 23884
12 512 25592
12 768 25644
12 1024 25696
12 1280 25748
12 1536 25800
12 1792 25852
12 2048 25904
12 2304 25956
12 2560 26008
12 2816 26060
16 PUBLICAdministering SAP Data Warehouse Cloud
Administering SAP Data Warehouse Cloud
Compute Blocks Storage (GB) Capacity Units
13 512 27716
13 768 27768
13 1024 27820
13 1280 27872
13 1536 27924
13 1792 27976
13 2048 28028
13 2304 28080
13 2560 28132
13 2816 28184
13 3072 28236
14 512 29840
14 768 29892
14 1024 29944
14 1280 29996
14 1536 30048
14 1792 30100
14 2048 30152
14 2304 30204
14 2560 30256
14 2816 30308
14 3072 30360
14 3328 30412
15 512 31964
15 768 32016
15 1024 32068
15 1280 32120
15 1536 32172
15 1792 32224
15 2048 32276
15 2304 32328
15 2560 32380
15 2816 32432
15 3072 32484
Administering SAP Data Warehouse CloudAdministering SAP Data Warehouse Cloud PUBLIC 17
Compute Blocks Storage (GB) Capacity Units
15 3328 32536
15 3584 32588
16 512 34088
16 768 34140
16 1024 34192
16 1280 34244
16 1536 34296
16 1792 34348
16 2048 34400
16 2304 34452
16 2560 34504
16 2816 34556
16 3072 34608
16 3328 34660
16 3584 34712
16 3840 34764
17 512 36212
17 768 36264
17 1024 36316
17 1280 36368
17 1536 36420
17 1792 36472
17 2048 36524
17 2304 36576
17 2560 36628
17 2816 36680
17 3072 36732
17 3328 36784
17 3584 36836
17 3840 36888
18 768 38388
18 1024 38440
18 1280 38492
18 1536 38544
18 PUBLICAdministering SAP Data Warehouse Cloud
Administering SAP Data Warehouse Cloud
Compute Blocks Storage (GB) Capacity Units
18 1792 38596
18 2048 38648
18 2304 38700
18 2560 38752
18 2816 38804
18 3072 38856
18 3328 38908
18 3584 38960
18 3840 39012
18 4096 39064
19 768 40512
19 1024 40564
19 1280 40616
19 1536 40668
19 1792 40720
19 2048 40772
19 2304 40824
19 2560 40876
19 2816 40928
19 3072 40980
19 3328 41032
19 3584 41084
19 3840 41136
19 4096 41188
19 4352 41240
20 768 42636
20 1024 42688
20 1280 42740
20 1536 42792
20 1792 42844
20 2048 42896
20 2304 42948
20 2560 43000
20 2816 43052
Administering SAP Data Warehouse CloudAdministering SAP Data Warehouse Cloud PUBLIC 19
Compute Blocks Storage (GB) Capacity Units
20 3072 43104
20 3328 43156
20 3584 43208
20 3840 43260
20 4096 43312
20 4352 43364
20 4608 43416
21 768 44760
21 1024 44812
21 1280 44864
21 1536 44916
21 1792 44968
21 2048 45020
21 2304 45072
21 2560 45124
21 2816 45176
21 3072 45228
21 3328 45280
21 3584 45332
21 3840 45384
21 4096 45436
21 4352 45488
21 4608 45540
21 4864 45592
22 768 46884
22 1024 46936
22 1280 46988
22 1536 47040
22 1792 47092
22 2048 47144
22 2304 47196
22 2560 47248
22 2816 47300
22 3072 47352
20 PUBLICAdministering SAP Data Warehouse Cloud
Administering SAP Data Warehouse Cloud
Compute Blocks Storage (GB) Capacity Units
22 3328 47404
22 3584 47456
22 3840 47508
22 4096 47560
22 4352 47612
22 4608 47664
22 4864 47716
22 5120 47768
23 768 49008
23 1024 49060
23 1280 49112
23 1536 49164
23 1792 49216
23 2048 49268
23 2304 49320
23 2560 49372
23 2816 49424
23 3072 49476
23 3328 49528
23 3584 49580
23 3840 49632
23 4096 49684
23 4352 49736
23 4608 49788
23 4864 49840
23 5120 49892
23 5376 49944
24 768 51132
24 1024 51184
24 1280 51236
24 1536 51288
24 1792 51340
24 2048 51392
24 2304 51444
Administering SAP Data Warehouse CloudAdministering SAP Data Warehouse Cloud PUBLIC 21
Compute Blocks Storage (GB) Capacity Units
24 2560 51496
24 2816 51548
24 3072 51600
24 3328 51652
24 3584 51704
24 3840 51756
24 4096 51808
24 4352 51860
24 4608 51912
24 4864 51964
24 5120 52016
24 5376 52068
24 5632 52120
25 768 53256
25 1024 53308
25 1280 53360
25 1536 53412
25 1792 53464
25 2048 53516
25 2304 53568
25 2560 53620
25 2816 53672
25 3072 53724
25 3328 53776
25 3584 53828
25 3840 53880
25 4096 53932
25 4352 53984
25 4608 54036
25 4864 54088
25 5120 54140
25 5376 54192
25 5632 54244
25 5888 54296
22 PUBLICAdministering SAP Data Warehouse Cloud
Administering SAP Data Warehouse Cloud
Compute Blocks Storage (GB) Capacity Units
26 1024 55432
26 1280 55484
26 1536 55536
26 1792 55588
26 2048 55640
26 2304 55692
26 2560 55744
26 2816 55796
26 3072 55848
26 3328 55900
26 3584 55952
26 3840 56004
26 4096 56056
26 4352 56108
26 4608 56160
26 4864 56212
26 5120 56264
26 5376 56316
26 5632 56368
26 5888 56420
26 6144 56472
27 1024 57556
27 1280 57608
27 1536 57660
27 1792 57712
27 2048 57764
27 2304 57816
27 2560 57868
27 2816 57920
27 3072 57972
27 3328 58024
27 3584 58076
27 3840 58128
27 4096 58180
Administering SAP Data Warehouse CloudAdministering SAP Data Warehouse Cloud PUBLIC 23
Compute Blocks Storage (GB) Capacity Units
27 4352 58232
27 4608 58284
27 4864 58336
27 5120 58388
27 5376 58440
27 5632 58492
27 5888 58544
27 6144 58596
27 6400 58648
28 1024 59680
28 1280 59732
28 1536 59784
28 1792 59836
28 2048 59888
28 2304 59940
28 2560 59992
28 2816 60044
28 3072 60096
28 3328 60148
28 3584 60200
28 3840 60252
28 4096 60304
28 4352 60356
28 4608 60408
28 4864 60460
28 5120 60512
28 5376 60564
28 5632 60616
28 5888 60668
28 6144 60720
28 6400 60772
28 6656 60824
29 1024 61804
29 1280 61856
24 PUBLICAdministering SAP Data Warehouse Cloud
Administering SAP Data Warehouse Cloud
Compute Blocks Storage (GB) Capacity Units
29 1536 61908
29 1792 61960
29 2048 62012
29 2304 62064
29 2560 62116
29 2816 62168
29 3072 62220
29 3328 62272
29 3584 62324
29 3840 62376
29 4096 62428
29 4352 62480
29 4608 62532
29 4864 62584
29 5120 62636
29 5376 62688
29 5632 62740
29 5888 62792
29 6144 62844
29 6400 62896
29 6656 62948
29 6912 63000
30 1024 63928
30 1280 63980
30 1536 64032
30 1792 64084
30 2048 64136
30 2304 64188
30 2560 64240
30 2816 64292
30 3072 64344
30 3328 64396
30 3584 64448
30 3840 64500
Administering SAP Data Warehouse CloudAdministering SAP Data Warehouse Cloud PUBLIC 25
Compute Blocks Storage (GB) Capacity Units
30 4096 64552
30 4352 64604
30 4608 64656
30 4864 64708
30 5120 64760
30 5376 64812
30 5632 64864
30 5888 64916
30 6144 64968
30 6400 65020
30 6656 65072
30 6912 65124
30 7168 65176
Supported Storage Sizes for Data Lake
Data Lake Storage (TB) Capacity Units
5 2800
10 5600
15 8400
20 11200
25 14000
30 16800
35 19600
40 22400
45 25200
50 28000
55 30800
60 33600
65 36400
70 39200
75 42000
80 44800
26 PUBLICAdministering SAP Data Warehouse Cloud
Administering SAP Data Warehouse Cloud
Data Lake Storage (TB) Capacity Units
85 47600
90 50400
Supported Storage Sizes for SAP BW Bridge
BW Bridge Storage (GB) Capacity Units
256 3400
512 6800
1024 13600
2048 27200
4096 54400
Administering SAP Data Warehouse CloudAdministering SAP Data Warehouse Cloud PUBLIC 27
2 Managing Users and Roles
Create and manage users, manage secure access to SAP Data Warehouse Cloud using roles, and set up authentication for your users if you are using your own identity provider.
2.1 Configuring Identity Provider Settings
By default, SAP Cloud Identity Authentication is used by SAP Data Warehouse Cloud. We also support single sign-on (SSO), using your identity provider (IdP).
Related Information
Enable IdP-Initiated Single Sign On (SAP Data Center Only) [page 28]Renewing the SAP Analytics Cloud SAML Signing Certificate [page 30]Enabling a Custom SAML Identity Provider [page 31]
2.1.1 Enable IdP-Initiated Single Sign On (SAP Data Center Only)
By default, IdP-initiated SSO is not supported if SAP Data Warehouse Cloud is running on an SAP Data Center. To support IdP initiated SSO on an SAP Data Center, you must add a new assertion consumer service endpoint to your identity provider.
Prerequisites
SAP Data Warehouse Cloud can be hosted either on SAP data centers or on non-SAP data centers. Determine which environment SAP Data Warehouse Cloud is hosted in by inspecting your URL:
● A single-digit number, for example us1 or jp1, indicates an SAP data center.● A two-digit number, for example eu10 or us30, indicates a non-SAP data center.
28 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
Procedure
1. Navigate to your IdP and find the page where you configure SAML 2.0 Single Sign On.2. Find and copy your FQDN.
For example, mysystem.wdf.sap-ag.de3. Add a new assertion consumer service (ACS) endpoint that follows this pattern:
https:// <FQDN>/
For example, https://mysystem.wdf.sap-ag.de/4. If you are using SAP Cloud Identity Authentication Service as your identity provider, the link to log onto SAP
Data Warehouse Cloud through your identity provider will follow this pattern:
https://<tenant_ID>.accounts.ondemand.com/saml2/idp/sso?sp=<sp_name>&index=<index_number>
For example, https://testsystem.accounts999.ondemand.com/saml2/idp/sso?sp=mysystem.wdf.sap-ag.de.cloud&index=1
NoteThe pattern will vary depending on the identity provider you use.
The following table lists the URL parameters you can use for IdP-initiated SSO.
Parameter Mandatory Description
sp Yes ○ This is the name of the SAML 2 service provider for which SSO is performed.
○ The sp_name value of the parameter equals the Entity ID of the service provider.
○ This parameter is needed for Identity Authentication to know which service provider to redirect the user to after successful authentication.
index NoteYou can choose by the index the correct ACS endpoint for unsolicited SAML response processing. Provide the index parameter when the default ACS endpoint that has been configured via the administration
○ Enter the index number of the endpoint of the assertion consumer service of the service provider as the target of the SAML response. Otherwise, the identity provider uses the default endpoint configured for the trusted service provider.
○ If your IdP doesn't support indexing, you must choose between IdP-initiated SSO or SP-
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 29
Parameter Mandatory Description
console cannot process unsolicited SAML responses.
initiated SSO. You can either replace the default ACS endpoint to initiate an IdP SSO or continue using the default endpoint to initiate an SP SSO.
○ A non-digit value or a value for an index entry that is not configured returns an error message.
Results
Users will be able to use SAML SSO to log onto SAP Data Warehouse Cloud through their identity provider.
2.1.2 Renewing the SAP Analytics Cloud SAML Signing Certificate
To continue using SAML SSO, an administrator must renew the certificate before it expires.
Context
An email with details on how to renew the SAML X509 certificate is sent to administrators before the certificate expiry date. If the certificate expiry is less than 30 days away, a warning message appears when you log on to SAP Data Warehouse Cloud.
NoteIf you click the Renew link on the warning message, you're taken to the Security tab on the (Administration) page.
Procedure
1. From the side navigation, go to (System) → (Administration)→ Security.2. Select Renew.
A confirmation dialog appears. When you confirm the renewal, a new metadata file is automatically downloaded.
30 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
NoteThe renewal process takes around five minutes to complete.
3. If you use a custom identity provider, upload the SAP Data Warehouse Cloud metadata file to your SAML Identity Provider (IdP).
NoteThis step is not required if you use SAP Cloud ID for authentication.
4. If you have live data connections to SAP HANA systems that use SAML SSO, you must also upload the new metadata file to your SAP HANA systems.
5. Log on to SAP Data Warehouse Cloud when five minutes has passed.
Results
If you are able to log on, the certificate renewal was successful. If you cannot logon, try one of the following troubleshooting tips.
If you use SAP Cloud ID for authentication:
1. Clear the browser cache.2. Allow up to five minutes for the SAP Cloud ID service to switch to the new certificate.
If you use a custom identity provider for authentication:
1. Ensure the new metadata file has been uploaded to your IdP. For more information, see Enabling a Custom SAML Identity Provider [page 31].
2. Clear the browser cache.3. Allow up to five minutes for your IdP to switch to the new certificate with the newly uploaded metadata.
2.1.3 Enabling a Custom SAML Identity Provider
By default, SAP Cloud Identity Authentication is used by SAP Data Warehouse Cloud. SAP Data Warehouse Cloud also supports single sign-on (SSO), using your identity provider (IdP).
Prerequisites
● You must have an IdP that supports SAML 2.0 protocol.● You must be able to configure your IdP.● You must be assigned to the System Owner role. For more information see Transfer the System Owner Role
[page 49].
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 31
● SAP Data Warehouse Cloud can be hosted either on SAP data centers or on non-SAP data centers. Determine which environment SAP Data Warehouse Cloud is hosted in by inspecting your SAP Data Warehouse Cloud URL:○ A single-digit number, for example us1 or jp1, indicates an SAP data center.○ A two-digit number, for example eu10 or us30, indicates a non-SAP data center.
● If your users are connecting from Apple devices using the mobile app, the certificate used by your IdP must be compatible with Apple's App Transport Security (ATS) feature.
NoteA custom identity provider is a separate solution, like for example Azure AD, and is not part of SAP Analytics Cloud or SAP Data Warehouse Cloud. Therefore the change in configuration is to be applied directly in the solution, not within SAP Analytics Cloud or SAP Data Warehouse Cloud. Also no access to SAP Analytics Cloud or SAP Data Warehouse Cloud is required to make the change, only an access to the Identity Provider, eg Azure AD.
Procedure
1. From the side navigation, go to (System) → (Administration) →Security .
If you've provisioned SAP Data Warehouse Cloud prior to version 2021.03 you'll see a different UI and need go to (My Products) → (Analytics) → (System) → (Administration) → Security.
2. Select (Edit).3. In the Authentication Method area, select SAML Single Sign-On (SSO) if it is not already selected.
NoteBy default, SAP Cloud ID is used for authentication.
4. In Step 1, select Download and save the metadata file.A metadata file is saved.
5. Upload the metadata file to your SAML IdP.The file includes metadata for SAP Data Warehouse Cloud, and is used to create a trust relationship between your SAML Identity Provider and your SAP Data Warehouse Cloud system.
6. Optional: You can access the system from your SAML Identity Provider by adding a new assertion consumer service endpoint to your identity provider. For more information, see Enable IdP-Initiated Single Sign On (SAP Data Center Only) [page 28].
7. Map your SAML IdP user attributes and roles.
If SAP Data Warehouse Cloud is running on an SAP data center, you must submit an SAP Product Support Incident using the component LOD-ANA-ADM. In the support ticket, indicate that you want to set up user profiles and role assignment based on custom SAML attributes, and include your SAP Data Warehouse Cloud tenant URL.
32 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
NoteIf SAP Data Warehouse Cloud is running on an SAP data center, and you want to continue using User Profiles and Role assignment using SAML attributes, you will need to open a support ticket each time you switch to a different custom IdP.
If SAP Data Warehouse Cloud is running on a non-SAP data center, you must configure your SAML IdP to map user attributes to the following case-sensitive allowlisted assertion attributes:
Attribute Name Notes
email Required if your NameID is "email".
Groups Required. The value must be set to "sac", even in case of SAP Data Warehouse Cloud. The Groups attribute is a custom attribute and must be added if it does not exist yet. You need to contact your administrator to get the path where the mapping needs to be changed.
familyName Optional. familyName is the user's last name (surname).
displayName Optional.
functionalArea Optional.
givenName Optional. givenName is the user's first name.
preferredLanguage Optional.
custom1 Optional. For SAML role assignment.
custom2 Optional. For SAML role assignment.
custom3 Optional. For SAML role assignment.
custom4 Optional. For SAML role assignment.
custom5 Optional. For SAML role assignment.
Example:
<AttributeStatement> <Attribute Name="email"> <AttributeValue>[email protected]</AttributeValue> </Attribute> <Attribute Name="givenName"> <AttributeValue>Abc</AttributeValue> </Attribute> <Attribute Name="familyName"> <AttributeValue>Def</AttributeValue> </Attribute> <Attribute Name="displayName"> <AttributeValue>Abc Def</AttributeValue> </Attribute> <Attribute Name="Groups"> <AttributeValue>sac</AttributeValue> </Attribute>
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 33
<Attribute Name="custom1"> <AttributeValue>Domain Users</AttributeValue> <AttributeValue>Enterprise Admins</AttributeValue> <AttributeValue>Enterprise Key Admins</AttributeValue> </Attribute> </AttributeStatement>
NoteIf you are using the SAP Cloud Identity Authentication service as your IdP, map the Groups attribute under Default Attributes for your SAP Data Warehouse Cloud application. The remaining attributes should be mapped under Assertion Attributes for your application.
8. Download metadata from your SAML IdP.9. In Step 2, select Upload, and choose the metadata file you downloaded from your SAML IdP.10. In Step 3, select a User Attribute.
The attribute will be used to map users from your existing SAML user list to SAP Data Warehouse CloudNameID used in your custom SAML assertion:
<NameID Format="urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified"><Your Unique Identifier></NameID>
Determine what your NameID maps to in your SAP Data Warehouse Cloud system. It should map to . The user attribute you select must match the User ID, Email or a custom attribute. You can view your SAP Data Warehouse Cloud user attributes in Security Users .
NoteNameID is case sensitive. The User ID, Email, or Custom SAML User Mapping must match the values in your SAML IdP exactly. For example, if the NameId returned by your SAML IdP is [email protected] and the email you used in SAP Data Warehouse Cloud is [email protected] the mapping will fail.
Choose one of the following options:
○ USER ID: If NameID maps to the SAP Data Warehouse Cloud User ID.○ Email: If NameID maps to SAP Data Warehouse Cloud Email address.
NoteIf your NameID email is not case-sensitive and contains mixed-case, for example [email protected], consider choosing Custom SAML User Mapping instead.
○ Custom SAML User Mapping: If NameID maps to a custom value.
Note
If you select this option, there will be a new column named SAML User Mapping in SecurityUsers . The. After switching to your SAML IdP, you must manually update this column for all existing users.
34 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
NoteIf you are using a live connection to SAP S/4HANA Cloud Edition with OAuth 2.0 SAML Bearer Assertion, NameId must be identical to the user name of the business user on your SAP S/4HANA system.
For example, if you want to map an SAP Data Warehouse Cloud user with the user ID SACUSER to your SAP S/4HANA Cloud user with the user name S4HANAUSER, you must select Custom SAML User Mapping and use S4HANAUSER as the Login Credential in Step 10.
If you are using SAP Cloud Identity as your SAML IdP, you can choose Login Name as the NameID attribute for SAP Data Warehouse Cloud, then you can set the login name of your SAP Data Warehouse Cloud user as S4HANAUSER.
11. Optional: Enable Dynamic User Creation.
When dynamic user creation is enabled, new users will be automatically created using the default role and will be able to use SAML SSO to log onto SAP Data Warehouse Cloud. After users are created, you can set roles using SAML attributes.
NoteAutomatic user deletion is not supported. If a user in SAP Data Warehouse Cloud is removed from your SAML IdP, you must go to Security Users and manually delete users. For more information, see Deleting Users [page 46].
If this option is enabled, dynamic user creation still occurs even when SAML user attributes have not been set for all IdP users. To prevent a user from being automatically created, your SAML IdP must deny the user access to SAP Data Warehouse Cloud.
12. In Step 4, enter <Your Unique Identifier>.
This value must identify the system owner. The Login Credential provided here are automatically set for your user.
NoteThe Login Credential depends on the User Attribute you selected under Step 3.
13. Test the SAML IdP setup, by logging in with your IdP, and then clicking Verify Account to open a dialog for validation.
In another browser, log on to the URL provided in the Verify Your Account dialog, using your SAML IdP credentials. You can copy the URL by selecting (Copy).
You must use a private session to log onto the URL; for example, guest mode in Chrome. This ensures that when you log on to the dialog and select SAP Data Warehouse Cloud, you are prompted to log in and do not reuse an existing browser session.
NoteIf SAP Data Warehouse Cloud is running on a non-SAP data center, upon starting the verification step, you will see a new screen when logging into SAP Data Warehouse Cloud. Two links will be displayed on this page. One will link to your current IdP and the other will link to the new IdP you will switch to. To perform the Verify Account step, use the link for the new IdP. Other SAP Data Warehouse Cloud users
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 35
can continue logging on with the current IdP. Once you have completed Step 16 and the IdP switch has completed, this screen will no longer appear.
If you can log on successfully, the SAML IdP setup is correct.14. In the Verify Your Account dialog, select Check Verification.
If the verification was successful, a green border should appear around the Login Credential box.
15. Select (Save).The Convert to SAML Single Sign-On confirmation dialog will appear.
16. Select Convert.When conversion is complete, you will be logged out and directed to the logon page of your SAML IdP.
17. Log on to SAP Data Warehouse Cloud with the credentials you used for the verification step.
18. From the side navigation, go to (Security) → and (Users)look for the column of the User Attribute you selected in step 8.
The values in this column should be a case sensitive match with the NameId sent by your IdP's SAML assertion.
NoteIf you selected Custom SAML User Mapping as User Attribute, you must manually update all fields in the SAML User Mapping column.
Results
Users will be able to use SAML SSO to log onto SAP Data Warehouse Cloud.
NoteYou can also set up your IdP with your Public Key Infrastructure (PKI) so that you can automatically log in your users with a client side X.509 certificate.
Next Steps
Switch to a Different Custom IdPIf SAML SSO is enabled and you would like to switch to a different SAML IdP, you can repeat the above steps using the new SAML IdP metadata.
36 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
2.1.3.1 Disabling SAML SSO
You can revert your system to the default identity provider (SAP Cloud Identity) and disable your custom SAML IdP.
Procedure
1. From the side navigation, go to (System)→ (Administration) → Security.
If you've provisioned SAP Data Warehouse Cloud prior to version 2021.03 you'll see a different UI and need go to (My Products) → (Analytics) → (System) → (Administration) → Security.
2. Select (Edit) .3. In the Authentication Method area, select SAP Cloud Identity (default).
4. Select (Save) .
Results
When conversion is complete, you will be logged out and directed to the SAP Cloud Identity logon page.
2.1.3.2 Updating the SAML IdP Signing Certificate
You can update the SAML identity provider (IdP) signing certificate.
Prerequisites
● You must have the metadata file that contains the new certificate from your custom IdP, and you must be logged into SAP Data Warehouse Cloud before your IdP switches over to using the new certificate.
● You must be the System Owner in SAP Data Warehouse Cloud.
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 37
Context
To upload the new metadata file, do the following:
Procedure
1. From the side navigation, go to (System) → (Administration) →Security .
If you've provisioned SAP Data Warehouse Cloud prior to version 2020.03 you'll see a different UI and need go to (My Products) → (Analytics) → (System) → (Administration) → Security.
2. Select (Edit)3. Under Step 2, select Update and provide the new metadata file.
4. Select (Save) and confirm the change to complete the update.The update will take effect within two minutes.
Results
NoteYou do not have to redo Step 3 or Step 4 on the Security tab.
2.1.3.3 Identity Provider Administration
The Identity Provider Administration tool allows system owners to manage the custom identity provider configured with SAP Data Warehouse Cloud. Through the tool, the system owner can choose to upload new metadata for the current custom identity provider, or revert to using the default identity provider.
Prerequisites
● SAP Data Warehouse Cloud must already be configured to use a custom identity provider.● You must be the system owner.
Procedure
1. Access the Identity Provider Administration tool using the following URL pattern:
38 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
https://console.<data center>.sapanalytics.cloud/idp-admin/
For example, if your SAP Data Warehouse Cloud system is on eu10, then the URL is:https://console.eu10.sapanalytics.cloud/idp-admin/
If your SAP Data Warehouse Cloud system is on cn1, then the URL is:https://console.cn1.sapanalyticscloud.cn/idp-admin/
If your tenant is on EUDP:
https://console-eudp.eu1.sapanalytics.cloud/idp-admin/
https://console-eudp.eu2.sapanalytics.cloud/idp-admin/
2. Log in with an S-user that has the same email address as the system owner of your system. If you don't yet have such an S-user, you can click the “Register” button and create a P-user.If you create a new P-user, you'll receive an email with an activation link that will let you set your password.
3. Once you're logged in, you'll see a list of SAP Data Warehouse Cloud systems for which you are the system owner.
Select the system you want to work on by clicking on its row.
Once you're in the settings page for your system, you can see information about your current custom identity provider. If you need to reacquire your system's metadata, you can click the “Service Provider Metadata Download” link.
If you don't want to manage your custom identity provider through Identity Provider Administration, you can disconnect your system by clicking “Disconnect IdP Admin from your system”.
4. To proceed with either reverting to the default identity provider or updating the current custom identity provider, select the corresponding radio button and then click “Step 2”.
NoteYour SAP Data Warehouse Cloud system is connected to the Identity Provider Administration tool by default. The connection status for your system is displayed under the “Status” column of the systems list page. If you'd like to disconnect your system from the console, you can do so in either of two places:
○ In SAP Data Warehouse Cloud, navigate to System Administration Security Optional: Configure Identity Provider Administration Tool , click the Connected switch, and then save the changes.
○ Click “Disconnect IdP Admin from your system” after selecting your system in Identity Provider Administration.
5. (Optional) Revert to the default identity provider.Choose this option if you're having problems logging in with your custom identity provider and would like to revert to the default identity provider. Once the reversion has finished, you can exit the Identity Provider Administration tool and log in to your SAP Data Warehouse Cloud system to reconfigure your custom identity provider.a. Select the “Yes” radio button to revert to the default IdP.b. Select “Yes” in the confirmation dialog to revert your authentication method back to the default IdP.c. Click “Step 3” to proceed to the validation step.
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 39
d. Click “Log into SAP Data Warehoouse Cloud” to open a new tab and navigate to your system. Log in with your default identity provider credentials. If you get an error saying “Your profile is not configured”, please create a support ticket under the component LOD-ANA-BI.
6. (Optional) Upload new metadata for the current custom identity provider.Choose this option if you need to reconfigure trust between your custom identity provider and your SAP Data Warehouse Cloud system. A common use case is to upload new metadata from your identity provider when a new signing certificate has been generated.a. Click “Browse” to select the new metadata file for your current custom identity provider.b. Click “Upload File” to upload the provided metadata file. After the upload is successful, it can take up
to five minutes for the new metadata file to be applied.c. Click “Step 3” to proceed to the validation step.d. Click “Log into SAP Data Warehouse Cloud” to open a new tab and navigate to your system. If you have
any login problems related to the identity provider configuration, as opposed to a user-specific problem, you can return to the Identity Provider Administration tool and either re-upload the metadata file or revert to the default identity provider.
2.2 Managing SAP Data Warehouse Cloud Users
You can create and modify users in SAP Data Warehouse Cloud in several different ways.
Creating Users
You can create users in the following ways:
Method More Information
Create individual users in the Users list Creating a New User [page 41]
Import multiple users from a CSV file Importing or Modifying Users from a File [page 42]
Modifying Users
You can modify existing users in the following ways:
Modification More Information
Export user data to a CSV file, to synchronize with other systems
Exporting Users [page 44]
Update the email address a user logs on with Updating User Email Addresses [page 46]
Delete users Deleting Users [page 46]
40 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
2.2.1 Creating a New User
You can create individual users in SAP Data Warehouse Cloud.
Prerequisites
You can select one or more roles while you're creating the user. Before getting started creating users, you might want to become familiar with the standard application roles or custom roles. But don't worry-you can still assign roles after you've created the users.
NoteFor more information about how to create a user and assign roles, check the in-app help in your system. Use F1 to open in-app help from SAP Data Warehouse Cloud User Interfaces. You can also use the question mark and click the Help tile to open the help.
Type of Role Description More Information
Standard application roles The roles available depend on the licenses included in your subscription
Standard Application Roles [page 48]
Custom roles Variations on the standard roles, created to meet your company's needs
Creating a Custom Role [page 50]
Context
The method described here assumes that SAP Data Warehouse Cloud is using its default authentication provider. If you are using a custom SAML Identity Provider, you must provide slightly different information, depending upon how your SAML authentication is configured.
Procedure
1. Go to (Expand) (Security) (Users).
2. Select (New) to add a new user to the user management table.3. Enter a User ID.
Each user needs a unique ID. Only alphanumeric and underscore characters are allowed. The maximum length is 127 characters.
4. Enter the user name details.Only Last Name is mandatory, but it is recommended that you provide a First Name, Last Name, and Display Name. Display Name will appear in user-facing screens.
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 41
5. Enter an Email address.
A welcome email with logon information will be sent to this address.6. Select the Manager who will approve requests this user makes for new role assignments.
Users can request additional roles only if they have a custom role that allows for self-service.
7. Select the icon and choose one or more roles from the list.
If one or more default roles have already been created, you can leave Roles empty. Default roles will be assigned when you click save.
8. Select (Save).
Results
A welcome email including an account activation URL will be sent to the user, so that the user can set an initial password and access the system.
2.2.2 Importing or Modifying Users from a File
You can create new users or batch-update existing users by importing user data that you have saved in a CSV file.
Prerequisites
The user data you want to import must be stored in a CSV file. At minimum, your CSV file needs columns for UserID, LastName, and Email, but it is recommended that you also include FirstName and DisplayName.
If you want to assign new users different roles, include a Roles column in the CSV file. The role IDs used for role assignment are outlined in Standard Application Roles [page 48].
For existing users that you want to modify, you can create the CSV file by first exporting a CSV file from SAP Data Warehouse Cloud. For more information, see Exporting Users [page 44].
NoteThe first name, last name, and display name are linked to the identity provider, and can't be changed in the User list page, or when importing a CSV file. (In the User list page, those columns are grayed out.)
To edit those values, you'll need to use the user login, and edit that user's profile.
Edit the downloaded CSV file to remove columns whose values you don't want to modify, and to remove rows for users whose values you don't want to modify. Do not modify the USERID column. This ensures that entries can be matched to existing users when you re-import the CSV.
42 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
These are the available mapping parameters when importing CSV user data:
Parameter Description
User ID
First Name
Last Name
Display Name
Manager
Roles
Mobile
Phone
Office Location
Function Area Can be used to refer to a user's team or area within their organization.
Job Title
Clean up notifications older than Set in user settings: when to automatically delete notifications.
Email Notification Set in user settings.
Welcome message Message that is shown to the user on the home screen.
Page tips Enabled/disabled via the help center (deprecated).
Closed Page tips Closed page tips are tracked so that they are not shown again.
Closed Item Picker Tips Closed tooltips are tracked so that they won't be reopened again (for first time users).
Current Banner Saves which banner is currently showing.
Last Banner The UUID of the last closed banner.
Last Maintenance Banner Version The version when the last maintenance banner was shown.
Marketing email opt in Set in user settings.
Homescreen content is initialized If default tiles have been set for the home screen.
Expand Story Toolbar Set in user settings.
Is user concurrent If the user has a concurrent license.
On the Edit Home Screen dialog, a user can override all the default preferences that have been set by the administrator
for the system ( System Administration Default
Appearance ). These are the preferences:
Override Background Option
Override Logo Option
Override Welcome Message
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 43
Parameter Description
Override Home Search To Insight
Override Get Started
Override Recent Stories
Override Recent Presentations
Override Calendar Highlights
Procedure
1. Go to (Expand) (Security) (Users).
2. Select (Import Users) Import Users from File .3. In the Import Users dialog, choose Select Source File to upload your CSV file.4. Choose Create Mapping to assign the fields of your user data from the CSV file to the fields in user
management.5. Select the appropriate entries for the Header, Line Separator, Delimiter, and Text Qualifier.6. Select OK when you've finished mapping.7. In the Import Users dialog, choose Import to upload your CSV file according to the defined mapping.
2.2.3 Exporting Users
If you want to synchronize SAP Data Warehouse Cloud user data with other systems, you can export the data to a CSV file.
Procedure
On the Users page of the Security area, choose (Export).
Results
The system exports all user data into a CSV file that is automatically downloaded to your browser's default download folder.
44 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
The CSV file contains these columns:
Column Description
USER_NAME
FIRST_NAME
LAST_NAME
DISPLAY_NAME
MANAGER
ROLES Roles assigned to the user.
SAML_USER_MAPPING SAML property for the user (if SAML enabled).
MOBILE Set in user preferences.
OFFICE_PHONE Set in user preferences.
OFFICE_ADDRESS Set in user preferences.
AGILE_BI_ENABLED_BY_DEFAULT Opt in for the agile data preparation feature.
JOB_TITLE Set in user preferences.
MARKETING_EMAIL_OPT_IN Set in user preferences.
IS_CONCURRENT Licensing attribute to indicate whether the user is consuming a named licensed user account (0) or a concurrent licensed user account (1).
DEFAULT_APP The application that will launch when you access your SAP Data Warehouse Cloud URL. The default application can be
set in System Administration System
Configuration or in the user settings.
On the Edit Home Screen dialog, a user can override all the default preferences that have been set by the administrator
for the system ( System Administration Default
Appearance ). These are the preferences:
OVERRIDE_BACKGROUND_OPTION
OVERRIDE_LOGO_OPTION
OVERRIDE_WELCOME_MESSAGE_FLAG
OVERRIDE_HOME_SEARCH_TO_INSIGHT_FLAG
OVERRIDE_GET_STARTED_FLAG
OVERRIDE_RECENT_FILES_FLAG
OVERRIDE_RECENT_STORIES_FLAGOVERRIDE_RECENT_STORIES_FLAG
OVERRIDE_RECENT_PRESENTATIONS_FLAG
OVERRIDE_RECENT_APPLICATIONS_FLAG
OVERRIDE_CALENDAR_FLAG
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 45
Column Description
OVERRIDE_FEATURED_FILES_FLAG
2.2.4 Updating User Email Addresses
You can update the user email addresses used for logon.
When you create a user, you must add an email address. The email address is used to send logon information.
To edit a user's email address, go to the Users page of the Security area, and select the email address you want to modify. Add a new email address and press Enter, or select another cell to set the new address.
If the email address is already assigned to another user, a warning will appear and you must enter a new address. Every user must be assigned a unique email address.
A new logon email will be sent to the updated address.
As long as a user has not logged on to the system with the new email address, the email address will appear in a pending state on the Users list.
Related Information
Creating a New User [page 41]Importing or Modifying Users from a File [page 42]
2.2.5 Deleting Users
You can delete users.
Procedure
1. In the Users management table, select the user ID you want to delete by clicking the user number in the leftmost column of the table.The whole row is selected.
2. Choose (Delete) from the toolbar.3. Select OK to continue and remove the user from the system.
46 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
Related Information
Creating a New User [page 41]Importing or Modifying Users from a File [page 42]Updating User Email Addresses [page 46]
2.3 Managing Roles and Privileges
Assigning roles to your users maintains access rights and secures your information in SAP Data Warehouse Cloud.
A role is a set of permissions grouped together and is then assigned to users. The pre-defined standard roles are grouped by the license type they consume. With SAP Data Warehouse Cloud you would, for example, see SAP Data Warehouse Cloud specific roles. Tenant administrators can also create their own custom roles by selecting individual privileges and permissions they would like their new role to have.
Related Information
Assigning Roles to Users [page 47]Creating a Custom Role [page 50]
2.3.1 Assigning Roles to Users
There are multiple ways to assign roles to users. To assign roles, you need a DW Administrator role.
NoteFor more information about how to create a user and assign roles, check the in-app help in your system. Use F1 to open in-app help from SAP Data Warehouse Cloud User Interfaces. You can also use the question mark and click the Help tile to open the help.
Assigning a Role to Multiple Users
1. Go to (Expand) (Security) (Roles).2. Find the role that you want to assign.3. At the bottom of the role box, click the link to change which users and teams the role will be assigned to.4. Select one or more users from the Available Members list.
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 47
5. Select OK.
Assigning or Updating an Individual User's Role
1. Go to (Expand) (Security) (Users).2. On the Users page, find the required user.3. In the user's row, select the icon in the Roles column. A list of Available Roles will appear.4. Select one or more roles.5. Select OK.
2.3.1.1 Roles and Licenses
The pre-defined standard roles are grouped by the license type they consume.
Each user's license consumption is determined solely by the roles that they've been assigned. For example, a user who has been assigned only the DW Administrator standard role consumes only a Data Warehouse Cloud license.
To access the Roles area, go to (Expand) (Security) (Roles).
This example shows the predefined standard roles associated with the SAP Data Warehouse Cloud license type:
Planning Professional, Planning Standard as well as Analytics Hub are SAP Analytics Cloud specific roles. More information about these roles can be found at Roles in the SAP Analytics Cloud documentation.
2.3.1.2 Standard Application Roles
SAP Data Warehouse Cloud is delivered with several standard roles.
You can assign standard roles directly to users or, if you have different business needs, you can use them as a template for defining new roles.
The following standard roles are available:
48 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
● System Owner - Includes all user privileges to allow unrestricted access to all areas of the application. Exactly one user must be assigned to this role.
● DW Administrator - Can create users and spaces and has full privileges across the whole of the SAP Data Warehouse Cloud tenant.
● DW Space Administrator - Can manage all aspects of spaces of which they are a member (except the Storage Assignment and Workload Class properties) and can create data access controls and use the Content Network.
● DW Integrator - Can create and edit connections, database users, and associate HDI containers in spaces of which they are a member.
● DW Modeler - Can create and edit objects in the Data Builder and Business Builder in spaces of which they are a member.
● DW Viewer - Can view objects in spaces of which they are a member.
NotePlease note for SAP Data Warehouse Cloud tenants that were initially provisioned prior to version 2021.03, you need the following additional roles to work with stories:
● BI Content Creator - Creates and changes stories.● BI Content Viewer - Views stories.
2.3.2 Transfer the System Owner Role
The individual who purchases SAP Data Warehouse Cloud is automatically designated as the system owner. If you, as the purchaser, are not the right person to administer the system, you can transfer the system owner role to the appropriate person in your organization.
Prerequisites
You must be logged on as a user with the System Information Update privilege.
NoteTransferring the system owner role is not possible if you only have one license for SAP Data Warehouse Cloud.
Context
1. On the Users page of the Security area, select the user you want to assign the system owner role to.2. Select (Assign as System Owner).
The Transfer System Owner Role dialog appears.3. Under New Role, enter a new role for the previous system owner, or select to open a list of available
roles.
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 49
NoteOne or more roles may be selected.
4. Select OK.
2.3.3 Creating a Custom Role
You can create a new custom role either by customizing a predefined role or by creating a role from a blank template.
Prerequisites
To create custom roles and assign them to users, you need the DW Administrator role.
Context
NoteIn this procedure, we use the Roles page to assign roles to users, but you can also assign roles on the Users page. Whether you create users first or roles first does not matter.
Procedure
1. Go to (Expand) (Security) (Roles).
2. To create a custom role, click (Add Role).3. Enter a unique name for the role and select the license type SAP Data Warehouse Cloud.4. Select Create.5. Select a role template.
The role templates are the predefined standard roles associated with the SAP Data Warehouse Cloud license type. If you wish to create a role without extending a predefined standard role, choose the blank template. After you select a template, the Permissions page appears, showing you the individual permissions assigned to privilege types that have been defined for the role template you chose.
6. Define the permissions for your new role for every privilege type. The permission privileges represent an area, application or tool in SAP Data Warehouse Cloud while the permissions (create, read, update, delete, manage and share) represent the actions a user can perform.For example, to define a user who is allowed to read all data change logs, select the check box in the Read column of the Data Change Log row. The permission is automatically passed on to all existing logs.
50 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
To define that the user should be allowed to read only specific data change logs, expand the Data Change Log node, and then select the check box in the Read column only for specific log rows.
7. If you want to change the role template that your new custom role will be based on, select (Select Template), and choose a role.
8. Save your new custom role.
NoteYou can't delete or save changes to the predefined standard roles.
2.3.3.1 Privilege Page
Each role has a set of privilege types and are assigned different permissions. These settings are configured in the privilege page.
Overview
A role represents the main tasks that a user performs in SAP Data Warehouse Cloud. Each role has a set of privilege types and are assigned different permissions like Create, Read, Update, Delete or Manage and Share. The privilege types represent areas of the application like the Space Management or the Business Builder and the files or objects created in those areas.
The standard application roles provide a set of privilege types that are appropriate for each particular job role. For example, the DW Administrator role includes the Create, Read, Update, Delete and Manage permissions for the privilege type Spaces, while the SAP Data Warehouse Cloud Viewer has no permissions selected on this permission type. Instead the user has the Read permission for Space Files, meaning this user won't be able to assign users, but they would be able to view the tables and views in the spaces they have been assigned to.
To see a list of all the privilege types and available permissions go to Privilege Types [page 52].
To create your own custom roles see Creating a Custom Role [page 50].
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 51
2.3.3.2 Privilege Types
A privilege type represents a task or an area in SAP Data Warehouse Cloud and is assigned to a specific role. The actions that can be performed in the area are determined by the permissions assigned to a privilege type.
Data Warehouse Cloud Privileges and Permissions
The following table lists the SAP Data Warehouse Cloud privilege types and the available permissions. Please note, that some of the privileges that are not restricted to an SAP Data Warehouse Cloud license (for example, Users and Roles) can be found in the Other Privileges and Permissions table.
Permissions Available by Data Warehouse Cloud Privilege Types(C=Create, R=Read, U=Update, D=Delete, E=Execute, M=Maintain, S=Share, M=Manage)
Privilege Type Permissions Notes
Space Files
Technical Name: DWC_SPACEFILE
CRUD---M Allows access to all objects inside a space e.g. views and tables. The permission is limited by membership of a space. A user with the Read permission can view the objects in spaces to which they have been assigned to.
A user with the Manage permission can also see objects in other spaces and are not limited to the spaces they have been assigned to.
Managing Your Space
Spaces
Technical Name: DWC_SPACES
CRUD---M Allows access to Space Management. Users need to have at least the Read permission to see the Space Management app. Which sections of the app are visible and which actions can be performed depend on several other permissions. Please see here for details: Permissions [page 57]
Managing Your Space
52 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
Privilege Type Permissions Notes
Data Warehouse Data Builder
Technical Name: DWC_DATABUILDER
CRUD--S- Allows access to all objects in the Data Builder.
Also allows access to the Data Sharing Cockpit with the Create, Read and Update permissions.
Sharing of objects: Only users with the share permission on this privilege can share objects.The sharing icon is otherwise greyed out.
Acquiring Data in the Data Builder
Data Warehouse Remote Connection
Technical Name: DWC_REMOTECONNECTION
CRUD---- Allows access to remote and run-time objects:
● To view remote tables in the Data Builder the user needs the Read permission
● To create, update, or delete a connection in the Connections app, the user needs Create, Read, Update and Delete and they have to be a member of the space with the corresponding Space Files permission.
Integrating Data via Connections
Acquiring Data in the Data Builder
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 53
Privilege Type Permissions Notes
Data Warehouse Cloud Data Integration
Technical Name: DWC_DATAINTEGRATION
-RU-E--- Allows access to the Data Integration Monitor:
● Read - To view the Data Integration Monitor.
● Update:○ To perform any one-off repli
cation/persistency actions in the Data Integration Monitor or Data Builder.
○ To redeploy views in the Data Builder where data persistency is used (including in the view lineage)
● Execute - To work with schedules.
NoteIn addition to these permissions, the following Data Integration Monitor actions require the DWC_DATABUILDER privilege with Read permission:
● To add a new view in the View Persistency Monitor.
● To set up or change partitioned data loading in the Remote Table Monitor.
Managing and Monitoring Data Integration
Data Warehouse Business Catalog
Technical Name: DWC_BUSINESSCATALOG
-R------ Allows access to the Repository Explorer. To see the contents the user also needs to be a member of the space and, therefore, requires Read on Space Files.
Repository Explorer
Data Warehouse Data Access Control
Technical Name: DWC_DAC
CRUD---- Allows access to the Data Access Control app and objects. To see the Data Access Control app the user needs the Update permission. To use a Data Access Control object to protect a view the user needs Read. They need Update in addition to change a view.
Securing Data with Data Access Controls
54 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
Privilege Type Permissions Notes
Data Warehouse Business Builder
Technical Name: DWC_BUSINESSBUILDER
-R------ Allows access to the Business Builder.
Modeling Data in the Business Builder
Business Entities in Business Builder
Technical Name: DWC_BUSINESS_ENTITY
CRUD---- Allows access to business objects defined in the Business Builder. Business objects can be dimensions or analytical data sets.
Objects in the Business Builder
Authorization Scenarios in Business Builder
Technical Name: DWC_AUTH_SCENARIO
CRUD---- Allows access to authorization scenarios defined in the Business Builder. Authorization scenarios are modeling abstractions for Data Access Controls.
Authorization Scenario
Fact Model in Business Builder
Technical Name: DWC_FACT_MODEL
CRUD---- Allows access to fact models defined in the Business Builder. Fact models are shaped like consumption models but offer re-useability in other consumption models.
Creating a Fact Model
Consumption Model in Business Builder
Technical Name: DWC_CONSUME_MODEL
CRUD---- Allows access to consumption models inside the Business Builder. Consumption models comprise perspectives which are presented as DWC_CUBE objects in the file repository.
Consumption Layer (prior to version 2021.03)
Folder in Business Builder
Technical Name: DWC_FOLDER
CRUD---- Allows access to folders defined in the Business Builder. Folders are used to organize objects inside the Business Builder.
File Repository (prior to version 2021.03)
Role CRUD---- Allows access to the Roles application.
Managing Roles and Privileges [page 47]
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 55
Privilege Type Permissions Notes
User CRUD---M The Read permission lets you see a list of users in a dialog; for example, when choosing which users to share a story with, or when choosing users to add to a team.
To see the user list in Security
Users , you need the Read permission, plus one of the Create, Update, or Delete permissions. If you have only the Read permission, you won't be able to view the user list.
Set the Manage permission to permit assigning users to roles, and approving role assignment requests from users.
Managing SAP Data Warehouse Cloud Users [page 40]
Team CRUD---M Allows to manage (assign and remove) space members.
Assign Members to Your Space
Activity Log -R-D---- Allows access to activity logs.
Lifecycle -R---MS- Allows to import and export content via the Content Network and Transport areas.
System Information -RU----- Users with the Read permission can access the About area in the System menu.
Users with the Update permission can access the Administration, Configuration and About areas in the System menu.
Example
A user with DW Viewer, for example, cannot see the Space Management area including the spaces. They can only read the files of the space that they have been assigned to (space files) and read the data belonging to their space in the Data Builder. They can only see the connections, but not edit them. They cannot see the member section or the individual members of their space (user and team). A user with DW Administrator on the other hand would have almost all permissions of all areas.
With an even finer granularity, you can select permissions that allows your user to see or edit only certain areas of your space. A modeler, for example, can be prohibited to see the general settings in particular the storage,
56 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
prioirity and data lake settings. The user might, however, be configured to see the members of a space, but is not allowed to add or delete members.
Space Management Details
Space Management Visible with Privilege Editable with Privilege
General Settings Space.Read Space.Update
Specific General Settings; Storage, Priority, Data Lake
Space.Read Space.Manage
Members User.Read and Team.Read Team.Update
Database Access RemoteConnection.Read RemoteConnection.Update
Time Data DataBuilder.Read DataBuilder.Update
Auditing Space.Update Space.Update
Your user would need the following combination of permissions for the following actions:
Creating a space (Space.Create and Space.Manage) and (Team.Read and Team.Create and Team.Update)
Open space monitor Space.Read
See Assigned/Used Storage and Assigned/Used In-Memory Charts
Space.Manage
2.3.3.3 Permissions
Permissions allow the user to perform certain actions such read, write or delete on privilege types. The type of action depends on the priviilege type that permission have been assigned to.
Permissions
The following table displays the available permissions and their definitions.
Permissions
Permission Meaning
Create Permits creating new objects of this item type. Users need this permission to create spaces, views or tables, upload data into a story, or upload other local files.
Read Permits opening and viewing an item and its content.
Update Permits editing and updating existing items. Compare this permission with the Maintain permission, which doesn't allow changes to the data structure. Note: some object types need the Maintain permission to update data. See the Maintain entry.
Delete Permits deletion of the item.
Share Permits the sharing of the selected item type.
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 57
Permission Meaning
Manage When granted on Spaces and Space Files, permits full control over all spaces and the data inside these spaces.
2.3.4 Roles and Privileges by App
Review the standard roles and the privileges needed to access apps, tools, other screens and the dwc command line interface for SAP Data Warehouse Cloud.
SAP Data Warehouse Cloud Apps
To access an app, tool, or editor, a user must have the following standard application role or a custom role containing the listed privileges:
App Requires Privileges (Permissions)…Contained in Standard Role...
(Home) None All roles
(Repository Explorer) Data Warehouse Business Catalog (-R------)
NoteTo see the repository explorer content, a user also needs to be member of space and have the privilege Space Files (-R------).
DW Modeler
(Data Marketplace) ● Spaces (-R------)● Space Files (CRUD----)● Data Warehouse Remote Connection
(CRUD--S-● Data Warehouse Cloud Data Integration
(-RU-E---)
DW Integrator
58 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
App Requires Privileges (Permissions)…Contained in Standard Role...
(Business Builder)
Start page
Dimension editor
Analytical dataset editor
Fact model editor
Consumption model editor
Authorization scenario editor
Each page or editor requires a separate permission:
● Start page: Data Warehouse Business Builder (-R------)
● Dimension editor: Data Warehouse Business Entity (CRUD----)
● Analytical dataset editor: Data Warehouse Business Entity (CRUD----)
● Fact model editor: Data Warehouse Fact Model (CRUD----)
● Consumption model editor: Data Warehouse Consumption Model (CRUD----)
● Authorization scenario editor: Data Warehouse Authorization Scenario (CRUD----)
DW Modeler
(Data Builder)
Start Page
Table editor
Graphical view editor
SQL view editor
Entity-relationship model editor
Data flow editor
Intelligent lookup editor
All pages and editors share a single permission:
● Data Warehouse Data Builder (CRUD--S-)
Note● To access the Data Editor, a user
needs this privilege with the Update (U) permission.
● To run an intelligent lookup, a user also needs the Data Warehouse Data Integration privilege with the Update (U) permission.
● To see remote objects in Data Builder, a user needs the Data Warehouse Remote Connection with the Read (R) privilege.
DW Modeler
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 59
App Requires Privileges (Permissions)…Contained in Standard Role...
(Data Access Controls) Data Warehouse Data Access Control (CRUD----)
Note● To apply a data access control to
protect a view, a user needs this privilege with the Read (R) permission.
● The Update (U) permission enables a user to see Data Access Control.
DW Space Administrator
(Data Integration Monitor) Data Warehouse Data Integration (-RU-E---)
Note● The Read (R) permission allows a
user to see Data Integration Monitor.
● The Update (U) permission allows a user to perform manual integration tasks.
● The Execute (E) permission allows a user to schedule automated integration tasks.
DW Integrator
DW Modeler (except for scheduling)
(Connections) Data Warehouse Remote Connection (CRUD--S-)
NoteA user with this privilege must be member of the space and also needs the Space Files privilege.
DW Integrator
NoteUsers with the DW Modeler role can access the app in read-only.
Note● Users with the role DW Administrator or DW Space Administrator have full access (with full CRUD
permissions) to all the apps listed in this table.● Users with the DW Viewer role have read-only access (with the R permission) to all the apps (except for
Data Access Control) listed in this table.
60 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
SAP Data Warehouse Cloud Administration Tools
To access an administration tool, a user must have the following standard application role or a custom role containing the listed privileges:
Tool Requires Privileges (Permissions)… Contained in Standard Role...
(Space Management)Spaces (CRUD---M) DW Administrator (full access, includ
ing creating spaces)
DW Space Administrator
Note● Users with other roles can view
(but not edit) the spaces they are members of.
● Users with the DW Integrator role can create database users.
(Content Network) Lifecycle (-R---MS-) DW Administrator
DW Space Administrator
(Security)
Users
Roles
Activities
Each sub-tool requires a separate permission:
● Users: User (CRUD---M)● Roles: Role (CRUD----)● Activities: Activity Log (-R-D----)
DW Administrator (full access)
DW Space Administrator (read-only access for the sub-tool users)
(Transport) Lifecycle (-R---MS-) DW Administrator
DW Space Administrator
(Data Sharing Cockpit) Data Warehouse Data Builder (CRU-----)
DW Modeler
(System)
Configuration
Administration
About
System Information (-RU-----)
Note● The Read (R) permission gives
access to the About area.● The Update (U) permission
gives access to all areas.
DW Administrator
NoteUsers with any role can view the About area.
Administering SAP Data Warehouse CloudManaging Users and Roles PUBLIC 61
The DWC Command Line Interface
To use the DWC command line interface (see Create, Read, Update, and Delete Spaces via the Command Line [page 69]), a user must have the following standard application role or a custom role containing the listed privileges:
Action Requires Privileges (Permissions)… Contained in Standard Role...
Create/Update Space (all properties) Spaces (CRUD---M)
Team (CRUD---M)
Data Builder (CRUD----)
DW Administrator
Update Space (members, database users, HDI containers, entity defini-tions)
Spaces (-RUD---M)
Team (-RUD---M)
Data Builder (CRUD----)
DW Space Administrator
NoteYou must also be a member of the space.
Update Space (entity definitions only) Spaces (-R------)
Team (-R------)
Data Builder (CRUD----)
DW Modeler
NoteYou must also be a member of the space.
62 PUBLICAdministering SAP Data Warehouse Cloud
Managing Users and Roles
3 Creating Spaces and Allocating Storage
All data acquisition, preparation, and modeling happens inside spaces. An Administrator must create one or more spaces and allocate storage to them.
A space is a secure area created by an Administrator, in which members can acquire, prepare, and model data. The Administrator allocates disc and in-memory storage to the space, set its priority, and can limit how much memory and how many threads its statements can consume.
If the administrator assigns one or more Space Administrators as members of the space, they can then assign other members, create connections to source systems, secure data with data access controls, and manage other aspects of the space (see Managing Your Space).
Space data is not accessible outside the space unless it is shared to another space or exposed for consumption.
3.1 Create a Space
Create a space, allocate storage, and assign one or more members to allow them to start acquiring and preparing data.
Context
NoteOnly administrators can create spaces, allocate storage, and set the space priority and statement limits. The remaining space properties can be managed by the space administrators that the administrator assigns as members to the space.
Procedure
1. In the side navigation area, click (Space Management), and click Create.2. In the Create Space dialog, enter the following properties, and then click Create:
Administering SAP Data Warehouse CloudCreating Spaces and Allocating Storage PUBLIC 63
Property Description
Space Name Enter the business name of the space. Can contain a maximum of 30 characters, and can contain spaces and special characters.
Space ID Enter the technical name of the space. Can contain a maximum of 20 uppercase letters or numbers and must not contain spaces or special characters other than _ (underscore). Unless advised to do so, must not contain prefix _SYS and should not contain prefixes: DWC_, SAP_ (See Rules for Technical Names [page 10]).
The space is created and its property sheet opens.
3. In the General Settings section, review the following properties:
Property Description
Space ID Enter the technical name of the space. Can contain a maximum of 20 uppercase letters or numbers and must not contain spaces or special characters other than _ (underscore). Unless advised to do so, must not contain prefix _SYS and should not contain prefixes: DWC_, SAP_ (See Rules for Technical Names [page 10]).
Space Name Enter the business name of the space. Can contain a maximum of 30 characters, and can contain spaces and special characters.
Space Status [read-only] Displays the status of the space. Newly-created spaces are always active.
Space Type [read-only] Displays the type of the space. You can only create spaces of type SAP Data Warehouse Cloud.
Created By [read-only] Displays the user that created the space.
Created On [read-only] Displays the date and time when the space was created.
Deployment Status [read-only] Displays the deployment status of the space. Newly-created spaces are deployed, but when you make changes, you need to save and re-deploy them before they are available to space members.
Deployed On [read-only] Displays the date and time when the space was last deployed.
4. [optional] Use the Storage Assignment properties to allocate disk and in-memory storage to the space and to choose whether it will have access to the SAP HANA data lake.
For more information, see Allocate Storage to a Space [page 65].5. [optional] Use the properties in the Workload Class section to prioritize between spaces for resource
consumption and set limits to the amount of memory and threads that a space can consume.
For more information, see Set a Priority and Statement Limits for a Space [page 67].6. Use the list in the Members section to add users as members of the space.
You must assign at least one member in order to use the space:
64 PUBLICAdministering SAP Data Warehouse Cloud
Creating Spaces and Allocating Storage
○ DW Space Administrator - Can manage all aspects of spaces of which they are a member (except the Storage Assignment and Workload Class properties) and can create data access controls and use the Content Network.
○ DW Integrator - Can create and edit connections, database users, and associate HDI containers in spaces of which they are a member.
○ DW Modeler - Can create and edit objects in the Data Builder and Business Builder in spaces of which they are a member.
○ DW Viewer - Can view objects in spaces of which they are a member.
For more information, see Assign Members to Your Space.7. [optional] Use the remaining sections to further configure the space.
○ Data Access/Data Consumption: Modify the following property, if appropriate:
Property Description
Expose for Consumption by Default Choose the default setting for the Expose for Consumption property for views created in this space.
○ Data Access/Database Users - Use the list in the Database Users section to create users who can connect external tools and read from and write to the space. See Create a Database User.
○ Data Access/HDI Containers - Use the list in the HDI Containers section to associate HDI containers to the space. See Prepare Your Project for Exchanging Data with Your Space.
○ Connections - Follow the link in the Connections section to create connections to source systems in the space. See Create a Connection.
○ Time Data/Time Tables and Dimensions - Click the button in the Time Tables and Dimensions section to generate time data in the space. See Generate Time Data and Dimensions.
○ Auditing/Space Audit Settings - Use the properties in the Space Audit Settings section to enable audit logging for the space. See Enable Audit Logging.
8. Either click Save to save your space (and click Deploy later), or directly click Deploy to save and deploy your space to the database in one click.
3.2 Allocate Storage to a Space
Use the Storage Assignment properties to allocate disk and in-memory storage to the space and to choose whether it will have access to the SAP HANA data lake.
Context
SAP Data Warehouse Cloud supports data tiering using the features of SAP HANA Cloud:
● In-Memory Storage (hot data) - Keep your most recent, frequently-accessed, and mission-critical data loaded constantly in memory to maximize real-time processing and analytics speeds.When you persist a view, the persisted data is stored in memory (see Work With Persisted Views).
Administering SAP Data Warehouse CloudCreating Spaces and Allocating Storage PUBLIC 65
● Disk (warm data) - Store master data and less recent transactional data on disk to reduce storage costs.When you load data to a local table or replicate data to a remote table in SAP Data Warehouse Cloud, the data is stored on disk by default, but you can load it in memory by activating the In-Memory Storage switch (see Accelerate Table Data Access with In-Memory Storage).
● Data Lake (cold data) - Store historical data that is infrequently accessed in the data lake. With its low cost and high scalability, the data lake is also suitable for storing vast quantities of raw structured and unstructured data, including IoT data. For more information, see Integrating Data to and From SAP HANA Cloud Data Lake.
You can allocate specific amounts of in-memory and disk storage to a space or disable the Enable Space Quota option, and allow the space to consume all the storage it needs, up to the total amount available in your tenant.
Procedure
1. In the side navigation area, click (Space Management), locate your space tile, and click Edit to open it.2. Use the Storage Assignment properties to allocate disk and in-memory storage to the space and to choose
whether it will have access to the SAP HANA data lake.
Property Description
Enable Space Quota Disable this option to allow the space to consume any amount of disk and in-memory space up to the total amounts available in your tenant.
If this option was disabled and then subsequently re-enabled, the Disk and In-Memory properties are initialized to the minimum values required by the current contents of the space.
Default: Enabled
Disk (GB) Enter the amount of disk storage allocated to the space in GB. You can use the buttons to change the amount by whole GBs or enter fractional values in increments of 100MB by hand.
Default: 2 GB
In-Memory (GB) Enter the amount of in-memory storage allocated to the space in GB. You can use the buttons to change the amount by whole GBs or enter fractional values in increments of 100MB by hand.
Default: 1 GB
Use This Space to Access the Data Lake Enable access to the SAP HANA Cloud data lake. Enabling this option is only possible if no other space already has access to the data lake.
Default: Disabled
66 PUBLICAdministering SAP Data Warehouse Cloud
Creating Spaces and Allocating Storage
NoteIf a space exceeds its allocations of in-memory or disk storage, it will be locked until a space member deletes the excess data or an administrator assigns additional storage. See Unlock a Space That Has Exceeded Its Assigned Storage [page 69].
3. Click Save to save your changes to the space, or Deploy to save and immediately make the changes available to space members.
3.3 Set a Priority and Statement Limits for a Space
Use the properties in the Workload Class section to prioritize between spaces for resource consumption and set limits to the amount of memory and threads that a space can consume.
Procedure
1. In the side navigation area, click (Space Management), locate your space tile, and click Edit to open it.2. Use the properties in the Workload Class section to prioritize between spaces for resource consumption
and set limits to the amount of memory and threads that a space can consume.
Property Description
Space Priority Enter the prioritization of this space when querying the database. You can enter a value from 1 (lowest priority) to 8 (highest priority).
In situations where spaces are competing for available threads, those with higher priorities have their statements run before those of spaces with lower priorities.
Default: 5
Enable Statement Limits Enable this option to allow you to set maximum thread and memory limits that statements running concurrently in the space can consume.
Default: Disabled
Administering SAP Data Warehouse CloudCreating Spaces and Allocating Storage PUBLIC 67
Property Description
Total Statement Thread Limit Enter the maximum number (or percentage) of threads that statements running concurrently in the space can consume. You can enter any value or percentage between 0 (no limit) and the total number of threads available in your tenant.
Setting this limit prevents the space from consuming all available threads, and can help with balancing resource consumption between competing spaces.
Default: 0 (no limit)
Total Statement Memory Limit Enter the maximum number (or percentage) of GBs of memory that statements running concurrently in the space can consume. You can enter any value or percentage between 0 (no limit) and the total amount of memory available in your tenant.
Setting this limit prevents the space from consuming all available memory, and can help with balancing resource consumption between competing spaces.
Default: 0 (no limit)
These options are based on the features provide by workload classes provided in SAP HANA. For more information, see Managing Workload with Workload Classes in the SAP HANA Cloud documentation.
3. Click Save to save your changes to the space, or Deploy to save and immediately make the changes available to space members.
3.4 Monitor Tenant and Space Storage
You can see the total storage available and the amount assigned to and used by spaces in the bars at the top of the Space Management page.
The following information is available:
● Used Disk - Shows the total amount of disk storage used. Hover over this bar to see a breakdown between:○ Space Data: All data that is stored in spaces.○ Audit Log Data: Data related to audit logs (see Audit Logging).
NoteAudit logs can grow quickly and consume a great deal of disk storage (see Delete Audit Logs [page 169]).
68 PUBLICAdministering SAP Data Warehouse Cloud
Creating Spaces and Allocating Storage
○ Other Data: Includes data stored in database user group schemas (see Creating Database User Groups [page 148]) and SAP HANA data (such as statistics schemas).
○ Administrative Data: Data used to administer the tenant and all spaces (such as space quota, space version). Includes all information stored in the central schemas (DWC_GLOBAL, DWC_GLOBAL_LOG, DWC_TENANT_OWNER).
● Assigned Disk - Shows the total amount of disk storage assigned to all spaces.● Used In-Memory - Shows the total amount of in-memory storage used in all spaces.● Assigned In-Memory - Shows the total amount of in-memory storage assigned to all spaces.
3.5 Unlock a Space That Has Exceeded Its Assigned Storage
If a space exceeds its allocations of in-memory or disk storage, it will be locked until a space member deletes the excess data or an administrator assigns additional storage.
Context
When a space is locked, members can continue to create and modify objects and save their changes in the repository, but they cannot deploy their changes to the run-time database.
In this situation, three actions are possible:
● Space members can delete data to bring the space back under the limit of its assigned storage.● A space administrator can use the Unlock Space button on the space page to unlock the space for a 24-
hour grace period, in case urgent changes must be deployed.● An administrator can assign more disk and/or in-memory storage to the space (see Allocate Storage to a
Space [page 65]).
3.6 Create, Read, Update, and Delete Spaces via the Command Line
You can use the SAP Data Warehouse Cloud command line interface, dwc, to create, read, update, and delete spaces, including setting space properties, assigning members, creating database users, creating entities (tables and views), and associating HDI containers to your space.
The following types of actions and commands are available with dwc:
● Read a Space [page 70]● Create or Update a Space [page 72]● Delete a Space [page 72]● Miscellaneous Commands [page 72]
Administering SAP Data Warehouse CloudCreating Spaces and Allocating Storage PUBLIC 69
Prerequisites
To use dwc you must install it (see Install or Update the dwc Command Line Interface [page 73]) and have an SAP Data Warehouse Cloud user with the following roles or permissions:
Action Requires Privileges (Permissions)… Contained in Standard Role...
Create/Update Space (all properties) Spaces (CRUD---M)
Team (CRUD---M)
Data Builder (CRUD----)
DW Administrator
Update Space (members, database users, HDI containers, entity defini-tions)
Spaces (-RUD---M)
Team (-RUD---M)
Data Builder (CRUD----)
DW Space Administrator
NoteYou must also be a member of the space.
Update Space (entity definitions only) Spaces (-R------)
Team (-R------)
Data Builder (CRUD----)
DW Modeler
NoteYou must also be a member of the space.
Read a Space
To read a space definition:
1. Enter the following command and press Return :
dwc spaces read -s <Space_ID> -H "<Server_URL>" [-d [<Entity1>,<Entity2>]] [-o <Filename>.json] [-p <Passcode>] [-V]
Complete the parameters as follows:
Parameter Description
-s <Space_ID> Enter the Space ID of the space you want to read.
-H "<Server_URL>" Enter the URL of your SAP Data Warehouse Cloud tenant. You can copy the URL of any page in your tenant.
Alternative: --host "<Server_URL>"
70 PUBLICAdministering SAP Data Warehouse Cloud
Creating Spaces and Allocating Storage
Parameter Description
-d [<Entity1>,<Entity2>]
[optional] Read the entity (table and view) definitions contained in the space. Using the -d parameter by itself will read all the entities, or you can specify a comma-separated list of entity technical names.
Entity definitions are read using the standard CSN syntax (see Core Data Services Schema Notation (CSN) ):○ Tables and views can be read. Data flows and other non-CSN objects are not sup
ported.○ When a view is read, the definitions of all the entities in its lineage (from immediate
sources through to the initial remote or local tables) are also included.○ Remote tables can be read. They can be written to another space from a space defi-
nition file only if they were originally imported from a connection created in v2021.19 or later and if a connection with the same technical name is present in the new space.
NoteUsing the -d option allows you to perform the same CSN export of entities as that which is available in the Data Builder and its editors (see Exporting Entities to a CSN File).
Alternative: --definitions [<Entity1>,<Entity2>]]
-o <Filename>.json [optional] Enter a path to a file with a .json extension.
If you do not enter a filepath, the space definition will be printed to the command line.
Alternative: --output <Filename>.json
-p <Passcode> [optional] Enter a passcode that you have obtained from your SAP Data Warehouse Cloud tenant.
If you do not enter a passcode, dwc will prompt you to obtain one:
1. Enter y and dwc will open the passcode page for your tenant.
2. If you are not already logged in, you must enter your username and password.3. When you arrive at the passcode page, copy the temporary authentication code and
paste it into the command line.
NoteYou must enter a new passcode for each command that you issue with dwc.
Alternative: --passcode <Passcode>
-V [optional] Print detailed log information to the console.
Alternative: --verbose
2. If prompted, enter a passcode to authorize the command.3. The space definition is written to the console or to the specified file.
Administering SAP Data Warehouse CloudCreating Spaces and Allocating Storage PUBLIC 71
Create or Update a Space
To create or update a space:
1. Prepare a space definition file (see The Space Definition File Format [page 74]).
NoteYou need only complete the parameters that you want to set. All other space properties are either set to default values or keep their current values. If your file contains valid CSN entity definitions (see Entity (Table and View) Definitions [page 82], then these entities will be created in the space.
2. Enter the following command and press Return :
dwc spaces create -H "<Server_URL>" -f <Filename>.json [-p <Passcode>] [-V]
3. If prompted, enter a passcode to authorize the command.4. The space is created or updated as you have specified.
NoteIf any parameters are set incorrectly, the creation or update is canceled and an error message is written to the console.
Delete a Space
To delete a space:
1. Enter the following command and press Return :
dwc spaces delete -s <Space_ID> -H "<Server_URL>" [-p <Passcode>] [-F] [-V]
2. If prompted, enter a passcode to authorize the command.3. When prompted, confirm that you want to delete the space.
NoteYou can use the -F (or --force) to suppress this prompt and delete the space directly, without confirmation.
4. The space is deleted and a confirmation message is written to the console.
Miscellaneous Commands
The following additional commands are also available:
72 PUBLICAdministering SAP Data Warehouse Cloud
Creating Spaces and Allocating Storage
Command Description
dwc cache-init -H "<Server_URL>" [-p <Passcode>]
Download the file of available dwc commands from the SAP Data Warehouse Cloud server.
dwc cache-clean Delete the local file of available dwc commands.
dwc passcode-url -H "<Server_URL>" Display the passcode url for the SAP Data Warehouse Cloud server.
dwc -v
or
dwc --version
Display the version of dwc.
dwc <command> -h
or
dwc <command> --help
Display help for the specified dwc command.
3.6.1 Install or Update the dwc Command Line Interface
The SAP Data Warehouse Cloud command line interface (dwc) is a Node.js package that you download using the Node Package Manager (npm).
Context
dwc is listed on https://www.npmjs.com/ at https://www.npmjs.com/package/@sap/dwc-cli .
Prerequisites
You have installed the following on your system:
● Node.js version >= 12.21.0● npm version >= 6
npm is distributed with Node.js. Therefore, when you download Node.js, npm is automatically installed. To download the Node.js installer, see nodejs.org .
NoteYou can test if Node.js and npm are installed on your system by executing the following commands:
● node -v
Administering SAP Data Warehouse CloudCreating Spaces and Allocating Storage PUBLIC 73
● npm -v
If Node.js and npm are already installed, then their current versions will appear. If you receive an error, you have not installed them yet.
Procedure
1. Run the following command:
npm install -g @sap/dwc-cli
NoteTo update dwc to the latest version at any time, you just need to run npm install -g @sap/dwc-cli again.
2. Test if the installation was successful by running the following command:
dwc -v
3. Run the following command to download the file of available dwc commands:
dwc cache-init -H "<Server_URL>"
Where <Server_URL> is the URL of your SAP Data Warehouse Cloud tenant. You can copy the URL of any page in your tenant.
NoteIf new commands become available for dwc, you will be prompted to run this command again.
4. When prompted, enter the passcode to authorize the command.
3.6.2 The Space Definition File Format
Space properties are set and retrieved in the space definition file format and stored as a .json file.
A space definition file must not exceed 25MB, and can contain the following space information:
● Space Properties [page 75]● Members [page 78]● Database Users [page 79]● HDI Containers [page 81]● Entity (Table and View) Definitions [page 82]
74 PUBLICAdministering SAP Data Warehouse Cloud
Creating Spaces and Allocating Storage
Space Properties
You can set space properties using the following syntax:
{ "<SPACE_ID>": { "spaceDefinition": { "version": "1.0.4", "label": "<Space_Name>", "assignedStorage": <bytes>, "assignedRam": <bytes>, "priority": <value>, "injection": { "dppRead": { "retentionPeriod": <days>, "isAuditPolicyActive": true|false }, "dppChange": { "retentionPeriod": <days>, "isAuditPolicyActive": true|false } }, "allowConsumption": true|false, "enableDataLake": true|false, "members": [], "dbusers": {}, "hdicontainers": {}, "workloadClass": { "totalStatementMemoryLimit": { "value": 0, "unit": "Gigabyte|Percent" }, "totalStatementThreadLimit": { "value": 0, "unit": "Counter|Percent" } } }
Parameters are set as follows:
Parameter Space Property Description
<SPACE_ID> Space ID [required] Enter the technical name of the space. Can contain a maximum of 20 uppercase letters or numbers and must not contain spaces or special characters other than _ (underscore). Unless advised to do so, must not contain prefix _SYS and should not contain prefixes: DWC_, SAP_ (See Rules for Technical Names [page 10]).
version - [required] Enter the version of the space definition file fomat. This must always be set to 1.0.4.
label Space Name Enter the business name of the space. Can contain a maximum of 30 characters, and can contain spaces and special characters.
Default value: <Space_ID>
Administering SAP Data Warehouse CloudCreating Spaces and Allocating Storage PUBLIC 75
Parameter Space Property Description
assignedStorage Disk (GB) Enter the amount of storage allocated to the space in bytes. You can enter any value between 100000 bytes (100MB) and the total storage size available in the tenant.
Default value: 2000000000 bytes (2GB)
NoteTo set no size limit for the space (and disable the Enable Space Quota option), enter 0 for both this parameter and assignedRam.
assignedRam In-Memory (GB) Enter the amount of ram allocated to the space in bytes. You can enter any value between 100000 bytes (100MB) and the total storage size available in the tenant.
Default value: 1000000000 bytes (1GB)
Priority Space Priority Enter the prioritization of this space when querying the database. You can enter a value from 1 (lowest priority) to 8 (highest priority).
Default value: 5
dppRead.isAuditPolicyActive
dppRead.retentionPeriod
dppChange.isAuditPolicyActive
dppChange.retentionPeriod
Enable Audit Log for Read Operations
Keep Logs for <n> Days
Enable Audit Log for Change Operations
Keep Logs for <n> Days
Enter the audit logging policy for read and change operations and the number of days that the logs are retained. you can retain logs for any period between 7 and 10000 days.
Default values: false, 30, false, 30
allowConsumption
Expose for Consumption by Default
Choose the default setting for the Expose for Consumption property for views created in this space.
Default value: false
enableDataLake Use This Space to Access the Data Lake
Enable access to the SAP HANA Cloud data lake. Enabling this option is only possible if no other space already has access to the data lake.
Default value: false
members Member Assignment See Members [page 78].
dbusers Database Users See Database Users [page 79].
hdicontainers HDI Containers See HDI Containers [page 81].
76 PUBLICAdministering SAP Data Warehouse Cloud
Creating Spaces and Allocating Storage
Parameter Space Property Description
workloadClass.totalStatementMemoryLimit.value
workloadClass.totalStatementMemoryLimit.unit
Total Statement Memory Limit
GB/%
Enter the maximum number (or percentage) of GBs of memory that statements running concurrently in the space can consume. You can enter any value or percentage between 0 (no limit) and the total amount of memory available in your tenant.
Default values: 0, Gigabyte
workloadClass.totalStatementThreadLimit.value
workloadClass.totalStatementThreadLimit.unit
Total Statement Thread Limit
Threads/%
Enter the maximum number (or percentage) of threads that statements running concurrently in the space can consume. You can enter any value or percentage between 0 (no limit) and the total number of threads available in your tenant.
Default values: 0, Counter
For example, the following file will create a new space, with all default properties:
{ "NEWSPACE": { "spaceDefinition": { "version": "1.0.4" } } }
NoteIf a property is not set it will receive the default value (on creation) or will keep its current value (on update).
This second file will update NEWSPACE by modifying the Space Name and increasing the Disk (GB) and In-Memory (GB) allocations:
{ "NEWSPACE": { "spaceDefinition": { "version": "1.0.4", "label": "My New Space", "assignedStorage": 6000000000, "assignedRam": 5000000000 } } }
This third file will update the Space Priority, and will leave the other parameters as previously set:
{ "NEWSPACE": { "spaceDefinition": { "version": "1.0.4", "priority": 4 } } }
Administering SAP Data Warehouse CloudCreating Spaces and Allocating Storage PUBLIC 77
NoteThe following properties are not supported when creating, reading, or updating spaces using dwc:
● Connections● Time Data● Space Status and other run-time properties
Members
You can add members to a space using the following syntax:
{ ... "members":[ { "name":"<User_ID>", "type":"user" } ] }
Parameters are set as follows:
Parameter Space Property Description
<name> Member [required] Enter a user ID recognized by your identity provider.
NoteAll members added to your space must already be registered as users of SAP Data Warehouse Cloud.
type - [required] Enter the type of the member. This must always be set to user.
For example, the following file will add three members to NEWSPACE:
{ "NEWSPACE": { "spaceDefinition": { "version": "1.0.4", "members": [ { "name": "[email protected]", "type": "user" }, { "name": "[email protected]", "type": "user" }, { "name": "[email protected]", "type": "user" } ] }
78 PUBLICAdministering SAP Data Warehouse Cloud
Creating Spaces and Allocating Storage
} }
When updating space members via dwc, you must always list all members that you want to have assigned to the space. This second file will add two new members and remove [email protected]:
{ "NEWSPACE": { "spaceDefinition": { "version": "1.0.4", "members": [ { "name": "[email protected]", "type": "user" }, { "name": "[email protected]", "type": "user" }, { "name": "[email protected]", "type": "user" }, { "name": "[email protected]", "type": "user" } ] } } }
Database Users
You can add database users to a space using the following syntax:
{ ... "dbusers":{ "<Space_ID>#<DB_UserName>":{ "ingestion":{ "auditing":{ "dppRead":{ "retentionPeriod":<days> "isAuditPolicyActive":false }, "dppChange":{ "retentionPeriod":<days> "isAuditPolicyActive":false } } }, "consumption":{ "consumptionWithGrant":false, "spaceSchemaAccess":false, "scriptServerAccess":false, "localSchemaAccess":false, "hdiGrantorForCupsAccess":false } } }
Administering SAP Data Warehouse CloudCreating Spaces and Allocating Storage PUBLIC 79
}
Parameters are set as follows:
Parameter Space Property Description
<SPACE_ID> Space ID [required] Must be the same as the <Space_ID> used at the root of the space definition file.
<DB_UserName> Database User Name Suffix
[required] Enter the name of the database user. Can contain a maximum of 20 uppercase letters or numbers and must not contain spaces or special characters other than _ (underscore).
ingestion.auditing.dppRead.isAuditPolicyActive
ingestion.auditing.dppRead.retentionPeriod
ingestion.auditing.dppChange.isAuditPolicyActive
ingestion.auditing.dppChange.retentionPeriod
Enable Audit Log for Read Operations
Keep Logs for <n> Days
Enable Audit Log for Change Operations
Keep Logs for <n> Days
Enter the audit logging policy for read and change operations and the number of days that the logs are retained. you can retain logs for any period between 7 and 10000 days.
Default values: false, 30, false, 30
consumption.consumptionWithGrant
With Grant Option Allow the database user to grant read access to the space schema to other users.
Default value: false
consumption.spaceSchemaAccess
Enable Read Access (SQL)
Grant the database user read access to the space schema.
Default value: false
consumption.scriptServerAccess
Enable Automated Predictive Library (APL) and Predictive Analysis Library (PAL
Grant the database user access to the SAP HANA Cloud machine learning libraries.
Default value: false
consumption.localSchemaAccess
Enable Write Access (SQL, DDL, & DML)
Grant the database user write access to the OpenSQL schema.
Default value: false
consumption.hdiGrantorForCupsAccess
Enable HDI Consumption
Grant the database user read access to HDI containers associated with the space.
Default value: false
For example, the following file will add a database user to NEWSPACE:
{ "NEWSPACE": { "spaceDefinition": {
80 PUBLICAdministering SAP Data Warehouse Cloud
Creating Spaces and Allocating Storage
"version": "1.0.4", "dbusers": { "MYSPACE#JJONES": { "ingestion": { "auditing": { "dppRead": { "retentionPeriod": 21, "isAuditPolicyActive": true } } }, "consumption": { "consumptionWithGrant": true, "spaceSchemaAccess": true, "scriptServerAccess": true, "localSchemaAccess": true, "hdiGrantorForCupsAccess": true } } } } } }
HDI Containers
You can associate HDI containers to a space using the following syntax:
{ ... "hdicontainers":{ "<Container_Name>":{} },
Parameters are set as follows:
Parameter Space Property Description
<Container_Name> HDI Container Name [required] Enter the name of an HDI container that is associated with your SAP Data Warehouse Cloud instance and which is not assigned to any other space.
For example, the following file will associate two HDB containers to NEWSPACE:
{ "NEWSPACE": { "spaceDefinition": { "version": "1.0.4", "hdicontainers": { "MyHDIContainer": {}, "MyOtherContainer": {}, } } } }
Administering SAP Data Warehouse CloudCreating Spaces and Allocating Storage PUBLIC 81
Entity (Table and View) Definitions
You can add entities to a space using the standard CSN syntax (see Core Data Services Schema Notation (CSN) ).
For example, the following file will create a table with two columns in NEWSPACE:
{ "NEWSPACE": { "spaceDefinition": { "version": "1.0.4" }, "definitions": { "Products": { "kind": "entity", "elements": { "Product ID": { "type": "cds.Integer64", "key": true, "notNull": true }, "Product Name": { "type": "cds.String", "length": 5000 } } } } } }
NoteTo obtain more complex examples, read existing entities from a space into a file using the -d option (see Read a Space [page 70]).
82 PUBLICAdministering SAP Data Warehouse Cloud
Creating Spaces and Allocating Storage
4 Preparing Connectivity for Connections
You need to perform some preparatory steps to be able to create and use connections in SAP Data Warehouse Cloud. The steps depend on the source you want to connect to and on the features you want to use with the connection.
The following overview lists the most common prerequisites per connection type and points to further information about what needs to be prepared to connect and use a connection.
Connection Type
Remote Tables: Data Provisioning Agent Required?
Remote Tables: Installation of JDBC Library Required for Connecting to Sources from Third-Party Vendors via Data Provisioning Agent?
Data Flows: Cloud Connector Required for On-Premise Sources?
Data Flows: Third-Party Driver Upload Required?
SAP Data Warehouse Cloud IP Required in Source Allowlist?
Server Certificate Upload Required?
Source IP Required in SAP Data Warehouse Cloud IP Allowlist?
Additional Information and Prerequisites
Adverity no no no no no no yes Prepare Connectivity to Adverity [page 105]
Amazon Athena
no no no no no yes no Prepare Connectivity to Amazon Athena [page 106]
Amazon Redshift
yes yes no yes yes (Outbound IP Address)
no no Prepare Connectivity to Amazon Redshift [page 106]
Amazon Simple Storage Service
no no no no no no no n/a
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 83
Connection Type
Remote Tables: Data Provisioning Agent Required?
Remote Tables: Installation of JDBC Library Required for Connecting to Sources from Third-Party Vendors via Data Provisioning Agent?
Data Flows: Cloud Connector Required for On-Premise Sources?
Data Flows: Third-Party Driver Upload Required?
SAP Data Warehouse Cloud IP Required in Source Allowlist?
Server Certificate Upload Required?
Source IP Required in SAP Data Warehouse Cloud IP Allowlist?
Additional Information and Prerequisites
Cloud Data Integration
yes no yes no no no no Prepare Connectivity for Cloud Data Integration [page 107]
Generic JDBC
yes yes no no no no no Prepare Connectivity for Generic JDBC [page 108]
Generic OData
no no yes no no yes no Prepare Connectivity for Generic OData [page 108]
Generic SFTP
no no no no no no no Prepare Connectivity for Generic SFTP [page 109]
Google BigQuery
no no no no no yes no Prepare Connectivity to Google BigQuery [page 110]
Google Cloud Storage
no no no no no no no n/a
Hadoop Distributed File System
no no no no no no no n/a
Microsoft Azure Blob Storage
no no no no no no no n/a
84 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
Connection Type
Remote Tables: Data Provisioning Agent Required?
Remote Tables: Installation of JDBC Library Required for Connecting to Sources from Third-Party Vendors via Data Provisioning Agent?
Data Flows: Cloud Connector Required for On-Premise Sources?
Data Flows: Third-Party Driver Upload Required?
SAP Data Warehouse Cloud IP Required in Source Allowlist?
Server Certificate Upload Required?
Source IP Required in SAP Data Warehouse Cloud IP Allowlist?
Additional Information and Prerequisites
Microsoft Azure Data Lake Store Gen1
no no no no no no no n/a
Microsoft Azure Data Lake Store Gen2
no no no no no no no n/a
Microsoft Azure SQL Database
yes yes no no yes (Outbound IP Address)
no no Prepare Connectivity to Microsoft Azure SQL Database [page 110]
Microsoft SQL Server
yes yes yes no (pre-bundled; no upload required)
no no no Prepare Connectivity to Microsoft SQL Server [page 111]
Open Connectors
no no no no no no no Prepare Connectivity to SAP Open Connectors [page 112]
Oracle yes yes no yes no no no Prepare Connectivity to Oracle [page 113]
Precog no no no no no no yes Prepare Connectivity to Precog [page 114]
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 85
Connection Type
Remote Tables: Data Provisioning Agent Required?
Remote Tables: Installation of JDBC Library Required for Connecting to Sources from Third-Party Vendors via Data Provisioning Agent?
Data Flows: Cloud Connector Required for On-Premise Sources?
Data Flows: Third-Party Driver Upload Required?
SAP Data Warehouse Cloud IP Required in Source Allowlist?
Server Certificate Upload Required?
Source IP Required in SAP Data Warehouse Cloud IP Allowlist?
Additional Information and Prerequisites
SAP ABAP yes no yes no no no no Prepare Connectivity to SAP ABAP Systems [page 114]
SAP BW yes no yes no no no no Prepare Connectivity to SAP BW [page 116]
SAP BW/4HANA Model Transfer
yes (to connect to the SAP HANA database of SAP BW/4HANA)
no yes (to make http requests to SAP BW/4HANA)
no no no no Preparing SAP BW/4HANA Model Transfer Connectivity [page 117]
SAP ECC yes no yes no no no no Prepare Connectivity to SAP ECC [page 120]
SAP Fieldglass
yes no yes no no no no Prepare Connectivity to SAP Fieldglass [page 121]
86 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
Connection Type
Remote Tables: Data Provisioning Agent Required?
Remote Tables: Installation of JDBC Library Required for Connecting to Sources from Third-Party Vendors via Data Provisioning Agent?
Data Flows: Cloud Connector Required for On-Premise Sources?
Data Flows: Third-Party Driver Upload Required?
SAP Data Warehouse Cloud IP Required in Source Allowlist?
Server Certificate Upload Required?
Source IP Required in SAP Data Warehouse Cloud IP Allowlist?
Additional Information and Prerequisites
SAP HANA yes (for on-premise)
no yes (for on-premise when using Cloud Connector for remote tables or for data flows feature)
no no yes (for cloud)
Cloud Connector IP (for on-premise when using Cloud Connector for remote tables feature)
Prepare Connectivity to SAP HANA [page 121]
SAP HANA Cloud, Data Lake
no no no no no no no n/a
SAP Marketing Cloud
yes no yes no no no no Prepare Connectivity to SAP Marketing Cloud [page 122]
SAP SuccessFactors for Analytical Dashboards
no no no no yes (HANA IP Address)
yes no Prepare Connectivity to SAP SuccessFactors for Analytical Dashboards [page 123]
SAP S/4HANA Cloud
yes no no no no no no Prepare Connectivity to SAP S/4HANA Cloud [page 124]
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 87
Connection Type
Remote Tables: Data Provisioning Agent Required?
Remote Tables: Installation of JDBC Library Required for Connecting to Sources from Third-Party Vendors via Data Provisioning Agent?
Data Flows: Cloud Connector Required for On-Premise Sources?
Data Flows: Third-Party Driver Upload Required?
SAP Data Warehouse Cloud IP Required in Source Allowlist?
Server Certificate Upload Required?
Source IP Required in SAP Data Warehouse Cloud IP Allowlist?
Additional Information and Prerequisites
SAP S/4HANA On-Premise
yes no yes no no no no Prepare Connectivity to SAP S/4HANA On-Premise [page 124]
NoteFor information about supported versions of sources that are connected via SAP HANA SDI and its Data Provsioning Agent, see the SAP HANA smart data integration and all its patches Product Availability Matrix (PAM) for SAP HANA SDI 2.0 .
For information about necessary JDBC libraries for connecting to sources from third-party vendors, see:
● SAP HANA smart data integration and all its patches Product Availability Matrix (PAM) for SAP HANA SDI 2.0
● Register Adapters with SAP Data Warehouse Cloud [page 93]
4.1 Preparing Data Provisioning Agent Connectivity
Most connection types that support creating views and accessing or replicating data via remote tables for this purpose leverage SAP HANA Smart Data Integration (SDI) and its Data Provisioning Agent. Before using the connection, the agent requires an appropriate setup.
Context
Data Provisioning Agent acts as a gateway to SAP Data Warehouse Cloud.
For an overview of connection types that require a Data Provisioning Agent setup, see Preparing Connectivity for Connections [page 83].
88 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
Procedure
To prepare connectivity via Data Provisioning Agent, perform the following steps:1. Install the latest Data Provisioning Agent version on a host in your local network.
For more information, see Install the Data Provisioning Agent [page 89].2. Add the external IP address of the server on which your SAP Data Provisioning Agent is running to the IP
allowlist in SAP Data Warehouse Cloud.
NoteFor security reasons, all external connections to your SAP Data Warehouse Cloud instance are blocked by default. By adding external IPv4 addresses or address ranges to the allowlist you can manage external client connections.
For more information, see Add IP address to IP Allowlist [page 100].3. Connect the Data Provisioning Agent to SAP Data Warehouse Cloud.
This includes configuring the agent and setting the user credentials in the agent.
For more information, see Connect and Configure the Data Provisioning Agent [page 90].4. Register the adapters with SAP Data Warehouse Cloud.
NoteFor third-party adapters, you need to download and install any necessary JDBC libraries before registering the adapters.
For more information, see Register Adapters with SAP Data Warehouse Cloud [page 93].
4.1.1 Install the Data Provisioning Agent
You download the latest Data Provisioning Agent 2.0 version from SAP Software Download Center and install it as a standalone installation on a Windows or Linux machine. If you have already installed an agent, check if you need to update to the latest version. If you have more than one agent that you want to connect to SAP Data Warehouse Cloud, make sure to have the same latest version for all agents.
Procedure
1. Plan and prepare the Data Provisioning Agent installation.a. You can install the agent on any host system that has access to the sources you want to access, meets
the minimum system requirements, and has any middleware required for source access installed. The agent should be installed on a host that you have full control over to view logs and restart, if necessary.
For more information on where you can install the agent, see Supported Platforms and System Requirements.
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 89
b. Download the latest Data Provisioning Agent HANA DP AGENT 2.0 from the SAP Software Download Center .
Note○ Make sure that all agents that you want to connect to SAP Data Warehouse Cloud have the
same latest version.○ Select your operating system before downloading the agent.
For more information, see Software Download.2. Install the Data Provisioning Agent on a host in your local network.
For more information, see Install from the Command Line.
NoteIf you have upgraded your Data Provisioning Agent to version 2.5.1 and want to create an Amazon Redshift connection, apply SAP note 2985825 .
Related Information
The following links point you to the latest SAP HANA Smart Data Integration and SAP HANA Smart Data Quality documentation for version 2.0:Install the Data Provisioning AgentPlanning and PreparationSupported Platforms and System RequirementsSoftware DownloadInstall from the Command LineUpdate the Data Provisioning Agent
4.1.2 Connect and Configure the Data Provisioning Agent
Connect the Data Provisioning Agent to SAP Data Warehouse Cloud. This includes configuring the agent and setting the user credentials in the agent.
Procedure
1. In SAP Data Warehouse Cloud, register the Data Provisioning Agent.
a. In the side navigation area, click (System) (Configuration) Data Integration .
In the On-Premise Agents section, each registered agent is displayed with a tile which shows important information, for example if the agent is connected or not, which version it has, and which adapters have been registered.
90 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
b. To create a new agent, add a new tile.c. In the following dialog, enter a unique name for your new agent registration.
NoteThe registration name cannot be changed later.
d. Select Create.
In the Agent Settings dialog, you get SAP HANA server information from SAP Data Warehouse Cloud. You will need this information when configuring the Data Provisioning Agent on your local host. Furthermore, the dialog displays a user name and a password for the SAP HANA XS user that is used for agent messaging. You will need this information when setting the SAP HANA XS user credentials in the Secure Storage Utility of the Data Provisioning Agent.
NoteEither keep the Agent Settings dialog open, or note down the information provided for the next step.
NoteDo not select any adapters yet. Before selecting adapters for registering them with SAP Data Warehouse Cloud, the Data Provisioning Agent needs to be connected.
2. Create a connection between the agent and SAP HANA. Perform the following steps:a. In the Data Provisioning Agent from the command line, go to the <DPAgent_root> directory, where
the agent was installed. By default, on Windows, this is C:\usr\sap\dataprovagent, and on Linux it is /usr/sap/dataprovagent.
b. Configure dpagentconfig.ini:
In the <DPAgent_root> directory, open dpagentconfig.ini and check and edit the following lines. Use the values provided by the SAP Data Warehouse Cloud Agent Settings dialog from the previous step:
Value Agent Setting in SAP Data Warehouse Cloud
agent.name=<Agent Name> (case sensitive) Agent Name
hana.port=<HANA Port> HANA Port
hana.onCloud=false n/a
hana.useSSL=true HANA Use SSL
hana.server=<HANA Server> HANA Server
jdbc.enabled=true HANA via JDBC
jdbc.host=<HANA Server> HANA Server
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 91
Value Agent Setting in SAP Data Warehouse Cloud
jdbc.port=<HANA Port> HANA Port
jdbc.encrypt=true n/a
Note<HANA Server> is the server IP address or hostname. <HANA Port> is the HTTP port.
c. If you need a proxy to access the internet due to a corporate firewall, you must also set the following lines:
cloud.useProxy=true proxyHost=<proxy_hostname> proxyPort=<proxy_port>
d. Save the changes made to dpagentconfig.ini.e. Start the Data Provisioning Agent.
Open an Administrator prompt on the Windows or Linux host where the Data Provisioning Agent is installed, and run the following command:
On Linux:
<DPAgent_root>/bin/agentcli.sh --configAgent
On Windows:
<DPAgent_root>/bin/agentcli.bat --configAgent
You must start the agent, if it has not been started already: Select option 2. Start or Stop Agent, and then option 1. Start Agent.
Wait a few seconds. Use option 3. Ping Agent to check if the agent is running.
To exit the script, select q. Quit.
This pattern of stopping and then starting the agent is required whenever you make a configuration change to the agent.
f. Set the credentials for the HANA XS user.
In the Administrator prompt, run the following command:
On Linux:
<DPAgent_root>/bin/agentcli.sh --setSecureProperty
On Windows:
<DPAgent_root>/bin/agentcli.bat --setSecureProperty
Select option 1. HANA XS Username.
Enter the username presented in the SAP Data Warehouse Cloud Agent Settings dialog from step 1.
92 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
Select option 2. HANA XS Password.
Enter the password presented in the Agent Settings dialog from step 1.
NoteIf you have closed and reopened the Agent Setting dialog, and it doesn't show a password, select Request a new password.
The new password is saved to SAP Data Warehouse Cloud and is ready to be used.
Select q. Quit to exit the script.g. Stop and restart the Data Provisioning Agent.
On Linux:
<DPAgent_root>/bin/agentcli.sh --configAgent
On Windows:
<DPAgent_root>/bin/agentcli.bat --configAgent
To stop the agent, select option 2. Start or Stop Agent, and then option 2. Stop Agent.
Select option 1. Start Agent to restart the agent.
Select option 1. Agent Status to check the connection status.
If the connection succeeded, you should see Agent connected to HANA: Yes.
Select q. Quit to exit the script.h. If you have kept the Agent Settings dialog in SAP Data Warehouse Cloud open, you can now close it.
The Data Provisioning Agent is now connected.
If the tile of the registered Data Provisioning Agents doesn’t display the updated connection status, select Refresh Agents.
4.1.3 Register Adapters with SAP Data Warehouse Cloud
After configuring the Data Provisioning Agent, in SAP Data Warehouse Cloud, register the Data Provisioning adapters that are needed to connect to on-premise sources.
Prerequisites
For third-party adapters, ensure that you have downloaded and installed any necessary JDBC libraries. Place the files in the <DPAgent_root>/lib folder before registering the adapters with SAP Data Warehouse Cloud. For connection types Amazon Redshift and Generic JDBC, place the file in the <DPAgent_root>/camel/lib folder. For information about the proper JDBC library for your source, see the SAP HANA smart data integration Product Availability Matrix (PAM). Search for the library in the internet and download it from an appropriate web page.
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 93
Procedure
1. In the side navigation area, click (System) (Configuration) Data Integration .2. In the On-Premise Agents section, click the Adapters button to display the agents with their adapter
information.
3. Click (menu) and then Edit.4. In the Agent Settings dialog, under Agent Adapters select the adapters.5. Close the dialog.
The selected adapters are registered with SAP Data Warehouse Cloud and available for creating connections to the supported on-premise sources.
Next Steps
To use new functionality of an already registered adapter or to update the adapter in case of issues that have been fixed in a new agent version, you can refresh the adapter by clicking the (menu) button and then choosing Refresh.
Related Information
SAP HANA smart data integration and all its patches Product Availability Matrix (PAM) for SAP HANA SDI 2.0
4.1.4 Prerequisites for ABAP RFC Streaming
If you want to stream ABAP tables for loading large amounts of data without running into memory issues it is required to meet the following requirements.
● You need to create an RFC destination in the ABAP source system. With the RFC destination you register the Data Provisioning agent as server program in the source system.Using transaction SM59, you create a TCP/IP connection with a user-defined name. The connection should be created with “Registered Server Program” as “Activation Type”. Specify “IM_HANA_ABAPADAPTER_*” as a filter for the “Program ID” field, or leave it empty.
● Successful registration on an SAP Gateway requires that suitable security privileges are configured. For example:○ Set up an Access Control List (ACL) that controls which host can connect to the gateway. That file
should contain something similar to the following syntax: <permit> <ip-address[/mask]> [tracelevel] [# comment]. <ip-address> here is the IP of the server on which Data Provisioning agent has been installed.For more information, see the Gateway documentation in the SAP help for your source system version, for example Configuring Network-Based Access Control Lists (ACL) in the SAP NetWeaver 7.5 documentation.
94 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
○ You may also want to configure a reginfo file to control permissions to register external programs.
4.2 Preparing Cloud Connector Connectivity
To create a connection that can be used for building data flows certain prerequisites might be necessary.
For connection types that support data flows and that connect to an on-premise source or to an SAP cloud application that uses Cloud Data Integration based connectivity, you need to install and configure SAP Cloud Platform Cloud Connector. Connection types that need a Cloud Connector setup to enable data flows are Cloud Data Integration, Generic OData, SAP ABAP, SAP Fieldglass, SAP HANA, SAP Marketing Cloud or SAP S/4HANA On-Premise, for example. For more information, see Configure Cloud Connector [page 95].
4.2.1 Configure Cloud Connector
Set up and configure Cloud Connector before creating a connection to an on-premise source that you want to use for data flows or model import, or for a SAP HANA on-premise source that you want to use for remote tables via SAP HANA Smart Data Integration.
Prerequisites
Before configuring the Cloud Connector, the following prerequisites must be fulfilled:
● The Cloud Connector is installed in your on-premise network.For more information, see Cloud Connector Installation in the SAP BTP Connectivity documentation.
● Before configuring the Cloud Connector, you or the owner of your organisation will need an SAP Business Technology Platform (SAP BTP) account. If you don't have an account yet, create an account by clicking Register in the SAP BTP cockpit.
● During Cloud Connector configuration you will need information for your SAP Data Warehouse Cloud subaccount. Make sure that you have the subaccount information available in System Administration
Data Source Configuration SAP Cloud Platform (SAP CP) Account .For more information, see Set Up Cloud Connector in SAP Data Warehouse Cloud [page 99].
Context
The Cloud Connector serves as a link between SAP Data Warehouse Cloud and your on-premise sources. It is required for connections to on-premise sources that you want to use for:
● Data flows● Model import (Cloud Connector is required for the live data connection of type tunnel that you need to
create the model import connection)
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 95
● Remote tables (only for SAP HANA on-premise via SAP HANA Smart Data Integration)
In the Cloud Connector administation, you need to connect the SAP Data Warehouse Cloud subaccount to your Cloud Connector, add a mapping to each relevant source system in your network, and specify accessible resources for each source system.
Procedure
1. Log on to the Cloud Connector Administration on https://<hostname>:8443.
<hostname> refers to the machine on which the Cloud Connector is installed. If installed on your machine, you can simply enter localhost.
2. To connect the SAP Data Warehouse Cloud subaccount to your Cloud Connector, perform the following steps:a. In the side navigation area of the Cloud Connector Administration, click Connector to open the
Connector page and click Add Subaccount to open the Add Subaccount dialog.b. Enter or select the following information to add the SAP Data Warehouse Cloud subaccount to the
Cloud Connector.
NoteYou can find the subaccount, region, and subaccount user information in SAP Data Warehouse Cloud under System Administration Data Source Configuration SAP Cloud Platform (SAP CP) Account Account Information .
Property Description
Region Select your region host from the list.
Subaccount Add your SAP Data Warehouse Cloud subaccount name.
Display Name [optional] Add a name for the account.
Subaccount User Add your subaccount (S-User) username.
Password Add your S-User password for the SAP Business Technology Platform.
96 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
Property Description
Location ID [optional] Define a location ID that identifies the location of this Cloud Connector for the subaccount.
Note○ Using location IDs you can connect multiple
Cloud Connector instances to your subaccount. If you don't specify any value, the default is used. For more information, see Managing Subaccounts in the SAP BTP Connectivity documentation.
○ Each Cloud Connector instance must use a different location, and an error will appear if you choose a location that is already been used.
○ We recommend that you leave the Location ID empty if you don't plan to set up multiple Cloud Connectors in your system landscape.
Description (Optional) Add a description for the Cloud Connector.
c. Click Save.
In the Subaccount Dashboard section of the Connector page, you can see all subaccounts added to the Cloud Connector at a glance. After you added your subaccount, you can check the status to verify that the Cloud Connector is connected to the subaccount.
3. To allow SAP Data Warehouse Cloud to access systems (on-premise) in your network, you must specify the systems and the accessible resources in the Cloud Connector (URL paths or function module names depending on the used protocol). Perform the following steps for each system that you want to be made available by the Cloud Connector:
a. In the side navigation area, under your subaccount menu, click Cloud To On-Premise and then (Add)in the Mapping Virtual To Internal System section of the Access Control tab to open the Add System Mapping dialog.
NoteThe side navigation area shows the display name of your subaccount. If the area shows another subaccount, select your subaccount from the Subaccount field of the Cloud Connector Administration.
b. Add your system mapping information to configure access control and save your configuration.
The procedure to add your system mapping information is specific to the protocol that you are using for communication. For more information about the detailed configuration steps for each communication protocol, see Configure Access Control in the SAP BTP Connectivity documentation.
NoteThe internal host specifies the host and port under which the backend system can be reached within the intranet. It must be an existing network address that can be resolved on the intranet and
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 97
has network visibility for the Cloud Connector. The Cloud Connector tries to forward the request to the network address specified by the internal host and port, so this address needs to be real.
The virtual host name and port represent the fully qualified domain name of the related system in the cloud.
For a SAP BW∕4HANA system that you want to use to import models into SAP Data Warehouse Cloud, enter the following:
Property Description
Back-end Type ABAP system
Protocol HTTPS
Internal Host
Internal Port
<system host>
<system port>
Virtual Host
Virtual Port
<can use the same host as the internal host>
<can use the same port as the internal port>
Principal Type None
c. To grant access only to the resources needed by SAP Data Warehouse Cloud, select the system host you just added from the Mapping Virtual To Internal System list, and for each resource that you want to allow to be invoked on that host click (Add) in the Resources Of section to open the Add Resource dialog.
d. Depending on the protocol, enter the URL Path (for HTTPS) or the Function Name (name of the function module for RFC).
For SAP BW/4HANA Model Import connections, the following URL paths need to be accessible. For respective paths (see below), select Path and all sub-paths.
○ /sap/opu/odata/sap/ESH_SEARCH_SRV/SearchQueries○ /sap/bw4/v1/dwc/dbinfo○ /sap/bw4/v1/dwc/metadata/queryviews – path and all sub-paths○ /sap/bw4/v1/dwc/metadata/treestructure – path and all sub-paths○ /sap/bw/ina – path and all sub-paths
For more information, see see Configure Access Control (HTTP) in the SAP BTP Connectivity documentation.
To connect to ABAP systems with SAP ABAP, SAP BW, SAP ECC, or SAP S/4HANA On-Premise connections, make the following function modules accessible. For respective function names (see below), select Prefix.
○ /SAPDS/ – prefix○ DHAMB_ – prefix○ DHAPE_ – prefix○ LTAMB_ – prefix○ LTAPE_ – prefix○ RFC_FUNCTION_SEARCH
98 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
○ RODPS_REPL_ – prefix
For more information, see see Configure Access Control (RFC) in the SAP BTP Connectivity documentation.
e. Choose Save.
4. [optional] To enable secure network communication (SNC) for data flows, configure SNC in the Cloud Connector.
For more information, see Initial Configuration (RFC) in the SAP BTP Connectivity documentation.
Next Steps
When you've defined a location ID in Cloud Connector configuration, you need to add the location ID in the On-premise data sources section of the SAP Data Warehouse Cloud administration. For more information, see Set Up Cloud Connector in SAP Data Warehouse Cloud [page 99].
You can now create your connections in SAP Data Warehouse Cloud.
4.2.2 Set Up Cloud Connector in SAP Data Warehouse Cloud
Different settings are required to set up Cloud Connector in SAP Data Warehouse Cloud.
Context
The Cloud Connector allows you to connect to on-premise data sources and use them for data flow or model import.
Procedure
1. In the side navigation area, click (System) (Administration) Data Source Configuration .2. In the SAP Cloud Platform (SAP CP) Account section, enter the SAP BTP user ID that has been used to set
up the Cloud Connetor if you cannot see any account information yet.Before configuring the Cloud Connector for connection creation in SAP Data Warehouse Cloud, you or the owner of your organisation will need an SAP Business Technology Platform (SAP BTP) account. If you don't have an account yet, create an account in the SAP BTP cockpit by clicking Register .During Cloud Connector configuration you will then need to enter information from your SAP Data Warehouse Cloud subaccount.To get the SAP Data Warehouse Cloud subaccount information, the subaccount needs to be linked to the user ID of your SAP BTP account. In the SAP Cloud Platform (SAP CP) Account section, you can check if this has been done and the Account Information is already available.
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 99
If you have an account but cannot see the Account Information here, enter the SAP BTP user ID. This ID is typically the email address you used to create your SAP BTP account. After you have entered the ID you can see the Account Information for SAP Data Warehouse Cloud:○ Subaccount○ Region Host○ Subaccount User
3. In the Live Data Sources section, switch on Allow live data to securely leave my network to be able to use the Cloud Connector for the model import feature in SAP Data Warehouse Cloud.
NoteThe Allow live data to securely leave my network switch is audited, so that administrators can see who switched this feature on and off. To see the changes in the switch state, go to (Security) (Activities), and search for ALLOW_LIVE_DATA_MOVEMENT.
4. In the On-premise data sources section, add location IDs if you have connected multiple Cloud Connector instances to your subaccount with different location IDs and you want to offer them for selection when creating connections using a Cloud Connector. If you don't add any location IDs here, the default location will be used.Cloud Connector location IDs identify Cloud Connector instances that are deployed in various locations of a customer's premises and connected to the same subaccount. Starting with Cloud Connector 2.9.0, it is possible to connect multiple Cloud Connectors to a subaccount as long as their location ID is different.
4.3 Add IP address to IP Allowlist
Clients in your local network need an entry in the appropriate IP allowlist in SAP Data Warehouse Cloud. Cloud Connectors in your local network only require an entry if you want to use them for federation and replication from on-premise systems.
Context
To secure your environment, you can control the range of IPv4 addresses that get access to the database of your SAP Data Warehouse Cloud by adding them to an allowlist.
You need to provide the external (public) IPv4 address (range) of the client directly connecting to the database of SAP Data Warehouse Cloud. If you're using a network firewall with a proxy, you need to provide the public IPv4 address of your proxy.
Internet Protocol version 4 addresses (IPv4 addresses) have a size of 32 bits and are represented in dot-decimal notation, 192.168.100.1 for example. The external IPv4 address is the address that the internet and computers outside your local network can use to identify your system.
The address can either be a single IPv4 address or a range specified with a Classless Inter-Domain Routing suffix (CIDR suffix). An example for a CIDR suffix is /24 which represents 256 addresses and is typically used for a large local area network (LAN). The CIDR notation for the IPv4 address above would be: 192.168.100.1/24 to denote the IP addresses between 192.168.100.0 and 192.168.100.255 (the leftmost 24 bits of the address in
100 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
binary notation are fixed). The external (public) IP address (range) to enter into the allowlist will be outside of the range 192.168.0.0/16. You can find more information on Classless Inter-Domain Routing on Wikipedia .
NoteThe number of entries in the allowlist is limited. Once the limit has been reached, you won't be able to add entries. Therefore, please consider which IP addresses should be added and whether the number of allowlist entries can be reduced by using ranges to request as few allowlist entries as possible.
Procedure
1. In the side navigation area, click (System) (Configuration) IP Allowlist .2. From the IP Allowlist dropdown, select the appropriate list:
○ Trusted IPs: For clients such as an SAP HANA Smart Data Provisioning Agent on a server, 3rd party ETL or analytics tools, or any other JDBC-client
○ Trusted Cloud Connector IPs: For Cloud Connectors that you want to use the for federation and replication from on-premise systems such as SAP HANA
The selected list shows all IP addresses that are allowed to connect to the SAP Data Warehouse Cloud database.
3. Click Add to open the Allow IP Addresses dialog.
NoteOnce the number of entries in the allowlist has reached its limit, the Add button will be disabled.
4. In the CIDR field of the dialog, either provide a single IPv4 address or a range specified with a CIDR suffix.
NotePlease make sure that you provide the external IPv4 address of your client respectively proxy when using a network firewall. The IP you enter needs to be your public internet IP.
5. In the dialog, click Add to return to the list.6. To save your newly added IP to the allowlist on the database, click Save in the pushbutton bar of your list.
NoteUpdating the allowlist in the database requires some time. To check if your changes have been applied, click Refresh.
Next Steps
You can also select and edit an entry from the list if an IP address has changed, or you can delete IPs if they are not required anymore to prevent them from accessing the database of SAP Data Warehouse Cloud. To update
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 101
the allowlist in the database with any change you made, click Save and be reminded that the update in the database might take some time.
4.4 Finding SAP Data Warehouse Cloud IP addresses
Find externally facing IP addresses that for particular remote applications must be added to allowlists before you can to use connections to these remote applications.
Particular remote applications or sources that you might want to access with SAP Data Warehouse Cloud restrict access to their instances and require external SAP Data Warehouse Cloud IP address information to be added to an allowlist in the remote application before first trying to access the application .
Outbound IP Address
The network for Amazon Redshift or Microsoft Azure SQL Database instances is protected by a firewall that controls incoming traffic. To be able to use connections with these connection types for data flows, the connected sources require the SAP Data Warehouse Cloud outbound IP address to be added to an allowlist.
Find the Outbound IP Address in the last step of the connection creation wizard.
Administrators can find the Outbound IP Address from the side navigation area by clicking (System) (About) and expanding the More section in the dialog.
HANA IP Addresses
Access to SAP SuccessFactors instances is restricted. To be able to use a SAP SuccessFactors for Analytical Dashboards connection for remote tables and view building, the connected source requires the externally facing IP addresses of theSAP Data Warehouse Cloud tenant to be added to an allowlist.
Administrators can find the HANA IP Addresses from the side navigation area by clicking (System) (About) and expanding the More section in the dialog.
For more information about adding the IP addresses in SAP SuccessFactors, see Adding an IP Restriction in the SAP SuccessFactors platform documentation.
102 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
4.5 Upload Certificates
For some sources that you want to access with your SAP Data Warehouse Cloud tenant, you need to upload server certificates to enable secure SSL/TLS-based connections.
Prerequisites
You have downloaded the certificate from an appropriate web page. Only X.509 Base64-encoded certificates enclosed between "-----BEGIN CERTIFICATE-----" and "-----END CERTIFICATE-----" are supported. The common filename extension for the certificates is .pem (Privacy-enhanced Electronic Mail). We also support filename extensions .crt and .cer.
Context
Enable secure SSL/TLS-based connections for connection types that support remote tables but don't use a Data Provisioning Agent. This means that sources such as Amazon Athena, Google BigQuery, Generic OData, or SAP SuccessFactors for Analytical Dashboards require certificate upload for trusted connectivity.
You can create connections for sources which require a certificate without having uploaded the necessary certificate. Validating a connection without valid server certificate in Space Management will fail though, and you won't be able to use the connection.
Procedure
1. In the side navigation area, click (System) (Configuration) Security .
2. Click Add Certificate.3. In the Upload Certificate dialog, browse your local directory and select the certificate.4. Enter a description to provide intelligible information on the certificate.5. Choose Upload.
Results
In the overview, you can see the certificate with its creation and expiry date. From the overview, you can delete certificates.
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 103
4.6 Upload Third-Party ODBC Drivers (Required for Data Flows)
To enable access to non SAP databases via ODBC to use them as sources for data flows, make sure that you have uploaded the required ODBC driver files to SAP Data Warehouse Cloud.
Prerequisites
● Search for the library in the internet and download it from an appropriate web page (see below).● Ensure you have a valid license for driver files.
Context
Drivers are required for the following connection types:
Connection Type Driver to be uploaded Download Site
Amazon Redshift AmazonRedshiftODBC-64-bit-1.4.11.1000-1.x86_64.rpm
https://docs.aws.amazon.com
Microsoft SQL Server pre-bundled; no upload required n/a
Oracle instantclient-basiclite-linux.x64-12.2.0.1.0.zip
NoteMake sure to select the Basic Light package zip file from the 12.2.0.1.0 version. The package applies to all versions supported by the Oracle connection type (Oracle 12c, Oracle 18c, and Oracle 19c).
https://oracle.com
Upload a Driver
Before creating a connection, follow the steps below:
1. In the side navigation area, click (System) (Configuration) Data Integration .2. Go to Third-Party Drivers and choose Upload.3. In the following dialog box, choose Browse to select the driver file from your download location.4. Choose Upload.
104 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
5. Choose sync and wait for about 5 to 10 minutes to finish synchronization before you start creating connections or using data flows with the connection.
Remove (and Re-Upload) a Driver
You might need to remove a driver when you want to upload a new version of the driver or your licence agreement has terminated.
1. Select the driver and choose Delete.2. If you're using a connection that requires the removed driver for data flows, choose Upload to re-upload
the driver to make sure that you can continue using the data flows.3. Choose sync to synchronize the driver changes with the underlying component. Once the
synchronization has finished, you can continue using data flows with the connection, or, if you haven't uploaded a new driver, you won't be able to use data flows with the connection anymore unless you re-upload the driver.
If you're using a connection that requires the removed driver for data flows, re-upload the driver to make sure that you can continue using the data flows.
4.7 Prepare Connectivity to Adverity
To be able to successfully validate and use a connection to Adverity for view building certain preparations have to be made.
Before you can use the connection, an administrator has prepared the following:
● In an Adverity workspace, you have prepared a datastream that connects to the data source for which you want to create the connection.
● In SAP Data Warehouse Cloud, you have added the necessary Adverity IP addresses to the IP allowlist. For more information, see Add IP address to IP Allowlist [page 100].
NoteTo get the relevant IP addresses, please contact your Adverity Account Manager or the Adverity Support team.
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 105
4.8 Prepare Connectivity to Amazon Athena
To be able to successfully validate and use a connection to Amazon Athena for remote tables certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● The server certificates have been uploaded to SAP Data Warehouse Cloud. You need two certificates, one for Amazon Athena and one for Amazon S3. Region-specific certificates might be required for Amazon Athena.For more information, see Upload Certificates [page 103].
4.9 Prepare Connectivity to Amazon Redshift
To be able to successfully validate and use a connection to an Amazon Redshift database for remote tables or data flows certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered CamelJdbcAdapter.For more information, see Preparing Data Provisioning Agent Connectivity [page 88].
● The necessary JDBC library has been downloadad and installed. The administrator has placed the file in the <DPAgent_root>/camel/lib folder and restarted the Data Provisioning Agent before registering the adapter with SAP Data Warehouse Cloud.
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● The SAP Data Warehouse Cloud outbound IP has been added to an allowlist in the source.For information on where a Data Warehouse Administrator can find the IP address, see Finding SAP Data Warehouse Cloud IP addresses [page 102].
106 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
● You have uploaded the necessary ODBC driver file to SAP Data Warehouse Cloud.For more information, see Upload Third-Party ODBC Drivers (Required for Data Flows) [page 104].
4.10 Prepare Connectivity for Cloud Data Integration
To be able to successfully validate and use a Cloud Data Integration connection for remote tables or data flows certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered CloudDataIntegrationAdapter.For more information, see Preparing Data Provisioning Agent Connectivity [page 88].
● For ABAP-based cloud SAP systems such as SAP S/4HANA Cloud or SAP Marketing Cloud: A communication arrangement is set up for communication scenario SAP_COM_0531 (CDI API for CDS Based Extraction) in the source system. For more information, see Integrating CDI in the SAP S/4HANA Cloud documentation.
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● Your on-premise network has the Cloud Connector installed and configured.For more information, see Configure Cloud Connector [page 95].
● For ABAP-based cloud SAP systems such as SAP S/4HANA Cloud or SAP Marketing Cloud: A communication arrangement is set up for communication scenario SAP_COM_0531 (CDI API for CDS Based Extraction) in the source system. For more information, see Integrating CDI in the SAP S/4HANA Cloud documentation.
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 107
4.11 Prepare Connectivity for Generic JDBC
To be able to successfully validate and use a Generic JDBC connection for remote tables certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered CamelJdbcAdapter.For more information, see Preparing Data Provisioning Agent Connectivity [page 88].
● It has been checked that the data source is supported by the CamelJdbcAdapter.For latest information about supported data sources and versions, see the SAP HANA Smart Data Integration Product Availability Matrix (PAM) .
NoteFor information about unsupported data sources, see 3130999 .
● The necessary JDBC library has been downloadad and installed. The administrator has placed the file in the <DPAgent_root>/camel/lib folder and restarted the Data Provisioning Agent before registering the adapter with SAP Data Warehouse Cloud.For more information, see Set up the Camel JDBC Adapter in the SAP HANA Smart Data Integration and SAP HANA Smart Data Quality Installation and Configuration Guide.For information about the proper JDBC library for your source, see the SAP HANA smart data integration Product Availability Matrix (PAM).
4.12 Prepare Connectivity for Generic OData
To be able to successfully validate and use a connection to an OData service for remote tables or data flows certain preparations have to be made.
General
Before you can use the connection, an administrator has prepared the following:
● The OData service URL needs to be publicly available.
108 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● The server certificate has been uploaded to SAP Data Warehouse Cloud.For more information, see Upload Certificates [page 103].
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● Your on-premise network has the Cloud Connector installed and configured.For more information, see Configure Cloud Connector [page 95].
4.13 Prepare Connectivity for Generic SFTP
To create a Generic SFTP connection the host's public key is required.
The host's public key should be provided through a trustable channel. If your Windows 10, Linux, or MacOS machine has such a channel, follow the steps below by replacing any occurence of $HOST with the host value of your connection, and $PORT with the port value.
Use the resulting file host_key.pub.txt (found in the directory where you run the following command) to upload the Host Key when creating your connection.
● Windows 10: In PowerShell, run the following command:(ssh-keyscan -t rsa -p $PORT $HOST 2>$null) -replace '^[^ ]* ','' > host_key.pub.txt
● Linux/MacOS: In a unix-compliant shell with both ssh-keyscan and sed commands (both are usually already installed in your system), obtain the key through the following command:ssh-keyscan -t rsa -p $PORT $HOST 2>/dev/null | sed "s/^[^ ]* //" > host_key.pub.txt
NoteIf your machine doesn't have a trustable channel, we recommend asking your administrator for the public host key to avoid man-in-the-middle attacks.
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 109
4.14 Prepare Connectivity to Google BigQuery
To be able to successfully validate and use a connection to a Google BigQuery data source for remote tables, certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● The server certificate has been uploaded to SAP Data Warehouse Cloud.For more information, see Upload Certificates [page 103].
4.15 Prepare Connectivity to Microsoft Azure SQL Database
To be able to successfully validate and use a connection to Microsoft Azure SQL database for remote tables or data flows certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered MssqlLogReaderAdapter.For more information, see Preparing Data Provisioning Agent Connectivity [page 88].
● The necessary JDBC library has been downloadad and installed in the <DPAgent_root>/lib folder before registering the adapter with SAP Data Warehouse Cloud.
● To use Microsoft SQL Server trigger-based replication the user entered in the connection credentials needs to have the required privileges and permissions. For more information, see Required Permissions for SQL Server Trigger-Based Replication in the Installation and Configuration Guide for SAP HANA Smart Data Integration and SAP HANA Smart Data Quality
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● The SAP Data Warehouse Cloud outbound IP has been added to an allowlist in the source.For information on where a Data Warehouse Administrator can find the IP address, see Finding SAP Data Warehouse Cloud IP addresses [page 102].
110 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
4.16 Prepare Connectivity to Microsoft SQL Server
To be able to successfully validate and use a connection to a Microsoft SQL Server for remote tables or data flows, certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered MssqlLogReaderAdapter.For more information, see Preparing Data Provisioning Agent Connectivity [page 88].
● The necessary JDBC library has been downloadad and installed in the <DPAgent_root>/lib folder before registering the adapter with SAP Data Warehouse Cloud.
● Required Permissions for SQL Server Trigger-Based Replication in the SAP HANA Smart Data Integration and SAP HANA Smart Data Quality Installation and Configuration Guide
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● Your on-premise network has the Cloud Connector installed and configured.For more information, see Configure Cloud Connector [page 95].
NoteCloud Connector is not required if your Microsoft SQL Server database is available on the public internet.
● The required driver is pre-bundled and doesn't need to be uploaded by an administrator.
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 111
4.17 Prepare Connectivity to SAP Open Connectors
Integrate SAP Open Connectors with SAP Data Warehouse Cloud to be able to connect to third party data sources powered by SAP Open Connectors.
Preparations in SAP BTP and SAP Open Connectors Account
1. Set up an SAP BTP account and enable the SAP Integration Suite service with the SAP Open Connectors capability.
NoteYou need to know your SAP BTP subaccount information (provider, region, environment, trial - yes/no) later to select the appropriate SAP BTP subaccount region in SAP Data Warehouse Cloud when integrating the SAP Open Connectors account in your space.
For information about setting up an SAP BTP trial version with the SAP Integration Suite service, see Set Up Integration Suite Trial . To enable SAP Open Connectors, you need to activate the Extend Non-SAP Connectivity capability in the Integration Suite.For information about setting up SAP Integration Suite from a production SAP BTP account, see Initial Setup in the SAP Integration Suite documentation.
2. In your SAP Open Connectors account, create connector instances for the sources that you want to connect to SAP Data Warehouse Cloud.For more information about creating an instance, see Authenticate a Connector Instance (UI) in the SAP Open Connectors documentation.For more information about connector-specific setup and connector-specific properties required to create an instance, see Connectors Catalog in the SAP Open Connectors documentation. There, click the connector in question and then <connector name> API Provider Setup or <connector name> Authenticate a Connector Instance.
3. In your SAP Open Connectors account, record the following information which you will require later in SAP Data Warehouse Cloud:○ Organization secret and user secret - required when integrating the SAP Open Connectors account in
your space.○ Name of the connector instance - required when selecting the instance in the connection creation
wizard
Preparations in SAP Data Warehouse Cloud
1. In the side navigation area, click (Connections), select a space if necessary, click the SAP Open Connectors tab, and then click Integrate your SAP Open Connectors Account to open the Integrate your SAP Open Connectors Account dialog.
2. In the dialog, provide the following data:1. In the SAP BTP Sub Account Region field, select the appropriate entry according to your SAP BTP
subaccount information (provider, region, environment, trial - yes/no).
112 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
2. Enter your SAP Open Connectors organisation and user secret.3. Click OK to integrate your SAP Open Connectors account with SAP Data Warehouse Cloud.
Results
With connection type Open Connectors you can now create connections to the third-party data sources available as connector instances with your SAP Open Connectors account.
4.18 Prepare Connectivity to Oracle
To be able to successfully validate and use a connection to an Oracle database for remote tables or data flows, certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered OracleLogReaderAdapter.For more information, see Preparing Data Provisioning Agent Connectivity [page 88].
● The necessary JDBC library has been downloadad and installed in the <DPAgent_root>/lib folder before registering the adapter with SAP Data Warehouse Cloud.
● Required Permissions for Oracle Trigger-Based Replication in the SAP HANA Smart Data Integration and SAP HANA Smart Data Quality Installation and Configuration Guide
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● To directly consume data in data flows, the Oracle database must be available on the public internet.● You have uploaded the necessary ODBC driver file to SAP Data Warehouse Cloud.
For more information, see Upload Third-Party ODBC Drivers (Required for Data Flows) [page 104].
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 113
4.19 Prepare Connectivity to Precog
To be able to successfully validate and use a connection to Precog for view building certain preparations have to be made.
Before you can use the connection, an administrator has prepared the following:
● In Precog, you have added the source for which you want to create the connection.● In SAP Data Warehouse Cloud, you have added the necessary Precog IP addresses to the IP allowlist. For
more information, see Add IP address to IP Allowlist [page 100].
NoteYou can find and copy the relevant IP addresses in the final step of the connection creation wizard.
4.20 Prepare Connectivity to SAP ABAP Systems
To be able to successfully validate and use a connection to an SAP ABAP system for remote tables or data flows, certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered ABAPAdapter.For the Language setting in the connection properties to have an effect on the language shown in the Data Builder, Data Provisioning Agent version 2.0 SP 05 Patch 10 (2.5.1) or higher is required.For more information, see Preparing Data Provisioning Agent Connectivity [page 88].
● The ABAP user specified in the credentials of the SAP ABAP connection needs to have a set of authorizations in the SAP ABAP system. For more information, see: Authorizations in the Installation and Configuration Guide for SAP HANA Smart Data Integration and SAP HANA Smart Data Quality.
● To access and copy data from SAP BW objects such as InfoProviders or characteristics the appropriate authorization objects like S_RS_ADSO or S_RS_IOBJA are required for the ABAP user. For more information, see Overview: Authorization Objects in the SAP NetWeaver documentation.If you want to access data from SAP BW Queries, make sure that the ABAP user has the required analysis authorizations to read the data and that characteristic 0TCAIPROV (InfoProvider) in the authorization includes @Q, which is the prefix for Queries as InfoProviders. For more information, see Defining Analysis Authorizations in the SAP NetWeaver documentation.
● If you want to stream ABAP tables for loading large amounts of data without running into memory issues, you need to configure suitable security privileges for successful registration on an SAP Gateway and you need to create an RFC destination of type TCP/IP in the ABAP source system. With the RFC destination you
114 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
register the Data Provisioning Agent as server program in the source system. For more information, see Prerequisites for ABAP RFC Streaming [page 94].
● To be able to use ABAP Dictionary tables from connections to a SAP BW∕4HANA system for remote tables and view building, please make sure that SAP note 2872997 has been applied to the system.
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● Your on-premise network has the Cloud Connector installed and configured.In the Cloud Connector configuration, make sure that access to the required resources is granted.For more information, see Configure Cloud Connector [page 95].See also: SAP Note 2835207 (ABAP connection type for SAP Data Intelligence)
● For SAP S/4HANA, source version 1909 FPS01 plus SAP Note 2873666 or higher versions are supported.
● When connecting to SAP LT Replication Server:Connectivity to SAP LT Replication Server is supported for:○ Systems with Addon DMIS 2011 Support Package 19 or higher (supporting SAP ECC 6.00 or higher)○ Systems with Addon DMIS 2018 Support Package 4 or higher (supporting SAP S/4HANA 1709 or
higher)○ SAP S/4HANA 2020 or higher
Source systems connected to an SAP LT Replication Server are supported down to:○ version 4.6C via Addon DMIS 2010○ version 6.20 via Addon DMIS 2011
In the ABAP-based system in which SAP LT Replication Server is installed, an administrator has created a configuration to specify the source system, the SAP LT Replication Server system and SAP Data Warehouse Cloud as target system.For more information, see Creating a Configuration in the SAP Landscape Transformation Replication Server documentation.
● When connecting to SAP S/4HANA Cloud, an administrator has created a communication arrangement for the communication scenario SAP_COM_0532 (SAP Data Hub – ABAP CDS Pipeline Integration) in the SAP S/4HANA Cloud system.For more information, see Integrating CDS Views Using ABAP CDS Pipeline in the SAP S/4HANA Cloud documentation.
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 115
4.21 Prepare Connectivity to SAP BW
To be able to successfully validate and use a connection to SAP BW for remote tables or data flows, certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered ABAPAdapter.For the Language setting in the connection properties to have an effect on the language shown in the Data Builder, Data Provisioning Agent version 2.0 SP 05 Patch 10 (2.5.1) or higher is required.For more information, see Preparing Data Provisioning Agent Connectivity [page 88].
● The ABAP user specified in the credentials of the SAP ABAP connection needs to have a set of authorizations in the SAP ABAP system. For more information, see: Authorizations in the Installation and Configuration Guide for SAP HANA Smart Data Integration and SAP HANA Smart Data Quality.
● To access and copy data from SAP BW objects such as InfoProviders or characteristics the appropriate authorization objects like S_RS_ADSO or S_RS_IOBJA are required for the ABAP user. For more information, see Overview: Authorization Objects in the SAP NetWeaver documentation.If you want to access data from SAP BW Queries, make sure that the ABAP user has the required analysis authorizations to read the data and that characteristic 0TCAIPROV (InfoProvider) in the authorization includes @Q, which is the prefix for Queries as InfoProviders. For more information, see Defining Analysis Authorizations in the SAP NetWeaver documentation.
● If you want to stream ABAP tables for loading large amounts of data without running into memory issues, you need to configure suitable security privileges for successful registration on an SAP Gateway and you need to create an RFC destination of type TCP/IP in the ABAP source system. With the RFC destination you register the Data Provisioning Agent as server program in the source system. For more information, see Prerequisites for ABAP RFC Streaming [page 94].
● To be able to use ABAP Dictionary tables from connections to a SAP BW∕4HANA system for remote tables and view building, please make sure that SAP note 2872997 has been applied to the system.
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● Your on-premise network has the Cloud Connector installed and configured.In the Cloud Connector configuration, make sure that access to the required resources is granted.For more information, see Configure Cloud Connector [page 95].
116 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
4.22 Preparing SAP BW/4HANA Model Transfer Connectivity
Accessing SAP BW/4HANA meta data and importing models into SAP Data Warehouse Cloud with a SAP BW/4HANA Model Transfer connection requires two protocols (or endpoints): Http and SAP HANA Smart Data Integration based on the SAP HANA adapter.
For accessing SAP BW/4HANA, http is used to securely connect to the SAP BW/4HANA system via Cloud Connector, and SAP HANA SQL is used to connect to the SAP HANA database of SAP BW/4HANA via Data Provisioning Agent. Using Cloud Connector to make http requests to SAP BW/4HANA requires a live data connection of type tunnel to SAP BW/4HANA.
For information on supported SAP BW/4HANA source versions, see Supported Source Versions for SAP BW/4HANA Model Transfer Connections [page 119].
Before creating a connection for SAP BW/4HANA Model Transfer in SAP Data Warehouse Cloud, you need to prepare the following:
1. In SAP BW/4HANA, make sure that the following services are active in transaction code SICF:○ BW InA - BW Information Access Services:
○ /sap/bw/ina/GetCatalog○ /sap/bw/ina/GetResponse○ /sap/bw/ina/GetServerInfo○ /sap/bw/ina/ValueHelp○ /sap/bw/ina/BatchProcessing○ /sap/bw/ina/Logoff
○ /sap/bw42. In SAP BW/4HANA, activate OData service ESH_SEARCH_SRV in Customizing (transaction SPRO) under
SAP NetWeaver Gateway OData Channel Administration General Settings Activate and Maintain Services .
3. Install and configure Cloud Connector. For more information, see Configure Cloud Connector [page 95].
4. In the side navigation area of SAP Data Warehouse Cloud, click System Administration Data Source Configuration Live Data Sources and switch on Allow live data to leave my network.If you've provisioned SAP Data Warehouse Cloud prior to version 2021.03: Click (Product Switch) Analytics System Administration Data Source Configuration Live Data Sources and switch on Allow live data to leave my network.
5. In the side navigation area of SAP Data Warehouse Cloud, open System Configuration Data Integration Live Data Connections (Tunnel) and create a live data connection of type tunnel to SAP BW/4HANA.If you've provisioned SAP Data Warehouse Cloud prior to version 2021.03: Click (Product Switch) Analytics (Connections) and create a live data connection of type tunnel to SAP BW/4HANA.For more information, see Create Live Data Connection of Type Tunnel [page 118].
6. Install and configure a Data Provisioning Agent and register the SAP HANA adapter with SAP Data Warehouse Cloud:○ Install the latest Data Provisioning Agent version on a local host or updat your agent to the latest
version. For more information, see Install the Data Provisioning Agent [page 89].
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 117
○ In SAP Data Warehouse Cloud, add the external IPv4 address of the server on which your Data Provisioning Agent is running, or in case you are using a network firewall add the public proxy IP address to the IP allowlist. For more information, see Add IP address to IP Allowlist [page 100].
○ Connect the Data Provisioning Agent to SAP Data Warehouse Cloud. For more information, see Connect and Configure the Data Provisioning Agent [page 90].
○ In SAP Data Warehouse Cloud, register the SAP HANA adapter with SAP Data Warehouse Cloud. For more information, see Register Adapters with SAP Data Warehouse Cloud [page 93].
4.22.1 Create Live Data Connection of Type Tunnel
To securely connect and make http requests to SAP BW∕4HANA, you need to connect via Cloud Connector. This requires that you create a live data connection of type tunnel to the SAP BW∕4HANA system.
Prerequisites
See the prerequisites 1 to 4 in Preparing SAP BW/4HANA Model Transfer Connectivity [page 117].
Procedure
1. In the side navigation area, click (System) (Configuration) Data Integration .
NoteIf you've provisioned SAP Data Warehouse Cloud prior to version 2021.03 you'll see a different UI and need to click (Product Switch) Analytics (Connections) and continue with step 3.
2. In the Live Data Connections (Tunnel) section, click Create Live Data Connection.
The Create Live Data Connection dialog will appear.
3. On the Connections tab, click (Add Connection).
The Select a data source dialog will appear.4. Expand Connect to Live Data and select SAP BW.
The New BW Live Connection dialog appears.5. Enter a name and description for your connection. Note that the connection name cannot be changed
later.6. Set the Connection Type to Tunnel.
By enabling tunneling, data from the connected source will always be transferred through the Cloud Connector.
7. Select the Location ID.8. Add your SAP BW∕4HANA host name, HTTPS port, and client.
118 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
Use the virtual host name and virtual port that were configured in the Cloud Connector.9. Optional: Choose a Default Language from the list.
This language will always be used for this connection and cannot be changed by users without administrator privileges.
NoteYou must know which languages are installed on your SAP BW∕4HANA system before adding a language code. If the language code you enter is invalid, SAP Data Warehouse Cloud will default to the language specified by your system metadata.
10. Under Authentication Method select User Name and Password.11. Enter user name (case sensitive) and password of the technical user for the connection.
The user needs the following authorizations:
○ Authorization object S_BW4_REST (authorization field: BW4_URI, value: /sap/bw4/v1/dwc*)○ Authorization object SDDLVIEW (authorization field: DDLSRCNAME, value: RSDWC_SRCH_QV)○ Read authorizations for SAP BW∕4HANA metadata (Queries, CompositeProviders and their
InfoProviders)Using authorizations for SAP BW∕4HANA metadata, you can restrict a model transfer connection to a designated semantic SAP BW/4HANA area.For more information, see Overview: Authorization Objects in the SAP BW∕4HANA documentation.
12. Select Save this credential for all users on this system.13. Click OK.
NoteWhile saving the connection, the system checks if it can access /sap/bc/ina/ services in SAP BW∕4HANA.
Results
The connection is saved and now available for selection in the SAP Data Warehouse Cloud connection creation wizard for the SAP BW∕4HANA Model Transfer connection.
4.22.2 Supported Source Versions for SAP BW/4HANA Model Transfer Connections
In order to create a connection of type SAP BW/4HANA Model Transfer , the SAP BW/4HANA system needs to have a specific version.
These versions of SAP BW/4HANA are supported:
● SAP BW/4HANA 2.0 SPS07 or higher○ 2989654 BW/4 - Enable DWC "Import from Connection" for BW/4 Query - Revision 1
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 119
○ 2714624 Version Comparison False Result○ 2754328 Disable creation of HTTP Security Sessions per request○ 2840529 Sporadic HTTP 403 CSRF token validation errors○ 2976147 Import of query views in the BW/4 hybrid scenario: No search results of BW back ends with
SAP_BASIS Release 753● SAP BW/4HANA 2.0 SPS01 to SPS06 after you have applied the following SAP Notes:
○ 2943200 TCI for BW4HANA 2.0 Hybrid○ 2945277 BW/4 - Enable DWC "Import from Connection" for BW/4 Query○ 2989654 BW/4 - Enable DWC "Import from Connection" for BW/4 Query - Revision 1○ 2714624 Version Comparison False Result○ 2754328 Disable creation of HTTP Security Sessions per request○ 2840529 Sporadic HTTP 403 CSRF token validation errors○ 2976147 Import of query views in the BW/4 hybrid scenario: No search results of BW back ends with
SAP_BASIS Release 753
4.23 Prepare Connectivity to SAP ECC
To be able to successfully validate and use a connection to SAP ECC for remote tables or data flows, certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered ABAPAdapter.For the Language setting in the connection properties to have an effect on the language shown in the Data Builder, Data Provisioning Agent version 2.0 SP 05 Patch 10 (2.5.1) or higher is required.For more information, see Preparing Data Provisioning Agent Connectivity [page 88].
● The ABAP user specified in the credentials of the SAP ABAP connection needs to have a set of authorizations in the SAP ABAP system. For more information, see: Authorizations in the Installation and Configuration Guide for SAP HANA Smart Data Integration and SAP HANA Smart Data Quality.
● If you want to stream ABAP tables for loading large amounts of data without running into memory issues, you need to configure suitable security privileges for successful registration on an SAP Gateway and you need to create an RFC destination of type TCP/IP in the ABAP source system. With the RFC destination you register the Data Provisioning Agent as server program in the source system. For more information, see Prerequisites for ABAP RFC Streaming [page 94].
120 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● Your on-premise network has the Cloud Connector installed and configured.In the Cloud Connector configuration, make sure that access to the required resources is granted.For more information, see Configure Cloud Connector [page 95].
4.24 Prepare Connectivity to SAP Fieldglass
To be able to successfully validate and use a connection to SAP Fieldglass for remote tables or data flows, certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered CloudDataIntegrationAdapter.For more information, see Preparing Data Provisioning Agent Connectivity [page 88].
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● Your on-premise network has the Cloud Connector installed and configured.For more information, see Configure Cloud Connector [page 95].
4.25 Prepare Connectivity to SAP HANA
To be able to successfully validate and use a connection to SAP HANA Cloud or SAP HANA (on-premise) for remote tables or data flows certain preparations have to be made.
SAP HANA Cloud
The server certificate has been uploaded to SAP Data Warehouse Cloud.
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 121
For more information, see Upload Certificates [page 103].
SAP HANA on-premise
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● If you want to use SAP HANA Smart Data Integration:○ A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered
HanaAdapter.For more information, see Preparing Data Provisioning Agent Connectivity [page 88].
● If you want to use SAP HANA Smart Data Access:○ Your on-premise network has the Cloud Connector installed and configured.
For more information, see Configure Cloud Connector [page 95].○ You have added the Cloud Connector IP address to the IP allowlist.
For more information, see Add IP address to IP Allowlist [page 100].
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● Your on-premise network has the Cloud Connector installed and configured.For more information, see Configure Cloud Connector [page 95].
4.26 Prepare Connectivity to SAP Marketing Cloud
To be able to successfully validate and use a connection to SAP Marketing Cloud for remote tables or data flows, certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered CloudDataIntegrationAdapter.For more information, see Preparing Data Provisioning Agent Connectivity [page 88].
● A communication arrangement is set up for communication scenario SAP_COM_0531 (CDI API for CDS Based Extraction) in the source system. For more information, see Integrating CDI in the SAP Marketing Cloud documentation.
122 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● Your on-premise network has the Cloud Connector installed and configured.For more information, see Configure Cloud Connector [page 95].
● A communication arrangement is set up for communication scenario SAP_COM_0531 (CDI API for CDS Based Extraction) in the source system. For more information, see Integrating CDI in the SAP Marketing Cloud documentation.
4.27 Prepare Connectivity to SAP SuccessFactors for Analytical Dashboards
To be able to successfully validate and use a connection to SAP SuccessFactors for remote tables or data flows certain preparations have to be made.
Before you can use the connection, an administrator has prepared the following:
● The server certificate has been uploaded to SAP Data Warehouse Cloud.For more information, see Upload Certificates [page 103].
● When using OAuth 2.0 authorization, SAP Data Warehouse Cloud must be registered in SAP SuccessFactors.For more information, see Registering Your OAuth2 Client Application in the SAP SuccessFactors platform documentation.
● In SAP SuccessFactors IP restriction management, you have added the externally facing SAP HANA IP addresses for SAP Data Warehouse Cloud to the list of IP restrictions. IP restrictions are a specified list of IP addresses from which users can access your SAP SuccessFactors system.For more information, see:○ IP Restrictions in the SAP SuccessFactors platform documentation○ Finding SAP Data Warehouse Cloud IP addresses [page 102]
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 123
4.28 Prepare Connectivity to SAP S/4HANA Cloud
To be able to successfully validate and use a connection to SAP S/4HANA Cloud, certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered CloudDataIntegrationAdapter.For more information, see Preparing Data Provisioning Agent Connectivity [page 88].
● A communication arrangement is set up for communication scenario SAP_COM_0531 (CDI API for CDS Based Extraction) in the source system. For more information, see Integrating CDI in the SAP S/4HANA Cloud documentation.
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● A communication arrangement is set up for communication scenario SAP_COM_0532 (SAP Data Hub – ABAP CDS Pipeline Integration) in the SAP S/4HANA Cloud system.For more information, see Integrating CDS Views Using ABAP CDS Pipeline in the SAP S/4HANA Cloud documentation.
4.29 Prepare Connectivity to SAP S/4HANA On-Premise
To be able to successfully validate and use a connection to SAP for remote tables or data flows, certain preparations have to be made.
Remote Tables
Before you can use the connection for view building and accessing data via remote tables, an administrator has prepared the following:
● A connection to an SAP HANA Smart Data Integration (SDI) Data Provisioning Agent with a registered ABAPAdapter.For the Language setting in the connection properties to have an effect on the language shown in the Data Builder, Data Provisioning Agent version 2.0 SP 05 Patch 10 (2.5.1) or higher is required.
124 PUBLICAdministering SAP Data Warehouse CloudPreparing Connectivity for Connections
For more information, see Preparing Data Provisioning Agent Connectivity [page 88].● The ABAP user specified in the credentials of the SAP ABAP connection needs to have a set of
authorizations in the SAP ABAP system. For more information, see: Authorizations in the Installation and Configuration Guide for SAP HANA Smart Data Integration and SAP HANA Smart Data Quality.
● If you want to stream ABAP tables for loading large amounts of data without running into memory issues, you need to configure suitable security privileges for successful registration on an SAP Gateway and you need to create an RFC destination of type TCP/IP in the ABAP source system. With the RFC destination you register the Data Provisioning Agent as server program in the source system. For more information, see Prerequisites for ABAP RFC Streaming [page 94].
Data Flows
Before you can use the connection for data flows, an administrator has prepared the following:
● Your on-premise network has the Cloud Connector installed and configured.For more information, see Configure Cloud Connector [page 95].
● Supported source version: SAP S/4HANA version 1909 FPS01 plus SAP Note 2873666 (SAP Data Hub / Data Intelligence ABAP Integration - TCI note for SAP_ABA 1909 SP0/SP1) or a higher SAP S/4HANA version
Administering SAP Data Warehouse CloudPreparing Connectivity for Connections PUBLIC 125
5 Managing and Monitoring Connectivity for Data Integration
Monitor Data Provisioning Agent connectivity in SAP Data Warehouse Cloud, manage the impacts of agent changes in SAP Data Warehouse Cloud, and troubleshoot Data Provisioning Agent or Cloud Connector connectivity.
5.1 Monitoring Data Provisioning Agent in SAP Data Warehouse Cloud
For connected Data Provisioning Agents, you can proactively become aware of resource shortages on the agent instance and find more useful information.
In Configuration Data Integration On-Premise Agents choose the Monitor button to display the agents with the following:
● Information about free and used physical memory and swap memory on the Data Provisioning Agent server.
● Information about when the agent was connected the last time.● Information about the overall number of connections that use the agent and the number of connections
that actively use real-time replication, with active real-time replication meaning that the connection type supports real-time replication and for the connection at least one table is replicated via real-time replication.You can change to the Connections view to see the agents with a list of all connections they use and their real-time replication status. You can pause real-time replication for the connections of the while applying changes to the agent. For more information, see Pause Real-Time Replication for an Agent [page 130].
5.1.1 Monitoring Data Provisioning Agent Logs
Access the Data Provisioning Agent adapter framework log and the adapter framework trace log directly in SAP Data Warehouse Cloud.
With the integrated log access, you don’t need to leave SAP Data Warehouse Cloud to monitor the agent and analyze agent issues. Accessing the log data happens via the Data Provisioning Agent File adapter which reads the log files and saves them into the database of SAP Data Warehouse Cloud.
126 PUBLICAdministering SAP Data Warehouse Cloud
Managing and Monitoring Connectivity for Data Integration
The following logs are available:
Log File Name and Location on Data Provisioning Agent Server Description
<DPAgent_root>/log/framework_alert.trc Data Provisioning Agent adapter framework log. Use this file to monitor data provisioning agent statistics.
<DPAgent_root>/log/framework.trc Data Provisioning Agent adapter framework trace log. Use this file to trace and debug data provisioning agent issues.
You can review the logs in SAP Data Warehouse Cloud after log access has been enabled for the agent in question. We display the actual log files as well as up to ten archived log files that follow the naming convention framework.trc.<x> respectively framework_alert.trc.<x> with <x> being a number between one and ten.
Related Information
Enable Access to Data Provisioning Agent Logs [page 127]Review Data Provisioning Agent Logs [page 128]
5.1.2 Enable Access to Data Provisioning Agent Logs
Enable accessing an agent’s log files before you can view them in SAP Data Warehouse Cloud.
Prerequisites
A Data Provisioning Agent administrator has provided the necessary File adapter configuration with an access token that you need for enabling the log access in SAP Data Warehouse Cloud.
To configure the access token in the agent's secure storage, the administrator has performed the following steps in the agent configuration tool in command-line interactive mode:
1. At the command line, navigate to <DPAgent_root>/bin.2. Start the agent configuration tool with the setSecureProperty parameter.
○ On Windows: agentcli.bat --setSecureProperty○ On Linux, ./agentcli.sh --setSecureProperty
3. Choose Set FileAdapter Access Token and specify the token.
For more information about the File adapter configuration, see File in the Installation and Configuration Guide of the SAP HANA Smart Data Integration and SAP HANA Smart Data Quality documentation.
Administering SAP Data Warehouse CloudManaging and Monitoring Connectivity for Data Integration PUBLIC 127
Procedure
1. From the main menu, open Configuration Data Integration .2. On the agent’s tile, click Edit.3. In the Agent Settings dialog, set Enable Log Access to true.4. In the FileAdapter Password field that appears, enter the File adapter access token.5. Click Save to activate the log access.
Results
The Review Logs entry in the menu of the agent’s tile is enabled and the framework_alert.trc and framework.trc logs are written to the database of SAP Data Warehouse Cloud. You can now review the current and archived log files from the agent's tile.
5.1.3 Review Data Provisioning Agent Logs
Use the logs to monitor the agent and analyze issues with the agent.
Prerequisites
The logs are written to the database of SAP Data Warehouse Cloud. For more information, see Enable Access to Data Provisioning Agent Logs [page 127].
Procedure
1. From the main menu, open Configuration Data Integration .2. On the agent’s tile, click Review Logs.
The Review Agent Logs dialog initially shows 50 log entries. To load another chunks of 50 entries each, scroll down to the bottom of the dialog and use the More button.
3. To show the complete message for a log entry, click More in the Message column.4. You have the following options to restrict the results in the display of the logs:
○ Search: In the <agent name> field, enter a search string and click (Search) to search in the messages of the logs.
○ Filters: You can filter based on time, message type and log file name. When you’ve made your selection, click Apply Filters.
128 PUBLICAdministering SAP Data Warehouse Cloud
Managing and Monitoring Connectivity for Data Integration
NoteIf your local time zone differs from the time zone used in the Data Provisioning Agent logs and you're applying a time-based filter, you might get other filter results than expected.
5. [optional] Export the logs as CSV file to your local system. Note that filters and search restrictions will be considered for the exported file.
5.1.4 Receive Notifications About Data Provisioning Agent Status Changes
For a selected SAP HANA Smart Data Integration Data Provisioning Agent, you can configure to get notified when the agent’s status changes from connected to disconnected or the other way round.
Prerequisites
To run recurring scheduled tasks on your behalf, you need to authorize the job scheduling component of SAP Data Warehouse Cloud. In your profile settings under Schedule Consent Settings, you can give and revoke your consent to SAP Data Warehouse Cloud to run your scheduled tasks in the future. Note that when you don't give your consent or revoke your consent, tasks that you own won't be executed but will fail.
For more information, see Changing your Profile Settings.
Context
A recurring task will check for any status changes according to the configured frequency and send the notifications to the user who is the owner of the configuration. The initial owner is the user who created the configuration. Any user with the appropriate administration privileges can take over the ownership for this task if required, for example in case of vacation replacement or when the previous owner left the department or company.
Procedure
1. In the side navigation area, click (System) (Configuration) Data Integration .2.
3. Go to the On-Premise Agents section and click (menu) Configure Sending Notifications.4. If you haven't authorized SAP Data Warehouse Cloud yet to run your scheduled tasks for you, you will see a
message at the top of the Configure Sending Notifications dialog asking for your consent. Give your consent.
Administering SAP Data Warehouse CloudManaging and Monitoring Connectivity for Data Integration PUBLIC 129
5. Switch on the Send Notifications toggle.
An additional field Owner appears that shows that you have been automatically assigned as the owner of the task.
6. Select the frequency in which the status of the Data Provisioning Agent should be checked.7. Save your configuration.
This will start the first status check. After the first check, the status check will be performed according to the defined frequency.
Results
If the status check finds any status change for the agent, a notification will be sent that you can find by clicking (Notifications) on the shell bar.
When you click on the notification, you’ll get to the On-Premise Agents section in (System) (Configuration) Data Integration where you can start searching for the root cause in case the agent is disconnected.
Next Steps
If you need to take over the ownership and receive the notifications for an agent’s status changes, go the the Configure Sending Notifications dialog as described above, click Assign to Me and save the configuration. From now on you will receive the notifications about any status changes for the agent. If you haven’t done so yet, you need to provide your consent before you can take over the ownership.
5.2 Pause Real-Time Replication for an Agent
For a selected SAP HANA Smart Data Integration Data Provisioning Agent, you can pause real-time replication for the connections that use the agent while applying changes to it, such as configuration changes or applying patches. After you have finished your agent changes, you can restart real-time replication.
Context
If you need to perform maintenance activities in a source system, you can pause real-time replication for the corresponding connection. For more information, see Pause Real-Time Replication for a Connection.
130 PUBLICAdministering SAP Data Warehouse Cloud
Managing and Monitoring Connectivity for Data Integration
Procedure
1. In SAP Data Warehouse Cloud, from the main menu, open Configuration Data Integration On-Premise Agents .
2. To show the Data Provisioning Agent tiles with a list of all connections they use, click the Connections button.
The real-time replication status of a connection shown here, can be:
Real-Time Replication status When do we show the status?
Active The connection type supports real-time replication and for the connection at least one table is replicated via real-time replication (even if the status in the Remote Table Monitor is Error).
Inactive The connection type supports real-time replication and for the connection currently there is no table replicating via real-time replication.
Paused The connection type supports real-time replication and for the connection at least for one table real-time replication is paused.
3. To pause the agent's connections with replication status Active or Inactive, on the tile of the agent choose (menu) and then Pause All Connections.
In the list of connections shown on the tile, the status for affected connections changes to Paused. You can also see the status change for the connections in the Connections application.
In the Remote Table Monitor the status for affected tables changes to Paused and actions related to real-time replication are not available for these tables. Also, you cannot start real-time replication for any table of a paused connection.
4. You can now apply the changes to your Data Provisiong Agent.
5. Once you're finished with the changes, restart real-time replication for the agent. Choose (menu) and then Restart All Connections.
The status in the list of connections shown on the tile, in the Connections application as well as in the Remote Table Monitor changes accordingly and you can again perform real-time related actions for the tables or start real-time replication.
Administering SAP Data Warehouse CloudManaging and Monitoring Connectivity for Data Integration PUBLIC 131
5.3 Troubleshooting the Data Provisioning Agent
If you encounter problems with the Data Provisioning Agent, the following information can be helpful.
Troubleshooting
● Examine the logs located in the <DPAgent_root>/log directory.● Please ensure that your data provisioning agent is connected to SAP HANA.
1. Run <DPAgent_root>/bin/agentcli.bat --configAgent.2. Select option 1. Agent Status to check the connection status.3. Make sure the output shows Agent connected to HANA: Yes.4. If the output doesn't show that the agent is connected, it may show an error message. Resolve the
error, and then select option 2. Start or Stop Agent, and then option 1. Start Agent to start the agent.● Failed to connect to the remote source. Please restart your data provisioning agent and try again.
1. Run <DPAgent_root>/bin/agentcli.bat --configAgent.2. Select option 2. Start or Stop Agent, and then option 2. Stop Agent to stop the agent.3. Select option 1. Start Agent to restart the agent.
For more information on troubleshooting, see SAP Data Warehouse Cloud - DP Agent Setup, Configuration and Troubleshooting Guide on github.com .
SAP Notes
SAP Note 2938870 - Errors when connecting DP Agent with DWC
SAP Note 2894588 - IP Allowlist in SAP Data Warehouse Cloud
SAP Note 2688382 - SAP HANA Smart Data Integration Memory Sizing Guideline
132 PUBLICAdministering SAP Data Warehouse Cloud
Managing and Monitoring Connectivity for Data Integration
5.4 Troubleshooting the Cloud Connector (SAP HANA Smart Data Acess)
These are some of the most common issues that can occur when you use the Cloud Connector to connect to on-premise remote sources via SAP HANA Smart Data Access.
1. The connectivity proxy is not enabled
The following error occurs if you try to connect to a remote source using the Cloud Connector, but the connectivity proxy hasn’t been enabled:
[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89001] Cannot resolve host name '<connectivity_proxy_host>' rc=-2: Name or service not known (<virtual_host>:<virtual_port>))
SAP Data Warehouse Cloud takes care of enabling the connectivity proxy. This might take a while.
2. The connectivity proxy is enabled but not fully ready to serve requests
The following error occurs if the connectivity proxy has been enabled but is not yet ready to be used:
[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89006] System call 'connect' failed, rc=111: Connection refused {<connectivity_proxy_ip>:<connectivity_proxy_port>)} {ClientPort:<client_port>} (<virtual_host>:<virtual_port>))
SAP Data Warehouse Cloud takes care of enabling the connectivity proxy. This might take a while.
3. The virtual host specified in the connection details includes an underscore
The following error occurs if you’ve used a virtual host name with an underscore, for example, hana_01:
[LIBODBCHDB SO][HDBODBC] General error;-10719 Connect failed (invalid SERVERNODE 'hana_01:<virtual_host>:<virtual_port>')
Virtual host names must not contain underscores.
Administering SAP Data Warehouse CloudManaging and Monitoring Connectivity for Data Integration PUBLIC 133
4. The virtual host specified in the connection details is unreachable
The following error occurs if the specified virtual host cannot be reached:
[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89132] Proxy server connect: connection not allowed by ruleset (<virtual_host>:<virtual_port>))
5. The selected location ID is invalid.
The following error occurs if an invalid location ID was specified in the Data Source Configuration of the SAP Data Warehouse Cloud Administration:
[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89132] Proxy server connect: Network unreachable (<virtual_host>:<virtual_port>))
6. The Cloud Connector's IP is missing or is incorrectly specified in the SAP Data Warehouse Cloud IP allowlist for trusted Cloud Connector IPs
The following error occurs when the Cloud Connector's IP is not included in the allowlist list:
[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89132] Proxy server connect: Network unreachable (<virtual_host>:<virtual_port>))
7. The Cloud Connector certificate has expired
The following error occurs when the subaccount certificate used in the Cloud Connector has expired:
[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89133] Proxy server connect: Network unreachable (<virtual_host>:<virtual_port>))
You can find the related logs in the ljs_trace.log file in the Cloud Connector. For example:
2021-07-29 04:50:42,131 +0200#ERROR#com.sap.core.connectivity.tunnel.client.notification.NotificationClient#notification-client-277-3# #Unable to handshake with notification server connectivitynotification.cf.sap.hana.ondemand.com/<virtual_host>:<virtual_port> javax.net.ssl.SSLException: Received fatal alert: certificate_expired
For information about renewing a subaccount certificate, see Update the Certificate for a Subaccount.
134 PUBLICAdministering SAP Data Warehouse Cloud
Managing and Monitoring Connectivity for Data Integration
8. The on-premise backend system requires TCP SSL
The following error occurs if the on-premise backend system requires TCP SSL:
[LIBODBCHDB SO][HDBODBC] Communication link failure;-10709 Connection failed (RTE:[89008] Socket closed by peer (<virtual_host>:<virtual_port>))
Administering SAP Data Warehouse CloudManaging and Monitoring Connectivity for Data Integration PUBLIC 135
6 Integrating Analytics Clients
Use SAP Analytics Cloud and other analytics clients to consume views (exposed by the Data Layer) and perspectives (exposed by the Business Layer) and produce charts, dashboards, and other analytic artifacts.
A DW Administrator can connect SAP Data Warehouse Cloud to SAP Analytics Cloud to make consumable views and perspectives automatically available as models there. For more information, see Integrating SAP Analytics Cloud [page 136].
In order to connect third party analytic clients, you must request a database user from your DW Space Administrator. For more information, see Integrating Third-Party Analytics Clients [page 145].
6.1 Integrating SAP Analytics Cloud
Consume your exposed views and perspectives to produce charts, dashboards, and other analytical artifacts in SAP Analytics Cloud.
Connecting SAP Data Warehouse Cloud to SAP Analytics Cloud
You can connect SAP Data Warehouse Cloud as a remote live connection to any SAP Analytics Cloud system. SAP Data Warehouse Cloud and SAP Analytics Cloud need to be on different tenants.
There are some product restrictions, listed in SAP Note 2832606 .
Use the Live Data Connectivity in SAP Analytics CloudIn SAP Analytics Cloud, you can then create stories using SAP Data Warehouse Cloud data as a source and combine it with additional data (e.g. from a file).
If required any number of SAP Data Warehouse Cloud systems from multiple tenants can be connected to any SAP Analytics Cloud system. Any SAP Data Warehouse Cloud can be connected to any SAP Analytics Cloud as
136 PUBLICAdministering SAP Data Warehouse Cloud
Integrating Analytics Clients
a live data connection. This can only be done across tenants. More information: Live Data Connections to SAP Data Warehouse Cloud
NoteSAP Data Warehouse Cloud remote connections can also be set up for SAP Analytics Cloud NEO tenants.
Tenant Link
When you link your tenants, you'll enable the product switch in the top right of the shell bar, and be able to easily navigate between them. For more information, see Link Your Tenants [page 137].
Authentification
SAP Data Warehouse Cloud and SAP Analytics Cloud share the same authentification mechanism. You can change the Identity Provider settings, so that you can use that Identity Provider to logon to SAP Data Warehouse Cloud as well. For more information, seeEnabling a Custom SAML Identity Provider [page 31] .
NoteIn SAP Data Warehouse Cloud provisioned prior to 2021.03, there are some differences. SAP Data Warehouse Cloud and SAP Analytics Cloud then run on the same tenant. For more information, see Integration to SAP Analytics Cloud (for SAP Data Warehouse Cloud provisioned prior to version 2021.03).
6.1.1 Link Your Tenants
You can link your SAP Data Warehouse Cloud tenant to your SAP Analytics Cloud tenant.
Context
When you link your tenants, you'll enable the product switch in the top right of the shell bar, and be able to easily navigate between them.
Procedure
1. Go to System Configuration Tenant Links .2. Enter the URL of your SAP Analytics Cloud tenant.3. You can test your connection.
Administering SAP Data Warehouse CloudIntegrating Analytics Clients PUBLIC 137
6.1.2 Managing OAuth Clients
You can use Open Authorization (OAuth) protocol to allow third-party applications access to protected resources. You can then access SAP Data Warehouse Cloud via APIs.
Prerequisites
SAP Data Warehouse Cloud can be hosted either on SAP data centers or on non-SAP data centers (for example, Amazon Web Services (AWS)). Determine which environment SAP Data Warehouse Cloud is hosted on by inspecting your SAP Data Warehouse Cloud URL:
● A single-digit number, for example us1 or jp1, indicates an SAP data center.● A two-digit number, for example eu10 or us30, indicates a non-SAP data center.
Context
Administrators can manage OAuth clients in the system administration area of SAP Data Warehouse Cloud. The following steps describe how to add a new OAuth client and trusted identity provider.
Depending on wether your SAP Data Warehouse Cloud is hosted on SAP data centers or on non-SAP data centers, you need to choose the respective procedure.
Procedure
1. Add a New OAuth Client (SAP Data Center). For more information, see Managing OAuth Clients [page 138]2. Add a New OAuth Client (Non-SAP Data Center). For more information, see Add a New OAuth Client (Non-
SAP Data Center) [page 141]3. Authorize a third party application to use the SAP Data Warehouse Cloud public APIs. For more
information, see Authorization for API access OAuth Clients [page 143]4. Add a Trusted Identity Provider. For more information, see Add a Trusted Identity Provider [page 144]5. Enter the origins that will be hosting your client application. These origin values will allow embedding in
your client application.
138 PUBLICAdministering SAP Data Warehouse Cloud
Integrating Analytics Clients
6.1.2.1 Add a New OAuth Client (SAP Data Center)
You can add a new OAuth client.
Context
This is the procedure for systems hosted on SAP data centers. Determine which environment SAP Data Warehouse Cloud is hosted on by inspecting your SAP Data Warehouse Cloud URL:
● A single-digit number, for example us1 or jp1, indicates an SAP data center.● A two-digit number, for example eu10 or us30, indicates a non-SAP data center.
Procedure
1. Go to System Administration App Integration .2. Under Configured Clients, select Add a New OAuth Client.3. In the dialog, add a Name for the OAuth client.4. From the Purpose list, select the intended use for your OAuth client:
○ Interactive Usage (default)Accessing protected resources using an interactive usage OAuth client requires a valid SAML-based user context.
○ API AccessAn API access OAuth client allows a third-party application to access SAP Data Warehouse Cloud public APIs without a SAML assertion.
If you selected Interactive Usage, do the following:
1. Under Authorization Grant, select the authorization method your clients will use to obtain an access token. There are two options available: Authorization Code or Client Credentials.
Administering SAP Data Warehouse CloudIntegrating Analytics Clients PUBLIC 139
Authorization Method Steps
Authorization Code 1. Provide an Authorization Code Lifetime. The lifetime is the duration that an authorization code will remain valid. Once this period is over, clients can no longer use the existing authorization code to obtain access tokens and refresh tokens. An administrator can set both the value and unit. Available time units include days, hours, and minutes.
NoteThe lifetime value must be a positive integer. If this value is not provided, the lifetime value will be infinite by default.
2. (Optional) Select Confidential. If selected, third-party applications must provide a secret value to obtain an OAuth access token to use with SAP Data Warehouse Cloud. Enter a Secret value, and the Lifetime of the secret value. This is the duration that the secret remains valid. Once this period is over, an administrator must reset the secret value. This lifetime should be provided in days. For example, 30 days.
3. Enter a Redirect URI. This is the URI where access or refresh tokens must be returned to.
4. Enter the Token Lifetime.When the access token expires, clients must use a valid refresh token to obtain a new access token. An administrator can set both the value and unit. Available time units include days, hours, and minutes.The lifetime value must be a positive integer. If this value is not provided, the lifetime value will be infinite by default.
5. Enter the Refresh Token Lifetime.An administrator can set both the value and unit of the refresh token lifetime. Available time units include days, hours, and minutes. The lifetime value must be a positive integer. If this value is not provided, the lifetime value will be infinite by default.
140 PUBLICAdministering SAP Data Warehouse Cloud
Integrating Analytics Clients
Authorization Method Steps
Client Credentials 1. Enter a Secret value, and the Lifetime of the secret value. This is the duration that the secret remains valid. Once this period is over, an administrator must reset the secret value. This lifetime should be provided in days. For example, 30 days.
2. Enter the Token Lifetime.When the access token expires, clients must use a valid refresh token to obtain a new access token. An administrator can set both the value and unit. Available time units include days, hours, and minutes.The lifetime value must be a positive integer. If this value is not provided, the lifetime value will be infinite by default.
2. Select Add.
If you selected API Access, do the following:
1. Choose at least one option from the Access list:○ Story Listing: This OAuth client privilege allows a third-party application to access a list of stories in
your system.○ User Provisioning: This OAuth client privilege allows a third-party application to manage users in
your system.2. Enter a Secret value, and the Lifetime of the secret value.
This is the duration that the secret remains valid. Once this period is over, an administrator must reset the secret value. This lifetime should be provided in days. For example, 30 days.
3. Enter the Token Lifetime.When the access token expires, clients must use a valid refresh token to obtain a new access token. An administrator can set both the value and unit. Available time units include days, hours, and minutes.The lifetime value must be a positive integer. If this value is not provided, the lifetime value will be infinite by default.
4. Select Add.
6.1.2.2 Add a New OAuth Client (Non-SAP Data Center)
You can add a new OAuth client.
Context
This is the procedure for systems hosted on non-SAP data centers. Determine which environment SAP Data Warehouse Cloud is hosted on by inspecting your SAP Data Warehouse Cloud URL:
● A single-digit number, for example us1 or jp1, indicates an SAP data center.● A two-digit number, for example eu10 or us30, indicates a non-SAP data center.
Administering SAP Data Warehouse CloudIntegrating Analytics Clients PUBLIC 141
Procedure
1. Go to System Administration App Integration .2. Under Configured Clients, select Add a New OAuth Client.3. In the dialog, add a Name for the OAuth client.4. From the Purpose list, select the intended use for your OAuth client:
○ Interactive Usage (default)Accessing protected resources using an interactive usage OAuth client requires a valid SAML-based user context.
○ API AccessAn API access OAuth client allows a third-party application to access SAP Data Warehouse Cloud public APIs without a SAML assertion. For more information, see Authorization for API access OAuth Clients [page 143].
○ If you selected API Access, choose at least one option from the Access list:○ Story Listing: This OAuth client privilege allows a third-party application to access a list of stories in
your system.○ User Provisioning: This OAuth client privilege allows a third-party application to manage users in
your system.5. Enter a Redirect URI. The URI must be the exact URI where access or refresh tokens are returned too. If the
URI has dynamic parameters, use a wildcard pattern for the URI. For example, https://redirect_host/**
6. Select Add.
NoteThe Token Lifetime and Refresh Token Lifetime cannot be configured.
7. If you are using OAuth 2.0 you must provide the following information to your client application:○ Authorization URL: The OAuth 2.0 Authorization URL.○ Token URL: The OAuth 2.0 Token Service URL.○ OAuth2SAML Token URL: The OAuth 2.0 Token Service URL to be used in the OAuth 2.0 SAML Bearer
Assertion workflow.○ OAuth2SAML Audience: The audience to be used by the OAuth 2.0 SAML Bearer Assertion workflow.
142 PUBLICAdministering SAP Data Warehouse Cloud
Integrating Analytics Clients
6.1.2.3 Authorization for API access OAuth Clients
You can authorize a third party application to use public APIs.
Context
If you selected API Access as the Purpose for the OAuth client, follow these steps to authorize a third party application to use the SAP Data Warehouse Cloud public APIs without a SAML assertion:
Procedure
1. Perform a POST HTTPS call to the following address:
<Token URL>?grant_type=client_credentials
<Token URL> is the Token URL listed in the OAuth Clients section of the App Integration page.
2. Use basic authentication, and set the OAuth client ID as the user and the secret as the password. This call returns an access token.
3. Access the required public API endpoint with the following headers:
Header name Value Notes
Authorization Bearer <Token> <Token> is the access token returned by the previous step.
x-sap-sac-custom-auth True
Next Steps
If you use the OAuth 2.0 SAML Bearer Assertion workflow, you must also configure a trusted identity provider. For next steps, see Add a Trusted Identity Provider [page 144].
The client you added will appear in lists on the App Integration page. Hover over a client and select Edit or Delete to delete it.
You may need to use the Authorization URL and Token URL listed here to complete setup on your OAuth clients.
Administering SAP Data Warehouse CloudIntegrating Analytics Clients PUBLIC 143
6.1.2.4 Add a Trusted Identity Provider
If you use the OAuth 2.0 SAML Bearer Assertion workflow, you must add a trusted identity provider to SAP Data Warehouse Cloud.
Prerequisites
The corresponding OAuth Client must be added to SAP Data Warehouse Cloud. For more information, see Managing OAuth Clients [page 138].
Context
The OAuth 2.0 SAML Bearer Assertion workflow allows a third-party application access to protected resources without prompting users to log into SAP Data Warehouse Cloud when there is an existing SAML assertion from the third-party application identity provider.
NoteBoth SAP Data Warehouse Cloud and the third-party application must be configured with the same identity provider.
Procedure
1. Go to System Administration App Integration .2. In Trusted Identity Providers, select Add a Trusted Identity Provider.3. In the dialog, add a unique Name for the trusted identity provider. This name is used only for identification
purposes, and will appear in the list of trusted identity providers.4. Add the identity provider name. The Provider Name must be unique.
NoteThe provider name can contain only alphabet characters (a-z & A-Z), numbers (0-9), underscore (_), dot (.), hyphen (-), and cannot exceed 36 characters.
5. Provide signing certificate information for the third-party application server.
NoteThe signing certificate information must be in X.509 Base64 encoded format.
6. Select Add.
144 PUBLICAdministering SAP Data Warehouse Cloud
Integrating Analytics Clients
Next Steps
The identity providers that you added will appear in lists on the App Integration page. Hover over an identity provider and selectEdit to update information or Delete to delete it.
You may need to use the Authorization URL and Token URL listed here to complete setup on your OAuth clients.
6.2 Integrating Third-Party Analytics Clients
Third-Party Business Intelligence (BI) tools can take advantage of the power of the SAP Data Warehouse Cloud by connecting to SAP Data Warehouse Cloud spaces through SQL-based client-server interfaces to deliver advanced visualization, reporting and analysis.
If you as a space administrator would like to expose your space data to use it in a different tool, you can create a space schema user in SAP Data Warehouse Cloud.You can create multiple users to access the data. Then you can decide which objects are allowed to be consumed.
NoteYou need to add the IP address of your third-party tool to the IP allowlist.
Related Information
Create a Database UserIntegrating Third-Party BI Clients via ODBC on Microsoft Windows [page 145]https://saphanacloudservices.com/data-warehouse-cloud/learning-track/connections-bi-tools/
6.2.1 Integrating Third-Party BI Clients via ODBC on Microsoft Windows
You can expose your space data via ODBC to use it in third-party BI clients.
Prerequisites
● You have created a space schema user. More information: Create a Database User● You have either made all objects in this space consumable in the space management or have the property
allow consumption switched on at object level.● You have created a view of type analytical dataset.
Administering SAP Data Warehouse CloudIntegrating Analytics Clients PUBLIC 145
● You have added the IP address of your third-party tool to the IP allowlist. For more information, see Add IP address to IP Allowlist [page 100].
● You have an ODBC Data Source Administrator installed. If you are using 64-bit MS Office, the 64-bit ODBC Data Source Administrator is preferable. If you are using 32-bit MS Office, the 32-bit ODBC Data Source Administrator should be used.
● You have the prerequisites for SAP HANA Cloud as described here: Connect to SAP HANA Cloud via ODBC
Context
You can expose the data of a dedicated space via ODBC to third-party BI clients. You can decide if all the objects in the space are allowed to be consumed, or just specific objects.
Procedure
Create an ODBC Data Source pointing to SAP Data Warehouse Cloud schema1. Go to the ODBC Data Source Administrator.2. On the User DSN tab, choose Add to create a new ODBC data source.3. Select the HDBOBC driver and choose Finish. The HDBOBC driver should have version 2.4.183 or higher.4. You now need to configure your ODBC data source:
1. Enter the Host and Port number of your SAP Data Warehouse Cloud system. You are given these credentials when you create your space schema user.
2. Deselect Multitenant under Database.3. Select Connect to the database using TLS/SSL under TLS/SSL.4. You can now test the connection by entering the User ID and password you got when you created the
space schema user.
NoteIf you receive the error “Connection refused”, you might have to add your IP to the IP allowlist in SAP Data Warehouse Cloud. More information: Add IP address to IP Allowlist [page 100]
Load data into your third-party frontend using the ODBC Data Source5. Go to your third-party frontend.
146 PUBLICAdministering SAP Data Warehouse Cloud
Integrating Analytics Clients
6. Connect via ODBC to SAP Data Warehouse Cloud. Enter the username and password from your space schema user. You can select your SAP Data Warehouse Cloud source and load your data.
7. You can now change the appearance of your data by usingyour third-party tools.
Administering SAP Data Warehouse CloudIntegrating Analytics Clients PUBLIC 147
7 Creating Database User Groups
Create database user groups with corresponding administrators that can directly connect to the the underlying SAP HANA Cloud database and work with SQL.
You need the DW Administrator role to create database user groups.
Database user groups are isolated environments that support a separation of user management tasks. Administrators of database user groups connect to the SAP HANA Cloud database and can create other users, schemas, and roles. The users and adminstrators of these groups can create data entities (DDL) and ingest data (DML) directly in their schemas, but have no privileges for accessing the SAP Data Warehouse Cloud schemas. As these groups are decoupled from SAP Data Warehouse Cloud, they pose no security risk.
For more information on SAP HANA user groups please see User Groups.
When creating a database user group, an administrator is automatically created for the group.
NoteCertain actions cannot be performed via standard SQL statements and should be performed by using the stored procedures developed by SAP Data Warehouse Cloud:
● Creating a Schema [page 151]● Creating a Role [page 152]● Granting a Role [page 153]● Revoking a Role [page 154]● Dropping a Role [page 155]
Privileges
List the current available user privileges with the following statement select * from effective_privileges where user_name = current_user;:
● USERGROUP OPERATOR on the relevant usergroup● Usergroup administrative procedures:
○ EXECUTE ON "DWC_GLOBAL"."CREATE_USERGROUP_SCHEMA"○ EXECUTE ON "DWC_GLOBAL"."CREATE_USERGROUP_ROLE"○ EXECUTE ON "DWC_GLOBAL"."DROP_USERGROUP_ROLE"○ EXECUTE ON "DWC_GLOBAL"."GRANT_USERGROUP_ROLE"○ EXECUTE ON "DWC_GLOBAL"."REVOKE_USERGROUP_ROLE"
● SAP HANA system privileges:○ ATTACH DEBUGGER○ CATALOG READ○ CREATE SCENARIO
148 PUBLICAdministering SAP Data Warehouse Cloud
Creating Database User Groups
○ CREATE STRUCTURE PRIVILEGE○ EXPORT○ IMPORT
● EXECUTE on "SYS"."GET_INSUFFICIENT_PRIVILEGE_ERROR_DETAILS" procedure
7.1 Creating a Database User Group
Create database user groups with corresponding administrators that can directly connect to the the underlying SAP HANA Cloud database and work with SQL.
Prerequisites
You need the DW Administrator role to create a database user group.
Procedure
1. Go to (Configuration) → Database Access and select Database User Groups.
If you've provisioned SAP Data Warehouse Cloud prior to version 2020.03 you'll see a different UI and need go to Configuration → Database Access.
2. On the Database User Group page select Create.3. Enter a suffix for your database user group and click Create. When creating a user group, a database user
group administrator is automatically created.4. Use the provided credentials to connect to the SAP HANA Cloud database through the SAP HANA
database explorer.
Parameter Description
Database User Group Administrator The database user group administrator name.
Host Name and Port Name These credentials are automatically filled when connecting via SAP HANA database explorer.
Password Please remember or copy the password, as you will need this information to connect to SAP HANA Cloud.
5. Close the dialog and then select your newly created user group in the list.6. Launch Open Database Explorer.7. Enter the password that you've copied in step 4 into the required field.8. Click OK.
Administering SAP Data Warehouse CloudCreating Database User Groups PUBLIC 149
Results
The SAP HANA database explorer opens with your database user group at the top level. You can now use the SQL editor to create users, roles and schemas. Please note that not all actions are possible through the standard SQL statements, but require the following stored procedures:
● Creating a Schema [page 151]● Creating a Role [page 152]● Granting a Role [page 153]● Revoking a Role [page 154]● Dropping a Role [page 155]
7.2 Creating a User
Create a user for your database user group by running the standard SQL statement in SAP HANA database explorer.
Prerequisites
The database user group administrator can create users.
Context
When using SAP HANA database explorer, you can create a user by running an SQL query in the SQL console.
Procedure
1. Open the SQL console.2. Use the following SQL statement:CREATE USER <user_name> PASSWORD <pwd> SET USERGROUP
<DBgroup_name>.
NoteTo avoid possible conflicts, we strongly recommend that you use DWCDBGROUP# as a prefix to the user name (see Rules for Technical Names [page 10]). You can use the following syntax: DWCDBGROUP#<DBgroup_name>#<user_name>
CREATE USER DWCDBGROUP#ONIT2#USER1 password “Welcome1” set usergroup “DWCDBGROUP#ONIT2”;
150 PUBLICAdministering SAP Data Warehouse Cloud
Creating Database User Groups
3. Perform the query by clicking (Run) or press F8.
Results
If the run was successful, you'll receive a confirmation message in the Result pane.
7.3 Creating a Schema
Create a schema in your database user group by using our stored procedure.
Prerequisites
The database user group administrator creates new schemas.
Context
When using SAP HANA database explorer, you can create a schema by running anSAP Data Warehouse Cloud specific stored procedure in the SQL console.
Procedure
1. Open the SQL console.2. Create a schema by calling the stored procedure CREATE_USERGROUP_SCHEMA. The owner of the new
schema should be assigned to a user of the database user group. In our example this is "USER1". If the owner name is set to null, then the database user group administrator is set as owner.
CALL "DWC_GLOBAL"."CREATE_USERGROUP_SCHEMA" ( SCHEMA_NAME => 'DWCDBGROUP#ONIT2#SCHEMA', OWNER_NAME => 'DWCDBGROUP#ONIT2#USER1' );
3. Perform the query by clicking (Run) or press F8.
Administering SAP Data Warehouse CloudCreating Database User Groups PUBLIC 151
Results
If the run was successful, you'll receive a confirmation message in the Result pane.
7.4 Creating a Role
Create a role in your database user group by using our stored procedure.
Prerequisites
The database user group administrator creates new roles in the user group.
Context
When using SAP HANA database explorer, you can create a role by running anSAP Data Warehouse Cloud specific stored procedure in the SQL console.
Procedure
1. Open the SQL console.2. Create a role by calling the stored procedure CREATE_USERGROUP_ROLE. The owner of the new role should
be assigned to the database usergroup. In our example this is "DWCDBGROUP#ONIT2#SCHEMA".
CALL "DWC_GLOBAL"."CREATE_USERGROUP_ROLE" ( ROLE_SCHEMA_NAME => 'DWCDBGROUP#ONIT2#SCHEMA', ROLE_NAME => 'DWCDBGROUP#ROLE1' );
3. Perform the query by clicking (Run) or press F8.
Results
If the run was successful, you'll receive a confirmation message in the Result pane.
152 PUBLICAdministering SAP Data Warehouse Cloud
Creating Database User Groups
7.5 Granting a Role
Grant a role to users or other roles in your database user group by using our stored procedure.
Prerequisites
The database user group administrator grants roles to users or other roles in the database user group.
The role schema, grantee and grantee role should all be within in the same database user group.
Context
When using SAP HANA database explorer, you can create a schema by running anSAP Data Warehouse Cloud specific stored procedure in the SQL console.
Procedure
1. Open the SQL console.2. Grant a role by calling the stored procedure GRANT_USERGROUP_ROLE. The owner of the new schema
should be assigned to a user of the database user group. In our example we're granting Role1 to User1;
CALL "DWC_GLOBAL"."GRANT_USERGROUP_ROLE" ( ROLE_SCHEMA_NAME => 'DWCDBGROUP#ONIT2#SCHEMA', ROLE_NAME => 'DBUSERGROUP_ROLE1', GRANTEE => 'DWCDBGROUP#ONIT2#USER1', GRANTEE_ROLE_NAME => NULL, WITH_ADMIN_OPTION => FALSE);
3. Perform the query by clicking (Run) or press F8.
Results
If the run was successful, you'll receive a confirmation message in the Result pane.
Administering SAP Data Warehouse CloudCreating Database User Groups PUBLIC 153
Example
Creating a new role (Role2) and granting Role1 to Role2:
CALL "DWC_GLOBAL"."CREATE_USERGROUP_ROLE" (ROLE_SCHEMA_NAME => 'DWCDBGROUP#ONIT2#SCHEMA',ROLE_NAME => 'DBUSERGROUP_ROLE2');CALL "DWC_GLOBAL"."GRANT_USERGROUP_ROLE"( ROLE_SCHEMA_NAME => 'DWCDBGROUP#ONIT2#SCHEMA', ROLE_NAME => 'DBUSERGROUP_ROLE1', GRANTEE => 'DWCDBGROUP#ONIT2#SCHEMA', GRANTEE_ROLE_NAME => 'DBUSERGROUP_ROLE2', WITH_ADMIN_OPTION => TRUE);
7.6 Revoking a Role
Revoke a role in your database user group by using our stored procedure.
Prerequisites
The database user group administrator can revoke roles in the database user group.
Context
When using SAP HANA database explorer, you can revoke a role by running anSAP Data Warehouse Cloud specific stored procedure in the SQL console.
Procedure
1. Open the SQL console.2. Revoke a role by calling the stored procedure REVOKE_USERGROUP_ROLE.
CALL "DWC_GLOBAL"."REVOKE_USERGROUP_ROLE" ( ROLE_SCHEMA_NAME => 'DWCDBGROUP#ONIT2#SCHEMA', ROLE_NAME => 'DBUSERGROUP_ROLE1', GRANTEE => 'DWCDBGROUP#ONIT2#USER1', GRANTEE_ROLE_NAME => NULL);
154 PUBLICAdministering SAP Data Warehouse Cloud
Creating Database User Groups
3. Perform the query by clicking (Run) or press F8.
Results
If the run was successful, you'll receive a confirmation message in the Result pane.
7.7 Dropping a Role
Drop a role in your database user group by using our stored procedure.
Prerequisites
The database user group administrator can drop roles in the database user group.
Context
When using SAP HANA database explorer, you can drop a role by running anSAP Data Warehouse Cloud specific stored procedure in the SQL console.
Procedure
1. Open the SQL console.2. Drop a role by calling the stored procedure DROP_USERGROUP_ROLE. In our example we're dropping Role1.
CALL "DWC_GLOBAL"."DROP_USERGROUP_ROLE" (ROLE_SCHEMA_NAME => 'DWCDBGROUP#ONIT2#SCHEMA',ROLE_NAME => 'DBUSERGROUP_ROLE1' );
3. Perform the query by clicking (Run) or press F8.
Results
If the run was successful, you'll receive a confirmation message in the Result pane.
Administering SAP Data Warehouse CloudCreating Database User Groups PUBLIC 155
7.8 Allow a Space to Read From the Database User Group Schema
By default, no SAP Data Warehouse Cloud space can access the database user group schema. To grant a space read privileges from the database user group schema, use the GRANT_PRIVILEGE_TO_SPACE stored procedure.
Prerequisites
Only the administrator of a database user group has the privilege to run the stored procedure "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE".
Context
You can grant read privileges by running an SAP Data Warehouse Cloud specific stored procedure in the SQL console in the SAP HANA Database Explorer.
Procedure
1. From the side navigation area, go to (System) → (Configuration) → Database Access → Database User Groups.
2. Select the database user group and click Open Database Explorer.3. In the SQL console in SAP HANA Database Explorer, call the stored procedure to grant the 'SELECT'
privilege to a space using the following syntax:
CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" ( OPERATION => <operation>, PRIVILEGE => <privilege>, SCHEMA_NAME => <schema name>, OBJECT_NAME => <object name>, SPACE_ID => <space ID>);
Parameters are set as follows:
Parameter Values Description
operation ○ 'GRANT'○ 'REVOKE'
[required] Enter 'GRANT' to give the read privileges, or 'REVOKE' to remove the read privileges to the space.
156 PUBLICAdministering SAP Data Warehouse Cloud
Creating Database User Groups
Parameter Values Description
privilege 'SELECT' [required] Enter the read privilege that you want to grant (or revoke) to the space.
schema_name '[name of database user group schema]' [required] Enter the name of the schema you want the space to be able to read from.
object_name ○ ' '○ null○ '[name of the objet]'
[required] You can grant the read privileges, either at the schema level or at the object level.○ At the schema level (all objets in the
schema): enter null or ' '.○ At the object level: enter a valid table name.
space_id '[ID of the space]' [required] Enter the ID of the space you are granting the read privileges to.
To grant read access to all objects (tables) in the schema:
CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" ( OPERATION => 'GRANT', PRIVILEGE => 'SELECT', SCHEMA_NAME => 'SALE#ETL', OBJECT_NAME => '', SPACE_ID => 'SALES');
To grant read access to the table MY_TABLE:
CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" ( OPERATION => 'GRANT', PRIVILEGE => 'SELECT', SCHEMA_NAME => 'SALE#ETL', OBJECT_NAME => 'MY_TABLE', SPACE_ID => 'SALES');
4. Run the query by clicking (Run) or press F8.
Results
If the run is successful, you receive a confirmation message in the Result pane. You can then open the Data Builder, create a data flow, and select the tables as sources.
7.9 Allow a Space to Write to the Database User Group Schema
To grant a space write privileges in the database user group schema, use the GRANT_PRIVILEGE_TO_SPACE stored procedure. Once this is done, data flows running in the space can select tables in the schema as targets and write data to them.
Administering SAP Data Warehouse CloudCreating Database User Groups PUBLIC 157
Prerequisites
Only the administrator of a database user group has the privilege to run the stored procedure "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE".
Context
You can grant write privileges by running an SAP Data Warehouse Cloud specific stored procedure in the SQL console in the SAP HANA Database Explorer.
Procedure
1. From the side navigation area, go to (System) → (Configuration) → Database Access → Database User Groups.
2. Select the database user group and click Open Database Explorer.3. In the SQL console in SAP HANA Database Explorer, call the stored procedure to grant the 'INSERT',
'UPDATE', or 'DELETE' privilege to a space using the following syntax:
CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" ( OPERATION => <operation>, PRIVILEGE => <privilege>, SCHEMA_NAME => <schema name>, OBJECT_NAME => <object name>, SPACE_ID => <space ID>);
Parameters are set as follows:
Parameter Values Description
operation ○ 'GRANT'○ 'REVOKE'
[required] Enter 'GRANT' to give the write privileges, or 'REVOKE' to remove the write privileges to the space.
privilege ○ 'INSERT"○ 'UPDATE'○ 'DELETE'
[required] Enter the write privilege that you want to grant (or revoke) to the space.
NoteYou can grant one privilege at a time.
schema_name '[name of database user group schema]' [required] Enter the name of the schema you want the space to be able to write from.
158 PUBLICAdministering SAP Data Warehouse Cloud
Creating Database User Groups
Parameter Values Description
object_name ○ ' '○ null○ '[name of the objet]'
[required] You can grant the write privileges, either at the schema level or at the object level.○ At the schema level (all objets in the
schema): enter null or ' '.○ At the object level: enter a valid table name.
space_id '[ID of the space]' [required] Enter the ID of the space you are granting the write privileges to.
To grant update write access to all objects (tables) in the schema:
CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" ( OPERATION => 'GRANT', PRIVILEGE => 'UPDATE', SCHEMA_NAME => 'SALE#ETL', OBJECT_NAME => '', SPACE_ID => 'SALES');
To grant update write access to the table MY_TABLE:
CALL "DWC_GLOBAL"."GRANT_PRIVILEGE_TO_SPACE" ( OPERATION => 'GRANT', PRIVILEGE => 'UPDATE', SCHEMA_NAME => 'SALE#ETL', OBJECT_NAME => 'MY_TABLE', SPACE_ID => 'SALES');
4. Run the query by clicking (Run) or press F8.
Results
If the run is successful, you receive a confirmation message in the Result pane. You can then open the Data Builder, create a data flow, and select the tables as targets.
Administering SAP Data Warehouse CloudCreating Database User Groups PUBLIC 159
8 Monitoring and Troubleshooting SAP Data Warehouse Cloud
Administrators have access to various monitoring logs and views, and can create database analysis users, if necessary, to help troubleshoot issues.
8.1 Database Analysis User
Connect to your SAP HANA Cloud database with this application-wide user to analyze, diagnose and solve database issues.
The database analysis user is a database user with additional privileges to temporarily support monitoring, analyzing, tracing, and debugging of database issues.
The intent of this user is to support where ordinary database users lack the required privileges and, thus, should only be created when a specific task is required. It's not intended to used as a stand-by user and should be deleted immediately after finishing the required task to avoid misuse.
NoteOnly create this user for a specific specific task and delete right after the task has been completed. Never keep a database analyssis user longer than intended for and do not create this user if another user has sufficient privileges to complete the task. We strongly recommend creating this user with an automatic expiration date. This user can access all SAP HANA Cloud monitoring views and all SAP Data Warehouse Cloud meta data (read permission) and, thus, has access to all data in all spaces including all sensitive data. It is, therefore, to be created with high precaution.
This user can only be created by an administrator with a DW Administrator role.
SAP Data Warehouse Cloud provides a direct access to SAP HANA Cloud database explorer on the user interface to use an analyzing tool.
Capabilities
Capability Description Additional Information
Use EXPLAIN PLAN Using this command, a user can see the execution plan of a subquery, or that of an entry already in the SQL plan cache.
EXPLAIN PLAN Statement (Data Manipulation)
160 PUBLICAdministering SAP Data Warehouse Cloud
Monitoring and Troubleshooting SAP Data Warehouse Cloud
Capability Description Additional Information
Access SAP HANA Cloud monitoring views (expensive statements)
SAP HANA Cloud includes a set of runtime views called monitoring views that provide actual SAP HANA Cloud runtime data, including statistics and status information related to the execution of DML statements. These views are useful for monitoring and troubleshooting performance. The data in monitoring views is not stored on disk; it is calculated when you execute a query on one of the views.
Monitoring Views
Configure tracing and access trace files SAP HANA Cloud provides various traces for obtaining detailed information about the actions of the database system for troubleshooting and error analysis
Traces
Access to all space data This user is granted access to all data in all spaces in SAP Data Warehouse Cloud including highly sensitive data.
Privileges
List the current available user privileges with the following statement; select * from effective_privileges where user_name = current_user;:
● Full access to space data● Full access to space metadata● SAP HANA Cloud system privileges
○ ATTACH DEBUGGER○ CATALOG READ○ EXPORT○ TRACE ADMIN○ WORKLOAD ANALYZE ADMIN○ WORKLOAD CAPTURE ADMIN○ WORKLOAD REPLAY ADMIN
● SAP HANA objects:○ SELECT ON SCHEMA "_SYS_STATISTICS"○ SELECT ON "_SYS_EPM"."MDS_METADATA_DOCUMENTS"○ SELECT ON "SYS"."M_SERVICES" ○ SELECT ON "SYS"."M_SERVICE_MEMORY" ○ SELECT ON "SYS"."M_HEAP_MEMORY_RESET" ○ SELECT ON "SYS"."M_SERVICE_STATISTICS" ○ EXECUTE ON "SYS"."MEASURE_NETWORK_IO"○ EXECUTE ON "SYS"."MANAGEMENT_CONSOLE_PROC"
Administering SAP Data Warehouse CloudMonitoring and Troubleshooting SAP Data Warehouse Cloud PUBLIC 161
○ EXECUTE ON "SYS"."GET_INSUFFICIENT_PRIVILEGE_ERROR_DETAILS"
Auditing
All actions of the database analysis user are audited. The logs are saved as a new view called ANALYSIS_AUDIT_LOG and are stored in the space that has been assigned to store audit logs. For more information on auditing please see Enable Audit Logging.
Related Information
Creating Database Analysis User [page 162]Deleting Database Analysis User [page 163]
8.1.1 Creating Database Analysis User
Create a database analysis user to solve database issues.
Prerequisites
To create or delete a database analysis user you need to have the DW Administrator role assigned to you.
NoteOnly create this user when a specific task arises and delete right after the task has been completed. Never keep a database analysis user longer than intended for and do not create this user if another user has sufficient privileges to complete the task. We strongly recommend creating this user with an automatic expiration date. That way the user is no longer active after a period of time. This user can access all SAP HANA monitoring views and all SAP Data Warehouse Cloud meta data (read permission) and, thus, has access to all data in all spaces including all sensitive data. It is, therefore, to be created with high precaution.
Context
This database user with an additional application privileges supports to temporarily monitor, analyze, trace, and debug database issues.
162 PUBLICAdministering SAP Data Warehouse Cloud
Monitoring and Troubleshooting SAP Data Warehouse Cloud
Procedure
1. From the side navigation area go to (System) → (Configuration) → Database Access → Database Analysis Users.
If you've provisioned SAP Data Warehouse Cloud prior to version 2021.03 you'll see a different UI and need to use a different path; (Configuration)→ Database Access → Database Analysis User.
2. Click Create and give your user a user name suffix. This suffix is added to the default prefix to create your full user name. Note that you cannot change the prefix as it is a reserved prefix (see Rules for Technical Names [page 10]).
3. Set the expiration time for your user. After the user expires, they are no longer active, but the user can still be seen in the list. If a user has expired, and you still require it, you can reactivate the user by selecting the user, then clicking the info button and selecting Reactivate Analysis User. We strongly recommend you to delete the user once the task has been completed. Setting an automatic time expiration will ensure that the user is no longer active when not needed.
4. Click Create.
Results
Details like user name, hostname, port and password are provided. You need these credentials to connect to the underlying SAP HANA Cloud database. For the initial connection, you can select a database analysis user from the list and then open SAP HANA database explorer.
Related Information
Database Analysis User [page 160]Deleting Database Analysis User [page 163]Connecting with SAP HANA Cockpit [page 164]
8.1.2 Deleting Database Analysis User
Delete your database analysis user as soon as the support task is completed.
Context
We strongly recommend to delete this user once the support task has been fulfilled to avoid misuse of sensitive data.
Administering SAP Data Warehouse CloudMonitoring and Troubleshooting SAP Data Warehouse Cloud PUBLIC 163
Procedure
1. From the side navigation area go to (System) → (Configuration) → Database Access → Database Analysis Users.
If you've provisioned SAP Data Warehouse Cloud prior to version 2021.03 you'll see a different UI and need to use a different path; (Configuration→ Database Access → Database Analysis User.
2. Select the user you want to delete and then click Delete.
Related Information
Database Analysis User [page 160]Creating Database Analysis User [page 162]
8.1.3 Connecting with SAP HANA Cockpit
The SAP HANA cockpit provides a single point of access to a range of tools to analyze and monitor your underlying SAP HANA database via a database analysis user.
Procedure
1. In the side navigation area, click (System) (Configuration) Database Access Database Analysis Users .
If you've provisioned SAP Data Warehouse Cloud prior to version 2021.03 you'll see a different UI and need to use a different path; (Configuration)→ Database Access → Database Analysis User.
2. Select the database analysis user that you want to use to connect to SAP HANA cockpit.
3. For the initial connection, you'll need the password. Click (Information) to find the password and copy it. 4. Click Open SAP HANA Cockpit and enter the database analysis user name and password.
Results
Once connected, you can use SAP HANA Cockpit to perform advanced monitoring and analysis tasks on a database-level. For more information on SAP HANA Cockpit, go to the SAP HANA Cockpit product page. For more information on the privileges see Database Analysis User [page 160].
164 PUBLICAdministering SAP Data Warehouse Cloud
Monitoring and Troubleshooting SAP Data Warehouse Cloud
8.2 Setting up a Monitoring Space in SAP Data Warehouse Cloud
Define a space that is dedicated to monitoring SAP Data Warehouse Cloud (such as monitoring the database for resource consumption).
Prerequisites
You need the DW Administrator role to access the Monitoring page and select a space.
Context
Monitoring information provided by monitoring views can be sensitive as it includes information on all spaces and views. This is why these views are not accessible to all users by default in SAP Data Warehouse Cloud. However, as an administrator, you can select a space dedicated to monitoring information, in which SAP HANA monitoring views are available. The users who are assigned to the space and have modeling privileges, can then access the monitoring views in the Data Builder.
The following views are made available in this space:
● SAP HANA monitoring views from the SYS schema.● SAP HANA views from the _SYS_STATISTICS schema.● Monitoring views dedicated to SAP Data Warehouse Cloud from the DWC_GLOBAL schema (providing data
on logging and scheduling). For more information, see Monitoring Tasks, Logs and Schedules With Dedicated Monitoring Views [page 166].
NoteIf you have already selected a space dedicated to monitoring before version 2021-19, you need to select another space, then select the initial space again so that you can access all the views.
In the second area of the Monitoring page, you can configure the expensive statement views, which are monitoring views that allow to analyze individual SQL queries whose execution exceeds one or more thresholds that you specify.
Procedure
1. Go to (Configuration) → Monitoring.2. Select a space from the drop-down list and click Confirm Selected Space. We recommend to create a
dedicated space for monitoring, as you might not want all users to view sensitive data.3. To trace expensive statements, select Enable Expensive Statement Tracing and specify the following
parameters to configure and filter the trace details.
Administering SAP Data Warehouse CloudMonitoring and Troubleshooting SAP Data Warehouse Cloud PUBLIC 165
● Threshold CPU Time: Specifies the threshold CPU time of statement execution. When set to 0, all SQL statements are traced.
● Threshold Memory: Specifies the threshold memory usage of statement execution. When set to 0, all SQL statements are traced.
● Threshold Duration: Specifies the threshold execution time. When set to 0, all SQL statements are traced.● Trace Parameter Values: In SQL statements, field values may be specified as parameters (using a "?" in the
syntax). If these parameter values are not required, then do not select the option to reduce the amount of data traced.
For more information about these parameters, see Expensive Statements Trace in the SAP HANA Cloud, SAP HANA Database Administration Guide.
Result
The monitoring views are available in the Data Builder editors in Source Browser Sources tab:
● SAP HANA monitoring views under the SYS schemaAll SAP HANA monitoring views start with M_. For more information about all the monitoring views available, see Monitoring Views in the SAP HANA Cloud SQL Reference Guide.The views for expensive statement are M_EXPENSIVE_STATEMENTS and M_EXPENSIVE_STATEMENT_EXECUTION_LOCATION_STATISTICS. For more information, see M_EXPENSIVE_STATEMENTS and M_EXPENSIVE_STATEMENT_EXECUTION_LOCATION_STATISTICS in the SAP HANA Cloud SQL Reference Guide.
● SAP HANA views under the _SYS_STATISTICS schema. For more information, see Embedded Statistics Service Views (_SYS_STATISTICS schema) .
● Monitoring views dedicated to SAP Data Warehouse Cloud under the DWC_GLOBAL schema. For more information, see Monitoring Tasks, Logs and Schedules With Dedicated Monitoring Views [page 166].
Troubleshooting
After the configuration, if you face authorization issues due to insufficient privilege on the monitoring views (suffix EXT_V) in your monitoring space, the solution is to choose another space as the monitoring space in the configuration UI, and then select the existing current monitoring space again.
8.3 Monitoring Tasks, Logs and Schedules With Dedicated Monitoring Views
Monitor tasks and schedules execution over a space.
As an administrator, you need to keep an eye on how tasks and schedules are running within a space. Gathering information from different logs might be time consuming and you need to find an easy way to collect this information.
166 PUBLICAdministering SAP Data Warehouse Cloud
Monitoring and Troubleshooting SAP Data Warehouse Cloud
Using monitoring views dedicated to SAP Data Warehouse Cloud from the DWC_GLOBAL schema is the solution. Four monitoring views are ready to use in the DWC GOBAL schema and can be recognized as they have a naming convention Suffix_V_EXT:
● TASK_SCHEDULES_V_EXT:
Column Key Function Values
SPACE_ID X Identifier of the SAP Data Warehouse Cloud space which contains the object with the defined schedule.
OBJECT_ID X Identifier of the SAP Data Warehouse Cloud object for which the schedule is defined.
APPLICATION_ID X Identifier of the type of object VIEWS, REMOTE_TABLES, DATA_FLOWS
ACTIVITY X Identifier of the type of activity applied to the object.
NoteFor each application, you can have multiple activities (for example, replicating or deleting data)
PERSIST (View), EXECUTE (Dataflow), REPLICATE (Remote Tables)
OWNER Identifier of the responsible of the schedule, schedule executed on users behalf, consent is checked against
< DWC User ID >
CRON Defines the recurrence of a schedule For example: "0 */1 * * *" for hourly
CHANGED_BY User who last changed the schedule configuration.
CHANGED_AT Timestamp containing Date and Time, at which the schedule was last changed.
● TASK_LOGS_V_EXT
Column Key Function
TASK_LOG_ID X Uniquely identifies an execution of a task.
SPACE_ID Identifier of the SAP Data Warehouse Cloud space which contains the object with the defined schedule.
APPLICATION_ID Identifier of the type of object
OBJECT_ID Identifier of the SAP Data Warehouse Cloud object for which the schedule is defined.
ACTIVITY For each application there could be multiple activities, e.g. replicating or deleting data
START_TIME Timestamp containing Date and Time, at which the scheduled task was started.
Administering SAP Data Warehouse CloudMonitoring and Troubleshooting SAP Data Warehouse Cloud PUBLIC 167
Column Key Function
END_TIME Timestamp containing Date and Time, at which the scheduled task was stopped.
STATUS Reports if this task execution is still running, completed or failed.
TRIGGERED_TYPE Indicates if task execution was triggered manually (DIRECT) or via schedule (SCHEDULED)
APPLICATION_USER The user on whose behalf the schedule was executed (the owner at this point in time)
DURATION Duration of the task execution (also works for ongoing execution)
START_DATE Date when the scheduled task was started.
● TASK_LOG_MESSAGES_V_EXT
Column Key Function
TASK_LOG_ID X Uniquely identifies an execution of a task
MESSAGE_NO X Order sequence of all messages belonging to a certain Tasklog ID.
SEVERITY Indicates if the message provides general information (INFO) or error information (ERROR)
TEXT The message itself
DETAILS Technical additional information. For example, it can be an error stack or a correlation ID.
● TASK_LOCKS_V_EXT
Column Key Function
LOCK_KEY X Identifier, flexible field as part of the lock identifier, usually set to WRITE.
APPLICATION_ID X Identifier of the type of object.
SPACE_ID X Identifier of the SAP Data Warehouse Cloud space which contains the object with the defined schedule.
OBJECT_ID X Identifier of the SAP Data Warehouse Cloud object for which the schedule is defined.
TASK_LOG_ID Uniquely identifies the task execution that set the lock.
CREATION_TIME Indicates when the lock has been set.
NoteCross-space sharing is active for all HANA monitoring views. The row level access of shared views is bound to the space read access privileges of the user who consumes the view.
You can then choose a space that is dedicated to task framework monitoring. For more information, see Setting up a Monitoring Space in SAP Data Warehouse Cloud [page 165].
168 PUBLICAdministering SAP Data Warehouse Cloud
Monitoring and Troubleshooting SAP Data Warehouse Cloud
8.4 Monitor Database Operations with Audit Logs
Monitor the read and change actions performed in the database with audit logs, and see who did what and when.
If Space Administrators have enabled audit logs to be created for their space (see Enable Audit Logging), you can get an overview of these audit logs. You can do analytics on audit logs by assigning the audit views to a dedicated space and then work with them in a view in the Data Builder.
NoteAudit logs can consume a large quantity of GB of disk in your database, especially when combined with long retention periods (which are defined at the space level). You can delete audit logs when needed, which will free up disk space. For more information, see Delete Audit Logs [page 169].
1. Choose a space that will contain the audit logs.Go to System Configuration Audit . Enable to save and later display the audit logs directly in a certain space by choosing a space from the drop-down list. We recommend to create a dedicated space for audit logs, as you might not want all users to view sensitive data.
2. Open the Data Builder, create a view, and add one or more of the following views from the DWC_AUDIT_READER schema as sources:○ DPP_AUDIT_LOG - Contains audit log entries.○ AUDIT_LOG_OVERVIEW - Contains audit policies (read or change operations) and the number of audit
log entries.○ ANALYSIS_AUDIT_LOG - Contains audit log entries for database analysis users. For more information,
see Database Analysis User [page 160].
8.4.1 Delete Audit Logs
Delete audit logs and free up disk space.
All spaces for which auditing is enabled, are listed in the Audit Log Deletion area.
For each space, you can delete separately all the audit log entries recorded for read operations and all the audit log entries recorded for change operations. All the entries recorded before the date and time you specify are deleted.
1. Go to System Configuration Audit .2. Select the spaces and the audit policy names (read or change) for which you want to delete all audit log
entries and click Delete.3. Select a date and time and click Delete.
All entries that have been recorded before this date and time are deleted.Deleting audit logs frees up disk space, which you can see in the Used Disk bar of the Space Management page.
Administering SAP Data Warehouse CloudMonitoring and Troubleshooting SAP Data Warehouse Cloud PUBLIC 169
8.5 Monitor Object Changes with Activities
Changes of modeling objects (such as spaces and tables) as well as changes to the system configuration are logged in Security Activities .
8.6 Configuring Notifications
Configure notifications about system events and network connection issues, and define the SMTP server to be used for email deliveries.
Notify All Users about Network Connection Issues
When there are problems with a system, your users would like to know whether it is something that they control or if the issues are related to the network. You can't create messages for all situations, but you can let them know when the network connection is unstable.
To turn on the connection notification:
1. In the side navigation area, click System Administration Notifications .2. To enable editing of all settings on the page, click Edit.3. In the Connections Notifications section, change the toggle to ON.4. Click Save to commit your changes.
When the notification is on, everyone who uses the application on that tenant will see the notification in the top right corner of their application.
Configure Custom SMTP Server
Configuring an email server of your choice ensures greater security and flexibility while delivering email for your business.
1. In the side navigation area, click System Administration Notifications .2. To enable editing of all settings on the page, click Edit.3. In the Email Server Configuration section, select Custom, and complete the following properties.4. Click Check Configuration or Save to successfully validate the configuration details.
170 PUBLICAdministering SAP Data Warehouse Cloud
Monitoring and Troubleshooting SAP Data Warehouse Cloud
8.7 Requesting Help from SAP Support
You can request help from SAP Product Support by creating a support incident. In many cases, you should create a support user to allow an SAP support engineer to log into and troubleshoot your system.
You can create an SAP support incident on the SAP Support Portal. For detailed information about what to include in an incident, see SAP Note 2854764 .
An administrator can create a special support user to allow an SAP support engineer to connect to your system to troubleshoot the problem. This user has minimum privileges and does not consume a user license. You can delete this support user when your issue is resolved.
To create a support user, select from the main toolbar, choose Create Support User, and then choose OK in the dialog. A message is displayed once the support user is created.
An email is automatically sent to SAP Support to notify them of the newly created support user, and it is listed with your other users at Security Users .
For more information, see SAP Note 2891554 .
Administering SAP Data Warehouse CloudMonitoring and Troubleshooting SAP Data Warehouse Cloud PUBLIC 171
Important Disclaimers and Legal Information
HyperlinksSome links are classified by an icon and/or a mouseover text. These links provide additional information.About the icons:
● Links with the icon : You are entering a Web site that is not hosted by SAP. By using such links, you agree (unless expressly stated otherwise in your agreements with SAP) to this:
● The content of the linked-to site is not SAP documentation. You may not infer any product claims against SAP based on this information.● SAP does not agree or disagree with the content on the linked-to site, nor does SAP warrant the availability and correctness. SAP shall not be liable for any
damages caused by the use of such content unless damages have been caused by SAP's gross negligence or willful misconduct.
● Links with the icon : You are leaving the documentation for that particular SAP product or service and are entering a SAP-hosted Web site. By using such links, you agree that (unless expressly stated otherwise in your agreements with SAP) you may not infer any product claims against SAP based on this information.
Videos Hosted on External PlatformsSome videos may point to third-party video hosting platforms. SAP cannot guarantee the future availability of videos stored on these platforms. Furthermore, any advertisements or other content hosted on these platforms (for example, suggested videos or by navigating to other videos hosted on the same site), are not within the control or responsibility of SAP.
Beta and Other Experimental FeaturesExperimental features are not part of the officially delivered scope that SAP guarantees for future releases. This means that experimental features may be changed by SAP at any time for any reason without notice. Experimental features are not for productive use. You may not demonstrate, test, examine, evaluate or otherwise use the experimental features in a live operating environment or with data that has not been sufficiently backed up.The purpose of experimental features is to get feedback early on, allowing customers and partners to influence the future product accordingly. By providing your feedback (e.g. in the SAP Community), you accept that intellectual property rights of the contributions or derivative works shall remain the exclusive property of SAP.
Example CodeAny software coding and/or code snippets are examples. They are not for productive use. The example code is only intended to better explain and visualize the syntax and phrasing rules. SAP does not warrant the correctness and completeness of the example code. SAP shall not be liable for errors or damages caused by the use of example code unless damages have been caused by SAP's gross negligence or willful misconduct.
Bias-Free LanguageSAP supports a culture of diversity and inclusion. Whenever possible, we use unbiased language in our documentation to refer to people of all cultures, ethnicities, genders, and abilities.
172 PUBLICAdministering SAP Data Warehouse Cloud
Important Disclaimers and Legal Information
www.sap.com/contactsap
© 2022 SAP SE or an SAP affiliate company. All rights reserved.
No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP SE or an SAP affiliate company. The information contained herein may be changed without prior notice.
Some software products marketed by SAP SE and its distributors contain proprietary software components of other software vendors. National product specifications may vary.
These materials are provided by SAP SE or an SAP affiliate company for informational purposes only, without representation or warranty of any kind, and SAP or its affiliated companies shall not be liable for errors or omissions with respect to the materials. The only warranties for SAP or SAP affiliate company products and services are those that are set forth in the express warranty statements accompanying such products and services, if any. Nothing herein should be construed as constituting an additional warranty.
SAP and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP SE (or an SAP affiliate company) in Germany and other countries. All other product and service names mentioned are the trademarks of their respective companies.
Please see https://www.sap.com/about/legal/trademark.html for additional trademark information and notices.
THE BEST RUN