Sunday 29 August 2021

Freight Collaboration with SAP Logistics Business network (LBN) and its integration with SAP Transportation Management (TM)

SAP has developed & released a cloud application called as ‘Logistics business network (LBN)’ to network and communication with partners (Shippers, Carriers, Customs authorities etc) involved in logistics (warehousing & transportation) operations. LBN has 3 components

1. Freight collaboration

2. Global track & trace (GTT)

3. Material traceability

LBN can be integrated with SAP ECC , S/4HANA, Transportation management ( TM ) and Extended warehouse management (EWM).

Friday 27 August 2021

CAP: Access the HANA DB behind (via HANA Explorer)

When you create a CAP application with HANA as the database, it is not very intuitive to find how to explore tables/views of the application (being a developer). This method can also be used for loading initial data (via xlsx, csv files) to your tables during production cutover.

In this short blog, I will explain how to do that.

What won’t work?

If you go to your HANA Cloud instance ( in the Space ) and open the HANA explorer, this actually tries to open the explorer via the HANA Cockpit so it requires you to know the DBADMIN credentials which as a developer you will not have. Moreover sharing DBADMIN credentials is not safe as this single HANA instance can be serving Sandbox, Dev, QA landscapes and DBADMIN can edit/view tables of all applications in all of those landscapes.

Wednesday 25 August 2021

Key Concepts in Privacy Technologies


We live in a digital age, where our personal data is collected, processed, and disclosed at an unprecedented rate. Various Privacy Technologies and Privacy Risk Management Framework are being evolved today to address growing need to protect personal or sensitive data. Privacy being a fundamental right in many geographies and international organizations, the government around the world are in a frenzied race to enact privacy regulations and make it mandatory for organization that collect and process personal to adhere to the core principles of data protections and privacy. In particular, there has been greater emphasis on privacy enhancing technologies to protect the personal data such as technologies that are available for consent management, data minimization, data tracking, data anonymization, de-identification, pseudonymisation, encryptions, tokenization, masking, obfuscation, access control and identity, authentication and authorizations. In this blog, we will review some of the key data privacy technologies commonly deployed and how SAP HANA supports various privacy technology to provide tools for our customers as a data controller to meet the compliance. For the sake of simplicity, only high level concepts are presented in this blog.

Friday 20 August 2021

Run XSA application(UI part) locally with visual studio code

I have been working on XSA for a while, the SAP WEB IDE FOR SAP HANA is a bit outdated(especially for ui5 development) and sometimes not that performant. Recently i managed to move the xsa application to visual studio code and run the UI part(for now) locally. This blog post will show you how I moved and the solution for the issue encountered during the setup.

Note: This blog post is about moving your xsa development (UI part at least) to visual studio code, not about using visual studio code to do xsa app development from scratch

some prerequisite:

1.visual studio code +plugin

2.node js+ git

3. a running xsa application in your xsa( to get the destination and service binding info)

Wednesday 18 August 2021

Near real time Data Replication from Salesforce to SAP HANA using the Advantco Salesforce adapter for SDI


Salesforce provides APIs that enable external applications to receive events in near real time mode. In this blog, we provide a detailed configuration steps for two of these options: Platform Event and Change Data Capture (CDC). Platform Event is an excellent option when data from multiple objects are required as in the case of combining data from Opportunity with details from the Account object. Change Data Capture (CDC) provides a quick and configurable option to publish data of a specific object like Account or Contact.

The solution we describe here is based on SAP Smart Data Integration (SDI) tools which provide features to support data replication from external resources to SAP HANA. The two most important components in this context are the Data Provisioning Server, which is a native SAP HANA process, and the Data Provisioning Agent, which is a container running outside the HANA environment. The Advantco Salesforce adapter is deployed on the Data Provisioning Agent host and can be configured from the HANA environment duration the creation of a remote source. The remote source is the connection from SAP HANA to the Salesforce instance.

Monday 16 August 2021

SAP S/4HANA Cloud Fit-to-standard approach for technical areas


There are lot´s of good SAP blog posts, which describe Fit-to-standard methodology. Many of them highlight main points for business-related topics. I would like to sum up my experience in relation to SAP S/4HANA Cloud Implementation Fit-to-standard methodology for technical topics. There are following streams in SAP S/4HANA Cloud implementation projects, which require more technical knowledge, rather than business:

  • Integration
  • Migration
  • Output Management
  • Embedded Analytics
  • Master Data Management
  • Extensibility
  • Security

Friday 13 August 2021

Integrating SAP HANA Data Lake to Google Big Query – DL2BQ

A Simple Architecture:

SAP HANA Cloud, BW SAP HANA Data Warehousing, SAP HANA Tutorial and Materials, SAP HANA Certification, SAP HANA Guides, SAP HANA Learning

Pre-requisites: You must have your btp trial account up and running & data lake instance should also be running & have your credentials also ready for an open database connectivity

You should also have your gcp trial account ready – & make sure you have downloaded the gcp credentials in json format locally in your system.

Tuesday 10 August 2021

Error Handling in HANA

Requirement –

This blog explains how we can Implement Error Handling in HANA Sql to maintain data reliability, durability & consistency during execution of multiple DML statements in a single code block.

Implementation Scenario –

Let’s say, we have HANA stored procedure or Sql Code block and it has 5 Insert statements inserting data in a table (HXE_SAMPLE.ERROR_TEST) which has Primary Key defined on ID Column.

Sample Table Structure –


Monday 9 August 2021

Aggregate Data from Multiple SAP HANA Sources to One SAP HANA Cloud, HANA data lake IQ

With SAP HANA Cloud, hybrid landscapes (on-premise systems working with cloud systems) have become easily attainable. However, when doing aggregate data analysis, it might be easier to have the data of interest in a single source to keep data types and functionality consistent. We will see that with a HANA Cloud database and remote connections, the movement of data from multiple HANA instances to an SAP HANA Cloud data lake can be done from a single SQL console!

Today, I am going to bring data from an on-premise SAP HANA system together with an SAP HANA Cloud database in a single SAP HANA data lake. I will start from a landscape which has an SAP HANA Cloud database, SAP HANA database, and SAP HANA data lake connected.

Wednesday 4 August 2021

Monitoring & Analysis for SAP HANA Platform

The described aims to enlighten the audience about the Real-time Monitoring capabilities exposed by the SAP HANA platform. System Administrators maintain the integrity and consistency of the SAP HANA platform by performing regular system monitoring which helps with identifying system behavioral patterns. A combination of monitoring tools and checks provides a detailed technical overall system health check and helps with identifying and forecast requirements against possible data & Hardware bottlenecks.

An adequate System monitoring exercise involves continuous monitoring of the following but not limited to these environmental components:

Monday 2 August 2021

How To Do DTO in BW/4HANA Step by Step

This document provides the details of DTO implementation in BW4/HANA NLS-IQ 16.x

Data Tiering Optimization(DTO) helps SAP BW/4HANA customers to classify the data stored in the DataStore object (advanced) as hot, warm or cold, depending on the cost and performance requirements for the data.

Depending on this classification and how the data is used, the data is stored in different storage areas.