Tuesday 27 April 2021

Planning UAT across SAP S/4HANA and SAP IBP: 5 Lessons from the Real World

SAP implementation project delivery is heavy lifting, when you look to implement S/4HANA or SAP IBP full suite just by itself. It’s order of magnitude more complex when you need to implement S/4HANA and IBP full suite together in a greenfield environment. This is even more so, if you are looking to implement IBP Order Based Planning along with S/4HANA against complex business requirements, such as configurable products.

I would like to share lessons learned in planning User Acceptance Test UAT across S/4HANA and IBP. Learn how to plan when you have common testers across S/4 and IBP – how to mitigate the bottleneck risk, how to develop a test plan for a highly complex solution, and how to balance the conflicting needs of Business Teams vs. IT/PMO teams when it comes to substance vs. schedule.

Monday 26 April 2021

Querying ABAP CDS views from an ArcGIS HANA tenant

More and more of our customers that run SAP ERP and ArcGIS Enterprise are knocking down silos between GIS and transactional data by referencing their transactional data from ArcGIS Enterprise. This is made possible by putting an ArcGIS geodatabase on HANA or HANA Cloud and using HANA’s Smart Data Access (SDA) to query the transactional data on the fly.

Another key component that makes this possible is a sync framework which captures the transactional asset ID and saves it as an attribute in the corresponding ArcGIS feature class. For example, the SAP Object ID for a power pole is stored in the corresponding feature in the power pole feature class (a “foreign key relationship”). The same process plays out on the SAP side where the geometry ID of a feature in an ArcGIS feature class is stored as an attribute of the asset in SAP ERP. SAP’s Geographical Enablement Framework (GEF) relies on the sync framework to make sure that assets in SAP have corresponding geometries in ArcGIS Enterprise and vice-versa. The GEF enabled Plant Maintenance module enables Plant Maintenance workers to do their work on a map. RE-FX and Project Management can be enabled with GEF as an engineered service. Remember that GEF is for the users of these ERP modules, but what about everyone else in the enterprise?

Tuesday 20 April 2021

BAdI Implementation in ABAP on HANA for BW Extractors

Introduction:

BAdI Enhancement for BW Extractors in S/4 HANA system.

The Logic for the enhancement of BW datasource is traditionally written in ABAP Function Module Exit or ABAP BAdI. But, now with S/4 HANA, BAdI is the default place to write the logic for Enhancement (Function Module Exit is still available but might not be going forward).

With S/4 HANA we have a flexibility to write logic in HANA Optimized ABAP code using new coding techniques (New Open SQL syntax).

Friday 16 April 2021

Enabling cold store data access using view for external access (The ‘8’ View) for aDSOs in BW reports with mixed modelling scenario

As promised in my earlier blog post, in this article I will be explaining the possibilities of using the view for external access or the external SAP HANA SQL view for aDSOs or the generated ‘8’ view of aDSOs. The main idea is to gain optimum performance avoiding unnecessary access to cold storage when only Hot data is requested. For details about the ‘8’ view please have a look into the above mentioned blog post link.

For this illustration, I will use the same aDSO used for the earlier blog post : SALESADSO, which has some sample sales data for calendar year 2019 & 2020. Executing suitable DTO rule, I have moved all the data residing in partitions covering calendar year 2019 to external cold storage (IQ). So, the view for external access for the aDSO : /BIC/ASALESADSO8, has the COLD_STORE_FLAG column populated as : ‘X’ for all data belonging to year 2019 & ‘ ‘ for all data belonging to year 2020.

Wednesday 14 April 2021

View for external Access (The ‘8’ View) for aDSOs in BW/4HANA 2.0 Mixed modelling

In this blog post, I would focus on some of the key aspects of the generated view for external access  or the external SAP HANA SQL View for aDSOs in BW/4HANA 2.0 by a use case scenario. The view is generated with naming convention as following : /BIC (namespace)/A<technical name of the aDSO>8.

We are all well familiar with the below generated tables of aDSOs : (depending on the type of particular aDSO they are relevant)

/BIC/A<technical name of the aDSO>1 : Inbound Table for aDSO.

/BIC/A<technical name of the aDSO>2 : Active Data Table for aDSO.

/BIC/A<technical name of the aDSO>3 : Change Log Table for aDSO.

Tuesday 13 April 2021

SAP S/4HANA Business Partner Toolset (BDT) at Business Partner

This blog is relevant for all releases working with Business Partner, meaning ECC 6.0 onwards. Main focus is SAP S/4HANAon-premise and private cloud edition, which is the most relevant working with Business Partner.

Introduction

What is the BDT?

BDT stands for “Business Data Toolset” and is a central tool for maintaining master data and simple transactional data. In this context I will focus on Business Partner transaction and Business Partner Relationship.

Monday 12 April 2021

Hands-On Tutorial: Leverage SAP HANA Machine Learning in the Cloud through the Predictive Analysis Library

The hard truth is that many machine learning projects fail to get set into production. It takes time and real effort to move from a machine learning model to a real business application. This is due to many different reasons, for example:

1. Limited data access

2. Poor data quality

3. Small computing power

4. No version control

Friday 9 April 2021

Partitioning Data Volumes for HANA DB performance improvement

Partitioning Data Volumes

Below is a simple question and answer format to understand the usage of data volume partitioning ,how it helps in improving over all HANA DB read and write performance and how this is different from data volume striping

1. What is data volume partitioning? How does it add performance advantage over default setup? Since when it is available?

Data volumes on the Indexserver can be partitioned so that read and write operations can run in parallel with increased data throughput. Also, the HANA startup time will benefit from this feature because the data throughput will increase massively, depending on the number and performance of the additional disks or volumes

Wednesday 7 April 2021

How to use a REST API to post data into SAP S/4HANA Cloud using SAP RPA 2.0

Introduction

With the desktop studio application, developers are free to create custom RPA bots which simplifies days to day tasks by reducing human intervention. With RPA 2.0 and its Low-Code approach, I will show you how to leverage the SAP Intelligent RPA Cloud Studio to create the API call.

I will be using the supplier invoice creation API which can be accessed via activating the Communication Arrangement SAP_COM_0057 in the SAP S/4HANA Cloud system.

Monday 5 April 2021

How to sizing SAP S/4HANA Conversion

This Blog was made to help customers prepare the SAP S/4HANA landscape conversion considering the sizing relevant KPI’s for the key performance indicators.

There are many perspectives that we need to consider when doing this planning. The sizing procedure helps customers to determine the correct resources required by an application within customer’s business context.

From the customer perspective, sizing is the translation of business requirements into hardware requirements, but from the development point of view, sizing refers to the creation of a sizing model for the product functionality with a reasonable number of input parameters and assumptions.