Monday 31 May 2021

A tip of defining Synonyms at HDI using SAP Business Application Studio

Purpose

I found accessing tables in HDI container using SAP BAS is a bit of tricky and have been juggling of how to use the form editor for synonym. Here is a tip for someone who might have been in the same dilemma as I have, or for who don’t want to spend time on investigating and heading around in trying to find the reason as I did.

Environment

A  table (“city”) & the data have been generated in an external data source – Data Lake whose connection is already established on HANA Data Explore (Please refer to other blog if you need to link external data sources to HANA ). I created a schema ‘booking’ in my HANA cloud.

Wednesday 26 May 2021

Adaptation Project: New Facet with Smart table (OData Redefinition & Translation)

Adaptation Projects or say it application variant, is a feature to extend SAP delivered/Standard fiori elements app.

In this blog post, I will show step by step:

1. How to re-define the OData service

2. Create adaptation project in web IDE with new facet

3. Deployment process

4. Translation for the variant, tile and entity in another language

5. And last configuring the tile for the FLP

Monday 24 May 2021

Mass Update of SLT Advanced Replication Settings

Requirement: Need to add a new column DW_LOAD_DATE (containing system timestamp) for all tables replicating from S/4HANA to Enterprise HANA. This will enable us to understand the exact timestamp when a particular record was replicated to HANA.

Challenge: Currently we have more than 180 tables in active replication. Adding DW_LOAD_DATE manually to each table in the configuration is very time consuming, error prone and most of all boring.

Possible Solutions:

1. Manual Addition

2. Addition of field via Template Maintenance

3. Mass Update by editing the Exported Advanced Replication Settings Files.

Friday 21 May 2021

Populate Dates Between Start & End Dates

Requirement

This blog is intended to show how we can fill the dates if dates are saved in two columns as StartDate or EndDate ( Refer Sample Data of Date Rang Table Screenshot).

We have one Transaction table which is having data for each day (Example: PGI_DATE).

In another table, data is entered by date range.

Monday 17 May 2021

Intelligent Asset Accounting with SAP S/4HANA Cloud

In this blog post we will see how the traditional Asset Accounting processes which were manual, tedious, time consuming gets automated with innovations in intelligent technologies like SAP Intelligent Robotic Process Automation.

Asset Accounting processes –

1. Update Asset master

2. Asset Acquisition

3. Asset Transfer within company code

4. Asset Transfer across company code

5. Asset Retirement

6. Asset Retirement by Scrapping

Tuesday 11 May 2021

How to install & run the ABAP on HANA Sizing Report (SAP Note 1872170) – A Step-by-Step Guide

In this blog post you will get a clear, step-by-step process of how to install and run the ABAP Sizing Report (described in SAP Note 1872170), with screenshots accompanying each step.

Steps involved

1) Check System Version

The first thing to do is to verify the version of your system that you are going to install and run the report on. This report runs on SAP_BASIS 620 and higher.

Wednesday 5 May 2021

Develop UI5 / Fiori App Using VS code.

Step 1:

1. Install Node.js  – Node.js® is a JavaScript runtime built on Chrome’s V8 JavaScript engine.

2. Install VS Code – Visual Studio Code is a code editor redefined and optimized for building and debugging modern web and cloud applications Link.

Step 2:

Create a folder and open it in VS code.

In VS code go to Extensions(cntrl +shift+ X).

Monday 3 May 2021

SAP HANA on Google Cloud + NetApp CVS: non-disruptive volume size and performance scaling to fit workload needs

SAP HANA on Google Cloud + NetApp Cloud Volume Service: Resizing volume size and performance to fit your workload needs in a non-disruptive manner.

If your HANA instance is running on Google Cloud, utilizing NetApp CVS, you can take advantage of its non disruptive, flexible volume scaling that fits performance needs. It provides you the flexibility to increase / decrease volume size to juggle between performance and cost in uptime.

For example, you can easily increase the volume size to boost up disk throughput to improve the duration of HANA startup, Data Loading, System Migration, S/4 Conversion, Import/Export, Backup/ Restore, eliminate system standstill/ performance issue during critical workload (month end processing, high volume of change activities, etc) that could possibly caused by long savepoint duration due to disk I/O bottlenecks and etc. Once the ad-hoc workload is completed, the volume can be scaled to a size in uptime that fits your HANA DB size and meets the HANA disk KPIs during normal operation to save some unnecessary cost.