Monday, 3 October 2022

Python hana_ml: Classification Training with APL(GradientBoostingBinaryClassifier)

I am writing this blog to show training with APL using python package hana_ml. With APL, you can automate preprocessing to some extent.

Environment


Environment is as below.

◉ Python: 3.7.14(Google Colaboratory)
◉ HANA: Cloud Edition 2022.16
◉ APL: 2209

Wednesday, 28 September 2022

SAP HANA Cockpit Installation.

In this blog, we would like to explain HANA Installation and how to monitor the HANA Landscape systems using HANA Cockpit.

In older HANA 1.0 SP12 by default cockpit was installed along with HANA Server but the newer version after HANA 2.0 SP01 higher SP level, we must install COCKPIT on a separate server to manage and monitor the HANA Landscape systems.

HANA COCKPIT can be installed on a separate server, or we can install within the HANA server.

SAP recommended to install the HANA COCKPIT a separate server for production environment.

Monday, 26 September 2022

Hana Table Migration using Export & Import

Requirement –


One of most common requirement in Hana is to move user defined table from one Hana environment to another (Ex: Dev > QA > Prod). Here is one of the method i.e. Export & Import we can use.

Solution –


Will migrate the Hana Table from Source Hana Environment to Target Hana Environment.

Step 1 Export Table – First we have to export the table from source system.

Connect to source Hana system and create sample table and load data using below script or any other script.

Wednesday, 21 September 2022

What’s New in SAP HANA Cloud in September 2022

Summer is slowly coming to an end, at least for the colleagues located in the northern hemisphere.  And with the end of the summer, we are also approaching yet another end of a quarter. With this blogpost, I want to provide you with a summary of the exciting innovations that have been introduced to SAP HANA Cloud in Q3 2022.     

To get a detailed overview of individual functionalities and changes, including demos and a Q&A, don’t miss out and register now for our release webinar hosted by the SAP HANA Cloud product management team.

Friday, 16 September 2022

Jupyter Notebook and SAP HANA: Persisting DataFrames in SAP HANA

Introduction


Jupyter Notebook or R are often the tools of choice for data scientists. It makes using data operations, data exploration and sharing convenient. SAP HANA is the database of choice for the most important businesses around the globe. Connecting both worlds, Jupyter Notebook and SAP HANA provides an incredible potential which needs to be seized.

SAP HANA


Jürgen Müller already provided a great blogpost about SAP HANA. I want to shortly formulate what SAP HANA is also with my own words. SAP HANA is an in-memory database which provides a lot of benefits, especially for analytical use cases. It connects different aspects of databases within one database. Besides typical properties of relational databases, it also delivers properties of NoSQL databases, like column-based databases. Depending on use-case, it is possible to activate or deactivate specific properties so that you can get the best performance out of your system.

Wednesday, 14 September 2022

Python with SAP Databases

How can we leverage Python programming language in SAP infrastructure? – in multiple ways I must say.

In this blog we will try to understand how we can establish connection with the database/s and how can we execute some basic queries.

But why Python?

Python is easy to learn, flexible, supports connectivity for almost all databases and OS platforms, supports connectivity to SAP using rfc modules, strong community support and has a lot of potential in automation.

Monday, 12 September 2022

HANA SPS upgrade from HANA 2.0 Rev 37 to HANA 2.0 Rev 59 on the DR server when primary and secondary server setup as replication are present

HANA SPS upgrade from HANA 2.0 Rev 37 to HANA 2.0 Rev 59 on the DR server when primary and secondary server setup as replication are present

There are many environments and larger landscapes where we have cluster environment.But in this environment we have Primary DB replicated with the secondary DB where the primary system is up an running and replicated to the secondary DB/DR Server.

Note:

While performing the HANA SPS upgrade or HANA revision upgrade first the upgrade needs to be performed to the DR/secondary server before performing it on the primary server

Reason:The version present on the DR/secondary server should be higher or similar to that of the primary server.

Saturday, 10 September 2022

SAP Analytics Cloud – TroubleShooting – Timeline Traces

There are tons of different ways how you can performance traces on SAC for troubleshooting. In this blog I will talk about one of the tracing methods called (TIMELINE TRACES) that can be very helpful when there are performance problems being observed in SAP Analytics cloud (SAC) regardless if the connection is with BW or S/4HANA etc.

Step 1:

Log on to your SAC Tenant.

Step 2:

Once you are logged on to the SAC tenant open CHROME developer tools in the same browser window where you have the Tenant open.

Friday, 9 September 2022

SAP HANA Capture and Replay Tool End-to-end Hands-On Tutorial

This tutorial will walk us through the end-to-end steps in SAP HANA capture and replay tool.

What can I do with SAP HANA capture and replay?


Capture and replay tool could capture the real workload from source SAP HANA system and replay the captured workload to a target SAP HANA system. Both source system and target system should be on premise system.

SAP HANA, SAP HANA Certification, SAP HANA Prep, SAP HANA Career, SAP HANA Jobs, SAP HANA Skills, SAP HANA Learning, SAP HANA Tutorial and Materials

Wednesday, 7 September 2022

Python hana_ml: PAL Classification Training(UnifiedClassification)

I am writing this blog to show basic classification training procedures using python package hana_ml. Wtih class UnifiedClassification, you can use several classification algorithms. Besides, training result can be exported as HTML report easily.

SAP HANA Exam, SAP HANA Exam Prep, SAP HANA, SAP HANA Career, SAP HANA Skills, SAP HANA Jobs, SAP HANA Certification, SAP HANA Career, SAP HANA Skills, SAP HANA Jobs

Sunday, 28 August 2022

Load Data from A Local File into HANA Cloud, Data Lake

Overview

Ever wondered how will you load data/file from your local machine into HANA Cloud, Data Lake without hesitation?  

Then you are at the right place. This blog will provide you a step-by-step guide as to how anyone can easily load data from their local machine to the HANA Cloud, Data Lake using Data Lake File store. 

Step-by-step process:

Firstly, one must provision a Data Lake instance from the SAP HANA Cloud Central through SAP BTP Cockpit. One can learn to do so by going through the following tutorial – Provision a Standalone Data Lake in SAP HANA Cloud | Tutorials for SAP Developers  

Friday, 26 August 2022

Integrating SAP PaPM Cloud & SAP DWC

Since integration topics are currently the talk of the town when it comes to SAP Profitability and Performance Management Cloud (SAP PaPM Cloud), it would just make sense to let the community know that one of the most popular data warehousing service that is: SAP Data Warehouse Cloud (SAP DWC) could be easily integrated to your SAP PaPM Cloud Tenant. If you’re curious on how to do it, read on. 

At the end of this blog post, my goal is for you to be able to: 

◉ Consume a HANA Table from SAP PaPM Cloud’s underlying HANA Database into SAP DWC 
◉ Consume a View created from SAP DWC into SAP PaPM Cloud’s Modeling via Connections 

Let’s start!

Wednesday, 24 August 2022

E-Mail Templates in S/4 HANA- Display table in Email Template

SAP has a very interesting feature in S/4 HANA (cloud and on premise both) – E-Mail Templates.

In this blog post, we will learn how to embed table with multiple records in SE80 email template.

Example:

◉ Requirement is to send an email by end of the month to all employees who has pending hours in his/her time sheet. Data is taken from Std HR tables and pending hours is calculated using formula for each employee for project on which he is assigned. Pending hours = Planned hours – (Approved + Submitted) hours.

Monday, 22 August 2022

CDS Views – selection on date plus or minus a number of days or months

Problem

Need to be able to select data in a CDS view where the records selected are less than 12 months old. This needs a comparison of a date field in the view with the system date less 12 months.

The WHERE clause should look something like the following.

Where row_date >= DATS_ADD_MONTHS ($session.system_date,-12,'UNCHANGED')

The problem with this is that the CDS view SQL statement above is not permitted, giving the following error message on activation.

Friday, 19 August 2022

Monitoring Table Size in SAP HANA

This is a second blogpost about RybaFish Charts tool, If you never heard about the tool – please check the introduction article.

The real power of RybaFish Charts is in custom KPIs. RybaFish supports three KPI types: regular, gantt and multiline. Today we are going to create regular KPI to track the memory consumption by a certain column store (CS) table:

SAP HANA, SAP HANA Exam Prep, SAP HANA Exam Certification, SAP HANA Prep, SAP HANA Preparations, SAP HANA Career, SAP HANA Skills, SAP HANA Jobs, SAP HANA Tutorial and Materials
Table Size Monitoring

Monday, 8 August 2022

Flatten Parent-Child Hierarchy into Level Hierarchy using HANA (2.0 & above) Hierarchy Functions in SQL

Knock knock! Anyone else also looking for handy illustrations of the hierarchy functions introduced with HANA 2.0? Well count me in then.

While trying hard not to write SQLs with recursive self joins to flatten a hierarchical data format presented in parent-child relationship, the hierarchy functions in HANA can be a saviour for sure. Let’s look into something easy to implement using pre-defined hierarchy functions available with HANA 2.0 & above.

As a starter, let’s assume we have a miniature article hierarchy structured as below:

Saturday, 6 August 2022

Integration of SAP Ariba Sourcing with Qualtrics XM for Suppliers, HANA Cloud Database

In this blog, I will give you an overview of a solution to extract supplier data from a Sourcing event in SAP Ariba Sourcing, and save it in a mailing list in SAP Qualtrics XM for Suppliers, using BTP services.

Process Part 1

First, to extract the information from SAP Ariba Sourcing, I use the Operational Reporting for Sourcing (Synchronous) API to get events by date range, and also the Event Management API which returns supplier bid and invitation information from the sourcing events.

Then, I store the information I need in a SAP HANA Cloud database. I created 3 tables to store the information that I will send to SAP Qualtrics XM for Suppliers: RFx header information, Invitations, and Organizations contact data.

Finally, I send all the information needed to a SAP Qualtrics XM for Suppliers mailing list, which will then handle automatically sending surveys to suppliers that participated in the Ariba sourcing events.

To send the information to SAP Qualtrics XM for Suppliers, I use the Create Mailing List API.

Integration

All this is orchestrated by the SAP Integration Suite, where I created 2 iFlows:

◉ The first iFlow is to get the information from the SAP Ariba APIs, and store it in the SAP HANA Cloud database.

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs
Get RFx Information from Ariba, and store it in SAP HANA Cloud

◉ The second iFlow is to get the information from the SAP HANA Cloud database, and send it to SAP Qualtrics XM for Suppliers via the Mailing List API.

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs
Get information from HANA Cloud and Send Contacts information to Qualtrics 

The SAP HANA Cloud database was created with a CAP application developed in Visual Studio Code.

Final Thoughts

By using SAP Ariba and Qualtrics APIs, as well as a couple of BTP services, integration between the two can be achieved in a very simple way with only a few steps.

Process Part 2


There are two methods to create SAP HANA Cloud artifacts:

HANA Database project

In this method we create database artifacts in a classic database schema using declarative SQL

For this exercise, I created the project in the SAP Business Application Studio as explained in the tutorial. I used these files to create the tables:

rfx.hdbtable

COLUMN TABLE rfx (
  id NVARCHAR(15) NOT NULL COMMENT 'Internal ID',
  title NVARCHAR(1024) COMMENT 'Title',
  created_at DATETIME COMMENT 'Created at',
  updated_at DATETIME COMMENT 'Updated at',
  event_type NVARCHAR(30) COMMENT 'Event Type',
  event_state NVARCHAR(30) COMMENT 'Event State',
  status NVARCHAR(30) COMMENT 'Event Status',
  PRIMARY KEY(id)
) COMMENT 'RFx Information'
 
rfx_invited_users.hdbtable

COLUMN TABLE rfx_invited_users (
  id NVARCHAR(15) NOT NULL COMMENT 'Internal ID',
  unique_name NVARCHAR(1024) NOT NULL COMMENT 'Contact Unique Name',
  full_name NVARCHAR(1024) COMMENT 'Full Name',
  first_name NVARCHAR(250) COMMENT 'First Name',
  last_name NVARCHAR(250) COMMENT 'Last Name',
  email NVARCHAR(250) COMMENT 'Email',
  phone NVARCHAR(30) COMMENT 'Phone',
  fax NVARCHAR(30) COMMENT 'Fax',
  awarded BOOLEAN COMMENT 'Awarded',
  PRIMARY KEY(id, unique_name)
) COMMENT 'Users invited to RFx'
 
rfx_organizations.hdbtable

COLUMN TABLE rfx_organizations (
  id NVARCHAR(15) NOT NULL COMMENT 'Internal ID',
  item INTEGER COMMENT 'Item No',
  org_id NVARCHAR(15) COMMENT 'Organization ID',
  name NVARCHAR(1024) COMMENT 'Name',
  address NVARCHAR(1024) COMMENT 'Address',
  city NVARCHAR(1024) COMMENT 'City',
  state NVARCHAR(1024) COMMENT 'State',
  postal_code NVARCHAR(10) COMMENT 'Postal Code',
  country NVARCHAR(100) COMMENT 'Country',
  contact_id NVARCHAR(1024) NOT NULL COMMENT 'Contact Unique Name',
  PRIMARY KEY(id, item)
) COMMENT 'RFx Organizations'

After creating all files, the project should look like this:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

After deployment, you should see all 3 tables in the SAP HANA Cloud database explorer:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

Now you can use JDBC (for example) to access the tables in the SAP HANA Cloud database.

Multi-target application project, CAP and CDS

In this method we create a multi-target application and use CAP and CDS to generate the SAP HANA database tables, as well as the OData services to access the database.

I used these files for the CDS artifacs:

schema.cds

namespace com.aribaxm.service;

type InternalId : String(15);
type SDate : DateTime;
type XLText : String(2050);
type LText : String(1024);
type SText : String(30);
type MText : String(250);

entity Rfx {
    key id           : InternalId;
        title        : LText;
        createdAt    : SDate;
        updatedAt    : SDate;
        eventType    : SText;
        eventState   : SText;
        status       : SText;
}

entity RfxInvitedUsers {
    key id         : InternalId;
    key uniqueName : LText;
    fullName       : LText;
    firstName      : MText;
    lastName       : MText;
    email          : MText;
    phone          : SText;
    fax            : SText;
    awarded        : Boolean;
}

entity RfxOrganizations {
    key id      : InternalId;
    key item    : Integer;
    orgId       : SText;
    name        : LText;
    address     : LText;
    city        : LText;
    state       : LText;
    postalCode  : SText;
    country     : MText;
    contactId   : LText;
}
 
service.cds

using com.aribaxm.service as aribaxm from '../db/schema';

service CatalogService @(path:'/api/v1') {
    entity Rfx as projection on aribaxm.Rfx;
    entity RfxInvitedUsers  as projection on aribaxm.RfxInvitedUsers;
    entity RfxOrganizations as projection on aribaxm.RfxOrganizations;
}

server.js

"use strict";

const cds = require("@sap/cds");
const cors = require("cors");
//const proxy = require("@sap/cds-odata-v2-adapter-proxy");

cds.on("bootstrap", app => app.use(cors()));

module.exports = cds.server;

The mta.yaml should look something like this:

---
_schema-version: '3.1'
ID: AribaXM
version: 1.0.0
parameters:
  enable-parallel-deployments: true
build-parameters:
  before-all:
    - builder: custom
      commands:
        - npm install --production
        - npx -p @sap/cds-dk cds build --production

modules:
  - name: aribaxm-srv
    type: nodejs
    path: gen/srv
    parameters:
      buildpack: nodejs_buildpack
    build-parameters:
      builder: npm-ci
    provides:
      - name: srv-api # required by consumers of CAP services (e.g. approuter)
        properties:
          srv-url: ${default-url}
    requires:
      - name: aribaxm-db

  - name: aribaxm-db-deployer
    type: hdb
    path: gen/db
    parameters:
      buildpack: nodejs_buildpack
    requires:
      - name: aribaxm-db

resources:
  - name: aribaxm-db
    type: com.sap.xs.hdi-container
    parameters:
      service: hana # or 'hanatrial' on trial landscapes
      service-plan: hdi-shared
    properties:
      hdi-service-name: ${service-name}
 
And the package.json should look something like this:

{
  "name": "aribaxm",
  "version": "1.0.0",
  "description": "A simple CAP project.",
  "repository": "<Add your repository here>",
  "license": "UNLICENSED",
  "private": true,
  "dependencies": {
    "@sap/cds": "^5",
    "express": "^4",
    "@sap/hana-client": "^2",
    "cors": "^2"
  },
  "devDependencies": {
    "sqlite3": "^5",
    "@sap/hdi-deploy": "^4"
  },
  "engines": {
    "node": "^16"
  },
  "scripts": {
    "start": "cds run"
  },
  "eslintConfig": {
    "extends": "eslint:recommended",
    "env": {
      "es2020": true,
      "node": true,
      "jest": true,
      "mocha": true
    },
    "globals": {
      "SELECT": true,
      "INSERT": true,
      "UPDATE": true,
      "DELETE": true,
      "CREATE": true,
      "DROP": true,
      "CDL": true,
      "CQL": true,
      "CXL": true,
      "cds": true
    },
    "rules": {
      "no-console": "off",
      "require-atomic-updates": "off"
    }
  },
  "cds": {
    "requires": {
        "db": {
            "kind": "hana"
        }
    },
    "hana": {
        "deploy-format": "hdbtable"
    }
  }
}
 
After creating all files, the project should look like this:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

After deployment, you should see all 3 tables in the SAP HANA Cloud Database Explorer:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

And these 2 applications in the BTP space:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

Now you can use OData to access the database. In this exercise I didn’t added access control, so you should use your BTP user to execute the OData services from Postman and check the access.

Final Thoughts

Now you have 2 methods of creating SAP HANA Cloud database artifacts, both from the SAP Business Application Studio or Visual Studio, so you can access the database from JDBC or OData services.

Friday, 5 August 2022

Processing of Prepayments with SAP S/4HANA Accruals Management

For sure you have already heard about the SAP S/4HANA Accruals Management?

Starting with S4 HANA OP 2021, the Accrual Management provides a new functionality to process deferrals. So far, the processing of prepayments is a typical and manual activity for the Finance users. With this new functionality Finance can achieve efficiency gains and it’s quite easy to implement as well. And you can also use the deferrals’ part even if you didn’t implement the accruals’ part yet.

In this blog I will show you how to set up the SAP S/4HANA Accruals Management for deferrals in order to optimize this process.

Thursday, 4 August 2022

App Extensibility for Create Sales Orders – Automatic Extraction: Custom Proposal for Sales Order Request Fields (Using BAdI)

In the Create Sales Orders – Automatic Extraction app, the system starts data extraction and data proposal for sales order requests immediately after your purchase order files are uploaded. If SAP pre-delivered proposal rules do not satisfy your business needs, key users can create custom logic to implement your own proposal rules.

Here is an example procedure. In your custom logic, you want the system to set the sales area to 0001/01/01 if the company code is 0001, and set the request delivery date to the current date plus seven days if this date is initial. In case ML can not extract the date from file or date is not exist in the file.

Wednesday, 3 August 2022

How to Display Situations in Your Custom Apps with the Extended Framework of Situation Handling

Situation Handling in SAP S/4HANA and SAP S/4HANA Cloud detects exceptional circumstances and displays them to the right users.

I’m the lead UI developer within the Situation Handling framework for end-user facing apps and I’m excited to present a new feature in the Situation Handling extended framework to you. Together with the team from SAP Fiori elements, we introduce a new, simplified way of displaying situations right in your business apps. You can enable a great user experience for your end users with low effort. You should read through this tutorial-style article which explains the details and points you to further resources that you can take as a blueprint for your successful implementation.

With the extended framework of Situation Handling, situations can be displayed in apps based on SAP Fiori elements for OData version 4. You can do this without any front-end development, just by extending the app’s OData service.

Tuesday, 2 August 2022

Currency conversion in BW/4HANA, Enterprise HANA Modelling

Most the business model cost center or profit center are located in different country and with different currency profit and cost is generated. At the end of the year finance team try to calculate total cost or profit against a target currency (USD,EUR or different one) which is the headquarter of the company located to generate laser and balance sheet. In that scenario we need to perform currency conversion to generate analytics report. This blog I am going to discuss about the currency conversion step for different scenarios i.e. BW/4HANA, Enterprise HANA Modelling  SAP Analytics Cloud

1. Currency conversion in BW/4HANA:

In our scenario, A finance analyst wants all the profits generated in the Belgium and France plant (in EURO) need to converted into the USD which is the currency of the organization head office to generate the Laser posting.

Monday, 1 August 2022

Backup and Recovery for the SAP HANA (BTP)

SAP HANA (HANA Cloud, HAAS ..) offers comprehensive functionality to safeguard your database i.e. SAP HANA offers automatic Backup to back up your database and ensure that it can be recovered speedily and with maximum business continuity even in cases of emergency. The recovery point objective (RPO) is no more than 15 minutes.

A full backup of all SAP HANA Cloud instances is taken automatically once per day for last 14 days

These Backups are encrypted using the capabilities of the SAP Business Technology Platform. The retention time for backups is 14 days. This means that an instance can be recovered for 14 days.

Wednesday, 27 July 2022

Decoding S/4HANA Conversion

To start with there is always a lot of confusion on S/4HANA Conversion projects.

Too many tasks, too many teams, and too many responsibilities. Also moving away from our beloved Business Suite sparks fear in us.

Lets explain why we should move away from ECC 6.0 :

SAP Maintenance Strategy:

SAP provides mainstream maintenance for core applications of SAP Business Suite 7 software (including SAP ERP 6.0, SAP Customer Relationship Management 7.0, SAP Supply Chain Management 7.0, and SAP Supplier Relationship Management 7.0 applications and SAP Business Suite powered by SAP HANA®) on the respective latest three Enhancement Packages (EhPs) until December 31, 2027.

Saturday, 23 July 2022

XaaS Digital Assets with SAP S/4HANA Public Cloud

In this blog let us go in more details of various possible business models and corresponding pricing models that we can have as part of XaaS Digital Assets. Our focus will be on E2E business process for Subscription based products in SAP S/4HANA Public Cloud with an out of the box integration with SAP Subscription Billing & Entitlement Management solutions.

Let us imagine that there is a Digital Assets Software company having a software portfolio including various software & different pricing models like:

◉ Fixed Recurring Charge

◉ Tier Based Pricing

◉ Volume/Usage Based Pricing

Friday, 22 July 2022

XSD Validation for DMEEX

SAP HANA, SAP HANA Exam, SAP HANA Exam Prep, SAP HANA Certification, SAP HANA Preparation, SAP HANA Tutorial and Materials, SAP HANA Skills, SAP HANA Jobs, SAP HANA News

There is new functionality available in DMEEX and delivered across SAP S/4HANA on-premise that allows you to define XSD (XML Schema Definition) validation for your format trees.

If you were provided with an XSD with a criterion for the output file from the requester (e.g., bank or other financial institution) you can now upload the XSD file and take advantage of the file being checked against the schema as soon as the file is created.

Wednesday, 20 July 2022

SAP Data Warehouse Cloud bulk provisioning

As our customers adopt SAP Data Warehouse Cloud, we often need to help them set up new users for both training and productive use.  This can be a significant administrative task when there are many users, spaces, connections, and shares needed for each user.  NOTE: SAP provides the SAP Data Warehouse Cloud command line interface (CLI) for automating some provisioning tasks.

For a specific customer, we needed to create 30 training users, along with a Space per user, multiple Connections per user, and numerous shares from a common space.  This could all have been accomplished using the SAP Data Warehouse Cloud user interface but we wanted to go faster, and more importantly make it repeatable.

Monday, 18 July 2022

How can SAP applications support the New Product Development & Introduction (NPDI) Process?

In this blog you will get an overview of “How can SAP Applications support the New Product development and Introduction (NPDI) Process for Discrete and Process industry .

Introduction of NPDI Process:

NPDI stands for “New Product Development and Introduction is the complete process of bringing a new product to the customer/Market. New product development is described in the literature as the transformation of a market opportunity into a product available for sale and it can be tangible (that is, something physical you can touch) or intangible (like a service, experience, or belief).

Friday, 15 July 2022

SAP AppGyver – Handling Image : Loading and displaying data

This article is a continuation of the previous one. This article assumes the environment created in the previous article, so please refer to that article if you have not yet done so.

This time, I will explain how to display the image data stored in the BLOB type column of HANA Cloud using the SAP AppGyver application.

Additional development to the AppGyver application

Add a page

In this case, I would like to create a function that displays a list of image IDs, and when I tap on one, the image is displayed.I would like to add a separate page for this function, although it could be created on the same page.

Wednesday, 13 July 2022

The SAP Geoenablement Framework (GEF) now authenticates with ArcGIS Enterprise

SAP HANA, SAP HANA Exam, SAP HANA Exam Prep, SAP HANA Certification, SAP HANA Preparation, SAP HANA Career, SAP HANA Skills, SAP HANA Jobs, SAP HANA News, SAP HANA Guides
Transmission line repair

GEF is integrated into SAP Plant Maintenance.GEF allows Plant Maintenance users to do their tasks using a map – regardless of whether the customer is running SAP ERP Central Component (ECC), SAP Business Suite powered by SAP HANA (Suite on HANA), or SAP S/4HANA.

Monday, 11 July 2022

Difference between Role, Authorization Object/s, and Profile

As a Functional Consultant, one may wonder what a Role is and how different it is from the Authorization Object and Profile. While it is mostly the job of the Security team to assign the required Role for a user, it is also the Functional Consultant’s responsibility to provide inputs about the required Transactions, restrictions within a Transaction, and how these restrictions should vary depending on the user.

Let’s begin this blog by defining what a user is. In simple terms, if a system has our users already created in it, only then we will be able to log in using a username and password. In SAP, Transaction code SU01 is used to create a user. Using this Tr. Code, users can be created, modified, deleted, locked, unlocked, and copied to create a new one. Typically, in a project user creation has certain prerequisites. Initially, the user or the concerned manager requests the user creation by filling in the access form and providing all the required details. This is followed by one or two stages of approval and finally the creation of the user by the Security team.

Friday, 8 July 2022

How to use Custom Analytical Queries in SAP S/4HANA Embedded Analytics?

In this blog post you will learn step-by-step how to create a report in a SAP Fiori environment on operational SAP S/4HANA data. This is done using the ‘Custom Analytical Query’ SAP Fiori app. This SAP Fiori app is standard available in SAP S/4HANA Embedded Analytics and allows users to create reports themselves, directly on the operational data. These reports can be consumed in SAP Fiori or in any other visualization application like SAP Analysis for Office or SAP Analytics Cloud.

How to create a Custom Analytical Query?

To create a Custom Analytical Query the following steps need to be executed:

Step 1: Start the Custom Analytical Query app

Step 2: Create a new report

Step 3: Select the fields

Step 4: Assign to rows and columns

Step 5: Add custom fields

Step 6: Add filters

Step 7: Publish

Wednesday, 6 July 2022

Pass Input Parameters to CV from Parameters derived from Procedure/Scalar Function

I am writing this blog post on SAP HANA Input Parameters. There are few blogs on HANA IPs but they are not giving clear understanding. Here I am giving a basic example which will make the understanding easy for HANA developers.

Those who are working on HANA for quite sometime and developed SAP HANA CVs they must have worked on Input Parameters and Variables.

A Variable:

Variables are bound to columns and are used for filtering using WHERE clauses. As such, they can only contain the values available in the Columns they relate to.

Friday, 1 July 2022

SAP AppGyver – Handling Image: Data Writing

Now, here is an article on SAP AppGyver.

Today I would like to explain how to handle image data. Images can be found at …..There are pros and cons to storing it in HANA Cloud’s BLOB type column.

Assumption

In this article, I will explain how to create an application that takes a photo and stores it in a BLOB type column in HANA Cloud.

Wednesday, 29 June 2022

Providing a solution to an agile business requirement with SAP BTP

In this blog, we will describe the process of identifying and adjusting the correct pieces from SAP BTP Platform in order to solve a specific customer request. This process starts from fully understanding the business needs. Then, how this translates to different SAP BTP components, in order not only to answer the current requirement but also and the future ones.

Business Case

A self service mechanism was requested by a customer (mainly from business users) in order to quickly create or edit new derived time dependent measures. This mechanism will help them to take faster and better business decisions. The key points which driven the provided solution were mainly two, who will use it (Business users) and how / where (from reporting layer).

Monday, 27 June 2022

Analyzing High User Load Scenarios in SAP HANA

If you are a ERP/NetWeaver system administrator, you will face many scenarios where you experience high resource utilization in the HANA DB. In order to correct these situations, you need to analyze the root cause of this load. This blog post will help you in the analysis process to find the root cause of the load. It will help you find the exact application user which caused this load on your HANA DB.

When get reports from your monitoring tools or users about performance issues in the system, do the following:

◉ Login to HANA Cockpit

◉ Open the Database (usually tenant) which is affected by the issue.

◉ Go to CPU Usage -> Analyze Workloads

Friday, 24 June 2022

SAP PaPM Cloud: Downloading Output Data Efficiently

Let’s say that as a Modeler, you have successfully uploaded your data into SAP Profitability and Performance Management Cloud (SAP PaPM Cloud) and utilized SAP PaPM Cloud’s extensive modeling functions for enrichment and calculation. And as a result of your Modeling efforts, you now have the desired output that you would like to download from the solution. The question is: Depending on the number of records, what would be the most efficient way to do this?

To make it simple, I’ll be using ranges to differentiate small and large output data. Under these two sections are step-by-step procedure on how to download results generated from SAP PaPM Cloud – in which based on my experience, would be the most efficient way.

Friday, 10 June 2022

SAP Tech Bytes: CF app to upload CSV files into HANA database in SAP HANA Cloud

Prerequisites

◉ SAP BTP Trial account with SAP HANA Database created and running in SAP HANA Cloud

◉ cf command-line tool (CLI)

If you are not familiar with deploying Python applications to SAP BTP, CloudFoundry environment, then please check Create a Python Application via Cloud Foundry Command Line Interface tutorial first.

I won’t repeat steps from there, like how to logon to your SAP BTP account using cf CLI. But I will cover extras we are going to work with / experiment with here.

Thursday, 9 June 2022

SAP HANA On-Premise SDA remote source in SAP HANA Cloud using Cloud Connector

SAP Cloud Connector serves as a link between SAP BTP applications and on-premise systems. Runs as on-premise agent in a secured network and provides control over on-premise systems and resources that can be accessed by cloud applications.

In this blog, you will learn how to enable cloud connector for HANA Cloud Instance, install and configure the cloud connector. Also to connect an SAP HANA on-premise database to SAP HANA Cloud using SDA remote source.

Wednesday, 8 June 2022

How to deal with imported Input Data’s NULL values and consume it in SAP PaPM Cloud

Hello there! I will not bother you with some enticing introduction anymore and get straight to the point. If you are:

(a) Directed here because of my previous blog post SAP PaPM Cloud: Uploading Input Data Efficiently or;

(b) Redirected here because of a quick Google search result or what not…

Either way, you are curious on how a User could use a HANA Table with NULL values upon data import and consume this model in SAP Profitability and Performance Management Cloud (SAP PaPM Cloud). Then, I got you covered with this blog post. 

Monday, 6 June 2022

Exception Aggregation in SAP SAC, BW/BI and HANA: A Practical approach

Today am going to discuss about a very useful topic about exception aggregation in term of concepts and usage scenario in SAP Analytics Cloud, BW/BI and HANA.

In all the Analytics reports or dashboards Key figures are shown in the aggregated level. But the main question is how the aggregation done and shown in the report ?

In Standard Aggregation applied to a Calculated key figure, Key figure aggregation done by group by all the dimensions in a single row for a single select.

Monday, 30 May 2022

Transforming Hierarchy using HANA Calculation view

Introduction:

This blog post is on usage of two powerful Nodes namely Hierarchy function and Minus Node in HANA Calculation view. Both Nodes are available in SAP HANA 2.0 XSA and HANA cloud.

Minus and Hierarchy function Node are available starting SAP HANA 2.0 SPS01 and SPS03 respectively for on premise and available in SAP HANA CLOUD version.

This use case will be helpful in business scenario where one wants to migrate from SAP BW 7.X to SAP HANA 2.0 XSA or SAP HANA Cloud. SAP BW is known for Data warehousing and strong reporting capabilities. While migration from SAP BW to SAP HANA 2.0 or SAP HANA cloud some of features are not available out the box. In this case HANA cloud is considered as backend for data processing and modelling purpose and Analysis for office for reporting.

Saturday, 28 May 2022

Extending business processes with product footprints using the Key User Extensibility in SAP S/4HANA

With SAP Product Footprint Management, SAP provides a solution, giving customers transparency on their product footprints, as well as visibility into the end-to-end business processes and supply chains. 

SAP S/4HANA comes with the Key User Extensibility concept, which is available both in the cloud and on-premise versions.

Key User Extensibility, together with product footprints calculated in SAP Product Footprint Management, enables customers to enrich end-to-end business processes with sustainability information, helping to implement the “green line” in the sustainable enterprise. With Key User Extensibility, this can be achieved immediately, as the extension of the business processes can be introduced right away, by customers and partners, during an implementation project.

Friday, 27 May 2022

Configuration of Fiori User/Web Assistant with/without Web Dispatcher for S/4 HANA On-Premise System

Overview –

The Web Assistant provides context-sensitive in-app help and is an essential part of the user experience in SAP cloud applications. It displays as an overlay on top of the current application.

You can use the Web Assistant to provide two forms of in-app help in SAP Fiori apps:

◉ Context help: Context-sensitive help for specific UI elements.

◉ Guided tours: Step-by-step assistance to lead users through a process.

Wednesday, 25 May 2022

Migration Cockpit App Step by Step

Migration Cockpit is a S/4HANA app that replaces LTMC from version 2020 (OP).

This is a powerful data migration tool included in the S/4HANA license and it delivers preconfigured content with automated mapping between source and target, this means that if your need matches the migration objects available, you do not have to build a tool from the scratch, it is all ready to use, reducing the effort of your data load team.

Migration Cockpit App, SAP HANA, SAP HANA Exam Prep, SAP HANA Exam, SAP HANA Preparation, SAP HANA Certification, SAP HANA Career, SAP HANA Jobs, SAP HANA News, SAP HANA Prep
Migration Cockpit Illustration by SAP

Monday, 23 May 2022

LO Data source enhancement using SAPI

In this blog we will discuss about LO Data source enhancement using SAPI. The scenario is same.

For a Particular Order we need to have material Status and other fields in our data flow for which Material Number is available in our datasource 2LIS_04_P_MATNR.

But before going to the implementation I want to discuss about the enhancement framework architecture which given below –

BW (SAP Business Warehouse), SAP HANA, SAP HANA Career, SAP HANA Learning, SAP HANA Career, SAP HANA Skills, SAP HANA Jobs

It is better and good practice to enhance i.e. append to communication structure instead of directly appending extract structure. It will increase the scope for reusability.

Sunday, 22 May 2022

SAP BW4HANA DS connation (MS SQL DB) Source system via SDA

Introduction:

As you are all aware now, you cannot connect DS directly to the SAP BW4HANA system, Hance needs to connect DS DB with the HANA database via SDA and create a source system.

Based on customer requirements set up HANA DB connection with MS SQL DB and set up source system.

DISCLAIMER

The content of this blog post is provided “AS IS”. This information could contain technical inaccuracies, typographical errors, and out-of-date information. This document may be updated or changed without notice at any time. Use of the information is therefore at your own risk. In no event shall SAP be liable for special, indirect, incidental, or consequential damages resulting from or related to the use of this document.

Friday, 20 May 2022

Data driven engineering change process drives Industry 4.0

We are “Bandleaders for the Process.” Our mission is to orchestrate plant operation, with leadership across the manufacturing value chain. This reminds me of my job in the plant 20 years ago.

One important mission was to manage engineering change; it required a lot of time and attention to plan, direct, control and track all the activities across the team with multiple files and paper documents:

◉ What is the impact of change?

◉ When will the new parts come from the suppliers? How many old parts do we have in stock?

◉ Which production order should be changed? What is the status of production orders?

◉ Are new tools ready? Have all of build package documents been revised?

Wednesday, 18 May 2022

Rise with SAP: Tenancy Models with SAP Cloud Services

Introduction

Transitioning to Rise with SAP cloud services, SAP customers have a choice of opting for either single tenanted or multi-tenanted landscape. The choice of tenancy model largely depends on the evaluation of risk, type of industry, classification of data, security, sectorial and data privacy regulations. Other considerations include performance, reliability, shared security governance, migration, cost, and connectivity. While customer data is always isolated and segregated between tenants, the level of isolation is a paramount consideration in choosing a Tenancy Model.

In this blog, we will cover tenancy models available under Rise with SAP cloud services and explore nuanced differences and some of the consideration for choosing each of the tenancy models.

Monday, 16 May 2022

SAP HANA, express edition and SFLIGHT demo database, modeling – Complete Tutorial

Task

Try the SAP HANA Modeling functions and possibilities with the SFLIGHT demo database.

Reason of the Article

To illustrate the process by a concrete example based on the SFLIGHT database. During the process there are many additional steps and setups are necessary. There are also some helpful documentation is available, which can be used during the process. I am trying to collect these information also inside this post. Also, this is a step-by-step guide which goes through the process.

Definitions

SFLIGHT is a sample database. Official documentation can be found here: Flight Model

Saturday, 14 May 2022

APL Time Series Forecast using a Segmented Measure

The latest release of the Automated Predictive Library (APL) introduces the capability to build several time series models at once from a segmented measure like Sales by Store for example or Profit by Product. No need any more to define a loop in your SQL code or Python code. Just tell APL what column represents the segment in your dataset. You can also specify how many HANA tasks to run in parallel for a faster execution.

This new capability requires HANA ML 2.13 and APL 2209.

Let’s see how it works in Python and then in SQL.

Friday, 13 May 2022

Setup SAP HANA XS with HTTPS

When you install SAP HANA 2.0, SSL certificate in PSE store is self-signed. In order to allow for signed SSL HTTP connections with SAP HANA, we need to replace default self-signed certificate with a new one signed by a CA of your choice.

Steps

1. Go to below URL

https://FQDN:4300/sap/hana/xs/wdisp/admin/public/default.html

2. Open the tree of SSL and Trust configuration, Click on PSE Management

Wednesday, 11 May 2022

Running hdbcli on an Apple M1 chip: an alternative way with using arch command

A look at arch command

I tried an alternative approach, that I would like to share here. It does not involve making a copy of the terminal application, but instead using arch command.

As man arch explains:

By default, the operating system will select the architecture that most closely matches the processor type. … The arch command can be used to alter the operating system’s normal selection order.

The arch command with no arguments, displays the machine’s architecture type.

Friday, 6 May 2022

Understanding the Configuration of SAP HANA NSE

This blog is intended to provide some more understanding about SAP HANA NSE and its configuration.

Design Principles of NSE

SAP HANA NSE adds a seamlessly integrated disk-based processing extension to SAP HANA’s in-memory column store by offering a large spectrum of data sizes for an improved cost-to-performance ratio. It manages data without fully loading it into memory. This offers the ability for processing in-memory stored data for performance critical operations (hot data) and NSE-managed data for less frequent accesses (warm data).

Monday, 2 May 2022

HDI, get objects details

Introduction

When working with HANA Repository, you may be used to query the table “_SYS_REPO”.”ACTIVE_OBJECT” to get details about design-time objects.

With HDI, the same information is now split by containers and you have to query 2 objects:

◉ the view M_OBJECTS

◉ the procedure READ_DEPLOYED

For some use cases, it would be convenient to be able to query all HDI containers at once and get information from M_OBJECTS and READ_DEPLOYED at the same time.

Friday, 29 April 2022

Implementing SAP S/4HANA with SAP Activate: Best Practices

There’s a tradeoff in SAP implementation that forces users to choose between standardization and customization. SAP S/4HANA focuses on standardization while still trying to support integration from third-party providers. By standardizing the processes within SAP S/4HANA and extending that standardization to third-party add-ons, the software suite makes it easier and faster to adopt cloud innovations as they become available. Experts in the field refer to this thinking as a “cloud mindset.” It’s also colloquially termed “keeping the core clean,” meaning that their interaction with third-party plugins does not taint the core services. SAP Activate was developed to help businesses keep their core clean and create a cloud mindset for their SAP installations. Industry best practices for this process follow, underlining SAP Activate’s approach to keeping the core clean.

Thursday, 28 April 2022

Serial Number Management in SAP S/4HANA Cloud

You can use serial number management in SAP S/4HANA Cloud Warehouse Management to identify and track individual products in your warehouse, from goods receipt to goods issue. For products with serial numbers, it is easy to track, even retrospectively, on which date it was delivered from which supplier, who performed the quality inspection, who moved this product, when this was done, and when it was packed and sent to the customer.

A serial number is a series of characters that have been assigned to each product in the warehouse so that it remains separate and are identified in the warehouse system. You can use serial numbers up to a length of 18 characters in an SAP ERP system.

Wednesday, 27 April 2022

SAP S/4HANA Embedded Analytics: An Overview

In this blog post I will discuss the overview of SAP S/4HANA Embedded Analytics which  is one of the key innovations of S/4. It is a collection of SAP Fiori tiles in SAP S/4HANA that enable real time operational reporting in your transactional system. There is no ETL (Data Extraction, Transformation, and Loading) and batch processing required, compared to traditional reporting landscapes involving ERP and data warehouse systems. Embedded Analytics uses SAP Fiori as the front-end user interface and SAP has delivered standard content known as Fiori analytical apps for various functional areas. Standard content can be leveraged right out of the box as an accelerator for your implementation and can also be personalized and extended based on business requirements.

Why should I use SAP S/4HANA Embedded Analytics?

◉ Business Innovation through Radical Simplification

◉ Agility and Creativity for the Business

◉ Scale and Trust for the Enterprise

Monday, 25 April 2022

RISE with SAP to transform your business

Introduction

In recent years, business transformation has become a hot issue and a trending topic in the SAP community. Let us dive into this blog post to find more sense on the matter.

What is business transformation?

A business may undergo a transformation for a variety of reasons. They could be the result of new technology, market shifts, low profit and turnover, resulting in cost cutting, or a merger and acquisition. These factors are critical for a company’s success, necessitating a change in the business ecosystem.

Saturday, 23 April 2022

How to Find Which Structure is Included in Which Structure?

Introduction

In Every Projects you go there will be requirement to enhance the Standard Screen to add some new Custom Fields. The general approach you do use Append Structure and add your Custom Fields. Sometime while analysing it may happen that you have the Append Structure name but you cannot find the Actual Structure it is attached to. You must be thinking Where-used list should help you find out but No it does not.

Solution:

Let us first analyze the issue. Suppose the below structure I am looking to find where is attached.

Friday, 22 April 2022

Maintaining business roles in SAP S/4HANA Cloud

Introduction

This blog post describes applicable for S/4HANA Cloud system administrators scenario: during the system implementation it is necessary to create a new business role by the business key user request. In the same time, business key user request includes name of the app, to which it is necessary to set up access, but not the business role name itself. As an example we will take the app “Display Dunning History”.

Identifying business role name

To identify business role name, we will use Fiori apps library. In there we need to to to “All apps for SAP S/4HANA” -> All apps and enter then in search field name of the target app. In our case it is “Display dunning history”:

Wednesday, 20 April 2022

Key and unique building blocks of SCC Multi-Tiered Subcon Collaboration Process

In this blog, I want to share my experience and learnings with multi-tiered subcontracting collaboration process using my apparel client’s example. I will attempt to focus on the unique functional aspects that needs attention during any implementation of this process. Let me start with overview of the business case and the process adopted by business. Later, I will deep dive into the key building blocks and its details.

Business Case and Process adopted with SCC

One of my apparel clients in USA is sourcing garments from China’s manufacturer. A special fabric is used in manufacturing of specific silhouettes of garments. This fabric is sourced from Ethiopian (East Africa) supplier. In this case my client, was looking for drop shipment process for special fabric from Ethiopian supplier to Chinese manufacturer to save transportation time and space in their own warehouse. Chinese manufacturer was provided the visibility of the fabric order via manual spreadsheets and e-mails, so that he can plan the production and subsequent shipments accordingly.

Monday, 18 April 2022

Delivery Time Calculation for Summarized JIT Call with “CYCLE”

Solution Overview

As you know SAP KANBAN solution have provided external procurement function which can issue summarized JIT Call document to suppliers for fulfillment. it is definitely containing delivery date and time in the header information on summarized JIT Call documents, and those can be calculated by SAP standard JIT calculation profile configuration also. but during the solution delivery, you will be facing the “CYCLE” issues which was broadly and deeply operated on Japan automotive OEM and their suppliers who running KANBAN in factory.

Friday, 15 April 2022

Role of Migration Server/Import Server in Brownfield implementation-RISE with SAP, S/4HANA Cloud,private edition

In this blog I wish to discuss about the basic usage of migration server(VM) / Import Server during a brownfield system migration to SAP S/4HANA Cloud,private edition.

Below are the questions that this blog is targeted in addressing.

1. What is a migration server / Import server in RISE with SAP S/4HANA cloud,private edition?

2. What is the configuration of migration/Import server?

3. Why should the customer subscribe for additional storage while using Migration server?

4. How many migration/import server are needed to complete a 3 system landscape migration to SAP S/4HANA Cloud,private edition?

Wednesday, 13 April 2022

DMO: BWoH: From SAP Netweaver BI 7.0 to SAP Netweaver 7.5 BI on HANA {Technical preparation and execution}

Summary of what we achieved: We have recently migrated and upgraded the BW/BI system from Netweaver 7.0 to Netweaver 7.5 using the SAP’s robust tool SUM 2.0’s DMO feature for one of our customer. The database size was nearly 3 terabyte and the resultant HANA database was around 250 Gigabyte.

The existing infrastructure was having database as Oracle 11.2.0.2.0 and Operating System HPUX ia64.

So if anyone is going to work on a similar assignment, can refer to this blog post for insight.

Monday, 11 April 2022

Data preview on Intermediate nodes of a CV in HANA cloud/On-premise

Data preview on intermediate nodes of a calculation view is a regular task while using HANA studio as a development tool. it is useful for debugging/troubleshooting data output at each node level of a CV.

But after migrating to HANA XSA(On-Premise) or HANA cloud(Cloud platform) then you will need to use Web IDE for HANA(On-Premise) or Web IDE fullstack(Cloud) as development tools. Usually, Web IDE is connected to Dev instance and Not to Prod. if you try to connect to Prod, there is a risk of unexpected changes deployed in Prod.

In this situation, if you wanted to do the data preview on a graphical CV in Web IDE, it can only happen in Dev instance. For Production, we will have to implement the firefighter access set up and run SQL to generate data preview on the intermediate nodes.

Friday, 8 April 2022

Deploy Machine Learning/Exploratory Data Analysis Models to SAP Business Technology Platform

DISCLAIMER: Please note that the content of this blog post is for demonstration purpose only, it should not be used productively without impact evaluation on production environment.

Introduction:

In this blog, we will implement an end to end solution for Python based web application(Flask) on SAP Business Technology Platform.

◉ We will use a cloud based HANA DB, and will leverage python package hdbcli to fetch the relevant data using SQL statement.

◉ We will be using python data science packages such as pandas,seaborn and matplotlib to display various graphs showing Exploratory Data Analysis.

Wednesday, 6 April 2022

Some interesting Facts of Compatibility Views in SAP BW/4HANA and SAP S/4HANA

As the name implies, SAP BW/4HANA or SAP S/4HANA are fully integrated with the underlying SAP HANA platform. This also means, that the physical data model changed compared to their predecessor products (for example in FICO area). Some tables are removed, changed or consolidated which makes a lot of sense to leverage the full power of the columnar in-memory concepts of SAP HANA.

In order to facilitate customers´ transformations, SAP has introduced so-called compatibility views to keep some core interfaces running. If a compatibility view is in place for an obsolete table, requests to access that table are redirected to newly introduced table(s) by the help and logic provided in the compatibility view.

Monday, 4 April 2022

Multiclass Classification with APL (Automated Predictive Library)

Common machine learning scenarios such as fraud detection, customer churn, employee flight risk, aim to predict Yes/No outcomes using binary classification models. But sometimes the target to predict has more than just two classes. This is the case of Delivery Timeliness that can have three categories: Early/On-time/Late.

From this article you will learn how to train and apply a multiclass classification model in a Python notebook with HANA ML APL.

The following example was built using HANA ML 2.12.220325 and APL 2209.

Census Income will be our training dataset.

Friday, 1 April 2022

Two simple tips to boost the working efficiency of a Data Science Project

How can we make our daily work more efficient? Is there any straight forward answer? For me, the answer is only one word, experience.

Participating on several Data Science projects the last years, i was really amazed how fast you can confirm the saying “Almost 70-80% of a Data Science project is spent on the Data preparation”. There are two simple tips that will be presented on this blog post regarding the Data preparation process.

The first one is comparing four different ways, that a data scientist in SAP HANA, can create random sample datasets from an initial dataset and which can be their potential usage. The second one is exposing the power of SAP HANA ML on creating and automating a set of new aggregated columns (max(), sum(), avg() for example) from existing columns without the need of writing complex and big SQL queries (feature engineering part).

Wednesday, 30 March 2022

SAP’s Banking Product – SAP Financial Services Data Management and Platform(FSDM/FSDP), SAP Financial Planning Subledger(SAP FPSL) Part 1

SAP’s Banking Product – SAP Financial Services Data Management, Platform(FSDM/FSDP), SAP Financial Planning Subledger(SAP FPSL)

Writing this series of blog post on SAP’s Banking products formerly known as SAP Bank Analyser now known as SAP Financial Planning SubLedger or SAP FPSL. SAP was eagerly working straight after SAP S/4 HANA release to provide banking solutions on it’s flagship product Database i.,e SAP HANA and SAP Application suite ABAP stack on top it.

The main reason of writing this blog post is to showcase the power of SAP’s HANA database and there are less information available except on the SAP’s FSDP official website/Video.

Friday, 25 March 2022

How to Set Up SAP HANA on Azure

SAP HANA deployment on Azure enables companies to evaluate and then run development, test, sandbox, and training environments for different SAP products. Deploying SAP HANA in the cloud enables customers to avoid the traditional path of procuring hardware and then installing SAP. Azure offers benefits like scalability, availability, and cost savings.

Hosting SAP HANA on Azure

SAP HANA is an in-memory Relational Database Management System (RDBMS). SAP HANA uses a solid-state Random Access Memory (RAM) to store data. This functionality improves database performance compared to traditional databases that use persistent storage.

You can run SAP HANA on-premises on your own dedicated hardware, or in a public or private cloud. Launching a SAP HANA certified Virtual Machine (VM) or a bare metal server enables you to run SAP workloads in Azure while optimizing costs

Wednesday, 23 March 2022

S/4HANA Cloud Applications Monitoring with Cloud Integration

Introduction

As my first blog on the SAP Community, I would like to tell you a little story. A few months ago I joined an AMS project and some of my daily tasks involved a twice-a-day monitoring on some SAP Cloud Integration iFlows and a few SAP S/4HANA applications like Message Monitoring, Manage Output Items, etc.

In the beginning I took my time in analyzing the payloads and error sources because there were many scenarios and I could not evaluate them as fast as I can now, but because I got better at doing my job I ran into the need of fetching all the messages with their details in a more efficient way.

Monday, 21 March 2022

Why Do I Prefer Shell Conversion for BW4HANA Migration?

Abstract:


This Blogpost discusses the various migration strategies in fundamental Business and Technical sense.

Disclaimer:


In this blog post, the points that I brought are my own experience and thought process. It may vary from place to place. However, this blog post will give some thought processes to the reader before taking a call on the migration strategy.

Friday, 18 March 2022

HDI: returning multiple deployment errors

Problem:

Working with SAP HANA Cloud and HDI the tooling stopped at the first error and you had to fix them in the sequence the errors where thrown. All this one by one. This can lead to a lot of cycles that are time consuming and do not support the workflow one had in mind.

Solution:

With SAP HANA Cloud March 2022 the HDI can now return the first error it can detect per each “dependency branch”.

Wednesday, 16 March 2022

SAP HANA XS Advanced Administration, Deployment and Operations

Many SAP Customers started using XS Advanced and there were several questions on how to deploy or where to deploy and what are the consideration for choosing any of those approached. Based on my experience with various customers I tried to address some key topics in this blog to consolidate information from various sources of documentations

1. Deployment Options

There are various ways of deploying the XS Advanced runtime. Before deploying the XSA many scenarios need to considered that will have impact on your landscape maintenance activity. For example the system refresh scenarios where you will need to refresh only a certain tenants instead of complete system this deployment plays a key role. I will go through the limitations in detail in backup restore section. In general, The following additional services run where XSA is deployed:

Sunday, 13 March 2022

The fastest way to load data from HANA Cloud, HANA into HANA Cloud, HANA Data Lake

Overview

Recently as customers are moving larger and larger tables from HANA into HANA Data Lake, I am being asked what the fastest way is to move data from HANA to HANA Data Lake.  Or more precisely I am asked if there is a faster way then doing a simple HANA INSERT into a HANA Data Lake virtual table.

You may be asking why customers are moving large tables from HANA to HANA Data Lake (HDL) and the most popular use case for this is an initial materialization of a large datasets or archiving older data to HDL.  Most of these customers are using HANA Smart Data Integration (SDI) to do this materialization and often using the same interface for change data capture using SDI Flowgraphs or SDI real-time replication to keep these tables up to date.

Monday, 7 March 2022

Preserve and Identify Source Deleted Records in HANA via SLT

Requirement:

Need to preserve S/4HANA table hard-deleted records in Enterprise HANA.

Identify these records in Enterprise HANA by setting IS_DELETED = ‘Y’.

Challenge: SLT Replication by default will ensure that both source and target data records match all the time. This means that even deletion (of record) in the Source will be passed on to the Target system and cause a deletion in Target – to ensure data record count matches exactly between Source and Target tables.

Friday, 4 March 2022

SAP Tech Bytes: SAP HANA / CAP Access Tables from a Different Schema

Introduction

I read on social media about a New Year’s resolution idea: instead of answering questions sent in direct communication, write the response as a blog post and send the requester a link to the post. It sounds like a great idea to better utilize time and share knowledge so I decided to give it a try. Fourteen days into the new year and so far I’ve failed spectacularly. I find when I go to write a blog post I want to provide more background and detail. All this takes more time than you can usually squeeze into the day. This blog post represents my attempt to take at least one question I’ve received and answer via blog post, although admittedly after already responding to the original request.

Wednesday, 2 March 2022

HANA NSE (Native Storage Extension) Data Tiering Options for Utilities

Purpose – An attempt to explain HANA NSE (Native Storage Extension) concepts in simple words for anyone looking to understand this topic. I have also tried to simplify steps used in implementing HANA NSE.

Topics covered –

– Reasons or case to implement HANA NSE.

– Basic concepts of HANA NSE.

– How to find tables/objects which can be candidates for implementing NSE?

– How to use DVM, DBA cockpit, and NSE advisor.

– Examples and links to the documentation provided by SAP.

Friday, 18 February 2022

SAP Analytics Classification algorithm: Predict the potential profits of marketing campaigns

Profiling the client’s attributes that influences the positive response to sales

Classification is one of the Machine Learning algorithms of SAP Analytics cloud to find what are the variables that have more influence to get a positive result.  Based on this information we can calculate what is the maximum profit of a marketing campaign. The algorithm takes past information and evaluates the results to give how good is the data to do predictions.

We are going to train the Classification process with the data of a past marketing campaign to know the possible effectiveness percentage of a campaign based on:

Monday, 14 February 2022

Forecasting Intermittent Time Series with Automated Predictive (APL)

Starting with version 2203 of the Automated Predictive Library (APL) intermittent time series are given a special treatment. When the target value has many zeros, typically when the demand for a product or a service is sporadic, APL will no longer put in competition various forecasting models, but it will systematically use the Single Exponential Smoothing (SES) technique.

For SAP Analytics Cloud users, this functionality is coming with the 2022.Q2 QRC release in May.

Let’s take the following monthly quantity as an example.

Friday, 11 February 2022

Move data FAST from an SAP HANA Cloud database to a HANA Data Lake

If you’ve ever been stuck wondering how you can move data from your SAP HANA Cloud database to your SAP HANA Data Lake with minimal effort, this is for you. In fact, it might be the fastest way to move your data depending on your data lake’s configuration. Plus, it’s simple enough that a single Python function can do all the work for you!

This blog will outline how you can leverage remote servers in the data lake to make a connection to your HANA database and pull the data from HANA to data lake. I also experiment with different data lake configurations to see what parameters affect the speed of this data transfer the most, that way you know how to scale your data lake instance to achieve the best performance.

Thursday, 10 February 2022

Reduce SAP HANA Memory Footprint and TCO of SAP HANA with NSE

Efficient Data Management strategy is important for SAP HANA customers to achieve low TCO (Total cost of ownership) and maintain memory footprint.

SAP HANA Data growth comes with its associated challenges and leave customers in Dilemma about the actionable to manage the growth, TCO and performance as depicted in below graph

SAP HANA Exam Prep, SAP HANA Career, SAP HANA Learning, SAP HANA Prep, SAP HANA Preparation, SAP HANA Guides

Wednesday, 9 February 2022

Can data modeling be enhanced by incorporating business knowledge?

If the answer is yes, then in what extent and how?

In this blog post we will try to answer the question by approaching a real-life scenario. We will discuss the challenges related to high amounts of data and the resulting processing times if the access to that data is not optimally planned and the structure of the data is not designed in a proper way. Finally, we will see how business knowledge about the scenario will provide information about the data modelling, which will boost the performance significantly.