Logo

Table of Contents:

User Guide

Prerequisites needed in the Gekkobrain Cloud

Before you start the extraction of data from your SAP system, you need to prepare the initial steps in the cloud, to receive the data.

Cloud configuration

The cloud is organized in SYSTEMS, DATASOURCES and PROJECTS.

The cloud regularly receives information from the extractors which you have installed in your SAP systems. But 2 things needs to happen before the data is displayed in the cloud. 1) setup of the cloud to recognise your SAP environment and the data from those systems, and 2) initial setup of the extractors in your SAP systems including reports variant and batchjob timings and frequency.

Creating your systems

In order for Gekkocloud to recognise your SAP systems they need to be created in the cloud.

Navigate to the menu on the left called "New System", its visible after you press Systems.

Variable Example Value Comment
System ID DEV It makes sense to choose your SYSID here
Description Development A meaningful description to yourself
System Type OPS Either DEV or OPS

Note: Please create your Prod (OPS) system before creating your Dev system in order for you be able to link the OPS system during the Dev system creation.

Proceed to create as many Dev and Ops systems as you want to monitor. If you have more than one Dev system pointing to Production create both the Dev systems in the cloud.

Below is an example of a normal system setup. Remember that its easiest if you create the Ops system first because you need to point a Dev system to an Productive system.

example: A typical Ops/prod system

Variable Value
System ID P10
Description Wholesale Production
System Type OPS

example: A typical Dev system that connects to production

Variable Value
System ID D10
Description Wholesale Development
System Type OPS
OPS System P10

Creating datasources for your systems

For a given Dev-Ops system chain you need to set up the required datasources for both your Dev and your Ops system.

The datasources represents a "data-area" in the cloud and they are fed by extractors that you have installed in your systems.

Navigate to the menu on the left called "New Datasource". Its visible after you press Data Sources.

Variable Example Value Comment
System ID DEV This is a drop-down of the systems you have created
Template TRANSPORT This is a drop down of the datasources available to you
Description TMS data(Dev) A meaningful description to yourself

In a DevOps scenario you need to setup these datasources:

  Dev Prod(Ops)
TRANSPORT Yes Yes
CROSSPEPEND Yes Not needed
DUMP Not needed Yes
APPSTATS Not needed Yes
WEBSTATS Not needed Yes
SQLM Not needed Yes
HANAREADY Yes Not needed
PERFORM Yes Not needed
S4ESTIMATE Yes Not needed

Data refresh rate (batch-job frequency)

Exactly these datasources and their corresponding SAP extractors needs to be active for you to benefit from a complete DevOps scenario. This model example suggests the frequency and timings for the batch-jobs.

  Dev Prod(Ops)
TRANSPORT Hourly (delta), One daily full load Hourly (delta), One daily full load(*)
CROSSDEPEND Daily, some time after midnight -
DUMP - Daily, some time after midnight
APPSTATS - Daily, some time after midnight
WEBSTATS - Daily, some time after midnight
SQLM - Daily, some time after midnight
HANAREADY Daily, some time after midnight -
PERFORM Daily, some time after midnight -

(*) If the import to the system does not happen more than once a day the Production TRANSPORT extractor obviously only needs to run after the import has run.

Start working with Insights

When you have setup the extractors in SAP and the transfer of data has been completed, you will be able to get started with Gekkobrain Insights.

Depending on your role in Gekkobrain - you can get Insight from the following areas:

The Developer Dashboard

The dashboard will deliver valuable Insights to the SAP developers regarding their ongoing development and also about the development already deployed to production.

To navigate to the dashboard - choose ‘Dashboard’ -> ‘Developer’ in the menu.

Gekkobrain Insights will automatically identify any ongoing development when an ABAP Object is added to a transport in the SAP Development system. For each ABAP Object the Insights tool will check/display the following (depending on the datasources available):

The Developer Dashboard is available for both Ongoing Development and Development already Deployed to Production.

The Deployment Dashboard

This dashboard will show all the developments - both open tasks and development which has been released for testing but not deployed to production yet.

For each of the ABAP Objects on the Deploment Dashboard you will be able to filter for:

The OPS Dashboard

The OPS dashboard will give you insights into the stability and performance of the productive enviroment.

The dashboard is divided into the following areas:

Deployed Development

The deployment list will display all development objects which have been deployed into the productive system.

The purpose of this deployement list, is to identify potential issues which might need to be addressed/fixed.

The deployment list focuses on the following issues:

Dumps

The Dumps dashboard will give you detailed insights into the dumps found in the productive system.

In Gekkobrain Insights you will be able to browse dumps information for a longer timeframe compared to whats possible in SAP ST22. This means that you are able to see the stability trend in your system.

You can also see all the dumps based on the category of the dump-type. The category can often tell how or where to spend the time in the dump resolution. E.g. programs dumping with RFC_NO_AUTHORITY could be failing due missing or incorrect Auth. setup, TSV_TNEW_PAGE_ALLOC_FAILED is related to very high memory allocation in a program.

For each dump category you can see all the related programs, how often the dump occurred and the amount of users affected by the dump.

Furthermore you can drilldown to the dump details for specific dumps within a specific program. Here you will be able to see the code fragment causing the dump, how often the dumps occurs, how many users were affected, whate version of the ABAP code introduced the dump and whether there are dependencies to other programs, which might “inherit” the dump.

App. Stats

The App. Stats will give you detailed information about the usage, performance and memory consumption within the SAP Transcations having the highest execution rate.

The overview dashboard will show the App. Stats based on the Top50 most executed transactions. You will be able to see the overall trend for the avarage response time and memory consumption for these transactions, and also see the usage trend for the same. Furmore you can select a specific module or a specific transaction, and you will see the trends for this sub-area only. From the overview you can drilldown to either the SQL queries involved in the transaction or the users running the transaction.

If you drilldown into “Queries” you will see all related SQL fragments found within the transcatione and see the corresponding performance for these SQL fragments. This will help developers to pinpoint which SQL fragments are the real performance improvement candidates.

Web Stats

The Web Stats will monitor how your SAP Services & end-points are performing - and its using data from both from the SAP Gateway and the SAP Backend system.

The overview dashboard will show the Web Stats based on the Top50 most executed services. You will be able to see the overall trend for the avarage response time and memory consumption for these services, and also see the usage trend for the same. These services are quite often consumed by Apps or applications outside of SAP, so getting the insights about the performance and amount of users are very important.

For each of the services you can drilldown to see the individual performance and usage of the underlying endpoints in the service.

If you drilldown at user level, you will be able to see the userexperince for the particular service - how often the user has executed the service and the average response time for the service.

Batch Stats

The Batch Stats monitors all programs running as part of batch-jobs in your system. The batch-jobs are often used for time consuming tasks or tasks which should be repeated automatically in the system.

The overview dashboard will show the Batch Stats based on the Top50 most executed programs. You will be able to see the overall trend for the avarage response time and memory consumption for these programs, and also see the usage trend for the same. Its important to monitor the programs in the batch jobs. They are running in the background of the system, without any user dialog, and therefore you might miss the feedback if a program will get an increase in the responsetime or amount of executions. Both scenarios could lead to an overall decrease of performance in the system.

If you select a specific program you will be able to see the trend for that specific program - and see how the performance is for that task or if the amount of executions has grown in the system.

SQL

The SQL dashboard is valuable SQL information collected from your productive enviroment, where you wil get insights into the performance of individual SQL fragments.

The SQL Stats will show the Top50 of the most executed roots in your system. You can decide whether you want to see overall list based on both SAP and Custom Code - or only for one of these categories. If a SAP standard root contains elements of Custom Code, it will still be listed under Custom Roots.

When you drilldown on a selected Root, you will find one or more levels (e.g. could be subroutines, function or methods). At the lowest level you will be able to see the SQL fragments at that specific level. Use the SQL fragments to determine where there might be an improvement candidate.

The Solution Component Builder

If you are planning e.g. to migrate to S/4HANA, you might consider doing a selective brownfield migration of your Custom Code. So instead of spending a lot of time fixing Simplification issues in the entire Custom Code base, then you would like to focus at selected Custom Code elements which are relevant in order to support a given set of processes in the new system. But how do you make sure that you capture all the right Custom Code elements, and how do you check that all dependencies are covered as well?

The answer to this is the Gekkobrain Solution Component Builder. The purpose of this service is to bundle all the relevant Custom Code elements into a so-called Solution Component. In order to create a new solution component you can use 2 different types of input:

Using Processdata from Gekkobrain Flows

When you define a business process in Gekkobrain Flows, you also get the series of Roots which are part of the business process. If the Root is custom, then Gekkobrain is able to find all the related custom code elements and transfer them to your solution component. If the Root is SAP standard, you could still have custom code elements which are relevant to capture in the solution component (e.g. BADI’s or Userexist). We extract both elements from the selected Flow.

Using SQLM data from Gekkobrain Insights

As an alternative to Gekkobrain Flows, you can also use SQLM data as a datasource. Start by creating a manual Solution Component in your S/4HANA Project. Then you need to determing some of the known custom code elements you plan to capture and add them to the manual solution component. With this as an input, you can use the solution component builder to find clusters of roots or other custom code elements which are related, and should be added to the same solution component.