Logo

Table of Contents:

Installation

Latest release

This documentation refers to the latest release of the Gekkobrain Insight Extractors

Download the Insight Extractors

The Insight Extractors are part of the Gekkobrain Extractor Framework. Use the following guideline to download the package

How to setup and run the Insight Extractors

After setting up the cloud its time to setup the jobs in the backend.

Running the extractor in RFC mode or local mode

Schedule the jobs as recurring jobs with the frequency mentioned previously. The cloud needs data from all the systems that you want to monitor, but since all the extractors that need data from your productive environment can receive data via an RFC call to production you do not have to run the batchjob in production meaning you do not have to transport the extractors to production. The requirement for is that your dev environment has an RFC destination to production.

If you prefer to run the batchjobs for production locally in production then you must transport the extractor to production! Leaving the field RFC blank will automatically switch the extractor to local mode.

Maintain and test your RFC connection in transaction SM59 before proceeding

The extractors all have a a parameter for specifying the RFC destination. If you leave the field empty then extractor will think that the data is located in the current system.

Example configuration

example for RFC

Parameter Value
RFC-Destination PRDCLNT010

example for local mode

Parameter Value
RFC-Destination  

Batchjob variants

In the following section all the extractors are detailed except HANAREADY and S4ESTIMATE as these are detailed in the assesment part of this documentation.

DATASOURCE: TRANSPORT

The transport datasource receives data from the SAP extractor named "/GEKKOBR/EXTRACT_TRKORR_INFO".

Go to SE38 to save 4 variants for this report.

1) Current days changes (the hourly frequency) sending data from your Dev system 2) Current days changes (the hourly frequency) sending data from your Ops (prod) system 3) A full dataload (the daily job) sending data from your Dev system 4) A full dataload (the daily job) sending data from your Ops (prod) system

Example configuration

An example of recommended OPS system selection screen values explained below:

Parameter Value
Prod. Mode X
Full Load X
URL Subdomain admin
Endpoint inbound-data
URL Domain gekkobrainhosted.com
SSL id see section regarding SSL id
Proxy host see section regarding http proxy setup
Proxy service see section regarding http proxy setup
API Key Taken from the “system list” in the cloud
SAP System ID Prefills with you sysid - remember ot replace if you are using RFC mode
Account Number Taken from "account information" menu in the cloud

An example of recommended DEV system selection screen values explained below:

Parameter Value
Dev. Mode X
Full Load X
URL Subdomain admin
Endpoint inbound-data
URL Domain gekkobrainhosted.com
API Key Taken from the “system list” in the cloud
SAP System ID Prefills with your sysid
No of lines in payload 200
Account Number Taken from “account information” menu in the cloud

DATASOURCE: CROSSDEPEND

The crossdepend datasource receives data from the SAP extractor named "/GEKKOBR/EXTRACT_WHERE_USED".

Go to SE38 to save a variant for this report.

The recommended Dev system selection screen values explained below:

Parameter Value
URL Subdomain admin
Endpoint inbound-data
URL Domain gekkobrainhosted.com
API Key Taken from the “system list” in the cloud
SAP System ID Prefills with you sysid - remember ot replace if you are using RFC mode
Account Number Taken from “account information” menu in the cloud

DATASOURCE: APPSTATS

The appstats datasource receives data from the SAP extractor named "/GEKKOBR/EXTRACT_PERFORM_DATA". This extractor is serving 2 different datasources, namely APPSTATS and WEBSTATS. WEBSTATS will be detailed afterwards.

Go to SE38 to save a variant for this report.

The recommended Dev system selection screen values explained below:

Parameter Value
Appstats X
Yesterdays data X
URL Subdomain admin
Endpoint inbound-data
URL Domain gekkobrainhosted.com
API Key Taken from the “system list” in the cloud
SAP System ID Prefills with you sysid - remember ot replace if you are using RFC mode
Account Number Taken from “account information” menu in the cloud

Other date-spans of data There are 2 different scenarios in which it makes sense to load other than yesterdays data. 1) If you are running the extractor for the first time and want as much historical data in the cloud as possible 2) If the job sending data was suspended for one or more days

Example configuration

example of a full load

Parameter Value
Full load X

example of a specific date load

Parameter Value
Specific date X
Date 20.12.2012

DATASOURCE: WEBSTATS

The appstats datasource receives data from the SAP extractor named "/GEKKOBR/EXTRACT_PERFORM_DATA". This extractor is serving 2 different datasources, namely APPSTATS and WEBSTATS.

Depending on where the ODATA is served from, from an embedded gateway or central-hub gateway, setup WEBSTATS to monitor the system where the http calls are logged. 

The cloud will display the WEBSTATS activity on each Ops/prod system, not on the central hub, eventhough that is where you might be retrieving it from.

The performance measures are relevant for the ABAP and Database performance and these ressources are taken from the Ops/prod systems ressources.

Go to SE38 to save a variant for this report.

The recommended Dev system selection screen values explained below:

Parameter Value
Webstats X
Yesterdays data X
URL Subdomain admin
Endpoint inbound-data
URL Domain gekkobrainhosted.com
API Key Taken from the “system list” in the cloud
SAP System ID Prefills with you sysid - remember ot replace if you are using RFC mode
Account Number Taken from “account information” menu in the cloud

Other date-spans of data There are 2 different scenarios in which it makes sense to load other than yesterdays data. 1) If you are running the extractor for the first time and want as much historical data in the cloud as possible 2) If the job sending data was suspended for one or more days

Example configuration

example of a full load

Parameter Value
Full load X

example of a specific date load

Parameter Value
Specific date X
Date 20.12.2012

DATASOURCE: DUMP

The dump datasource receives data from the SAP extractor named "/GEKKOBR/EXTRACT_DUMP_INFO".

Go to SE38 to save a variant for this report.

The recommended OPS system selection screen values explained below:

Parameter Value
# of past days from yesterday 1
Include ABAP that dumps X
Include Call Stack  
URL Subdomain admin
Endpoint inbound-data
URL Domain gekkobrainhosted.com
API Key Taken from the “system list” in the cloud
SAP System ID Prefills with you sysid - remember ot replace if you are using RFC mode
Account Number Taken from “account information” menu in the cloud

Note: If you are running this for the first time and you would like to retrospect the historic dumps which we recommend please run this as batch once with 30 instead of 1 in the # of past days from yesterday field.

Parameter Value
# of past days from yesterday 30

Verification / troubleshooting the extracter and data reception

Under the Datasources list menu you will be able to view the status of your datasources. As data starts to flow in you will be able to verify it in this list.

The statuses:

Status Description
No data received If you have already set up your job please check SM37 for errors
Receiving data The refresh is in progress, typically this only takes abit more time than the job time in SAP
2018-09-15 12:33:57 (GMT) An example of a timestamp marking the final pieces of that particular message series have been received and processed. The cloud region determines the time-zone after the date and time shown.

Heres an example where no data has yet been received.