We at Gekkobrain spend most of our time creating automation and SAP DevOps and upgrade tools. We do that to free up our customers time for really important and complicated tasks. Tasks which requires skilled people to make informed decisions about a solution. We have historically focused on tools to keep the digital core clean by optimizing the ABAP code and the database SQL; on code compliance and application performance management. We realized at some point that understanding the flow of data and interaction between modules and different pieces of software is just as important as optimizing code performance if a team is to excel in delivering good software.
If you are not familiar with our products it suffices to say that we thrive on gathering as many metrics about a system as we can and then analyze that data into as much detail as we can while applying advanced algorithms to connect and correlate those metrics in cause and effect schemas. We use all this analysis to automatically improve the custom ABAP and SQL that runs in a system and we do this to automatically convert custom code during SAP HANA upgrades and we revisit the metrics after such a code fix to reinforce our data-driven approach to make the system more and more reliable and to ensure that it runs as intended after the upgrade.
The knowledge that we gather about a system provides our robots with confirmation that the system is running correctly and as we upgrade everything our approach is to look at everything afterwards, be it SQL, performance metrics or recently, business flows, they all individually assist us in knowing if everything went according to plan.
After the SAP data is transferred to FLOWS and the Machine Learning has completed its data processing steps then you will right away be able to see your business flows. So, there is no need to upload business blueprints or other process template in order for the tool to be able to present you with its findings. If you wish to make suggestions to the system as to which document flows are of special interest to you then you may classify documents either through excel uploads or by providing other data sources that feeds extra information about your documents into FLOWS or you can simply explore your document flows in the dashboard to discover variances you never knew existed.
FLOWS can even detect and measure if your system seems to be receiving mass updates from 3rd party software which often makes it hard to fully understand the maturity of your business processes.
Business Processes Monitoring has always been at the heart of SAP’s mission. And traditionally that monitoring was centered around templates. Systems of control requires rules after all and Business Processes are a complicated set of rules. However, those templates are rarely really used. Using the previous analogy then NOT having an x-ray but an atlas of anatomy does help you sufficiently to know where to prioritize your work.
If templates are not reflecting the variance, the complexity of user and software behavior, then it will not be enough of an insurance to navigate an upgrade purely from the process schematics.
Greenfield discussions easily end up with Brownfield because the templates are inaccurate. And once templates start to show signs of inaccuracy then the pre-upgrade study invariably start to calculate how much risk and time to assign to finding out what was assumption and what was factual.
We find ourselves back at the X-ray machine. Knowing exactly what is going on before planning something as complicated as an S/4 journey means that decisions on what groups of custom enhancements to keep for a process to stay intact and what to throw away can be entirely fact based.
But don’t think that FLOWS is only for custom code migration. Getting this clear picture of what is actually happening also impacts service development in a profound way.
This is why we think of FLOWS as DevOps for your functional experts. The people in your SAP department or even in your business side that as process experts that are domain experts in Procurement, or in Warehouse Management, in Supply Chain Management, in Sales and Distribution, any SAP module of course, and in any Line of Business. The people that write up the functional specifications for the developers in conference with the Business Process Experts from the business side of the company. The architects and the application testers. In the creation phase of a solution it is often very time consuming to outline the solution and if the solution is to build on a bigger pre-existing solution, standard or custom, the analysis time aligning the new business requirement with SAP can be very difficult. And after go-live, in the burn-in period, or after the first rollout wave has settled or if a business suddenly slows down and no-one seems to understand why. These and many other situations are time consuming and often times the assumptions lead the way rather than the whole picture.
If a functional expert discovers what actually happens with all data flows within a solution area, or how common unknown flow variants are and what enterprise dimensions are present then a strange thing happens: Those same experts start to drill into those flow variants thinking of them not as errors but rather as interesting discoveries. They might be errors, but mainly they are uncharted until now and knowing about real system behavior makes the functional specification even better. This is why your Transactional data is so valuable - it shines a light on your SAP system, and it provides important feedback on how well your solution has been targeting your users!