“Process mining utilizes the treasure that data represents in a company»
In 2016, together with Thomas Baier and Karina Buschsieweke, Rami founded Lana Labs, a company specialized in automating the analysis of production and business processes through Artificial Intelligence, with the goal of making companies smarter, more efficient and faster. Process Mining LANA is the crown jewel of a firm that promises tremedous potential, following the recent acquisition by Appian.
What is process mining and what problems does it solve?
Process Mining, in general, is a technology that uses the treasure of data. It’s some sort of an X ray for a company’s business processes. So like a doctor would scan a person and see if something is broken, we do the same thing with the business in a company. For this we use digital traces (data) from IT systems like ERP systems, manufacturing or execution systems, and so on. Each manual or automatic entry in a systems is usually stored in the systems’ databases and contains timestamps. These digital traces with timestamps we use in order to identify the processes and how those processes.
The integration with Appian allows us to deliver the continuous optimization lifecycle within a single platform
This provides transparency for the Company and shows you how the processes were executed with over hundreds of thousands of executions and all variations. With these figures and facts one can then identify optimization potential, risk potential, automation potential, standardization potential and more in a continuous way. So, in short, we collect data and we provide customers automatically with information and facts where the process is slow, uses a lot of resources, is not standardized, or where they may not adhere to the target model (the way they are supposed to be executed).
How do you gather data all this data?
In our case, the data is actually there in most cases. Most companies, for example, use SAP, Oracle or any other common IT system like a manufacturing execution system or incident management system. And all these systems whenever they perform a task, they will record a timestamp in the database and this data we use to create an event log. It doesn’t need to be like a regular log file. We could also use that, but in most cases we take data out of relational databases and, in general, we look at timestamps that represent events and activities.
To connect to these systems we use connectors, and then we need to identify the database tables that we require. In most cases you will need to do some sort of data transformation, to bring it in the format that our algorithms are able to read and interpret the data and automatically create process graphs and provide fast insights into your processes. For example, let’s say an order has been received, then it was sent to production, and the production line produced it. Now the goods were handed over to logistics to deliver them, an an invoice was sent and later paid. These are the things that you were actually interested in and they are all stored in your databases.
What do you do with all of this information?
There are obviously different use cases and everything depends a little on what you’re looking at, but in most cases it’s about performance. For example, a customer might want to reduce the cycle times of a process, or to standardize it to get rid of too many variants. This standardization of the process flow sometimes is to prepare it for automation, and to introduce bots or workflow systems. But it also can be for system migrations, compliance, reduction of rework and similar improvement activities.
Often companies have already some sort of an idea of what the process is supposed to be executed like. I mean, Business Process Modeling was big for a long time and still is. With our software, you can connect a Best Practice model and get an automatic diagnosis between target (process model) and as is process (automatically discovered process graph). For example, to find out in which area the process is not executed as it is supposed to be, or where activities are being skipped (i. e. Quality Control, which could possibly result in a bad product).
In most cases it is all about improving performance
With our automatic diagnosis, based on the target model or Best Practice model this how we can automatically show companies how often they deviate from the target model, and, in the end, help them define optimization methods and measures.
Which part do Machine Learning and Artificial Intelligence play in all of this?
Our algorithms not only look at the process at hand, but also at the accompanying data, in order to identify patterns,for example, to understand why a confirmation has been skipped or why some deliveries were late. In order to identify areas of interest and root cause of problems, we use ML and AI to understand why this actually happens, so that analysts don’t have to dig themselves. In general we want to make it as easy as possible to identify the problems and their root causes, so that you’re able to resolve them as easily as possible.
Another important part is prediction of what will happen next. If you think about delivery, for example, it is important to be able to identify what processes are running, and to identify if they will run late, so that you can act before things happen.
Who is your usual contact person at your customer’s companies?
We get questions from different areas. Of course they come from management on a high level, because they want to steer their processes actively, and increase efficiency or customer satisfaction. But they also come from the operational level, often driven by the Department heads for example. In Finance these would be the people in charge of Procurement or Sales processes. In manufacturing it’s often a global production lead who wants to standardize the production process through different sites.
How long does it take to deploy your systems?
In general, deploying the system on cloud is immediate, and on premise most people would take between two and four hours to set it up once you have a virtual server. Then, depending on the complexity of the data, the project may run betweens 2-3 weeks to 2-3 months to gain the first insights. All of this depends greatly on the complexity of the systems and the number of systems involved we have to do a data transformation. ou may have to normalize data from different IT systems, bring them together and validate them, which is very important.
Is there any impact on the customer systems while your system is running?
No. Of course, depending how the system is configured, you maybe need to adjust the extraction rate so that you don’t pull everything at once, especially if it’s live data or live production systems. But for the most part, there isn’t any noticeable impact on the customer systems while we are running our product.
What is the advantage of using LANA instead of other, similar, technologies?
I will highlight a few areas. There is, for example, the target action analysis (the conformance checking), especially how we do it. The difference is in the depth and the results that we obtain are much better than than our competitors. Because we are able to provide insights into where the customer has deviations from his target model and the type of of deviations that occur.
That is one area, and the other one is the automatic analysis. This is actually one of the reasons why we actually started off the company.
That is one area, and the other one is the automatic analysis. This is actually one of the reasons why we actually started off the company. Thomas Baier and I finished our PHD’s in Process Mining and Business Process Optimization in 2015. At that time, we looked at the Process Mining Market and we saw that there was a lot of innovation in research, but these innovations were not implemented in any process mining tools yet. So, we decided we should bring these innovation to the market.
So, automation and the automation of analysis are two things we are very keen on, because we want to bring Process Mining and Data Preparation to normal business users, so that they don’t need to be process or data experts in order to improve their company. With our help, they can figure out what to do to, for example, improve their efficiency by 30%.
Our low code/no-code data preparation module helps to bring the data preparation part that has to take place before the analysis, to business users. With other competitors you need people with IT background to do data transformation. But with our new low-code UI, you don’t need that anymore. Just click and you can transform the data and analyze it. On the other hand, we have an open API. That way, customers are able to reuse what they create, connect it to other systems, and then even build new things.
How do you integrate with Appian and what’s the benefit of having you work together?
When we were working with our customers, we noticed that when we had done the analysis showing great insights and improvement potential, the question came up: What’s next? How can we realize it? That is where our initial offering stopped. Because we are experts in Process Analysis, and in finding out what you need to improve, but then you actually have to improve it.
This is the core of Appian’s automation platform, improving processes with digitization and automation. Now we can offer the complete lifecycle, discovery, design, and automation within one platform. First there is the analysis to discover the problems, the realization of improvement potential via workflows and automation, the monitoring of improvements, and further optimization with automation, and this in a continuous cycle to keep everything in shape and improve
We are able to provide information on where the deviations are and of what type they are
So, together we provide the most complete Low-Code Automation Platform, which brings you from discovery to design to automation and then combines it all in a circle. In this way, we are able to provide the continuous optimization lifecycle within one platform. This means that we first come in to gather all the “intelligence” on the company’s systems and after that we use Appians Platform to do the finetuning and automation of the process enhancement and optimization.
At the moment our main goal is bringing both technologies together, in one platform. Appian stands for reliability, stability and performance. Currently we’re just bringing the standards to the same level, and then we will create more and more synergies.
What’s the average timeframe a customer has to wait to see the first improvements?
I think this always depends, of course, on the process itself. There is no definite and unique answer for that. For example, for manufacturing, you may have a very different solution space. Introducing automation on a station may take more time than if you automate a task in a financial institution, that already has all its processes fully digitized. So, it is hard to give a concrete number because it really depends on the process and the domain that you’re looking at. It can range from a couple of days to a couple of months, really.
Our algorithms not only look at the process at hand, but also at the data that goes with it
However, you can work concurrently in different ways. You can, for example, start the process mining first and then do the automation, or you can improve the implementation running our software on the side, or you can do the monitoring afterwards to keep an eye on the processes, in order to find new potential.