Service-Meister modernizes maintenance processes with data-driven technologies

Optimization of service through AI and connected tools

In cooperation with Würth, we are developing advanced data science applications as part of the Service-Meister project to enable data-driven decision-making processes. The Service-Meister project as a whole is driven by a collaboration between 16 partners from industry, research institutions and implementation partners such as grandcentrix. The project is funded by the German Federal Ministry of Economics (BMWi).

Partnership as a speedboat

In this project, Adolf Würth GmbH & Co. KG, along with grandcentrix, forms one of six so-called fast boats, each of which implements a sub-project together. As one of three implementation partners in the consortium, we pursue the goal of developing specialized solutions and then generalizing them. The algorithms and approaches developed will then be made available to German SMEs in the context of the overall project.

Adolf Würth GmbH & Co. KG

Würth is a specialist for assembly and fastening materials and can look back on a 75-year company history. With over 400 entities, the world's leading B2B company is represented in 80 countries. In addition to its products, Würth also offers planners, configurators, seminars and other services, such as the Würth MASTERSERVICE. Würth maintains, calibrates and repairs around 500 devices per day as part of its MASTERSERVICE. In this context, the data generated on the individual service cases is not only interesting, but essential in order to be able to further optimize the service.

grandcentrix GmbH

As a specialist for end-to-end IoT solutions, grandcentrix has its own Data Science department that deals with all issues related to value creation with data. Since data collection and data use are in many cases the main motivators for the connectivity of devices, the Competence Center Data Science provides support at an early stage in the conception of IoT projects. For example, simulated data is used to develop prototype models and test business cases at an early stage. In this way, boundary conditions of connectivity, such as sensor types or data frame frequencies, can be selected at an early stage and with a subsequent optimal use of data in mind.

We have jointly set ourselves a number of goals for our speedboat and thus for the more efficient use of service data: Based on metadata, it should be possible to diagnose the economic efficiency of a repair even before the device is sent in. Furthermore, it should be possible to recognize the correlations between damages and the areas in which the machine was used, as well as to minimize the downtimes of the equipment through the data-supported acceleration of processes.

Predicting Service Cases

The goal is to forecast the course or outcome of any service case as early as possible. To achieve this, we first analyze and cluster historical data from MASTERSERVICE. Then, different models for predicting the case outcome are tested, including a Natural Language Processing (NLP) model trained with the free text descriptions of the tickets. In other models, additional features, such as device type, age, previous repair history, are also considered. As a result, the prototype can predict the most frequent causes of faults with a very high accuracy - even before the defective device arrives at Würth.

Predicting Service Cases

In order for data science use cases with increasing data volumes to function reliably in the long term, a stable architecture for data processing is required. Among other things, grandcentrix relies on pipelines with Apache Airflow, which are executed in Azure. In addition, a device simulator was developed in cooperation with our Competence Center S&I (Systems and Infrastructure), that enables a fast simulation of data. The data from the device simulator can be used to perform load tests of infrastructures as well as to simulate data for new use cases for which not enough real data is available yet. In the Service-Meister context, for example, the potential of networked tools is being evaluated.

Long Term Goals

Starting in 2020 the individual speedboats developed and researched specific challenges and goals of the respective industry partners. The goal of Service-Meister and therefore the reason for the funding by the BMWi is the transfer of the transfer of the experience and expertise gained from the individual use cases of the speedboats to generalized approaches and solutions that can then be deployed in German SMEs in various industries.

A collaboration between all partners of the Service-Meister consortium will start before the end of 2021 to jointly specify generic services and publish them under an open-source license. The platform provided is to be made available to SMEs free of charge.

Technical Highlights

Hypothesis Tests

Hypothesis tests are used to test abnormalities in data for statistical significance.

Anomaly Detection

The parameterizable anomaly detection algorithm can be applied to any data.

NLP

Natural Language Processing for ticket classification.

Data Pipeline with Apache Airflow

Data Science applications are applied and monitored in Airflow pipelines.