Skip to content

datecta GmbH

Digitalization _


Data analysis and developement of predictive models


We analyze stored data to identify important influencing factors on your production process and previously unknown relations. We use machine learning to detect fault conditions and we are able to forecast hard-to-measure parameters continuously during the production process.

Mehr

Project developement


We explain the governing principles of machine learning and the common pitfalls of their application in the industrial sector. Together we identify the most promising applications in your company. After a detailed assessment of the technical and commercial potentials we assemble a suitable team and make the project happen.

Mehr

Project valuation


We are available as an independent entity for the commercial and technical evaluation of digitization projects. Particular attention is paid to the calculation of real economic potential and the critical comparison of different technologies and providers.

Mehr

The challenge

With increasing availability of storage capacity and computing power, new exiting possibilities emerge that enable us to run industrial processes faster, cheaper and more energy efficient. These potentials are currently described by the buzzwords „digitalization“, „Industry 4.0“ or „Big Data“. Operators of technical facilities now face the difficult task to convert the hype into actions and to transfer the successful concepts of pioneering branches to their own company.

The company

Our company has the goal to apply machine learning in the industrial sector. To accomplish this task, our deep understanding to the challenges and pitfalls of industrial production processes is vital. As engineers we apply machine learning for several years in different industrial branches. We combine this experience with the profound process understanding of our customers. Thus we can unlock large potentials, often even with already available data.

Me

How we work

We consistently exploit the possibilities of the rapidly digitizing world of work. If required, we can rely globally on experts in the field of machine learning and data science to make globally existing know-how locally available.

So that technical or linguistic barriers do not play a role, our project manager is responsible for everything: he is in close contact with you, is familiar with your equipment and has experience in the application of machine learning methods in the industrial environment.



Data Science Revealing unknown influencing factors.

The increasing availability of measurement technology and storage capacities is leading to ever-expanding data archives, which are often rarely used due to lack of time and methods. Concealed in the historical data is a large treasure that can be made visible by means of detailed data analysis. Due to the prevalent complex systems, which contain a large number of measured variables and data points, the identification of all relationships is regularly difficult. Particularly comparative plots of individual data series fail because of limitations due for reasons of visual clarity and comprehensibility.

Generic placeholder image

Detection of anomalies. Machine learning against the flood of data.

Modern industrial processes are monitored by countless sensors and metrics, allowing the operator to monitor hundreds of relevant variables and respond appropriately to deviations. Since the flood of information will continue to increase due to new, cheaper sensors and better data integration, a value-adding use of the data is only possible with the aid of intelligent evaluation methods. For this reason, detection of anomalies in the production process is one of the greatest industrial potentials of machine learning. Machine assistance systems view all data of the process as a whole and automatically distinguish between important and unimportant parameters. In addition, they are able to recognize interactions that remain hidden in the one-dimensional view of individual values.

The assistance system monitors and filters all accumulated data and alerts the plant personnel only in the case of anomalies that require the expertise and creativity of a human being.

Generic placeholder image

Design of Experiments. Systematic growth of insight.

In order to determine the influence of fundamental changes to the process, specific tests are necessary. In order to quickly achieve results, the strategy "Simply try it out" is often used here. However, as this method usually gives only few reliable results, "just one more experiment" is usually what follows. This regularly goes beyond cost and time limits and, in the worst case scenario, leads to misinterpretations based on statistically insignificant observations.

For these reasons, design of experiments has great advantages over the intuitive approach. By means of this method it is possible to obtain reproducible, reliable results and nevertheless to significantly reduce the number of individual tests. Since the scope of the experiment is determined in advance, the effort can be estimated better and the structured procedure leads to better documentation and more sustainable knowledge management. Typical users report a reduction of project running times and costs due to the application of design of experiments by 40 - 75%.

Generic placeholder image

Condition-based maintenance. Forecast of failures.

In the field of maintenance, operators are confronted with the dilemma that, on the one hand, machine and plant availability should increase steadily, while on the other hand, maintenance costs should be reduced. Out of this motivation, a large field of application for machine learning has developed: condition-based maintenance. This approach intervenes when the ideal time for the change is present: before major damage is inflicted by spontaneous failure, but only when the achievable runtime is exhausted.

The determination of this ideal time is realized by condition monitoring, which is based on the measurement of quantities correlated with the degree of wear. In the case of rotationally driven equipment such as pumps or turbines, for example, characteristic vibration signals are a central indicator of an approaching failure. The measurements are evaluated by machine learning and the failure time is forecast. With this method, the running time of the equipment is exhausted, while the consequential costs of unplanned downtime are minimized.

Generic placeholder image

Soft sensors. Prediction of hard-to-measure parameters.

Soft sensors are used to make non-measurable quality parameters accessible online, thus enabling a controlling intervention during the production process. The word 'softsensor' is made up of 'software' and 'sensor', which illustrates that this is not a physically existing sensor, but rather a calculated forecast of the target value.

An example: The yield of a chemical production process can only be determined by laboratory analysis of the concentration of the valuable component. This means a considerable loss of time to be able to react to disturbances. To solve the problem, a soft sensor is developed from historical data. In this example the prognosis is calculated from the continuously measured quantities conductivity, the pH value, the turbidity and the color. It matches in 95% of the cases with the analytically determined concentration. Now it is possible to intervene without delay during the production process in order to achieve the optimum yield.

Generic placeholder image




You have a project for us?





datecta GmbH

Burgkstraße 24, 01159 Dresden

info@datecta.com

0049-151-20-911-923