AI-LIFECYCLE
Description of the individual steps:
The DevOps principle is known from software development and supports all important phases of development. In contrast to classic software projects, AI projects have a large number of additional requirements with regard to the technologies used, the organizational processes, and the communication between customers and the AI provider. The extension of DevOps principles to AI projects is summarized under the keyword MLOps. At KENBUN, we rely on a combination of structured MLOps processes and a flexible AI lifecycle for the development of successful AI solutions, which supports successful communication through idea generation and explorative processes, especially at the beginning of AI projects.
1. Problemunderstanding – Use Case Mining / Exploration
Together with you, we identify your AI potentials and your possibilities to get started with artificial intelligence (AI) (use cases). The exploration phase covers the business analysis and directly follows the first contact between the customer and KENBUN. This includes an initial conversation in which the AI potentials are explored and use cases are identified. KENBUN’s experience in quickly assessing the basic feasibility of the use cases is particularly important here. Often, a creative approach to the individual components also helps, as reformulating the requirements can turn an intractable AI problem into a manageable task. Examples here might include collecting additional training data, installing additional sensors, or limiting the use case.
2. Data Understanding – shared Workshop
The exploration is followed by a workshop in which the problem is concretized and translated into functional AI requirements. The potential of the use case is evaluated jointly by both sides. Part of the workshop is also the delivery of an initial data set by the customer. We then perform a detailed data analysis and identify potential law and AI use cases. The goal of the workshop is to familiarize the customer with the principle possibilities of AI solutions for their specific use cases. In addition, we get to know the use case better and can thus better understand the resulting data and advise the customer on data collection. In the workshop, we start with data engineering.
The joint workshop provides the foundation for the data-focused phases in the MLOps process:
• Adding and regularizing datasets
• Data labeling
• Data versioning
3. Proof-of-Concept
After the workshop, we develop your Proof-of-Concept (POC), a first working prototype to solve your use case, which you test afterwards.
The implementation of the POC is individually tailored to the customer and during the development an intensive interaction between the customer and KENBUN is useful. Here, an agile development process is recommended, which KENBUN follows during AI development. A central part of the POC is data engineering and model development, a functional prototype. However, in addition to the Data Science tasks, a classic software development and system architecture is also a central part of the POC. For example, the integration of client and server applications is relevant, as is the interaction of existing AI components with new solutions to be developed in a coherent overall concept. As a rule, containerized AI solutions are used on the servers. Customized apps are often developed or adapted for the end devices.
In the POC, the modeling steps of the MLOps process are run through for the first time. The datasets identified in the workshop are modeled using various ML methods, and models are trained and evaluated. Depending on the use case, AI models are run in containers on servers or integrated into custom apps.
Only when you are satisfied with your solution, we move on to the development and integration phase!
4. Development and Integration
A strength of the KENBUN AI lifecycle is the ability to integrate the solution tightly and directly with customer systems, allowing them to act as both a data source and a data sink. Therefore, the development of a scalable AI component focuses on the adaptation of existing AI components to the customer systems as well as the integration of newly developed components into the systems. In the development and integration phase, the workload increasingly shifts from Data Science to software development and to Big Data processes. We develop the proven model in an iterative and agile manner.
In particular, we follow the approach of developing a minimum viable product (MVP) as quickly as possible in order to get real feedback as soon as possible.
The AI model implemented in the MVP and all further product stages developed in iterations (development iterations) are now integrated into your existing system landscape.
Tools from our toolboxes KIDOU and KIDAN are used during development – this accelerates development extremely….
Click here for the language toolbox KIDOU
Click here for the AI Big Data platform KIDAN
5. Operation
For the successful operation of AI models, Big Data solutions are often necessary to process and store high-frequency data or data with large volumes so that the AI application is scalable. For this purpose, we have developed a dedicated AI and Big Data platform called KIDAN. KIDAN is built on open source solutions and supports MLOps processes. KIDAN provides tools for maintenance, model versioning, prediction monitoring, data monitoring, model versioning, model deployment, accounting and access controls. Due to the modular structure, it is possible to exchange individual parts during operation and to reuse components for different use cases.