While there are a number of AI tools and toolboxes available, industrial applications pose new challenges to ready solutions. The goal of the AIMS 5.0 AI Toolbox is not to replace existing toolboxes -- rather, it extends existing solutions to provide information exchange and knowledge base to industry in order to guide and supervise the design and development process of industrial AI applications.
To support AI service design and implementation, the AIMS 5.0 AI Toolbox has some well-defined objectives, as follows:
The figure below shows the main concept of the AIMS 5.0 AI Toolbox. There are four levels of AI Service scale, differing primarily in granularity. The three implementation based levels are supported by the fourth -- mostly theoretic -- methodology scale. In the following, the four levels are presented more deeply and compared to the objectives and requirements.
Application scale is the highest scale level of AI services, where the AI services provide solutions for complete use cases, e.g., a whole predictive maintenance service specifically for railway switches. The aim of the AIMS 5.0 AI Toolbox is to reach application scale by composition of reusable and composable tools.
Tools scale is a compromise between algorithmic scale and app scale. Tools basically provide applied AI methods, which are commonly one or more AI models combined with some inner logic to solve a recurring problem. E.g. localization of assets in a factory consists of a couple of small steps, but the problem arises in numerous use cases. Tools scale can provide excellent reusable scale and support rapid development. Also, careful design of service interfaces can provide good composability.
The level of Algorithmic scale is granular since well-known AI and ML algorithms are implemented as AI services. Algorithmic scale is mostly about choosing the best service interfaces to provide composable services. However, using pure algorithms -- while being highly re-usable -- requires great effort to implement complex use cases. Also, using pure AI algorithms requires great knowledge of AI methods and techniques, which is against rapid development.
Being on a theoretical scale, the methodology provides specifications, requirements, and best practices for implementing AI services. However, while the Methodology scale can also be considered an independent level of AI service scales, it is best to imagine it as a basis of all other service scale levels since it can provide means to fulfill all the objectives and requirements of AI Toolbox services.
The lifecycle of AI services shares the very same four steps. To provide re-usable services, the customization step makes it possible to tailor the service to special needs. This includes e.g. setting of the parameters. The next step is optional, however, most algorithm requires training to be able to perform specific tasks, e.g., detecting uncommon objects in an image. Training can be complex and shall support widely spread frameworks to efficiently accomplish fine-tuned models. Deployment is a crucial point in the lifecycle, since models require variable resources or perhaps massive parallelization. Also, standard solutions for deployment, containerization and extendability to industrial systems (e.g. the Arrowhead Framework) are common requirements.
The previous section introduced the main objectives and motivations of the AI Toolbox. This section presents the fundamental requirements for the AI Toolbox, examining how these ideas can be implemented and outlining the broad architectural framework that should characterize the AI Toolbox. The figure below presents the high-level architecture, including the building blocks of the tools and the AI Toolbox Support. Contributors to the toolbox are responsible for supplying tools, and their role includes providing ready-to-test notebooks along with necessary supplementary information and files, incorporating best practices related to the topic and the actual implementation. The notebook format is particularly advantageous for supporting proof-of-concept development, offering essential tools for interactive development. Additionally, the AI Toolbox Support role involves consolidating these notebooks into a Tool catalogue and Application Wiki, while also providing the necessary support and libraries for the tool deployment.
Arguably, one of the key goals of the AI Toolbox is to minimize time-to-market, emphasizing the swift and uncomplicated deployment of proof-of-concept applications and tools. Simplicity entails the availability of pre-existing example tools that can be readily employed, either as-is or with minor adjustments. We have defined 4 building blocks that will form the AI Toolbox Support:
To underpin this entire concept, a flexible and open-source Integrated Development Environment (IDE) is crucial. This IDE should have the capability to run notebooks, even in remote settings. Moreover, there are instances where deploying large AI models locally may not be feasible, or multiple notebook instances may need to utilize the same model, making remote running essential. These requirements serve as the foundation for operating and developing AI-based tools and industrial applications. Additionally, version control emerges as another indispensable requirement for the AI Toolbox, for which the well-proven solutions (e.g. Git, Subversion, Mercurial, etc.) are employed.
In this section, we outline the five fundamental building blocks that form the core components of the tools. These elements represent what we believe a collaborator should incorporate into the tools to contribute to the AI Toolbox, as detailed below: