Design, visualize and run AI flows with Pipeline Manager for Kenning

Published:

Topics: Edge AI, Open source tools

Many applications in the wild can be represented as graphs of interconnected nodes, especially when working with sets of modular blocks that can be connected with each other via certain APIs or interfaces. Such use cases are part of the many areas of Antmicro’s day-to-day work, for example:

  • ROS 2, a set of libraries and tools for implementing distributed systems consisting of various nodes communicating with each other via models such as publisher-subscribers, or service-client
  • Renode, our open source simulation framework, with sets of components that can be interconnected in order to form machines, which can also be connected and communicate with each other
  • FPGA designs, which are composed of interconnected building blocks interfaced with well-defined buses
  • Deep Neural Networks (DNNs), which in themselves are built from heavily interconnected processing blocks and Kenning, our open source DNN deployment flow and runtime creation framework, where we create optimization flows and runtime applications in a modular, graph-based manner

A graphical tool to manipulate the elements of tree-like or flow-like configuration files defining machines, nodes or processing blocks within domain-specific constraints can be extremely useful for any of those use cases. With the blocks, connections, constraints and parameters represented visually, users can better understand the architecture of a system and the relationships between its components as well as more easily edit the underlying structures.

Antmicro’s open source Pipeline Manager helps visualize and edit graph-like structures in the context of various frameworks and tools, providing a unified, data-based API for graphs and their building blocks. The Pipeline Manager also enables flows to be validated and run in a third-party app, e.g. Kenning, or potentially Renode, ROS 2 etc. This note describes Pipeline Manager’s features, capabilities and provides step-by-step instructions for running it locally.

pipeline-manager for Kenning

Pipeline Manager overview

Pipeline Manager operates on a JSON configuration file which lists available building blocks for graphs with their names, parameters, available inputs and outputs, and their types (for validation). The file is typically provided from the third party app which can then be used to validate or run the designed flow, but can also be supplied by hand.

The visualization graph is internally represented in the form of a JSON-like structure that can be imported and exported, containing available nodes, their parameter values, connections with other nodes, as well as graphical details, such as coordinates in the editor.

The Pipeline Manager’s frontend part can be used as a static, browser-only Web application that allows you to load, visualize, edit and export graphs based on the supplied JSON config.
However, the true capabilities of the tools show in server-based mode, with a third-party application acting as a backend.

In this approach, the third-party application communicates with the Pipeline Manager via a TCP connection, based on a simple protocol. The Pipeline Manager server establishes and maintains the communication between the frontend and the third-party application, parses messages from the third-party application and sends requests.

At the beginning of communication, the third-party application provides a specification of available building blocks (essentially, the JSON configuration mentioned above) which determines how the blocks can be created, configured and linked with each other.

Apart from providing block specifications, the third-party application can also implement methods for converting graph representations to its native format and vice versa, thus allowing the user to load a graph from an existing file and save a graph in a format that can later be used by the third-party application directly. It is also possible to run a specified graph in the third-party application from the Pipeline Manager side, without the need to jump to the terminal or other application.

Open source components of Pipeline Manager

The frontend side of the application is implemented using Vue.js and the Baklava.js library. Vue makes it easy to create extensible and visually pleasing websites which embed or extend Pipeline Manager’s functionality while Baklava is the engine for visualizing and manipulating the edges, nodes and parameters.

The third-party application communication server is implemented in Python using the Flask framework. The communication between the server and the third-party application utilizes BSD sockets and the Pipeline Manager protocol mentioned above.

The Python implementation of the protocol, including a simple communication API, is available in a minimal module called pipeline-manager-backend-communication, which provides both necessary structures and message types, as well as the basic implementation of communication with the Pipeline Manager server.

This basic communication allows Pipeline Manager to:

  • Obtain specifications for nodes available at runtime from the software
  • Send graphs specified in the browser for validation (in cases where simple rules in JSON format are not sufficient)
  • Send graphs specified in the browser for processing (e.g. to run the graph)
  • Save and load graphs in a format supported by the third-party application.

These functionalities need to be implemented on the third-party application side - by default, it receives graphs in the Pipeline Manager-supported format.

The protocol, however, is quite simple and easily extensible, so it is not limited to the functions listed above.

The frontend application can either run in tandem with the server application or independently. The application is a set of Vue-based libraries and a sample HTML page (which can be used to create new, customized visualizations), and contains:

  • A specification parser which validates specifications and creates a list of available nodes
  • A node generator implementation, which creates Baklava.js nodes from a JSON specification
  • Implementation of base nodes and utility widgets
  • Basic connection and parameter validation based on a JSON validation specification
  • Graph visualization using Baklava.js

With Pipeline Manager, introducing graph visualization and editing to a new or existing application is a matter of:

  • implementing a specification provider - a third-party application needs a method that will scan and extract available blocks from its implementation and convert them to a JSON representation that can be parsed by Pipeline Manager, i.e. their types, names, parameters and available connections to other blocks
  • implementing converters from a JSON file graph representation to a format supported by the third-party application and vice versa
  • implementing support for various Pipeline Manager requests, which (for Python) is as simple as writing several callbacks for request types specified in the pipeline-manager-backend-communication module, since the entire communication and message parsing are already implemented
  • in cases where the base application is written in a language other than Python, implementing a TCP client which follows the Pipeline Manager protocol and handles requests from the Pipeline Manager client.

Getting started with Pipeline Manager

Let’s start with the static version of the application to demonstrate the specification format and the application in a simple use case.

The basic system requirements are:

  • python, version higher than 3.9
  • npm

Install them using your package manager.

Then, clone the Pipeline Manager project and go to its directory:

git clone --recursive https://github.com/antmicro/kenning-pipeline-manager.git pipeline-manager
cd pipeline-manager

Now you need to install Python packages from requirements.txt:

pip install -r requirements.txt

With the necessary packages in place, you can build the static page with:

./build static-html

Then, open the pipeline_manager/frontend/dist/index.html page to see the application.

Preparing and loading a sample specification

As already mentioned, the available blocks are specified in a JSON file, which consists of a list of available nodes and optional metadata providing, e.g. information about styling or implemented interfaces.

Let’s look at the specification below:

{
	"nodes": [
    	{
        	"name": "LoadVideo",
        	"type": "filesystem",
        	"category": "Filesystem",
        	"properties": [
            	{"name": "filename", "type": "text", "default": ""}
        	],
        	"inputs": [],
        	"outputs": [{"name": "frames", "type": "Image"}]
    	},
    	{
        	"name": "SaveVideo",
        	"type": "filesystem",
        	"category": "Filesystem",
        	"properties": [
            	{"name": "filename", "type": "text", "default": ""}
        	],
        	"inputs": [
            	{"name": "color", "type": "Image"},
            	{"name": "binary", "type": "BinaryImage"}
        	],
        	"outputs": []
    	},
    	{
        	"name": "GaussianKernel",
        	"type": "kernel",
        	"category": "Generators",
        	"properties": [
            	{"name": "size", "type": "integer", "default": 5},
            	{"name": "sigma", "type": "number", "default": 1.0}
        	],
        	"inputs": [],
        	"outputs": [{"name": "kernel", "type": "Image"}]
    	},
    	{
        	"name": "StructuringElement",
        	"type": "kernel",
        	"category": "Generators",
        	"properties": [
            	{"name": "size", "type": "integer", "default": 5},
            	{
                	"name": "shape",
                	"type": "select",
                	"values": ["Rectangle", "Cross", "Ellipse"],
                	"default": "Cross"
            	}
        	],
        	"inputs": [],
        	"outputs": [{"name": "kernel", "type": "BinaryImage"}]
    	},
    	{
        	"name": "Filter2D",
        	"type": "processing",
        	"category": "Processing",
        	"properties": [
            	{"name": "iterations", "type": "integer", "default": 1},
            	{
                	"name": "border type",
                	"type": "select",
                	"values": ["constant", "replicate", "wrap", "reflect"],
                	"default": "constant"
            	}
        	],
        	"inputs": [
            	{"name": "image", "type": "Image"},
            	{"name": "kernel", "type": "Image"}
        	],
        	"outputs": [{"name": "output", "type": "Image"}]
    	},
    	{
        	"name": "Threshold",
        	"type": "processing",
        	"category": "Processing",
        	"properties": [
            	{"name": "threshold_value", "type": "integer", "default": 1},
            	{
                	"name": "threshold_type",
                	"type": "select",
                	"values": ["Binary", "Truncate", "Otsu"],
                	"default": "constant"
            	}
        	],
        	"inputs": [{"name": "image", "type": "Image"}],
        	"outputs": [{"name": "output", "type": "BinaryImage"}]
    	},
    	{
        	"name": "Morphological operation",
        	"type": "processing",
        	"category": "Processing",
        	"properties": [
            	{"name": "iterations", "type": "integer", "default": 1},
            	{
                	"name": "border type",
                	"type": "select",
                	"values": ["constant", "replicate", "wrap", "reflect"],
                	"default": "constant"
            	},
            	{
                	"name": "operation type",
                	"type": "select",
                	"values": ["dilation", "erosion", "closing", "opening"],
                	"default": "dilation"
            	}
        	],
        	"inputs": [
            	{"name": "image", "type": "BinaryImage"},
            	{"name": "kernel", "type": "BinaryImage"}
        	],
        	"outputs": [{"name": "output", "type": "BinaryImage"}]
    	}
	],
	"metadata": {
    	"interfaces": {
    	}
	}
}

The nodes list provides specifications for available blocks. Each entry consists of:

  • name - the name of the block, appearing in the list of available nodes, and in the graph
  • type - type of the node
  • category - category of the block, in the list of available blocks it is used for grouping them
  • properties - list of parameters present in the block
  • inputs - list of the expected inputs
  • outputs - list of the expected outputs

Each property in properties usually consists of:

  • name - name of the parameter,
  • type - type of the parameter, e.g. number, integer, select, text
  • default - default value

Each entry in inputs and outputs consists of:

  • name - name of the input/output
  • type - type of the input/output, used for validation (only inputs and outputs with matching type can be connected)

Save the above specification in the filesystem, and click Load specification to load it from the application.

Now, you can create, load, edit and save graphs in the internal format.

Using Pipeline Manager with Kenning

The first framework which we adopted to work with Pipeline Manager is Kenning. Thanks to its modular nature, adding Pipeline Manager support is particularly simple. Currently, Pipeline Manager allows us to create optimization flows.

To begin, install Kenning with Pipeline Manager support, and additionally TensorFlow and TVM optimizers (to be able to use their particular blocks):

pip install git+https://github.com/antmicro/kenning.git#egg=kenning[tensorflow,tvm,pipeline_manager]

Secondly, in the cloned Pipeline Manager repository directory (see previous sections), you need to clean the previous build (if you followed the tutorial above) and run the application in server-based mode:

./cleanup
./build server-app
./run

Then, run the pipeline_manager_client scenario in Kenning as follows:

python3 -m kenning.scenarios.pipeline_manager_client --file-path output.json

With this, Kenning will create a specification of nodes from available classes (their presence depends on the availability of dependencies - the logs for the application will show the components added to the specification).

In the end, you should be able to create, edit, save, load and run optimizations for Kenning from Pipeline Manager. Below you can see example views of more advanced flows in the application:

Flow example

Flow example

Visualize your AI workflows and simplify development with Kenning

Pipeline Manager is a great tool for editing workflow graphs - it simplifies the development process in tools such as Kenning by a great deal. This can help with fine-tuning and systematizing development, especially for a less technical user.

If you would like Pipeline Manager implemented for your backend application to perform runtime-based, more detailed validation or you would like to discuss using Kenning for your edge AI use case scenario, feel free to reach out to us at contact@antmicro.com.

See Also: