Scaling edge AI devices with automated cloud build environments
Topics: Open cloud, Edge AI
Complex edge devices at scale
We often find ourselves building complex edge AI devices operating in the field at scale, introducing breakthrough capabilities in various fields of industry, from robotics, through healthcare, to automotive and aerospace. These devices tend to run a massive amount of software - from customized BSPs with drivers, through AI inference and other libraries, up to the end user applications which can also be quite intricate, especially when the devices themselves involve multiple sensors and communication protocols which come with their own firmware.
Any piece of this complex puzzle can become a point of failure, and reproducibility, traceability are key, especially when you are to roll out 100s or 1000s of devices in different locations. That is why, when building Linux-based software stacks for such devices we use frameworks like Yocto/OpenEmbedded and Buildroot which create the BSP from a known set of sources, and implement flexible OTA systems which can be used to upgrade the devices in the field. Android, which we also implement for many customers (see e.g. our open AOSP-based BSP on GitHub), works in a similar manner, coming with its own set of build tools and scripts and OTA update capabilities.
Use automated cloud build environments and open source to encapsulate complexity
BSP builder frameworks require a large amount of resources to execute though, and can get quite complex, managing scores of repositories and interconnected layers of software. Build times can get long, to the point when testing new changes is not a trivial endeavor, which is why using powerful pay-per-use cloud compute infrastructure is a very good match.
At Antmicro, we rely heavily on open source components not just for the software we create - but also the tooling and development infrastructure. The obvious benefit is reproducibility and scalability of our methodology, where we’re able to quickly build and deploy cloud build environments that mirror our internal infrastructure for our customers on Google Cloud or Amazon Web Services. The complexity of the software stacks we’re creating for our customers, while still present and available to investigate in detail, is encapsulated in automated build pipelines that can be executed by either party to get a reproducible and traceable result. The BSPs and their elements can thus be versioned, and potential problems analyzed in the context of the lifecycle of the entire software stack. That way, we can have you covered however complex your device is.
Go further by adding smart OTA & cloud CI testing
From there, it is a short step towards OTA deployment systems that we can also build for you, cloud CI testing - not only of the builds themselves but also execution, using for example our open source Renode simulation framework - and other infrastructure to make your physical devices easier to manage at scale. Naturally, not everything needs to be implemented at once - we often gradually introduce more automation mechanisms for our customers throughout the product lifecycle, based on their needs and developments in other parts of their business.
Once you get to fully automated builds and tests in the cloud, however, the benefits become so large that there is no going back - it’s a convenient framework that can be used to build multiple product variants, entire product lines, and whatever the circumstances, remain in control of your software stack.
And all this is thanks to the portability, transparency and ease of integration provided by open source. If you are interested to get our help in building your next generation device in this way, do not hesitate to get in touch at email@example.com!