For some time now, the team and I have been developing analytics for IIoT applications in a fully containerized environment. It is actually part of the philosophy of the group.
Despite a lot of good serverless solutions out there, serving of containerized analytics orchestrations is still fundamental for most IIoT applications I’ve been involved with. Using it, we can pass around our whole dependency stack, deploy, replicate, move, and back up workloads. It gets us to the holy-grail type situations of build an application in cloud or on-premise and serve it on the edge without the hassles of tracking what get’s build where.
One other thing it allowed me, in the past year or so, was to have my main working machine free of bare metal installations. Yes, you’ve heard it right, for a while now I’ve not pip-installed anything through my terminal. Everything gets built in the context of a Dockerfile and becomes a self standing image, from where my APIs are served, or my Jupyter notebook prototypes get built.
Once I first got the epiphany of not needing to manage environments by basically building Docker images for all projects I started, everything seemed to fall in place. Now, the first prototypes I build in my Mac or my DGX Stations, can seed larger workloads on GPU servers in the cloud. It is super nice when you see a Deep Learning model shrunk down to run inference on something like a Jetson, right next to a rotating machine. Of course, the automation that comes with using something as Depend-on-Docker helps a lot.
As I type, I have at least 20 different specialty images build due to different needs. Some, built FROM tensorflow/tensorflow, others FROM continuumIO/anaconda3, others FROM microsoft/windowsservercore — Ok, this last one has very specific automation of .NET engineering software — but at the end of the day these images circulate around the group and can be reused by our teammates, built upon, and not a single drop of sweat is lost on figuring out that my version of TF Probability is different than yours!
If I can suggest something is to try it out. My development and prototyping environments are much neater now and with a simple docker push I can enable my much more clever colleagues to build IIoT AI applications that will run virtually anywhere.
Source: Deep Learning on Medium