Once you've experienced FaaS, you don't wanna go back. This has been my experience with AI and FaaS.
In particular, running AI inferences in FaaS proved to be a great match: - Each function processes one request at a time. A model usually takes 1 data input and produces 1 data output. - Enough code to fit into a function. An AI action loads a model, runs the inference, and returns the result. - In addition, FaaS provides a model to scale to 0 and scale to millions with the traffic. With OpenWhisk I think we're very close to make AI Actions a first-class citizen for developers, and I've created a wiki to explore what it would take to get there [1]. Coincidently James Thomas also published today his experience with Tensorflow and OpenWhisk [2] I'm interested in your thoughts, and see if there's enough interest in our community to make this a reality. Feel free to contribute to the wiki with edits, comments, anything you'd wanna add. [1] - https://cwiki.apache.org/confluence/display/OPENWHISK/AI+Actions [2] - https://medium.com/openwhisk/serverless-machine-learning-with-tensorflow-js-4aa24494a9b4 Thanks, dragos
