appendix E Deploying NLU containerized microservices

 

If you have read this far, you have probably already built some awesome NLP pipelines. You are now probably eager to integrate an NLP pipeline into an end-to-end application for others to use. Whether you are building a web application, mobile app, or desktop app, you will need to modularize and deploy your NLP software. This is the fun part of NLP: helping others experience the power of NLP to do things they never thought were possible. This appendix will show you how to deploy your NLP algorithms to the cloud where they can become a standalone NLU service or even become part of a bigger application.

The example code blocks here will be for an advanced open source NLU endpoint, typical of what some of the most advanced chatbots and AI service companies deploy as part of their core business. In this appendix, you will be working with the code from the nlu-fastapi project (https://gitlab.com/tangibleai/community/nlu-fastapi), which has been adapted from the open source mathtext Python package. The mathtext package provides the NLU service for the Rori.AI chatbot by Rising Academies, which helps teach math to thousands of daily students.

E.1 A multilabel intent classifier

E.2 Problem statement and training data

E.3 Microservices architecture

E.3.1 Containerizing a microservice

E.3.2 Building a container from a repository

E.3.3 Scaling an NLU service

E.4 Running and testing the prediction microservice

E.4.1 Setting up DigitalOcean Spaces and training the model

E.4.2 Downloading the nlu-fastapi container image

E.4.3 Running the container

E.4.4 Interacting with the container API