Passer au contenu principal

 Subscribe

This post is co-authored by Phani Mutyala, Senior Program Manager, Applied AI.

We recently announced a preview of Docker support for Microsoft Azure Cognitive Services with an initial set of containers covering Computer Vision, Face API, and Text Analytics. Today, we are happy to add support for our Language Understanding service. Language Understanding applies custom machine learning intelligence to a user’s conversational and natural language text to predict overall meaning and pull out relevant and detailed information. Language Understanding can be used to build conversational applications that communicate with users in natural language to complete a task.

Running Language Understanding in a container solves a few key problems AI developers are currently experiencing. One of those issues has to do with controlling how and where their data is used, either locally, in the cloud, or on premises. This kind of flexibility is really useful to many businesses we talk with everyday.

Another benefit is controlling the scaling, whether that’s scaling up or scaling down, which is especially important when AI models are being updated on a regular basis. By controlling when you scale, you plan for the right bandwidth based on your needs. Therefore, you can run the AI right next to your application logic and be very fast and scalable, all with the reliability and quality that a container provides.

In this blog, we describe how to get started with Language Understanding running in a Docker container on your local dev box. If you are new to Docker and need help getting set up on a local machine, please read the previously published blog post, “Running Cognitive Service containers.” You can also find much more how-to information on the documentation, as well as samples for how to use Cognitive Service containers.

Getting the Language Understanding Docker image

The Language Understanding Docker image is available directly from Docker Hub and to download it you just run docker pull:

docker pull mcr.microsoft.com/azure-cognitive-services/luis

You can also use docker pull to check for updated images.

Provisioning a Language Understanding service

As for the other Cognitive Services containers, to run a Language Understanding container locally, you need to provision a Language Understanding service in the Azure portal to get a valid API Key and billing endpoint. These values must be passed as command line arguments when you start the container. If you don’t have a Language Understanding service already, then open the Cognitive Services blade, then select Add, and create one. You can get the API Key and endpoint from the Getting Started page or the Overview page. In this case, we get it from the Getting Started page:

Getting started on the Azure portal

Getting the Language Understanding model

Language Understanding allows you to create a language model, also known as the Language Understanding app. This app is tailored to a specific area or domain that you want to cover. For example, you might want to build an application that knows about ordering milkshakes, in which case flavor, toppings, and size might be concepts you want to handle. We won’t dive into building a Language Understanding app here, but feel free to check out the many tutorials in the documentation, for example, “Build custom app to determine user intentions.”

Here, we simply use an empty app without any intents or entities. To create an empty Language Understanding app, go to the Language Understanding portal and create an app. It should look like this:

Empty Language Understanding App

Once you have the empty Language Understanding app, select train and then publish to make the model available for download:

Published Language Understanding App

With the Language Understanding app created, you will need to download it so that you can use it with the local Language Understanding container. To do that, go to My Apps in the  Language Understanding portal, select the empty Language Understanding app, and select Export/Export for container (GZIP):

Export published Language Understanding App for container

Create an empty folder in your root directory called input and copy the Language Understanding app file to that folder. If you are on a Windows machine, then it will look something like this:

Local Language Understanding app

Running the Language Understanding container

Now we are ready to fire up the local Language Understanding container using docker run. The special thing here is to mount the input folder so that the container can read it. To do this we use the  –mount option with docker run. With the folder named C:input, the command looks like this:

docker run --rm -it -p 5000:5000 --mount type=bind,src=C:input,target=/input mcr.microsoft.com/azure-cognitive-services/luis eula=accept apikey= billing=

There are lots of other ways to mount folders so check out the Docker options documentation for all the things you can do with Docker, in addition to configuration options available for the Language Understanding container.

Trying it out

As for all Cognitive Service containers you can now point your browser at https://localhost:5000/swagger to inspect the API and to try things out. You can also call the container programmatically. For more information, check out the several samples available on GitHub. By selecting Try it out you get the list of parameters needed to submit a local request to the container.

For the App ID you use the GUID part of the Language Understanding app name. In the example above it is the GUID starting with 2ccdc110. Enter some text, such as “Hello!” in the query field and select Execute:

Empty Language Understanding response

Since the Language Understanding app is empty, we get the None intent back. This means that the app didn’t understand it but now we can go build a better model and try again.

  • Explore

     

    Let us know what you think of Azure and what you would like to see in the future.

     

    Provide feedback

  • Build your cloud computing and Azure skills with free courses by Microsoft Learn.

     

    Explore Azure learning


Join the conversation