This sample demo app consists of a group of containerized microservices that can be easily deployed into an Azure Kubernetes Service (AKS) cluster. This is meant to show a realistic scenario using a polyglot architecture, event-driven design, and common open source back-end services (eg - RabbitMQ, MongoDB). The application also leverages OpenAI's GPT-3 models to generate product descriptions. This can be done using either Azure OpenAI or OpenAI.
This application is inspired by another demo app called Red Dog.
Note: This is not meant to be an example of perfect code to be used in production, but more about showing a realistic application running in AKS.
The application has the following services:
Service | Description |
---|---|
makeline-service |
This service handles processing orders from the queue and completing them (Golang) |
order-service |
This service is used for placing orders (Javascript) |
product-service |
This service is used to perform CRUD operations on products (Rust) |
store-front |
Web app for customers to place orders (Vue.js) |
store-admin |
Web app used by store employees to view orders in queue and manage products (Vue.js) |
virtual-customer |
Simulates order creation on a scheduled basis (Rust) |
virtual-worker |
Simulates order completion on a scheduled basis (Rust) |
ai-service |
Optional service for adding generative text and graphics creation (Python) |
mongodb |
MongoDB instance for persisted data |
rabbitmq |
RabbitMQ for an order queue |
For this demo, you can either use Azure OpenAI service or OpenAI service. If you plan on using Azure OpenAI service, you need to request access to enable it on your Azure subscription using the Request Access to Azure OpenAI Service form.
If you plan on using OpenAI, sign up on the OpenAI website.
- Open a cloud Shell from the Azure portal
- Clone this repo by running this command:
git clone https://github.com/dawright22/azure-ai-demo.git
- Change into the directory this created
- Now copy and Run this command:
Terraform init
- Now copy and Run this command:
Terraform apply
Once the deployment is complete you will see the outputs of the deployment. Then rename [pki_build.tf.second] to [pki_build.tf] and run terraform apply again. This will deploy the PKI components.
Finally, you will need to run the following command to deploy the application: Rename [app_build.tf.third] to [app_build.tf] and run terraform apply again. This will deploy the application components.
That's all Fokes! [to quote bugs bunny]
If you want to customise location or other componets then you can start playing with the Variables file to suit you.
Terraform Outputs give you all the names of resocurces created so you can use them to connect to the resources.
Name | Version |
---|---|
helm | ~> 2.0.2 |
local | ~>2.4.0 |
random | =3.5.1 |
Name | Version |
---|---|
azurerm | n/a |
helm | ~> 2.0.2 |
kubernetes | n/a |
random | =3.5.1 |
No modules.
Name | Description | Type | Default | Required |
---|---|---|---|---|
ai_location | value of azure region for deploying azure ai service | string |
"Australia East" |
no |
k8s_namespace | value of kubernetes namespace | string |
"default" |
no |
location | n/a | string |
"Australia East" |
no |
openai_model_capacity | value of azure openai model capacity | number |
120 |
no |
openai_model_name | value of azure openai model name | string |
"gpt-35-turbo" |
no |
openai_model_version | value of azure openai model version | string |
"0613" |
no |
Name | Description |
---|---|
ai_endpoint | n/a |
ai_key | n/a |
ai_managed_identity_client_id | n/a |
ai_model_name | n/a |
ai_openai_deployment_name | n/a |
aks_name | n/a |
db_account_name | n/a |
db_key | n/a |
db_uri | n/a |
k8s_namespace | n/a |
rg_name | n/a |
sb_listener_key | n/a |
sb_listener_username | n/a |
sb_namespace_host | n/a |
sb_namespace_uri | n/a |
sb_sender_key | n/a |
sb_sender_username | n/a |
- AKS Documentation. https://learn.microsoft.com/azure/aks
- Kubernetes Learning Path. https://azure.microsoft.com/resources/kubernetes-learning-path