Monday, April 8, 2019

Tensorflow Object Detection API - Inference Web Service

It is great to create REST API to access from anywhere to use your machine learning model. Flask framework makes it easy to create a web service with python. All the code is include in object_detection_app_test1.py.


Request Example
Type: POST
Address: http://xxxxxx:5000/inference_app
KEY: file
VALUE: IMG_0000.jpg

Set up configuration
In this example, you open 5000 port so you can access the web service, and THRESHOLD allows only inference results higher than 0.6 to be responded. An image sent through POST request is going to be saved at UPLOAD_DIR. Define flask, and write UPLOAD_DIR to app.config so flask app can obtain later when it receives http request.

Write model in app.config and run flask app
You load model from frozen_inference_graph.pb and obtain category index from pascal_label_map.pbtxt in the main function. It also writes the model of 'session', 'image_tensor', 'tensor_dict', and 'category_index' in app.config['MODEL'] so the flask app can obtain the model when it receives http requests.
Upload an image file and load model
When the flask app receives a http request, it loads the model from app.config['MODEL'] and upload the image file to a directory in the web server.
Run the inference
Delete the uploaded image file

1 comment:

  1. we are running a company and we love to write blogs about the latest AI trends. I was looking for something related to Tensor flow object detection API and here I found your blog...n its really
    helpful. Thanks for sharing such articles.

    You can also follow my blog here: What is object detection API? Tensorflow object detection tutorial

    ReplyDelete