Fixing Mixed-Content and CORS issues at ML Model inference time with Azure Functions

This is a follow-up from the previous post on deploying an ONNX model using Azure Functions. Suppose you’ve got the API working great and youwant to include this amazing functionality on a brand new website. What follows is the harrowing path through failure we all must take to eventually reach the glorious end: an AI that understands tacos an burritos like I do. Mixed Content Issues As soon as you deploy your site and bask in its new found glory you might run into something like this when you make the API call from JavaScript:
Read more →

Troubleshooting an ONNX Model deployment to Azure Functions

Building awesome machine learning models can be an onerous task. Once all the blood, sweat and tears have been expended creating this magical (and ethical) model sometimes it feels like getting the thing deployed will require severing another limb in the name of releasing software. This post is designed to help you, dear reader, manage the land mines surrounding this complicated task and hopefully make this part of the development cycle painless.

Here’s the rundown:

  1. I made a machine learning model in PyTorch that classifies tacos vs burrito images
  2. I exported the model to ONNX
  3. I want to deploy inferencing code to Azure Functions using Python
  4. I’ve been able to get the function running locally (this can be its own post but I thought the docs were excellent).

What follows is my journey getting this to work (with screenshots to boot). Let’s dig in!

Read more →