Updates to Azure OpenAI Functions in the latest API

Microsoft is constantly evolving and improving its AI services, and the Azure OpenAI is probably one of the fastest services to evolve. There is now support for GPT-4 Turbo with Vision, DALL-E3, enhanced content filtering and it keeps on improving. Azure OpenAI Functions is something that various AI apps are utilizing – and no wonder, because it is a fantastic feature and gives AI apps intelligence. In December 2023, Microsoft made changes to this capability API, deprecating the old functions API that was in use between summer and December 2023. Instead, it has been replaced in the latest API version ( 2023-12-01-preview) with new Tools-definitions, which have brought changes to the JSON schema. While the capability continues to work as before, developers will need to make a few changes to their code to adapt to these updates. In this blog post, we will explore these changes and provide guidance on how to update your code to ensure a smooth transition to the new Tools-definitions.   

Note: while Functions in API are already deprecated, the old code still works. It is important to update to new Tools API, because the old Functions API will not be there in the future.  

If you are using API version that earlier than 2023-12-01-preview, make sure to update your code to use the latest one. Earlier ones will be retried on 2nd of April 2024. 

  1. What are Azure OpenAI Functions / Tools?  
  2. What changed to Azure OpenAI Service API 2023-12-01-preview? 
  3. The new tools-schema examples 
  4. Tool response  
  5. Conclusion 

What are Azure OpenAI Functions / Tools?  

Azure OpenAI functions are a way to integrate the powerful language models of Azure OpenAI Service with other systems and tools. They allow you to define custom tools that the models can call based on the context of the prompt. This means that the AI can understand what your intent is, based on your prompt and conversation.  

Some examples of Azure OpenAI functions are: 

  • Retrieving weather of a mentioned city. Just use the city name, that is extracted from the prompt by functions-feature and create the logic to retrieve weather information – for example from Bing Weather. With this your chatbot can answer weather-related questions.  
  • Creating a ticket. You may have a bot that can help you to troubleshoot end user problems. Embedding a function that understands that a service ticket needs to be created, can automate the process – and make it really easy for the user.  
  • Fetching information. Perhaps you need to access some real-time data from the factory, stock exchanges, you name it. Making the bot to understand this with functions can make the conversation and usability rock.  

What this means, is that the AI is much more smarter and it is easier to create intelligent applications that understand what the user is intending to do.  

What changed to Azure OpenAI Service API 2023-12-01-preview? 

In short: functions and function_call parameters have been deprecated in the latest version of API. The functionality hasn’t changed, only the schema. The new terms are tools (was functions) and tool_choice (was function_call) parameters. 

The new name makes sense. You introduce tools for AI, that can be used when a need arises (user intent from the prompt). 

Note: in this blog post I only talk about changes in API regarding Functions à Tools change. There has been other changes as well, for the full list check out the API 2023-12-01-preview definition and also What’s new page. One of the coolest additions has been the support for GPT-4 Turbo with Vision, but that is a topic for another blog post.  

The new tools-schema examples 

When I started to update my function-examples I didn’t find any simple example that would show in JSON how tools and tool_choices work. It took me some trial-and-error to find out. In the end the solution was simpler than I thought at first. But I think this can really help you if you are starting to work with these.  

The weather retrieval of a specified city. This is the tools part of the JSON API call that is sent to the Azure OpenAI endpoint.  The top level array is tools, and functions are introduced one per item. 

That is not the full API call – just where you define tools/functions. What is interesting is the tool type definition. It is clearly giving room to add other types of tools to the schema later.  

Here is the full API call example. I have a habit of adding tools to the end of call. 

Remember: tools / functions need to be included on every call to AOAI (Azure OpenAI) API.  

Tool response  

When the call returns from AOAI, you need to note that the JSON is slightly different than it was with functions in earlier API version. Reply to that query is now: 

From that JSON we are interested in the choices-array, but as you can see also the responsible AI / content filtering result is returned in the same JSON. Now, let’s focus on finish_reason. When finish_reason is tool_calls and that tells us, we need to get tool information from the JSON and do the magic of that tool. In this case the magic is to retrieve the weather of given city. All relevant information is found in the tool_calls array. 

This item gives us information about which tool to use and with what parameters. Every tool call has an id, which is needed in the reply. From that we know we need to get the weather of Turku, Finland.  

Once we have the weather information, we need to tell it back to AI. This is another format that has changed in this functions->tools upgrade.  

We are adding requests to call tools and answers from the tool to the conversation, in messages block. It is important to tell which are from the user, AI (assistant) or from the tool. Call ID needs to be included as well.  

Now that we have this information, we are calling AI again with tool response with the example above. Then AI will reformat the answer from the function get_weather with content to natural language. This time we are getting reply like this:  

Now the finish_reason is stop, which means there is no need to call other tools. In case you have another tool, it would mean adding those JSON messages ( assistant telling to call the tool, tool replying to the user) for the JSON. Yes, the JSON is getting longer all the time. And remember to keep all tools-definitions in the call.  

Conclusion 

When I started to experiment with these functions, it wasn’t that clear how the conversation (message exchange) with the AI needed to be formatted. I had been using functions since July and had already figured out the conversation flow. The flow itself didn’t change, but how tool calls and returns are defined in JSON did.  

The message flow for one tool/function call is 

  • System prompt introduction  (system) 
  • User prompt (user) 
  • Tool response from AI. (assistant) 
  • Call the backend to retrieve data and add response to AI (tool) 
  • Natural language response from AI  (assistant) 

With multiple tool calls, the flow can look like this 

  • System prompt introduction  (system) 
  • User prompt (user) 
  • Tool response from AI. (assistant) 
  • Call the backend to retrieve data and add response to AI (tool) 
  • Tool response from AI. (assistant) 
  • Call the backend to retrieve data and add response to AI (tool)  
  • (repeat these tools response & tool’s responses until the finish_reason is stop) 
  • Natural language response from AI  (assistant) 

If you want to see tools (functions) in action and talk with about these live, join me at my session at Cloud Technology Townhall Tallinn 2024 on 1st of February. And no, this is not the only AI supercharging Teams I will show there.   

One thought on “Updates to Azure OpenAI Functions in the latest API

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.