Closed
Description
- Package Name: azure-ai-inference
- Package Version: 1.0.0b9
- Operating System: Ubuntu
- Python Version: 3.12.10
Describe the bug
Tried the example code sample_chat_completions_with_tools.py. Replacing the endpoint and key with my GPT-4o deployment. Getting error as follows:
azure.core.exceptions.HttpResponseError: (BadRequest) Invalid input format.
Code: BadRequest
Message: Invalid input format.
To Reproduce
Steps to reproduce the behavior:
- Download the sample code as shown in the link
- Update endpoint and key
- Run it locally
Expected behavior
The code should run without error
Metadata
Metadata
Assignees
Labels
Issues related to the client library for Azure AI Model Inference (\sdk\ai\azure-ai-inference)This issue points to a problem in the data-plane of the library.Workflow: This issue is responsible by Azure service team.Issues that are reported by GitHub users external to the Azure organization.Workflow: This issue needs attention from Azure service team or SDK teamThe issue doesn't require a change to the product in order to be resolved. Most issues start as that