You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There has been a demand from customers for the implementation of Model-as-a-Service (MaaS) in SK. MaaS, which is also referred to as [serverless API](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/model-catalog-overview#model-deployment-managed-compute-and-serverless-api-pay-as-you-go), is available in [Azure AI Studio](https://learn.microsoft.com/en-us/azure/ai-studio/what-is-ai-studio). This mode of consumption operates on a pay-as-you-go basis, typically using tokens for billing purposes. Clients can access the service via the [Azure AI Model Inference API](https://learn.microsoft.com/en-us/azure/ai-studio/reference/reference-model-inference-api?tabs=azure-studio) or client SDKs.
16
+
17
+
At present, there is no official support for MaaS in SK. The purpose of this ADR is to examine the constraints of the service and explore potential solutions to enable support for the service in SK via the development of a new AI connector.
18
+
19
+
## Client SDK
20
+
21
+
The Azure team will be providing a new client library, namely `Azure.AI.Inference` in .Net and `azure-ai-inference` in Python, for effectively interacting with the service. While the service API is OpenAI-compatible, it is not permissible to use the OpenAI and the Azure OpenAI client libraries for interacting with the service as they are not independent with respect to both the models and their providers. This is because Azure AI Studio features a diverse range of open-source models, other than OpenAI models.
22
+
23
+
### Limitations
24
+
25
+
The initial release of the client SDK will only support chat completion and text/image embedding generation, with image generation to be added later.
26
+
27
+
Plans to support for text completion are currently unclear, and it is highly unlikely that the SDK will ever include support for text completion. As a result, the new AI connector will **NOT** support text completions in the initial version until we get more customer signals or the client SDK adds support.
28
+
29
+
## AI Connector
30
+
31
+
### Naming options
32
+
33
+
- Azure
34
+
- AzureAI
35
+
- AzureAIInference
36
+
- AzureAIModelInference
37
+
38
+
Decision: `AzureAIInference`
39
+
40
+
### Support for model-specific parameters
41
+
42
+
Models can possess supplementary parameters that are not part of the default API. The service API and the client SDK enable the provision of model-specific parameters. Users can provide model-specific settings via a dedicated argument along with other settings, such as `temperature` and `top_p`, among others.
43
+
44
+
In the context of SK, execution parameters are categorized under `PromptExecutionSettings`, which is inherited by all connector-specific setting classes. The settings of the new connector will contain a member of type `dictionary`, which will group together the model-specific parameters.
This console application shows how to use function invocation filter (`IFunctionInvocationFilter`) to invoke a Kernel Function only if such operation was approved.
4
+
If function invocation was rejected, the result will contain the reason why, so the LLM can respond appropriately.
5
+
6
+
The application uses a sample plugin which builds software by following these development stages: collection of requirements, design, implementation, testing and deployment.
7
+
8
+
Each step can be approved or rejected. Based on that, the LLM will decide how to proceed.
9
+
10
+
## Configuring Secrets
11
+
12
+
The example requires credentials to access OpenAI or Azure OpenAI.
13
+
14
+
If you have set up those credentials as secrets within Secret Manager or through environment variables for other samples from the solution in which this project is found, they will be re-used.
15
+
16
+
### To set your secrets with Secret Manager:
17
+
18
+
```
19
+
cd dotnet/samples/Demos/FunctionInvocationApproval
20
+
21
+
dotnet user-secrets init
22
+
23
+
dotnet user-secrets set "OpenAI:ChatModelId" "..."
24
+
dotnet user-secrets set "OpenAI:ApiKey" "..."
25
+
26
+
dotnet user-secrets set "AzureOpenAI:ChatDeploymentName" "..."
27
+
dotnet user-secrets set "AzureOpenAI:Endpoint" "https://... .openai.azure.com/"
28
+
dotnet user-secrets set "AzureOpenAI:ApiKey" "..."
29
+
```
30
+
31
+
### To set your secrets with environment variables
Copy file name to clipboardExpand all lines: dotnet/samples/GettingStartedWithAgents/README.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -22,7 +22,7 @@ Example|Description
22
22
[Step1_Agent](https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/GettingStartedWithAgents/Step1_Agent.cs)|How to create and use an agent.
23
23
[Step2_Plugins](https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/GettingStartedWithAgents/Step2_Plugins.cs)|How to associate plug-ins with an agent.
24
24
[Step3_Chat](https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/GettingStartedWithAgents/Step3_Chat.cs)|How to create a conversation between agents.
25
-
[Step4_KernelFunctionStrategies](https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/Step4_KernelFunctionStrategies/Step1_Agent.cs)|How to utilize a `KernelFunction` as a _chat strategy_.
25
+
[Step4_KernelFunctionStrategies](https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/GettingStartedWithAgents/Step4_KernelFunctionStrategies.cs)|How to utilize a `KernelFunction` as a _chat strategy_.
26
26
[Step5_JsonResult](https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/GettingStartedWithAgents/Step5_JsonResult.cs)|How to have an agent produce JSON.
27
27
[Step6_DependencyInjection](https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/GettingStartedWithAgents/Step6_DependencyInjection.cs)|How to define dependency injection patterns for agents.
28
28
[Step7_OpenAIAssistant](https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/GettingStartedWithAgents/Step7_OpenAIAssistant.cs)|How to create an Open AI Assistant agent.
0 commit comments