[backport -> release/3.6.x] feat(plugins): ai-transformer plugins #12426
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Automated backport to
release/3.6.x
, triggered by a label in #12341.Original description
Summary
Adds two plugins that introspects AI requests and responses against the same, or separate, LLM services.
## AI Request Transformer
Uses a configured LLM service to introspect / transform the consumer's request body, before proxying upstream.
Works like so:
This
ai-request-transformer
plugin runs AFTER all of the "ai-prompt-*" plugins, allowing it to also introspect LLM requests against... a different LLM!Here is a sample deck config:
AI Response Transformer
Uses a configured LLM to introspect the upstream's HTTP(S) response, before sending back to the client.
Works like so:
kong.response.exit
) and can handle gzip / chunked requests. The code for this is copied from Kong plugin "forward-proxy"The is also an extra feature in ai-response, whereby if you ask the LLM to respond in this format:
then Kong will parse these instructions and set all response headers, set response status code, and set replacement response body, based on this output. This allows changing e.g.
Content-Type
, or throwing errors (/ stop words) from the LLM.This
ai-response-transformer
plugin runs AFTER the "ai-proxy" plugin, allowing it to also introspect LLM responses against a different LLM!Here is a sample deck config:
Checklist
changelog/unreleased/kong
orskip-changelog
label added on PR if changelog is unnecessary. README.mdIssue reference
Internal project.