Skip to main content

Cloud APIM - LLM Proxy - prompt context

cp:otoroshi_plugins.com.cloud.apim.otoroshi.extensions.aigateway.plugins.AiPromptContext

Enhance a LLM request based with a context. MUST be used in addition to the LLM proxy plugin

official documentation from otoroshi manual

categories:

  • Cloud APIM
  • AI - LLM

default configuration:

{
"ref" : ""
}