I’ve been having a ton of issues trying to get the AI Semantic plugin to work with Mistral. After setting up the plugin using Kong Konnect I was finally able to enable the plugin to work with the routes having an hybrid gateway manger and a node working on my machine in the same network as the REDIS database that I am using for the same effect. Unfortunatelly, everytime I activate the plugin, all requests get rejected and I get a 400 Bad Requests. I tryed to change parameters and made a ton of experiments, but the only thing I was finally able to do was to simply use the AI proxy plugin to communicate with Mistral and that works like a breeze. Not the AI semantic prompt guard plugin however. So my configuration for Mistral looks like this in terms of the AI proxy:
curl -X POST \
https://eu.api.konghq.com/v2/control-planes/{control_plane_id}/core-entities/services/{service_id}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer KONG_API_KEY" \
--data '{
"name": "ai-proxy",
"config": {
"model": {
"name": "mistral-medium",
"provider": "mistral",
"options": {
"mistral_format": "openai",
"upstream_url": "https://api.mistral.ai/v1/chat/completions"
}
},
"auth": {
"header_name": "Authorization",
"header_value": "Bearer MISTRAL_API_KEY"
},
"route_type": "llm/v1/chat"
}
}'
and it looks like this for the AI Semantic Prompt Guard plugin:
curl -X POST \
https://eu.api.konghq.com/v2/control-planes/{control_plane_id}/core-entities/routes/{route_id}/plugins \
--header "accept: application/json" \
--header "Content-Type: application/json" \
--header "Authorization: Bearer KONG_API_KEY" \
--data '{
"name": "ai-semantic-prompt-guard",
"config": {
"rules": {
"match_all_conversation_history": true,
"allow_prompts": ["Questions about StarWars"],
"deny_prompts": ["Questions about StarTrek"]
},
"embeddings": {
"auth": {
"header_name": "Authorization",
"header_value": "Bearer MISTRAL_API_KEY"
},
"model": {
"provider": "mistral",
"name": "mistral-embed",
"options": {
"upstream_url": "https://api.mistral.ai/v1/embeddings"
}
}
},
"vectordb": {
"dimensions": 1024,
"distance_metric": "cosine",
"strategy": "redis",
"threshold": 0.1,
"redis": {
"host": "redis",
"port": 6379
}
}
}
}
'
I configued the service and the route manually in Kong Konnect after createing a hybrid mode gateway manager and having two containers running locally. One for Kong and the other for Redis. However nothing works yet and the plugin only blocks the requests. The full examples can be found on my repo at GitHub - jesperancinha/kong-test-drives: Kont test project on folder: kong-test-drives/kong-ai at main · jesperancinha/kong-test-drives · GitHub. Can anyone help me with this and could I be missing at this point? Please let me know! Cheers! PS: I dot not want to use OpenAI for this example. I specifically want to use Mistral. Thank you!