The API endpoint URL for your chosen AI provider.

Ollama

For Ollama, specify the base URL where your Ollama service is running:

Note: Ensure the Ollama service is running and accessible from your Jenkins instance.

OpenAI

Leave this field empty. LangChain4j automatically uses the official OpenAI API endpoint.

Only specify a custom URL if you're using an OpenAI-compatible service or proxy.

Google Gemini

Leave this field empty. LangChain4j automatically uses the official Google AI API endpoint.

Only specify a custom URL if you're using a custom Google AI proxy.

For standard cloud providers (OpenAI, Google), LangChain4j handles the endpoint configuration automatically.