The API endpoint URL for your chosen AI provider.
For Ollama, specify the base URL where your Ollama service is running:
http://localhost:11434http://YOUR_OLLAMA_HOST:11434Note: Ensure the Ollama service is running and accessible from your Jenkins instance.
Leave this field empty. LangChain4j automatically uses the official OpenAI API endpoint.
Only specify a custom URL if you're using an OpenAI-compatible service or proxy.
Leave this field empty. LangChain4j automatically uses the official Google AI API endpoint.
Only specify a custom URL if you're using a custom Google AI proxy.
For standard cloud providers (OpenAI, Google), LangChain4j handles the endpoint configuration automatically.