By adding this integration, your data may be sent to your custom LLM deployment for certain actions within Arize (e.g., prompt playground) and your account may be billed for usage.
Add the Integration Using the Arize Skills
Add a custom model endpoint integration from your coding agent using the Arize Skills:Create an custom model integration named 'my-custom-model' with my base URL https://my-hosted-model/openai/v1 and API key F1...
Add the Integration from Arize AX
Select Custom Model Endpoints from the Providers List

Fill Out Your Integration Details

Enter the base URL including the version path (e.g.,If you need to send any extra headers to every request, set these. You then need to define the available model deployments. If you have all the default OpenAI models available then you can enable OpenAI default models. Otherwise add your model name as a custom model name.https://my-hosted-model/openai/v1). Do not include endpoint paths like/chat/completionsas these are appended automatically.
Advanced Settings

- Supports Function Calling - Turn this on to allow the integration to use function calling features. This is on by default.
- Authorized Orgs - Configure which organizations have access to this integration.
- Authorized Spaces - Configure which spaces have access to this integration.