Skip to main content
Add your custom model endpoints to begin accessing your model in Arize’s prompt playground. Arize makes use of the OpenAI Client to make calls to these endpoints.
By adding this integration, your data may be sent to your custom LLM deployment for certain actions within Arize (e.g., prompt playground) and your account may be billed for usage.

Add the Integration Using the Arize Skills

Add a custom model endpoint integration from your coding agent using the Arize Skills:

Create an custom model integration named 'my-custom-model' with my base URL https://my-hosted-model/openai/v1 and API key F1...

Add the Integration from Arize AX

Select Custom Model Endpoints from the Providers List

The AI Provider integrations tab

Fill Out Your Integration Details

The AI integration details tab
Give your integration a name. Then set your base URL and API key from your deployment.
Enter the base URL including the version path (e.g., https://my-hosted-model/openai/v1). Do not include endpoint paths like /chat/completions as these are appended automatically.
If you need to send any extra headers to every request, set these. You then need to define the available model deployments. If you have all the default OpenAI models available then you can enable OpenAI default models. Otherwise add your model name as a custom model name.

Advanced Settings

The AI integration details tab
Expand the Advanced Settings section to configure the following:
  • Supports Function Calling - Turn this on to allow the integration to use function calling features. This is on by default.
  • Authorized Orgs - Configure which organizations have access to this integration.
  • Authorized Spaces - Configure which spaces have access to this integration.

Add the Integration

Finally check the box to agree to the terms and conditions, then select the Add Integration button to create the integration.