Mastra Tracing
Instrument AI agents and workflows built with Mastra's TypeScript framework
Mastra is an agentic framework that simplifies building complex AI applications with multi-agent workflows, tool integrations, and memory management.
API Key Setup
Before running your application, ensure you have the following environment variables set:
export ARIZE_SPACE_ID="YOUR_ARIZE_SPACE_ID"
export ARIZE_API_KEY="YOUR_ARIZE_API_KEY"
You can find your Arize Space ID and API Key in your Arize Dashboard settings.
Install
Create your Project
If you haven't already, create a project with Mastra:
npm create mastra@latest
# answer the prompts, include agent, tools, and the example when asked
cd chosen-project-name
Install Packages
Install the OpenInference instrumentation package for Mastra
npm install @arizeai/openinference-mastra@^2.2.0 @mastra/core
npm install -D mastra
Setup Tracing
Initialize OpenTelemetry tracing for your Mastra application:
import { Mastra } from '@mastra/core/mastra';
import {
OpenInferenceOTLPTraceExporter,
isOpenInferenceSpan,
} from "@arizeai/openinference-mastra";
export const mastra = new Mastra({
// ... other config (agents, workflows, etc.)
telemetry: {
serviceName: "my-mastra-app",
enabled: true,
export: {
type: "custom",
tracerName: "my-mastra-app",
exporter: new OpenInferenceOTLPTraceExporter({
url: "https://otlp.arize.com/v1/traces",
headers: {
"space_id": process.env.ARIZE_SPACE_ID, // Set in API Key Setup
"api_key": process.env.ARIZE_API_KEY, // Set in API Key Setup
},
spanFilter: isOpenInferenceSpan,
}),
},
},
});
Run Mastra Example
Create Agents and Tools
From here you can use Mastra as normal. Create agents with tools and run them:
import { openai } from "@ai-sdk/openai";
import { Agent } from "@mastra/core/agent";
import { z } from "zod";
// Create a simple weather tool
const weatherTool = {
name: "weatherTool",
description: "Get current weather for a location",
parameters: z.object({
location: z.string().describe("The city and country"),
}),
execute: async ({ location }) => {
// Simulate weather API call
return {
location,
temperature: "22°C",
condition: "Sunny",
humidity: "60%"
};
},
};
// Create an agent
const weatherAgent = new Agent({
name: "Weather Assistant",
instructions: "You help users get weather information. Use the weather tool to get current conditions.",
model: openai("gpt-4o-mini"),
tools: { weatherTool },
});
// Register the agent with Mastra instance
const mastra = new Mastra({
agents: { weatherAgent },
telemetry: {
serviceName: "mastra-weather-agent",
enabled: true,
export: {
type: "custom",
tracerName: "mastra-weather-agent",
exporter: new OpenInferenceOTLPTraceExporter({
url: "https://otlp.arize.com/v1/traces",
headers: {
"space_id": process.env.ARIZE_SPACE_ID, // Set in API Key Setup
"api_key": process.env.ARIZE_API_KEY, // Set in API Key Setup
},
spanFilter: isOpenInferenceSpan,
}),
},
},
});
Running Your Application
To start your application with tracing:
# Start the Mastra dev server (required for tracing)
mastra dev
This will:
Generate OpenTelemetry instrumentation files in
.mastra/output/
Initialize the tracing SDK with your telemetry configuration
Start the Mastra playground at
http://localhost:4111
Enable trace export to Arize
Interact with your agents:
Via Playground: Navigate to
http://localhost:4111/playground
to chat with agentsVia API: Make requests to the generated API endpoints
Programmatically: Create test scripts that run within the mastra dev environment
Observe
Now that you have tracing setup, all agent runs, tool calls, and model interactions will be streamed to your Arize Space for observability and evaluation. All traces follow OpenTelemetry standards and include relevant metadata such as model parameters, token usage, execution timing, and error details.
Resources
Last updated
Was this helpful?