To integrate Raindrop with the Vercel AI SDK you’ll complete two steps:
  1. Configure an OpenTelemetry (OTEL) trace exporter (instructions differ for Next.js and Node.js).
  2. Instrument your Vercel AI SDK calls and attach Raindrop metadata (common to both).

Setting up OpenTelemetry in Next.js

First, install the required OpenTelemetry packages.
npm install raindrop-ai @opentelemetry/api @vercel/otel
Then, register the OpenTelemetry tracing exporter in your instrumentation.ts file.
// instrumentation.ts
import { registerOTel, OTLPHttpProtoTraceExporter } from '@vercel/otel';
import { BatchSpanProcessor } from '@opentelemetry/sdk-trace-base';

export function register() {
  registerOTel({
    serviceName: 'ai-chatbot',
    spanProcessors: [
      new BatchSpanProcessor(
        new OTLPHttpProtoTraceExporter({
          url: 'https://api.raindrop.ai/v1/traces',
          headers: {
            'Authorization': `Bearer ${process.env.RAINDROP_API_KEY}`,
          },
        }),
      ),
    ],
  });
}

Setting up OpenTelemetry in Node.js

For Node.js applications, first install the required OpenTelemetry packages.
npm install raindrop-ai @opentelemetry/api @opentelemetry/sdk-node @opentelemetry/resources @opentelemetry/semantic-conventions @opentelemetry/sdk-trace-base @opentelemetry/exporter-trace-otlp-proto @opentelemetry/sdk-trace-node
Then, configure the OpenTelemetry SDK:
import { NodeSDK } from '@opentelemetry/sdk-node';
import { BatchSpanProcessor } from '@opentelemetry/sdk-trace-node';
import { ATTR_SERVICE_NAME } from '@opentelemetry/semantic-conventions';
import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-proto';
import { resourceFromAttributes } from '@opentelemetry/resources';

const sdk = new NodeSDK({
  resource: resourceFromAttributes({
    [ATTR_SERVICE_NAME]: 'ai-chatbot',
  }),
  spanProcessors: [
    new BatchSpanProcessor(
      new OTLPTraceExporter({
        url: 'https://api.raindrop.ai/v1/traces',
        headers: {
          'Authorization': `Bearer ${process.env.RAINDROP_API_KEY}`,
        },
      })
    ),
  ],
});

sdk.start();

Instrumenting AI SDK Calls

To instrument your AI SDK calls:
  1. Enable experimental_telemetry: { isEnabled: true } at all AI SDK call sites
  2. Add Raindrop metadata at the top-level call that handles user input and produces the final output using raindrop.metadata()
import { generateText, openai, tool } from 'ai';
import { z } from 'zod';
import raindrop from 'raindrop-ai/otel';

const enhanceStory = tool({
  description: 'Enhance a story with additional details',
  parameters: z.object({
    story: z.string().describe('The story to enhance'),
  }),
  execute: async ({ story }) => {
    // This nested call only needs isEnabled: true, no metadata
    const enhanced = await generateText({
      model: openai('gpt-4o'),
      prompt: `Enhance this story with more vivid details: ${story}`,
      experimental_telemetry: {
        isEnabled: true, // Required at all call sites
        functionId: 'enhance-story',
      },
    });
    return { enhancedStory: enhanced.text };
  },
});

const result = await generateText({
  model: openai('gpt-4o'),
  prompt: 'Write a short story about a cat.',
  tools: {
    enhanceStory,
  },
  experimental_telemetry: {
    isEnabled: true, // Required
    functionId: 'generate-text',
    metadata: {
      ...raindrop.metadata({
        userId: 'user_123', // Required
        eventName: 'story_generation',
        convoId: 'convo_123',
      }),
    },
  },
});
That’s it! You’re ready to explore your events in the Raindrop dashboard. Ping us on Slack or email us if you get stuck!