Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue: Can't get any traces using OTel with @ai/sdk + Sentry #1459

Open
cacoos opened this issue Jan 23, 2025 · 0 comments
Open

Issue: Can't get any traces using OTel with @ai/sdk + Sentry #1459

cacoos opened this issue Jan 23, 2025 · 0 comments
Assignees

Comments

@cacoos
Copy link

cacoos commented Jan 23, 2025

Hey!

I'm trying to setup OTel for tracing with Langsmith in NextJS, using the Vercel AI SDK and Sentry.

I followed the guide here: https://docs.smith.langchain.com/observability/how_to_guides/tracing/trace_whttps://github.com/langchain-ai/langsmith-sdk/issues/new/chooseith_vercel_ai_sdk#sentry but there are a couple of outdated stuff (so does Sentry docs..)

Specifically this snippet:

client?.traceProvider?.addSpanProcessor(
  new BatchSpanProcessor(new AISDKExporter())
);

client doesn't have traceProvider 🤔

I've set it up like this:

// This file configures the initialization of Sentry on the client.
// The config you add here will be used whenever a users loads a page in their browser.
// https://docs.sentry.io/platforms/javascript/guides/nextjs/

import * as Sentry from "@sentry/nextjs";
import { AISDKExporter } from "langsmith/vercel";

const client = Sentry.init({
  dsn: "[DSN]",

  // Skip OpenTelemetry setup
  skipOpenTelemetrySetup: true,

  // Define how likely traces are sampled
  tracesSampleRate: 1,

  // Setting this option to true will print useful information to the console while you're setting up Sentry.
  debug: false,
});

import {
  BasicTracerProvider,
  BatchSpanProcessor,
} from "@opentelemetry/sdk-trace-base";
import {
  SentryPropagator,
  SentrySampler,
  SentrySpanProcessor,
} from "@sentry/opentelemetry";

const provider = new BasicTracerProvider({
  sampler: new SentrySampler(client!),
  spanProcessors: [
    new SentrySpanProcessor(),
    new BatchSpanProcessor(new AISDKExporter()),
  ],
});

provider.register({
  propagator: new SentryPropagator(),
  contextManager: new Sentry.SentryContextManager(),
});

Buuut it doesn't work, Langsmith doesn't show anything. Any suggestions? Thank you!

[edit]

These are the logs after activating debug inside AISDKExporter:

Toggle logs
[ai.streamText] 7072b281-18d7-5f7f-a395-203f0e397cb4 {
  run_type: 'llm',
  name: 'anthropic.messages',
  inputs: { messages: [ [Object], [Object] ] },
  outputs: { llm_output: { type: 'ai', data: [Object], token_usage: [Object] } },
  events: [],
  extra: {
    batch_size: 1,
    metadata: {
      ls_provider: 'anthropic',
      ls_model_type: 'messages',
      ls_model_name: 'claude-3-5-sonnet-20240620',
      'ai.operationId': 'ai.streamText'
    }
  },
  session_name: 'development',
  start_time: 1737672793834,
  end_time: 1737672796511
}
[ai.streamText.doStream] ba66cb0e-198d-5880-92a4-8b6677870f84 {
  run_type: 'llm',
  name: 'anthropic.messages',
  inputs: { messages: [ [Object], [Object] ] },
  outputs: { llm_output: { type: 'ai', data: [Object], token_usage: [Object] } },
  events: [ { name: 'new_token', time: 1737672794826 } ],
  extra: {
    batch_size: 1,
    metadata: {
      ls_provider: 'anthropic',
      ls_model_type: 'messages',
      ls_model_name: 'claude-3-5-sonnet-20240620',
      'ai.operationId': 'ai.streamText.doStream'
    }
  },
  session_name: 'development',
  start_time: 1737672793836,
  end_time: 1737672796509
}
[POST] 623ff1fc-31b9-53c3-a49e-6ad4d9f093dc undefined
[start response] c6b3e1df-861d-5da8-a8a0-409ee882e047 undefined
[POST /api/trpc/websites.createWebsiteVersion?batch=1] 50543616-472f-5c59-a4e9-10e629207eb6 undefined
[POST] b19317a0-cc03-58ea-8ced-d8733150a515 undefined
[POST /api/trpc/[trpc]/route] fd93e411-1a12-5736-885a-f50a07cd6e10 undefined
[resolve page components] ca2ec062-a1d4-546c-bbd8-df0d3080a4bd undefined
[executing api route (app) /api/trpc/[trpc]/route] 9e6e30e8-a032-52bc-9a7f-a3da02f7575c undefined
[start response] cd11396e-2a9d-53d4-8cbd-7ca13cd5984b undefined
[GET /api/trpc/websites.getWebsite?batch=1&input=%7B%220%22%3A%7B%22json%22%3A%7B%22websiteSlug%22%3A%22da%22%7D%7D%7D] 04d2321a-7cc2-5339-b98d-e6291adc9e23 undefined
[resolve page components] e4c65e1a-d6a4-5d1f-b90b-ac5742a3c357 undefined
[executing api route (app) /api/trpc/[trpc]/route] f074bb9b-5a1a-5eab-80c6-5eeab0af1976 undefined
[start response] 9397ae37-1097-5dc7-948d-d07589dd8f4a undefined
[2025-01-23T22:53:17.069Z] [LangSmith] sampled runs to be sent to LangSmith []
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants