-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(python): openai instrumentator #35
Conversation
python/instrumentation/openinference-instrumentation-openai/README.rst
Outdated
Show resolved
Hide resolved
""" | ||
Phoenix collector should be running in the background. | ||
""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Non-blocking. - I think what might make sense long term is to have small example apps that have a README and project toml. That way we can stack multiple instrumentations together.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea, that's a good idea!
|
||
resource = Resource(attributes={}) | ||
tracer_provider = trace_sdk.TracerProvider(resource=resource) | ||
span_exporter = OTLPSpanExporter(endpoint="http://127.0.0.1:6006/v1/traces") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thought: Let's brainstorm how to maybe just run some of these examples via docker-compose. That way it will be easier to showcase phoenix.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good idea!
python/instrumentation/openinference-instrumentation-openai/integration_tests/completions.py
Outdated
Show resolved
Hide resolved
cast_to: type, | ||
request_options: Mapping[str, Any], | ||
) -> Iterator[_WithSpan]: | ||
span_kind = _EMBEDDING_SPAN_KIND if cast_to is CreateEmbeddingResponse else _LLM_SPAN_KIND |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thought: I get that in this context everything is probably an LLM or Embedding but this read a bit presumptive. Maybe this could be in a function _get_span_kind_from_type
or something like that that makes it a bit more clear? Falling through to _LLM_SPAN_KIND feels a tad dangerous - I'm guessing assistant API doesn't hit this but might be better to fallback to unknown?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yea, agree. i'll update. On the other hand maybe the span name should come from a higher level caller: by the time the Request function is called, we don't really know who called it and for what purpose. An alternative is to also monkey patch those higher level callers, but instead of tracing them per se, just have them put the real span_name
into the context
and pass it down to request
(and then use the fallback in request if span_name
doesn't exist.
resolves #1961