Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(python): openai instrumentator #35

Merged
merged 51 commits into from
Jan 11, 2024
Merged

feat(python): openai instrumentator #35

merged 51 commits into from
Jan 11, 2024

Conversation

RogerHYang
Copy link
Contributor

@RogerHYang RogerHYang commented Jan 4, 2024

resolves #1961

@RogerHYang RogerHYang changed the title feat: openai instrumentator feat(python): openai instrumentator Jan 4, 2024
.github/workflows/python-CI.yaml Outdated Show resolved Hide resolved
Comment on lines 1 to 3
"""
Phoenix collector should be running in the background.
"""
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Non-blocking. - I think what might make sense long term is to have small example apps that have a README and project toml. That way we can stack multiple instrumentations together.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea, that's a good idea!


resource = Resource(attributes={})
tracer_provider = trace_sdk.TracerProvider(resource=resource)
span_exporter = OTLPSpanExporter(endpoint="http://127.0.0.1:6006/v1/traces")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thought: Let's brainstorm how to maybe just run some of these examples via docker-compose. That way it will be easier to showcase phoenix.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good idea!

cast_to: type,
request_options: Mapping[str, Any],
) -> Iterator[_WithSpan]:
span_kind = _EMBEDDING_SPAN_KIND if cast_to is CreateEmbeddingResponse else _LLM_SPAN_KIND
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thought: I get that in this context everything is probably an LLM or Embedding but this read a bit presumptive. Maybe this could be in a function _get_span_kind_from_type or something like that that makes it a bit more clear? Falling through to _LLM_SPAN_KIND feels a tad dangerous - I'm guessing assistant API doesn't hit this but might be better to fallback to unknown?

Copy link
Contributor Author

@RogerHYang RogerHYang Jan 5, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yea, agree. i'll update. On the other hand maybe the span name should come from a higher level caller: by the time the Request function is called, we don't really know who called it and for what purpose. An alternative is to also monkey patch those higher level callers, but instead of tracing them per se, just have them put the real span_name into the context and pass it down to request (and then use the fallback in request if span_name doesn't exist.

python/tox.ini Outdated Show resolved Hide resolved
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[otel] switch to OTEL for openai
2 participants