Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ai client support for tracing operations, inference instrumentation as… #37820

Open
wants to merge 2 commits into
base: dargilco/azure-ai-client
Choose a base branch
from

Conversation

M-Hietala
Copy link

… part of inference library

Description

Please add an informative description that covers that changes made by the pull request and link all relevant issues.

If an SDK is being regenerated based on a new swagger spec, a link to the pull request containing these swagger spec changes has been included above.

All SDK Contribution checklist:

  • The pull request does not introduce [breaking changes]
  • CHANGELOG is updated for new features, bug fixes or other significant changes.
  • I have read the contribution guidelines.

General Guidelines and Best Practices

  • Title of the pull request is clear and informative.
  • There are a small number of commits, each of which have an informative message. This means that previously merged commits do not appear in the history of the PR. For more information on cleaning up the commits in your PR, see this page.

Testing Guidelines

  • Pull request includes test coverage for the included changes.

@github-actions github-actions bot added AI AI Model Inference Issues related to the client library for Azure AI Model Inference (\sdk\ai\azure-ai-inference) labels Oct 9, 2024
# )

# Get an authenticated OpenAI client for your default Azure OpenAI connection:
client = ai_client.inference.get_azure_openai_client()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be .get_chat_completions_client(), since it looks like you are expecting azure.ai.infernce ChatCompletionsClient to be returned, not AzureOpenAI client from openai package.


# If you do not specify semantic conventions version, then the default version supported by the current
# Inference SDK will be returned.
instrumentor = ai_client.tracing.create_inference_instrumentor()
Copy link
Member

@jhakulin jhakulin Oct 9, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if this is anti-pattern on OTEL instrumentation? @lmolkova

ai_client.tracing.instrument_inference()

do something

ai_client.tracing.uninstrument_inference()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
AI Model Inference Issues related to the client library for Azure AI Model Inference (\sdk\ai\azure-ai-inference) AI
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants