Skip to content

Feature/improve llm logs #811

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Oct 29, 2024
Merged

Feature/improve llm logs #811

merged 5 commits into from
Oct 29, 2024

Conversation

schuellc-nvidia
Copy link
Collaborator

@schuellc-nvidia schuellc-nvidia commented Oct 18, 2024

Description

This will add a unique hash and the name of the used prompt template to each LLM request/response pair in the logging such that one can relate them when multiple request overlap each other.

Checklist

  • I've read the CONTRIBUTING guidelines.
  • I've updated the documentation if applicable.
  • I've added tests if applicable.
  • @mentions of the person or team responsible for reviewing proposed changes.

@Pouyanpi Pouyanpi force-pushed the feature/improve_llm_logs branch from e7d2c0a to c632ebb Compare October 18, 2024 15:03
@schuellc-nvidia schuellc-nvidia self-assigned this Oct 18, 2024
Copy link
Collaborator

@sklinglernv sklinglernv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a great addition to how we output LLM call information. Looks all good to me. Thanks Christian!

@schuellc-nvidia schuellc-nvidia merged commit 3e1833b into develop Oct 29, 2024
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants