We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Describe your environment.
OPENCENSUS = { 'TRACE': { 'SAMPLER': 'opencensus.trace.samplers.ProbabilitySampler(rate=1.0)', 'EXPORTER': '''opencensus.ext.jaeger.trace_exporter.JaegerExporter( service_name='okr', agent_host_name='128.0.0.1', agent_port=6831, transport=opencensus.common.transports.async_.AsyncTransport )''', } }
Steps to reproduce. My use case is: to trace Graphql requests which contain so many fields (with corresponding resolvers).
What is the expected behavior? As @reyang suggested to treat max_batch_size as the threshold to trigger exporter, could be a good option.
What is the actual behavior?
WARNING Data exceeds the max UDP packet size; size 90445, max 65000 WARNING Data exceeds the max UDP packet size; size 98349, max 65000
cc @c24t
#522 (comment)
The text was updated successfully, but these errors were encountered:
Seems to be fixed in opentelemetry by open-telemetry/opentelemetry-python#1500 (if anyone like me ends up here web-searching the error)
Sorry, something went wrong.
No branches or pull requests
Uh oh!
There was an error while loading. Please reload this page.
Describe your environment.
Steps to reproduce.
My use case is: to trace Graphql requests which contain so many fields (with corresponding resolvers).
What is the expected behavior?
As @reyang suggested to treat max_batch_size as the threshold to trigger exporter, could be a good option.
What is the actual behavior?
cc @c24t
#522 (comment)
The text was updated successfully, but these errors were encountered: