Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

panic when starting with tracing config for jaeger #5943

Closed
mateuszdrab opened this issue Dec 7, 2022 · 1 comment
Closed

panic when starting with tracing config for jaeger #5943

mateuszdrab opened this issue Dec 7, 2022 · 1 comment

Comments

@mateuszdrab
Copy link

Thanos, Prometheus and Golang version used: thanos:0.29.0

Object Storage Provider: Minio

What happened:
Thanos crashes with segmentation fault when an API request arrives after starting up

What you expected to happen:
Thanos starts and reports traces to Tempo using Jaeger format

How to reproduce it (as minimally and precisely as possible):
Configured basic tracing settings via argument flag on the pod:

- |
  --tracing.config=type: JAEGER
  config:
    service_name: "thanos"
    endpoint: "http://tempo-distributor.tempo:14268/api/traces"

Full logs to relevant components:

Logs

level=debug ts=2022-12-06T22:47:20.058736343Z caller=main.go:67 msg="maxprocs: Leaving GOMAXPROCS=[4]: CPU quota undefined"
level=info ts=2022-12-06T22:47:20.085074223Z caller=factory.go:43 msg="loading tracing configuration"
level=info ts=2022-12-06T22:47:20.086482782Z caller=client.go:56 msg="enabling client to server TLS"
level=info ts=2022-12-06T22:47:20.086785261Z caller=options.go:114 msg="TLS client using provided certificate pool"
level=info ts=2022-12-06T22:47:20.086854475Z caller=options.go:147 msg="TLS client authentication enabled"
level=debug ts=2022-12-06T22:47:20.087886643Z caller=engine.go:347 msg="Lookback delta is zero, setting to default value" value=5m0s
level=info ts=2022-12-06T22:47:20.093106817Z caller=options.go:26 protocol=gRPC msg="disabled TLS, key and cert must be set to enable"
level=info ts=2022-12-06T22:47:20.094581885Z caller=query.go:776 msg="starting query node"
level=info ts=2022-12-06T22:47:20.095158803Z caller=intrumentation.go:56 msg="changing probe status" status=ready
level=debug ts=2022-12-06T22:47:20.095577726Z caller=endpointset.go:309 component=endpointset msg="starting to update API endpoints" cachedEndpoints=0
level=debug ts=2022-12-06T22:47:20.096116061Z caller=endpointset.go:384 component=endpointset msg="updated endpoints" activeEndpoints=0
level=info ts=2022-12-06T22:47:20.096911999Z caller=intrumentation.go:75 msg="changing probe status" status=healthy
level=info ts=2022-12-06T22:47:20.098377739Z caller=grpc.go:131 service=gRPC/server component=query msg="listening for serving gRPC" address=0.0.0.0:10901
level=info ts=2022-12-06T22:47:20.09864108Z caller=http.go:73 service=http/server component=query msg="listening for requests and metrics" address=0.0.0.0:10902
level=info ts=2022-12-06T22:47:20.099325109Z caller=tls_config.go:195 service=http/server component=query msg="TLS is disabled." http2=false
level=debug ts=2022-12-06T22:47:25.148605711Z caller=endpointset.go:309 component=endpointset msg="starting to update API endpoints" cachedEndpoints=0
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x20 pc=0x143d0e8]

goroutine 86 [running]:
go.opentelemetry.io/otel/sdk/trace.parentBased.ShouldSample({{0x0, 0x0}, {{0x270e3f0, 0x3a18158}, {0x270e3c8, 0x3a18158}, {0x270e3f0, 0x3a18158}, {0x270e3c8, 0x3a18158}}}, ...)
    /go/pkg/mod/go.opentelemetry.io/otel/sdk@v1.10.0/trace/sampling.go:281 +0x1c8
github.com/thanos-io/thanos/pkg/tracing/migration.samplerWithOverride.ShouldSample(...)
    /app/pkg/tracing/migration/sampler.go:42
go.opentelemetry.io/otel/sdk/trace.(*tracer).newSpan(0xc00081f040, {0x2719670, 0xc0006bafc0}, {0xc0002c40d8, 0x16}, 0xc0006e3480)
    /go/pkg/mod/go.opentelemetry.io/otel/sdk@v1.10.0/trace/tracer.go:95 +0x456
go.opentelemetry.io/otel/sdk/trace.(*tracer).Start(0xc00081f040, {0x2719670?, 0xc0006bafc0?}, {0xc0002c40d8, 0x16}, {0xc0006b86c0?, 0x4?, 0xc0006e35b0?})
    /go/pkg/mod/go.opentelemetry.io/otel/sdk@v1.10.0/trace/tracer.go:52 +0x153
go.opentelemetry.io/otel/bridge/opentracing.(*WrapperTracer).Start(0xc000863620, {0x2719670?, 0xc0006bafc0?}, {0xc0002c40d8?, 0x1eae540?}, {0xc0006b86c0?, 0x1?, 0xc000117790?})
    /go/pkg/mod/go.opentelemetry.io/otel/bridge/opentracing@v1.10.0/wrapper.go:79 +0x4b
go.opentelemetry.io/otel/bridge/opentracing.(*BridgeTracer).StartSpan(0xc000784fc0, {0xc0002c40d8, 0x16}, {0xc0006baf60, 0x3, 0x3a18158?})
    /go/pkg/mod/go.opentelemetry.io/otel/bridge/opentracing@v1.10.0/bridge.go:430 +0x3f5
github.com/thanos-io/thanos/pkg/tracing/migration.(*bridgeTracerWrapper).StartSpan(0x2009940?, {0xc0002c40d8?, 0x3a18158?}, {0xc0006baf60?, 0x39acd40?, 0xc0006e3828?})
    /app/pkg/tracing/migration/bridge.go:89 +0x26
github.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors/tracing.newClientSpanFromContext({0x2719638, 0xc0006b45a0}, {0x2714460, 0xc0008440f0}, {0xc0002c40d8, 0x16})
    /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/tracing/client.go:92 +0x245
github.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors/tracing.(*opentracingClientReportable).ClientReporter(0xc0008638d8, {0x2719638, 0xc0006b45a0}, {0x0?, 0x0?}, {0x220b8e2, 0x5}, {0x22309c5, 0x10}, {0x22309d6, ...})
    /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/tracing/client.go:51 +0x127
github.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors.UnaryClientInterceptor.func1({0x2719638, 0xc0006b45a0}, {0x22309c4, 0x16}, {0x21122a0, 0x3a18158}, {0x21123e0, 0xc0006b46c0}, 0xc0002903c0?, 0x22f4818, ...)
    /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/client.go:19 +0x195
github.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryClient.func1.1.1({0x2719638?, 0xc0006b45a0?}, {0x22309c4?, 0x38?}, {0x21122a0?, 0x3a18158?}, {0x21123e0?, 0xc0006b46c0?}, 0x0?, {0xc0001255a0, ...})
    /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:74 +0x86
github.com/grpc-ecosystem/go-grpc-prometheus.(*ClientMetrics).UnaryClientInterceptor.func1({0x2719638, 0xc0006b45a0}, {0x22309c4, 0x16}, {0x21122a0, 0x3a18158}, {0x21123e0, 0xc0006b46c0}, 0x8?, 0xc00011c348, ...)
    /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-prometheus@v1.2.0/client_metrics.go:112 +0x117
github.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryClient.func1.1.1({0x2719638?, 0xc0006b45a0?}, {0x22309c4?, 0x203000?}, {0x21122a0?, 0x3a18158?}, {0x21123e0?, 0xc0006b46c0?}, 0x1?, {0xc0001255a0, ...})
    /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:74 +0x86
github.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryClient.func1({0x2719638, 0xc0006b45a0}, {0x22309c4, 0x16}, {0x21122a0, 0x3a18158}, {0x21123e0, 0xc0006b46c0}, 0x0?, 0x22f4818, ...)
    /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:83 +0x157
google.golang.org/grpc.(*ClientConn).Invoke(0xc000798000?, {0x2719638?, 0xc0006b45a0?}, {0x22309c4?, 0x16?}, {0x21122a0?, 0x3a18158?}, {0x21123e0?, 0xc0006b46c0?}, {0xc0001ea7d0, ...})
    /go/pkg/mod/google.golang.org/grpc@v1.45.0/call.go:35 +0x223
github.com/thanos-io/thanos/pkg/info/infopb.(*infoClient).Info(0xc0002ba1b8, {0x2719638, 0xc0006b45a0}, 0x8?, {0xc0001ea7d0, 0x1, 0x1})
    /app/pkg/info/infopb/rpc.pb.go:422 +0xc9
github.com/thanos-io/thanos/pkg/query.(*endpointRef).Metadata(0xc00061c180, {0x2719638, 0xc0006b45a0}, {0x2700800, 0xc0002ba1b8}, {0x271a8d0, 0xc0002ba478})
    /app/pkg/query/endpointset.go:61 +0xe3
github.com/thanos-io/thanos/pkg/query.(*EndpointSet).updateEndpoint(0xc0008a6500, {0x2719638, 0xc0006b45a0}, 0xc00011c2d0, 0xc00061c180)
    /app/pkg/query/endpointset.go:409 +0x105
github.com/thanos-io/thanos/pkg/query.(*EndpointSet).Update.func2(0xc00011c2d0)
    /app/pkg/query/endpointset.go:349 +0x2cb
created by github.com/thanos-io/thanos/pkg/query.(*EndpointSet).Update
    /app/pkg/query/endpointset.go:338 +0x60a
Stream closed EOF for thanos/thanos-query-5db87c6989-lvvl5 (query)

Anything else we need to know:

The above issue does not happen when OTLP grpc tracing is configured

            - |
              --tracing.config=type: OTLP
              config:
                client_type: "grpc"
                insecure: true
                endpoint: "tempo-distributor.tempo:4317"
@yeya24
Copy link
Contributor

yeya24 commented Dec 7, 2022

Duplicate of #5872

@yeya24 yeya24 marked this as a duplicate of #5872 Dec 7, 2022
@yeya24 yeya24 closed this as completed Dec 7, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants