Skip to content

subscriber channel is taking too long to respond with beyla 2.8.5 #2438

@esara

Description

@esara

What's wrong?

we recently upgraded to 2.8.5 and see these warning messages about every 10seconds in the beyla
https://github.com/open-telemetry/opentelemetry-ebpf-instrumentation/blob/main/pkg/pipe/msg/queue.go#L158
and panic at
https://github.com/open-telemetry/opentelemetry-ebpf-instrumentation/blob/main/pkg/pipe/msg/queue.go#L176

Steps to reproduce

running beyla as a daemonset in k8s

System information

GKE v1.33.5-gke.1201000 with 6.6.105+

Software version

v2.8.5

Configuration

    profile_port: 6057
    otel_traces_export:
      endpoint: http://tempo.monitoring:4317
    discovery:
      instrument:
      - open_ports: 80,443,2000-10000
    attributes:
      kubernetes:
        enable: true
      select:
        traces:
          include:
          - db.query.text
    filter:
      application:
        url_path:
          not_match: '{/actuator/*,*/actuator,...

Logs

logstime=2026-01-22T10:24:12.133Z level=WARN msg="subscriber channel is taking too long to respond" timeout=20s queueLen=10 queueCap=10 sendPath=nameResolverToAttrFilter dstName=AttributesFilter
time=2026-01-22T10:24:12.133Z level=WARN msg="subscriber channel is taking too long to respond" timeout=20s queueLen=10 queueCap=10 sendPath=overriddenAppExportQueue dstName=otel.TracesReceiver
time=2026-01-22T10:24:27.152Z level=WARN msg="subscriber channel is taking too long to respond" timeout=20s queueLen=10 queueCap=10 sendPath=overriddenAppExportQueue dstName=otel.TracesReceiver
time=2026-01-22T10:24:27.184Z level=WARN msg="subscriber channel is taking too long to respond" timeout=20s queueLen=10 queueCap=10 sendPath=nameResolverToAttrFilter dstName=AttributesFilter
time=2026-01-22T10:24:42.206Z level=WARN msg="subscriber channel is taking too long to respond" timeout=20s queueLen=10 queueCap=10 sendPath=overriddenAppExportQueue dstName=otel.TracesReceiver
time=2026-01-22T10:24:42.212Z level=WARN msg="subscriber channel is taking too long to respond" timeout=20s queueLen=10 queueCap=10 sendPath=nameResolverToAttrFilter dstName=AttributesFilter
time=2026-01-22T10:24:42.230Z level=WARN msg="subscriber channel is taking too long to respond" timeout=20s queueLen=10 queueCap=10 sendPath=kubeDecoratorToNameResolver dstName=transform.NameResolver
time=2026-01-22T10:24:57.268Z level=WARN msg="subscriber channel is taking too long to respond" timeout=20s queueLen=10 queueCap=10 sendPath=overriddenAppExportQueue dstName=otel.TracesReceiver
time=2026-01-22T10:24:57.275Z level=WARN msg="subscriber channel is taking too long to respond" timeout=20s queueLen=10 queueCap=10 sendPath=nameResolverToAttrFilter dstName=AttributesFilter


time=2026-01-22T23:09:17.865Z level=WARN msg="subscriber channel is taking too long to respond" timeout=20s queueLen=10 queueCap
panic: sending through queue path overriddenAppExportQueue. Subscriber channel otel.TracesReceiver is blocked                   
                                                                                                                                
goroutine 301 [running]:                                                                                                        
go.opentelemetry.io/obi/pkg/pipe/msg.(*Queue[...]).chainedSend(0x2960840, {0x292cfd8, 0x400066a410}, {0x400042e808, 0x3, 0x5}, {
    /src/vendor/go.opentelemetry.io/obi/pkg/pipe/msg/queue.go:176 +0x668                                                        
go.opentelemetry.io/obi/pkg/pipe/msg.(*Queue[...]).SendCtx(...)                                                                 
    /src/vendor/go.opentelemetry.io/obi/pkg/pipe/msg/queue.go:118                                                               
go.opentelemetry.io/obi/pkg/filter.(*filter[...]).doFilter.func1()                                                              
    /src/vendor/go.opentelemetry.io/obi/pkg/filter/attribute.go:134 +0xa8                                                       
go.opentelemetry.io/obi/pkg/pipe/swarm/swarms.ForEachInput[...]({0x292cfd8?, 0x400066a410}, 0x40004dc380, 0x0, 0x4000701f18?)   
    /src/vendor/go.opentelemetry.io/obi/pkg/pipe/swarm/swarms/read_pattern.go:33 +0x80                                          
go.opentelemetry.io/obi/pkg/filter.(*filter[...]).doFilter(0x2938540, {0x292cfd8, 0x400066a410})                                
    /src/vendor/go.opentelemetry.io/obi/pkg/filter/attribute.go:132 +0xb0                                                       
go.opentelemetry.io/obi/pkg/pipe/swarm.(*Runner).Start.func2()                                                                  
    /src/vendor/go.opentelemetry.io/obi/pkg/pipe/swarm/runner.go:82 +0x34                                                       
created by go.opentelemetry.io/obi/pkg/pipe/swarm.(*Runner).Start in goroutine 288                                              
    /src/vendor/go.opentelemetry.io/obi/pkg/pipe/swarm/runner.go:81 +0x194                                                      

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions