Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

consul builtin sidecar proxy takes 10 minutes to start working #12319

Closed
jrnijboer opened this issue Feb 11, 2022 · 3 comments · Fixed by #12820
Closed

consul builtin sidecar proxy takes 10 minutes to start working #12319

jrnijboer opened this issue Feb 11, 2022 · 3 comments · Fixed by #12820
Assignees
Labels
theme/connect Anything related to Consul Connect, Service Mesh, Side Car Proxies theme/performance Performance benchmarking or potential improvement type/bug Feature does not function as expected

Comments

@jrnijboer
Copy link

jrnijboer commented Feb 11, 2022

Overview of the Issue

When starting the builtin sidecar proxy to enable Consul Connect for a service registered in Consul, the proxy process starts immediately but only after 10 minutes it starts working and accepting Consul Connect requests. This used to work immediately in older versions of Consul. I've found out that Consul version 1.9.13 seems to work and in version 1.9.14 it takes 10 minutes. The release notes don't hint at any changes to the Consul connect proxy though. I have not done an exhaustive search to check if this happens with all versions from 1.9.4 and later or that all versions from 1.9.3 and earlier do not have this bug.

In our dev-stage we have consul servers running on version 1.11.1. On that dev-stage it is possible to use the sidecar proxy when using Consul 1.9.13 on a client. I suppose this indicates that it's not a server problem?

Reproduction Steps

Steps to reproduce this issue:
Download the latest consul version (1.11.2 at this moment)

  1. Run a consul server in dev mode: consul agent -dev
  2. Start any service, for example an nginx container: docker run -p 8000:80 -d nginx:alpine
  3. Write a service file for a Consul service registration:
service {
    name = "nginx-service"
    port = 8000
    connect {
        sidecar_service { }
    }
}
  1. Register the service in Consul: consul services register nginx-service.hcl
  2. Start the sidecar proxy: consul connect proxy --sidecar-for nginx-service

Now the sidecar proxy should start and open a socket on port 21000. For consul versions 1.9.13 and older this works nearly instant, for newer versions it takes 10 minutes to work. The first message in the log that appears is
[INFO] proxy: Proxy loaded config and ready to serve

This is immediately followed by:

[INFO]  proxy: Parsed TLS identity: uri=spiffe://b00fb169-c137-8254-a07e-ba340abd0e1c.consul/ns/default/dc/dc1/svc/nginx-service roots=[pri-9lo97id.consul.ca.b00fb169.consul]
[INFO]  proxy: Starting listener: listener="public listener" bind_addr=0.0.0.0:21000
  1. curl http://localhost:8000 after seeing above message to verify that the sidecar proxy is working

Consul info for both Client and Server

This happens on a consul cluster in dev modus, as well as a multi-node consul cluster setup with ACL's and TLS.

Operating system and Environment details

This is reproducible on Linux (Ubuntu 20.04) as well as Windows 10 and WSL.

Log Fragments

logs from the consul agent:

==> Starting Consul agent...
           Version: '1.11.2'
           Node ID: '5b94a58e-4e44-d381-37ce-f053c6177bb5'
         Node name: 'NB-XPS111'
        Datacenter: 'dc1' (Segment: '<all>')
            Server: true (Bootstrap: false)
       Client Addr: [127.0.0.1] (HTTP: 8500, HTTPS: -1, gRPC: 8502, DNS: 8600)
      Cluster Addr: 127.0.0.1 (LAN: 8301, WAN: 8302)
           Encrypt: Gossip: false, TLS-Outgoing: false, TLS-Incoming: false, Auto-Encrypt-TLS: false

==> Log data will now stream in as it occurs:

2022-02-11T16:25:16.761+0100 [INFO]  agent.server.raft: initial configuration: index=1 servers="[{Suffrage:Voter ID:5b94a58e-4e44-d381-37ce-f053c6177bb5 Address:127.0.0.1:8300}]"
2022-02-11T16:25:16.762+0100 [INFO]  agent.server.raft: entering follower state: follower="Node at 127.0.0.1:8300 [Follower]" leader=
2022-02-11T16:25:16.762+0100 [INFO]  agent.server.serf.wan: serf: EventMemberJoin: NB-XPS111.dc1 127.0.0.1
2022-02-11T16:25:16.763+0100 [INFO]  agent.server.serf.lan: serf: EventMemberJoin: NB-XPS111 127.0.0.1
2022-02-11T16:25:16.763+0100 [INFO]  agent.router: Initializing LAN area manager
2022-02-11T16:25:16.763+0100 [INFO]  agent.server: Adding LAN server: server="NB-XPS111 (Addr: tcp/127.0.0.1:8300) (DC: dc1)"
2022-02-11T16:25:16.763+0100 [WARN]  agent: grpc: addrConn.createTransport failed to connect to {dc1-127.0.0.1:8300 0 NB-XPS111 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:0->127.0.0.1:8300: operation was canceled". Reconnecting...
2022-02-11T16:25:16.763+0100 [INFO]  agent.server: Handled event for server in area: event=member-join server=NB-XPS111.dc1 area=wan
2022-02-11T16:25:16.763+0100 [WARN]  agent: grpc: addrConn.createTransport failed to connect to {dc1-127.0.0.1:8300 0 NB-XPS111.dc1 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:0->127.0.0.1:8300: operation was canceled". Reconnecting...
2022-02-11T16:25:16.763+0100 [INFO]  agent: Started DNS server: address=127.0.0.1:8600 network=tcp
2022-02-11T16:25:16.763+0100 [INFO]  agent: Started DNS server: address=127.0.0.1:8600 network=udp
2022-02-11T16:25:16.763+0100 [INFO]  agent: Starting server: address=127.0.0.1:8500 network=tcp protocol=http
2022-02-11T16:25:16.763+0100 [WARN]  agent: DEPRECATED Backwards compatibility with pre-1.9 metrics enabled. These metrics will be removed in a future version of Consul. Set `telemetry { disable_compat_1.9 = true }` to disable them.
2022-02-11T16:25:16.763+0100 [INFO]  agent: Started gRPC server: address=127.0.0.1:8502 network=tcp
2022-02-11T16:25:16.764+0100 [INFO]  agent: started state syncer
2022-02-11T16:25:16.764+0100 [INFO]  agent: Consul agent running!
2022-02-11T16:25:16.808+0100 [WARN]  agent.server.raft: heartbeat timeout reached, starting election: last-leader=
2022-02-11T16:25:16.808+0100 [INFO]  agent.server.raft: entering candidate state: node="Node at 127.0.0.1:8300 [Candidate]" term=2
2022-02-11T16:25:16.808+0100 [DEBUG] agent.server.raft: votes: needed=1
2022-02-11T16:25:16.808+0100 [DEBUG] agent.server.raft: vote granted: from=5b94a58e-4e44-d381-37ce-f053c6177bb5 term=2 tally=1
2022-02-11T16:25:16.808+0100 [INFO]  agent.server.raft: election won: tally=1
2022-02-11T16:25:16.808+0100 [INFO]  agent.server.raft: entering leader state: leader="Node at 127.0.0.1:8300 [Leader]"
2022-02-11T16:25:16.808+0100 [INFO]  agent.server: cluster leadership acquired
2022-02-11T16:25:16.808+0100 [INFO]  agent.server: New leader elected: payload=NB-XPS111
2022-02-11T16:25:16.809+0100 [INFO]  agent.leader: started routine: routine="federation state anti-entropy"
2022-02-11T16:25:16.809+0100 [INFO]  agent.leader: started routine: routine="federation state pruning"
2022-02-11T16:25:16.809+0100 [DEBUG] agent.server.autopilot: autopilot is now running
2022-02-11T16:25:16.809+0100 [DEBUG] agent.server.autopilot: state update routine is now running
2022-02-11T16:25:16.809+0100 [DEBUG] connect.ca.consul: consul CA provider configured: id=fb:50:9b:45:1a:65:15:c1:68:57:73:5f:da:cd:b8:0d:0f:e2:26:eb:68:66:43:11:85:9d:67:a9:7a:56:9c:b9 is_primary=true
2022-02-11T16:25:16.810+0100 [INFO]  connect.ca: initialized primary datacenter CA with provider: provider=consul
2022-02-11T16:25:16.810+0100 [INFO]  agent.leader: started routine: routine="intermediate cert renew watch"
2022-02-11T16:25:16.810+0100 [INFO]  agent.leader: started routine: routine="CA root pruning"
2022-02-11T16:25:16.810+0100 [INFO]  agent.leader: started routine: routine="CA root expiration metric"
2022-02-11T16:25:16.810+0100 [INFO]  agent.leader: started routine: routine="CA signing expiration metric"
2022-02-11T16:25:16.810+0100 [INFO]  agent.leader: started routine: routine="virtual IP version check"
2022-02-11T16:25:16.810+0100 [DEBUG] agent.server: successfully established leadership: duration=1.6918ms
2022-02-11T16:25:16.810+0100 [INFO]  agent.server: member joined, marking health alive: member=NB-XPS111 partition=default
2022-02-11T16:25:16.810+0100 [DEBUG] agent.leader: stopping routine: routine="virtual IP version check"
2022-02-11T16:25:16.810+0100 [DEBUG] agent.leader: stopped routine: routine="virtual IP version check"
2022-02-11T16:25:16.924+0100 [DEBUG] agent: Skipping remote check since it is managed automatically: check=serfHealth
2022-02-11T16:25:16.925+0100 [INFO]  agent: Synced node info
2022-02-11T16:25:17.087+0100 [INFO]  agent.server: federation state anti-entropy synced
2022-02-11T16:25:17.390+0100 [DEBUG] agent.http: Request finished: method=GET url=/v1/internal/ui/services?dc=dc1&index=1 from=127.0.0.1:59810 latency=168.9µs
2022-02-11T16:25:19.751+0100 [DEBUG] agent: Skipping remote check since it is managed automatically: check=serfHealth
2022-02-11T16:25:19.752+0100 [DEBUG] agent: Node info in sync
2022-02-11T16:25:19.752+0100 [DEBUG] agent: Node info in sync
2022-02-11T16:25:29.890+0100 [DEBUG] agent: Node info in sync
2022-02-11T16:25:29.890+0100 [INFO]  agent: Synced service: service=nginx-service
2022-02-11T16:25:29.890+0100 [DEBUG] agent: added local registration for service: service=nginx-service-sidecar-proxy
2022-02-11T16:25:29.890+0100 [DEBUG] agent: Node info in sync
2022-02-11T16:25:29.890+0100 [DEBUG] agent: Service in sync: service=nginx-service
2022-02-11T16:25:29.890+0100 [DEBUG] agent.http: Request finished: method=GET url=/v1/internal/ui/services?dc=dc1&index=14 from=127.0.0.1:59810 latency=10.482508s
2022-02-11T16:25:29.891+0100 [INFO]  agent: Synced service: service=nginx-service-sidecar-proxy
2022-02-11T16:25:29.891+0100 [DEBUG] agent: Check in sync: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:25:29.891+0100 [DEBUG] agent: Check in sync: check=service:nginx-service-sidecar-proxy:2
2022-02-11T16:25:29.891+0100 [DEBUG] agent: Node info in sync
2022-02-11T16:25:29.891+0100 [DEBUG] agent: Service in sync: service=nginx-service
2022-02-11T16:25:29.891+0100 [DEBUG] agent: Service in sync: service=nginx-service-sidecar-proxy
2022-02-11T16:25:29.891+0100 [DEBUG] agent: Check in sync: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:25:29.891+0100 [INFO]  agent: Synced check: check=service:nginx-service-sidecar-proxy:2
2022-02-11T16:25:29.891+0100 [DEBUG] agent.http: Request finished: method=PUT url=/v1/agent/service/register from=127.0.0.1:59814 latency=1.6597ms
2022-02-11T16:25:29.891+0100 [DEBUG] agent: Node info in sync
2022-02-11T16:25:29.892+0100 [DEBUG] agent: Service in sync: service=nginx-service
2022-02-11T16:25:29.892+0100 [DEBUG] agent: Service in sync: service=nginx-service-sidecar-proxy
2022-02-11T16:25:29.892+0100 [DEBUG] agent: Check in sync: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:25:29.892+0100 [DEBUG] agent: Check in sync: check=service:nginx-service-sidecar-proxy:2
2022-02-11T16:25:29.892+0100 [DEBUG] agent: Node info in sync
2022-02-11T16:25:29.892+0100 [DEBUG] agent: Service in sync: service=nginx-service
2022-02-11T16:25:29.892+0100 [DEBUG] agent: Service in sync: service=nginx-service-sidecar-proxy
2022-02-11T16:25:29.892+0100 [DEBUG] agent: Check in sync: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:25:29.892+0100 [DEBUG] agent: Check in sync: check=service:nginx-service-sidecar-proxy:2
2022-02-11T16:25:31.915+0100 [DEBUG] agent.http: Request finished: method=GET url=/v1/internal/ui/services?dc=dc1&index=16 from=127.0.0.1:59810 latency=150.5µs
2022-02-11T16:25:36.752+0100 [WARN]  agent: Check socket connection failed: check=service:nginx-service-sidecar-proxy:1 error="dial tcp 127.0.0.1:21000: connect: connection refused"
2022-02-11T16:25:36.752+0100 [WARN]  agent: Check is now critical: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:25:46.752+0100 [WARN]  agent: Check socket connection failed: check=service:nginx-service-sidecar-proxy:1 error="dial tcp 127.0.0.1:21000: connect: connection refused"
2022-02-11T16:25:46.752+0100 [WARN]  agent: Check is now critical: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:25:56.753+0100 [WARN]  agent: Check socket connection failed: check=service:nginx-service-sidecar-proxy:1 error="dial tcp 127.0.0.1:21000: connect: connection refused"
2022-02-11T16:25:56.753+0100 [WARN]  agent: Check is now critical: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:26:06.755+0100 [WARN]  agent: Check socket connection failed: check=service:nginx-service-sidecar-proxy:1 error="dial tcp 127.0.0.1:21000: connect: connection refused"
2022-02-11T16:26:06.755+0100 [WARN]  agent: Check is now critical: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:26:16.756+0100 [WARN]  agent: Check socket connection failed: check=service:nginx-service-sidecar-proxy:1 error="dial tcp 127.0.0.1:21000: connect: connection refused"
2022-02-11T16:26:16.756+0100 [WARN]  agent: Check is now critical: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:26:23.258+0100 [DEBUG] agent.http: Request finished: method=GET url=/v1/agent/services from=127.0.0.1:59826 latency=257.3µs
2022-02-11T16:26:23.259+0100 [DEBUG] agent.http: Request finished: method=GET url=/v1/agent/service/nginx-service-sidecar-proxy from=127.0.0.1:59826 latency=67.9µs
2022-02-11T16:26:23.261+0100 [DEBUG] agent.http: Request finished: method=GET url=/v1/agent/connect/ca/roots from=127.0.0.1:59828 latency=902.1µs
2022-02-11T16:26:26.757+0100 [WARN]  agent: Check socket connection failed: check=service:nginx-service-sidecar-proxy:1 error="dial tcp 127.0.0.1:21000: connect: connection refused"
2022-02-11T16:26:26.757+0100 [WARN]  agent: Check is now critical: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:26:36.758+0100 [WARN]  agent: Check socket connection failed: check=service:nginx-service-sidecar-proxy:1 error="dial tcp 127.0.0.1:21000: connect: connection refused"
2022-02-11T16:26:36.758+0100 [WARN]  agent: Check is now critical: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:26:40.158+0100 [DEBUG] agent: Skipping remote check since it is managed automatically: check=serfHealth
2022-02-11T16:26:40.159+0100 [DEBUG] agent: Node info in sync
2022-02-11T16:26:40.159+0100 [DEBUG] agent: Service in sync: service=nginx-service-sidecar-proxy
2022-02-11T16:26:40.159+0100 [DEBUG] agent: Service in sync: service=nginx-service
2022-02-11T16:26:40.159+0100 [DEBUG] agent: Check in sync: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:26:40.159+0100 [DEBUG] agent: Check in sync: check=service:nginx-service-sidecar-proxy:2
2022-02-11T16:26:46.759+0100 [WARN]  agent: Check socket connection failed: check=service:nginx-service-sidecar-proxy:1 error="dial tcp 127.0.0.1:21000: connect: connection refused"
...
...lines removed that repeat same message until the sidecar starts working
...
2022-02-11T16:35:26.827+0100 [WARN]  agent: Check socket connection failed: check=service:nginx-service-sidecar-proxy:1 error="dial tcp 127.0.0.1:21000: connect: connection refused"
2022-02-11T16:35:26.827+0100 [WARN]  agent: Check is now critical: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:35:29.894+0100 [DEBUG] agent.http: Request finished: method=GET url=/v1/agent/connect/ca/leaf/nginx-service from=127.0.0.1:59830 latency=9m6.6334955s
2022-02-11T16:35:36.828+0100 [DEBUG] agent: Check status updated: check=service:nginx-service-sidecar-proxy:1 status=passing
2022-02-11T16:35:36.829+0100 [DEBUG] agent: Node info in sync
2022-02-11T16:35:36.829+0100 [DEBUG] agent: Service in sync: service=nginx-service
2022-02-11T16:35:36.829+0100 [DEBUG] agent: Service in sync: service=nginx-service-sidecar-proxy
2022-02-11T16:35:36.829+0100 [INFO]  agent: Synced check: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:35:36.829+0100 [DEBUG] agent: Check in sync: check=service:nginx-service-sidecar-proxy:2
2022-02-11T16:35:36.829+0100 [DEBUG] agent.http: Request finished: method=GET url=/v1/internal/ui/services?dc=dc1&index=32 from=127.0.0.1:59810 latency=2m8.2864218s
2022-02-11T16:35:37.869+0100 [DEBUG] agent: Skipping remote check since it is managed automatically: check=serfHealth
2022-02-11T16:35:37.869+0100 [DEBUG] agent: Node info in sync
2022-02-11T16:35:37.869+0100 [DEBUG] agent: Service in sync: service=nginx-service
2022-02-11T16:35:37.869+0100 [DEBUG] agent: Service in sync: service=nginx-service-sidecar-proxy
2022-02-11T16:35:37.869+0100 [DEBUG] agent: Check in sync: check=service:nginx-service-sidecar-proxy:1
2022-02-11T16:35:37.869+0100 [DEBUG] agent: Check in sync: check=service:nginx-service-sidecar-proxy:2
2022-02-11T16:35:46.830+0100 [DEBUG] agent: Check status updated: check=service:nginx-service-sidecar-proxy:1 status=passing
2022-02-11T16:35:56.832+0100 [DEBUG] agent: Check status updated: check=service:nginx-service-sidecar-proxy:1 status=passing
2022-02-11T16:36:06.834+0100 [DEBUG] agent: Check status updated: check=service:nginx-service-sidecar-proxy:1 status=passing
2022-02-11T16:36:16.836+0100 [DEBUG] agent: Check status updated: check=service:nginx-service-sidecar-proxy:1 status=passing
2022-02-11T16:36:26.838+0100 [DEBUG] agent: Check status updated: check=service:nginx-service-sidecar-proxy:1 status=passing
2022-02-11T16:36:36.840+0100 [DEBUG] agent: Check status updated: check=service:nginx-service-sidecar-proxy:1 status=passing
2022-02-11T16:36:46.841+0100 [DEBUG] agent: Check status updated: check=service:nginx-service-sidecar-proxy:1 status=passing
2022-02-11T16:36:56.803+0100 [DEBUG] agent.http: Request finished: method=GET url=/v1/agent/service/nginx-service-sidecar-proxy?hash=144ca5fb9647b9e5 from=127.0.0.1:59826 latency=5m15.2379944s
2022-02-11T16:36:56.842+0100 [DEBUG] agent: Check status updated: check=service:nginx-service-sidecar-proxy:1 status=passing
2022-02-11T16:36:56.887+0100 [DEBUG] agent.router.manager: Rebalanced servers, new active server: number_of_servers=1 active_server="NB-XPS111 (Addr: tcp/127.0.0.1:8300) (DC: dc1)"
2022-02-11T16:37:06.844+0100 [DEBUG] agent: Check status updated: check=service:nginx-service-sidecar-proxy:1 status=passing
2022-02-11T16:37:16.845+0100 [DEBUG] agent: Check status updated: check=service:nginx-service-sidecar-proxy:1 status=passing

=======================
logs from the sidecar proxy (please not the date output and the timestamps of the logs):

[root@devlaptop:~$] date
Fri Feb 11 16:26:19 CET 2022
[root@devlaptop:~$] consul connect proxy --sidecar-for nginx-service
==> Consul Connect proxy starting...
    Configuration mode: Agent API
        Sidecar for ID: nginx-service
              Proxy ID: nginx-service-sidecar-proxy

==> Log data will now stream in as it occurs:

    2022-02-11T16:35:29.896+0100 [INFO]  proxy: Proxy loaded config and ready to serve
    2022-02-11T16:35:29.897+0100 [INFO]  proxy: Parsed TLS identity: uri=spiffe://b00fb169-c137-8254-a07e-ba340abd0e1c.consul/ns/default/dc/dc1/svc/nginx-service roots=[pri-9lo97id.consul.ca.b00fb169.consul]
    2022-02-11T16:35:29.897+0100 [INFO]  proxy: Starting listener: listener="public listener" bind_addr=0.0.0.0:21000
@Amier3 Amier3 added type/bug Feature does not function as expected theme/performance Performance benchmarking or potential improvement theme/connect Anything related to Consul Connect, Service Mesh, Side Car Proxies needs-investigation The issue described is detailed and complex. labels Feb 11, 2022
@Amier3
Copy link
Contributor

Amier3 commented Feb 15, 2022

Hey @jrnijboer

Thanks for bringing this to our attention. After looking at the changelog we think we have a good idea of what changes would cause this in 1.9.14, but it'll take a deeper dive to be sure. We'll keep this issue updated with what we find.

@kisunji
Copy link
Contributor

kisunji commented Mar 14, 2022

After some investigation, looks like this may be the result of #11693 which caused a regression: #12048. Basically, a non-blocking query which follows a blocking query on a leaf cert will block, meaning we do not see longs until the timeout (10min) is reached.

1.9.13:

2022-03-14T12:45:33.857-0400 [TRACE] agent.proxycfg: A blocking query returned; handling snapshot update: service_id=nginx-service-sidecar-proxy
2022-03-14T12:45:34.058-0400 [TRACE] agent.proxycfg: Delivered new snapshot to proxy config watchers: service_id=nginx-service-sidecar-proxy
2022-03-14T12:45:36.365-0400 [DEBUG] agent.http: Request finished: method=GET url=/v1/agent/services from=127.0.0.1:63513 latency=238.761µs
2022-03-14T12:45:36.366-0400 [DEBUG] agent.http: Request finished: method=GET url=/v1/agent/service/nginx-service-sidecar-proxy from=127.0.0.1:63513 latency=46.5µs
2022-03-14T12:45:36.367-0400 [DEBUG] agent.http: Request finished: method=GET url=/v1/agent/connect/ca/roots from=127.0.0.1:63515 latency=211.953µs
2022-03-14T12:45:36.367-0400 [DEBUG] agent.http: Request finished: method=GET url=/v1/agent/connect/ca/leaf/nginx-service from=127.0.0.1:63514 latency=703.972µs

1.9.14

2022-03-14T12:46:41.092-0400 [TRACE] agent.proxycfg: Delivered new snapshot to proxy config watchers: service_id=nginx-service-sidecar-proxy
2022-03-14T12:46:42.839-0400 [DEBUG] agent.http: Request finished: method=GET url=/v1/agent/services from=127.0.0.1:63552 latency=258.96µs
2022-03-14T12:46:42.840-0400 [DEBUG] agent.http: Request finished: method=GET url=/v1/agent/service/nginx-service-sidecar-proxy from=127.0.0.1:63552 latency=48.692µs
2022-03-14T12:46:42.841-0400 [DEBUG] agent.http: Request finished: method=GET url=/v1/agent/connect/ca/roots from=127.0.0.1:63554 latency=239.769µs

Note the GET to url=/v1/agent/connect/ca/leaf/nginx-service does not return immediately

We are still investigating a fix for this issue. Thank you for your patience!

@kisunji kisunji removed the needs-investigation The issue described is detailed and complex. label Mar 16, 2022
@HumanPrinter
Copy link

Is there any update on this issue? We have currently put a hold on updating our Consul installations untill this issue is fixed

rboyer added a commit that referenced this issue Apr 19, 2022
Fixes #12048
Fixes #12319
rboyer pushed a commit that referenced this issue Apr 19, 2022
…king leaf cert query to block

Fixes #12048
Fixes #12319

Regression introduced in #11693
rboyer added a commit that referenced this issue Apr 20, 2022
…king leaf cert query to block (#12820)

Fixes #12048

Fixes #12319

Regression introduced in #11693

Local reproduction steps:
1. `consul agent -dev`
2. `curl -sLiv 'localhost:8500/v1/agent/connect/ca/leaf/web'`
3. make note of the `X-Consul-Index` header returned 
4. `curl -sLi 'localhost:8500/v1/agent/connect/ca/leaf/web?index=<VALUE_FROM_STEP_3>'`
5. Kill the above curl when it hangs with Ctrl-C
6. Repeat (2) and it should not hang.
hc-github-team-consul-core pushed a commit that referenced this issue Apr 20, 2022
…king leaf cert query to block (#12820)

Fixes #12048

Fixes #12319

Regression introduced in #11693

Local reproduction steps:
1. `consul agent -dev`
2. `curl -sLiv 'localhost:8500/v1/agent/connect/ca/leaf/web'`
3. make note of the `X-Consul-Index` header returned 
4. `curl -sLi 'localhost:8500/v1/agent/connect/ca/leaf/web?index=<VALUE_FROM_STEP_3>'`
5. Kill the above curl when it hangs with Ctrl-C
6. Repeat (2) and it should not hang.
hc-github-team-consul-core pushed a commit that referenced this issue Apr 20, 2022
…king leaf cert query to block (#12820)

Fixes #12048

Fixes #12319

Regression introduced in #11693

Local reproduction steps:
1. `consul agent -dev`
2. `curl -sLiv 'localhost:8500/v1/agent/connect/ca/leaf/web'`
3. make note of the `X-Consul-Index` header returned 
4. `curl -sLi 'localhost:8500/v1/agent/connect/ca/leaf/web?index=<VALUE_FROM_STEP_3>'`
5. Kill the above curl when it hangs with Ctrl-C
6. Repeat (2) and it should not hang.
hc-github-team-consul-core pushed a commit that referenced this issue Apr 20, 2022
…king leaf cert query to block (#12820)

Fixes #12048

Fixes #12319

Regression introduced in #11693

Local reproduction steps:
1. `consul agent -dev`
2. `curl -sLiv 'localhost:8500/v1/agent/connect/ca/leaf/web'`
3. make note of the `X-Consul-Index` header returned 
4. `curl -sLi 'localhost:8500/v1/agent/connect/ca/leaf/web?index=<VALUE_FROM_STEP_3>'`
5. Kill the above curl when it hangs with Ctrl-C
6. Repeat (2) and it should not hang.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
theme/connect Anything related to Consul Connect, Service Mesh, Side Car Proxies theme/performance Performance benchmarking or potential improvement type/bug Feature does not function as expected
Projects
None yet
4 participants