Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High CPU load, docker unable to parse json #1350

Closed
sprnza opened this issue Jan 25, 2022 · 1 comment
Closed

High CPU load, docker unable to parse json #1350

sprnza opened this issue Jan 25, 2022 · 1 comment

Comments

@sprnza
Copy link

sprnza commented Jan 25, 2022

Expected behavior

Docker service operates normally

Actual behavior

High CPU load, service degraded, containers stopped

Steps to reproduce the behavior

2 out of 5 manager nodes have been affected by this issue. They'd been working perfectly for quite long time. Restart docker service has resolved the issue.

Output of docker version:

Docker version 20.10.3, build 48d30b5

Output of docker info:

Client:
 Context:    default
 Debug Mode: false
 Plugins:
  app: Docker App (Docker Inc., v0.9.1-beta3)
  buildx: Build with BuildKit (Docker Inc., v0.5.1-docker)

Server:
 Containers: 4
  Running: 2
  Paused: 0
  Stopped: 2
 Images: 133
 Server Version: 20.10.3
 Storage Driver: overlay2
  Backing Filesystem: extfs
  Supports d_type: true
  Native Overlay Diff: true
 Logging Driver: json-file
 Cgroup Driver: cgroupfs
 Cgroup Version: 1
 Plugins:
  Volume: local
  Network: bridge host ipvlan macvlan null overlay
  Log: awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog
 Swarm: active
  NodeID: sosj0pmzcni9avr9m7giw369d
  Is Manager: true
  ClusterID: n0au9bkwm9cztmsbpvq0wlwu2
  Managers: 5
  Nodes: 8
  Default Address Pool: x.0.0.0/8
  SubnetSize: 24
  Data Path Port: 4789
  Orchestration:
   Task History Retention Limit: 5
  Raft:
   Snapshot Interval: 10000
   Number of Old Snapshots to Retain: 0
   Heartbeat Tick: 1
   Election Tick: 10
  Dispatcher:
   Heartbeat Period: 5 seconds
  CA Configuration:
   Expiry Duration: 3 months
   Force Rotate: 0
  Autolock Managers: false
  Root Rotation In Progress: false
  Node Address: x.x.x.4
  Manager Addresses:
   x.x.x.1:2377
   x.x.x.2:2377
   x.x.x.3:2377
   x.x.x.4:2377
   x.x.x.9:2377
 Runtimes: io.containerd.runc.v2 io.containerd.runtime.v1.linux runc
 Default Runtime: runc
 Init Binary: docker-init
 containerd version: 269548fa27e0089a8b8278fc4fc781d7f65a939b
 runc version: ff819c7e9184c13b7c2607fe6c30ae19403a7aff
 init version: de40ad0
 Security Options:
  apparmor
  seccomp
   Profile: default
 Kernel Version: 5.4.0-65-generic
 Operating System: Ubuntu 20.04.2 LTS
 OSType: linux
 Architecture: x86_64
 CPUs: 8
 Total Memory: 7.774GiB
 Name: hp-swarm-mgr-4
 ID: ORUD:N3EE:R6V5:KQMK:7NQW:WB2J:BNSF:2VTH:3TTW:PDL6:L5M7:74GG
 Docker Root Dir: /var/lib/docker
 Debug Mode: false
 Registry: https://index.docker.io/v1/
 Labels:
 Experimental: false
 Insecure Registries:
  x.x.x.x.48:5000
  127.0.0.0/8
 Live Restore Enabled: false

WARNING: No swap limit support
WARNING: No blkio weight support
WARNING: No blkio weight_device support

VMWare

sudo journalctl -fu docker (lots of warnings showed below)

Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:26.821972036+03:00" level=warning msg="got error while decoding json" error="invalid character 'a' looking for beginning of value" retries=131
Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:26.821989887+03:00" level=warning msg="got error while decoding json" error="invalid character 'e' looking for beginning of value" retries=132
Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:26.822004162+03:00" level=warning msg="got error while decoding json" error="json: cannot unmarshal number into
Go value of type jsonlog.JSONLog" retries=133
Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:26.822014707+03:00" level=warning msg="got error while decoding json" error="invalid character ':' looking for beginning of value" retries=134
Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:26.822025780+03:00" level=warning msg="got error while decoding json" error="invalid character 'r' looking for beginning of value" retries=135
Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:26.822036421+03:00" level=warning msg="got error while decoding json" error="invalid character 'a' looking for beginning of value" retries=136
Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:26.822048683+03:00" level=warning msg="got error while decoding json" error="invalid character 's' looking for beginning of value" retries=137
Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:26.822059822+03:00" level=warning msg="got error while decoding json" error="invalid character ',' looking for beginning of value" retries=138
Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:26.822072578+03:00" level=warning msg="got error while decoding json" error="invalid character 'a' looking for beginning of value" retries=139
Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:26.822084515+03:00" level=warning msg="got error while decoding json" error="invalid character 'i' in literal true (expecting 'r')" retries=140
Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:26.822099053+03:00" level=warning msg="got error while decoding json" error="json: cannot unmarshal string into
Go value of type jsonlog.JSONLog" retries=141
Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:26.822108239+03:00" level=warning msg="got error while decoding json" error="invalid character ':' looking for beginning of value" retries=142
Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:26.822123615+03:00" level=warning msg="got error while decoding json" error="invalid character '\"' in literal null (expecting 'u')" retries=143
Jan 25 22:01:27 hp-swarm-mgr-4 dockerd[1786441]: time="2022-01-25T22:01:27.192931191+03:00" level=error msg="Handler for GET /v1.41/info returned error: write unix /run/docker.sock->@: write: broken pipe"

I've found this and it seems like it's related somehow but there is no explanation what could cause this.

@thaJeztah
Copy link
Member

Yes, this looks like a duplicate of moby/moby#29511. The reason for the failure is a corrupt logfile for one of the containers (for whatever reason; could've been an unclean restart, or perhaps the machine running out of disk space?). Unfortunately the error log doesn't include which file the error was in (perhaps this is something that can be added), but a workaround could be to stop the daemon, and remove the logfiles (or, I see it's a swarm node, perhaps temporarily set the node's --availability to drain (to have the service tasks re-deployed to another node in the cluster)

I also notice you're running a very old patch release of docker 20.10 (20.10.3); the current version of 20.10 is 20.10.15, and there may have been some improvements in the json logger since.

Let me close this ticket as duplicate of moby/moby#29511, but feel free to participate on that ticket

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants