Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Connection reset on heavy load #2241

Closed
Amoki opened this issue Jan 15, 2020 · 6 comments
Closed

Connection reset on heavy load #2241

Amoki opened this issue Jan 15, 2020 · 6 comments
Assignees

Comments

@Amoki
Copy link

Amoki commented Jan 15, 2020

On heavy load from a single source, gunicorn is resetting some TCP connections without any log.
It only appends when I query the server from an external computer and does not append on when requests come from localhost. I tried with a server at OVH.com and on LAN with 2 computers.

Environment

Python 3.8.0
gunicorn 20.0.4
Django 2.2.9
Ubuntu Server 18.04.3 LTS

The gunicorn command is

gunicorn my_app.wsgi -w 5 -b 0.0.0.0:8000 --timeout 1200 --access-logfile '-' --error-logfile '-' --worker-tmp-dir /dev/shm --log-level debug

Reproducing

I'm generating thousands of requests (GET or POST) and sending 300 of them at a time. When a request responds, I send a new one.

const async = require('async');
const requestPromise = require('request-promise');

const COUNT = 100000;
const LIMIT = 300;

async.timesLimit(COUNT, LIMIT, async () => await requestPromise('http://my-server:8080'));

After some thousand of requests (~2000 to ~10000), I have at least one Connection Reset.
I tried with GET requests responding a 300KB JSON, POST requests doing a dummy for-loop then responding a 204 and POST request doing some DB stuff and responding a small JSON.

There is no error in the log, we only see successful HTTP requests.

We ran a tcpdump and we see some tcp RST but we can't find where it comes from:

image

image

We tried to run the app in a limited CPU docker container to exclude networking failures due to high CPU usage and we have the same problem.

The error is the same through an Nginx reverse proxy.

Do you have any ideas about how can I have more information about what is happening?

@benoitc benoitc self-assigned this Jan 16, 2020
@Amoki
Copy link
Author

Amoki commented Jan 16, 2020

Update:
We can reproduce the bug on:
Ubuntu 18.04 (kernel 4.15)
Ubuntu 19.10 (kernel 5.3)
Debian 10 (kernel 4.19)

But not on an Arch Linux (kernel 5.4)
(We don't know if the bug is kernel related)

We tried to reload workers with --max-request and the problem is still there.

We'll try with an example Django app and I'll update the post.

UPDATE: I can reproduce the issue with a simple django + django-rest-framework exemple app:
https://github.com/bimdata/gunicorn-connectionreset

@druska
Copy link

druska commented Jan 21, 2020

We had the same issue, but with nginx in front of gunicorn. The solution was to set the --keep-alive flag in gunicorn to be greater than the keepalive_timeout in nginx.

@Laurel-rao
Copy link

Laurel-rao commented May 28, 2020

I have the same issue, on the over 10000+ requests, sometimes raise ConnectionResetError(10054)
This is my environment:
flask 1.0.2
python 3.6.10
gunicorn 20.0.4
docker 19.03
RedHat 7.4
my gunicorn config:

bind="0.0.0.0:9000"
workers= multiprocessing.cpu_count() * 2 + 1
threads = 100
backlog - 2048
worker_connections = 1000
daemon=True
proc_name="gunicorn_app"
pidfile="./logs/gunicorn.pid"
syslog=True
syslog_addr="udp://55.14.60.50:5959"

@Laurel-rao
Copy link

I fixed it.
cause i set process = 32 > cpu_cores = 20
But somehow, gunicorn works that process more than my computer cpu when i run it on docker,
gunicorn failed when i run it without docker, it raises Error

@Laurel-rao
Copy link

recently, it just less error, Can't totally fixed it

@benoitc
Copy link
Owner

benoitc commented Aug 6, 2024

no activity since awhile. closing feel free to create a new ticket if needed.

@benoitc benoitc closed this as completed Aug 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants