tickets #102203
closedmailman3 - uwsgi / gunicorn memory leak?
Added by pjessen about 3 years ago. Updated 4 months ago.
0%
Description
Despite having increased memory on mailman3 from 8Gb to 12Gb, 'uwsgi' just uses up more of it:
mailman3 (lists.o.o):~ # dmesg -T | grep Out
[Sat Nov 6 00:13:46 2021] Out of memory: Killed process 1413 (uwsgi) total-vm:6920348kB, anon-rss:6703136kB, file-rss:0kB, shmem-rss:128kB
[Sat Nov 6 04:43:29 2021] Out of memory: Killed process 6469 (uwsgi) total-vm:8274960kB, anon-rss:8054432kB, file-rss:0kB, shmem-rss:128kB
[Sat Nov 6 10:09:43 2021] Out of memory: Killed process 11454 (uwsgi) total-vm:8244728kB, anon-rss:8042024kB, file-rss:0kB, shmem-rss:128kB
[Sat Nov 6 19:13:26 2021] Out of memory: Killed process 17945 (uwsgi) total-vm:8281008kB, anon-rss:8056436kB, file-rss:0kB, shmem-rss:128kB
[Sun Nov 7 00:00:45 2021] Out of memory: Killed process 29157 (uwsgi) total-vm:7231816kB, anon-rss:7007352kB, file-rss:0kB, shmem-rss:128kB
[Sun Nov 7 04:15:18 2021] Out of memory: Killed process 2782 (uwsgi) total-vm:8303620kB, anon-rss:8061684kB, file-rss:0kB, shmem-rss:128kB
[Sun Nov 7 08:47:29 2021] Out of memory: Killed process 7453 (uwsgi) total-vm:8182164kB, anon-rss:7980296kB, file-rss:0kB, shmem-rss:128kB
[Sun Nov 7 11:28:15 2021] Out of memory: Killed process 12774 (uwsgi) total-vm:8297484kB, anon-rss:8040232kB, file-rss:0kB, shmem-rss:128kB
[Sun Nov 7 15:05:22 2021] Out of memory: Killed process 16003 (uwsgi) total-vm:8259072kB, anon-rss:8029160kB, file-rss:0kB, shmem-rss:128kB
[Mon Nov 8 00:08:59 2021] Out of memory: Killed process 20442 (uwsgi) total-vm:6887800kB, anon-rss:6657480kB, file-rss:0kB, shmem-rss:128kB
[Mon Nov 8 03:43:51 2021] Out of memory: Killed process 31699 (uwsgi) total-vm:8273612kB, anon-rss:8023212kB, file-rss:0kB, shmem-rss:128kB
[Mon Nov 8 19:17:54 2021] Out of memory: Killed process 3302 (uwsgi) total-vm:8195640kB, anon-rss:7975640kB, file-rss:0kB, shmem-rss:128kB
[Tue Nov 9 00:14:57 2021] Out of memory: Killed process 22671 (uwsgi) total-vm:6814652kB, anon-rss:6553372kB, file-rss:0kB, shmem-rss:128kB
[Tue Nov 9 06:01:11 2021] Out of memory: Killed process 28922 (uwsgi) total-vm:7822680kB, anon-rss:7612088kB, file-rss:0kB, shmem-rss:128kB
[Wed Nov 10 00:01:29 2021] Out of memory: Killed process 3191 (uwsgi) total-vm:5780740kB, anon-rss:5549512kB, file-rss:0kB, shmem-rss:128kB
Updated by pjessen about 3 years ago
- Private changed from Yes to No
I suspect this is also making uwsgi slower and slower - today since midnight 1040 time-outs:
2021/11/10 10:09:38 [error] 25453#25453: *3843163 upstream timed out (110: Connection timed out) while connecting to upstream, client: 114.119.138.77, server: lists.opensuse.org, request: "GET /archives/list/users-hu@lists.opensuse.org/export/users-hu@lists.opensuse.org-2009-09.mbox.gz?start=2009-08-01&end=2009-09-01 HTTP/1.1", upstream: "uwsgi://0.0.0.0:8000", host: "lists.opensuse.org"
Updated by pjessen about 3 years ago
I did some googling and I have amended uwsgi.ini with this:
# Worker Management
max-requests = 1000 ; Restart workers after this many requests
max-worker-lifetime = 3600 ; Restart workers after this many seconds
reload-on-rss = 1024 ; Restart workers after this much resident memory
worker-reload-mercy = 60 ; How long to wait before forcefully killing workers
From https://www.techatbloomberg.com/blog/configuring-uwsgi-production-deployment/
Updated by pjessen about 3 years ago
I am not sure if the above has worked or not. There has not been any oom kill since I added it, but I have just now seen a uwsgi process with about 4.5Gb in use. It was replaced soon after though.
Updated by pjessen about 3 years ago
No, it works. I have just watched uwsgi being stopped and a new process started.
Updated by pjessen about 3 years ago
- Category set to Mailing lists
- Status changed from New to Workable
- Assignee set to pjessen
I don't get it.
Today, four oom kills:
dmesg -T | grep Killed
[Thu Nov 18 15:22:36 2021] Out of memory: Killed process 19146 (uwsgi) total-vm:9138912kB, anon-rss:8921448kB, file-rss:0kB, shmem-rss:128kB
[Thu Nov 18 16:21:46 2021] Out of memory: Killed process 20832 (uwsgi) total-vm:9165396kB, anon-rss:8915852kB, file-rss:0kB, shmem-rss:128kB
[Thu Nov 18 16:49:57 2021] Out of memory: Killed process 21631 (uwsgi) total-vm:9147092kB, anon-rss:8916712kB, file-rss:0kB, shmem-rss:124kB
[Thu Nov 18 17:43:49 2021] Out of memory: Killed process 22753 (uwsgi) total-vm:9136628kB, anon-rss:8899684kB, file-rss:0kB, shmem-rss:124kB
Updated by pjessen almost 3 years ago
18 November - 10 oom kills
19 November - 2 oom kills
20 November - 6 oom kills
21 November - 12 oom kills
22 November - 11 oom kills
23 November - 5 oom kills
24 November - 7 oom kills
25, 26 - none.
27 November - 1 oom kill
since then none.
Updated by pjessen over 2 years ago
- Status changed from Workable to In Progress
- % Done changed from 0 to 80
In the last thirty days, only one oom kill. It seems clear that there is a memory leak, but at least this work-around (uwsgi.ini) seems to have taken care of the worst. Clearly this needs to be salted, but I'll admit I'm not very good with salt.
Updated by pjessen over 2 years ago
- Related to tickets #101842: mailman3 - nginx memory usage? added
Updated by pjessen over 2 years ago
- Related to tickets #113111: lists.opensuse.org - webserver backand timeouts - nginx 504 gateway timeout added
Updated by hellcp about 2 years ago
We are no longer using uwsgi, but it would be worth checking if a similar issue doesn't happen with gunicorn
Updated by pjessen almost 2 years ago
- Subject changed from mailman3 - uwsgi memory leak? to mailman3 - uwsgi / gunicorn memory leak?
- % Done changed from 80 to 0
hellcp wrote:
We are no longer using uwsgi, but it would be worth checking if a similar issue doesn't happen with gunicorn
I have just checked - 66 oom kills in the last two days.
[Wed Jan 4 04:45:19 2023] Out of memory: Killed process 5538 (gunicorn) total-vm:2606632kB, anon-rss:2579736kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5152kB oom_score_adj:0
[Wed Jan 4 05:04:53 2023] Out of memory: Killed process 21942 (gunicorn) total-vm:2583996kB, anon-rss:2551464kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5104kB oom_score_adj:0
[Wed Jan 4 05:59:49 2023] Out of memory: Killed process 30114 (gunicorn) total-vm:2885292kB, anon-rss:2859700kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5696kB oom_score_adj:0
[Wed Jan 4 06:00:21 2023] Out of memory: Killed process 17175 (gunicorn) total-vm:3771668kB, anon-rss:3745100kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:7496kB oom_score_adj:0
[Wed Jan 4 07:10:51 2023] Out of memory: Killed process 21717 (gunicorn) total-vm:2954316kB, anon-rss:2928600kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5832kB oom_score_adj:0
[Wed Jan 4 07:11:15 2023] Out of memory: Killed process 29888 (gunicorn) total-vm:4262912kB, anon-rss:4230728kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:8384kB oom_score_adj:0
[Wed Jan 4 08:32:20 2023] Out of memory: Killed process 1289 (gunicorn) total-vm:2896296kB, anon-rss:2870980kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5716kB oom_score_adj:0
[Wed Jan 4 10:15:53 2023] Out of memory: Killed process 5759 (gunicorn) total-vm:2974380kB, anon-rss:2944548kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5860kB oom_score_adj:0
[Wed Jan 4 10:37:24 2023] Out of memory: Killed process 2653 (gunicorn) total-vm:3007116kB, anon-rss:2979504kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5932kB oom_score_adj:0
[Wed Jan 4 12:11:48 2023] Out of memory: Killed process 21784 (gunicorn) total-vm:2902264kB, anon-rss:2857576kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5724kB oom_score_adj:0
[Wed Jan 4 12:13:21 2023] Out of memory: Killed process 26998 (gunicorn) total-vm:3146076kB, anon-rss:3124972kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:6204kB oom_score_adj:0
[Wed Jan 4 15:02:16 2023] Out of memory: Killed process 4219 (gunicorn) total-vm:2797524kB, anon-rss:2777004kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5520kB oom_score_adj:0
[Wed Jan 4 16:05:05 2023] Out of memory: Killed process 29692 (gunicorn) total-vm:2903796kB, anon-rss:2872340kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:5724kB oom_score_adj:0
[Wed Jan 4 16:10:10 2023] Out of memory: Killed process 4367 (gunicorn) total-vm:2890408kB, anon-rss:2852564kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5704kB oom_score_adj:0
[Wed Jan 4 16:15:11 2023] Out of memory: Killed process 29855 (gunicorn) total-vm:3804028kB, anon-rss:3762160kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:7492kB oom_score_adj:0
[Wed Jan 4 18:35:14 2023] Out of memory: Killed process 813 (gunicorn) total-vm:4009728kB, anon-rss:3983844kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:7900kB oom_score_adj:0
[Thu Jan 5 00:13:43 2023] Out of memory: Killed process 12828 (gunicorn) total-vm:2002940kB, anon-rss:1967236kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:3968kB oom_score_adj:0
[Thu Jan 5 00:27:45 2023] Out of memory: Killed process 30425 (gunicorn) total-vm:2815044kB, anon-rss:2778976kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5556kB oom_score_adj:0
[Thu Jan 5 01:53:13 2023] Out of memory: Killed process 23682 (gunicorn) total-vm:2011124kB, anon-rss:1976052kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:3980kB oom_score_adj:0
[Thu Jan 5 01:57:14 2023] Out of memory: Killed process 22627 (gunicorn) total-vm:2245508kB, anon-rss:2208772kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:4444kB oom_score_adj:0
[Thu Jan 5 02:19:19 2023] Out of memory: Killed process 5083 (gunicorn) total-vm:2328776kB, anon-rss:2295772kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:4608kB oom_score_adj:0
[Thu Jan 5 02:31:45 2023] Out of memory: Killed process 13990 (gunicorn) total-vm:2557472kB, anon-rss:2531064kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5052kB oom_score_adj:0
[Thu Jan 5 03:13:26 2023] Out of memory: Killed process 19255 (gunicorn) total-vm:1426376kB, anon-rss:1397856kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:2888kB oom_score_adj:0
[Thu Jan 5 03:14:01 2023] Out of memory: Killed process 29453 (gunicorn) total-vm:1521680kB, anon-rss:1496060kB, file-rss:780kB, shmem-rss:4kB, UID:467 pgtables:3032kB oom_score_adj:0
[Thu Jan 5 04:20:52 2023] Out of memory: Killed process 29794 (gunicorn) total-vm:1673412kB, anon-rss:1640244kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:3324kB oom_score_adj:0
[Thu Jan 5 05:39:39 2023] Out of memory: Killed process 16801 (gunicorn) total-vm:2615532kB, anon-rss:2577312kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5172kB oom_score_adj:0
[Thu Jan 5 05:40:05 2023] Out of memory: Killed process 26829 (gunicorn) total-vm:2535804kB, anon-rss:2509636kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5016kB oom_score_adj:0
[Thu Jan 5 10:02:21 2023] Out of memory: Killed process 18188 (gunicorn) total-vm:2328428kB, anon-rss:2291108kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:4600kB oom_score_adj:0
[Thu Jan 5 12:33:53 2023] Out of memory: Killed process 5178 (gunicorn) total-vm:1305460kB, anon-rss:1274636kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:2608kB oom_score_adj:0
[Thu Jan 5 14:59:27 2023] Out of memory: Killed process 2624 (gunicorn) total-vm:1691096kB, anon-rss:1663288kB, file-rss:52kB, shmem-rss:4kB, UID:467 pgtables:3364kB oom_score_adj:0
[Thu Jan 5 15:11:17 2023] Out of memory: Killed process 29774 (gunicorn) total-vm:2844536kB, anon-rss:2818588kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:5612kB oom_score_adj:0
[Thu Jan 5 15:55:54 2023] Out of memory: Killed process 9923 (gunicorn) total-vm:2904112kB, anon-rss:2869136kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5736kB oom_score_adj:0
[Thu Jan 5 16:00:24 2023] Out of memory: Killed process 18057 (gunicorn) total-vm:3078196kB, anon-rss:3052240kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:6072kB oom_score_adj:0
[Thu Jan 5 16:18:52 2023] Out of memory: Killed process 22888 (gunicorn) total-vm:2915716kB, anon-rss:2868836kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:5752kB oom_score_adj:0
[Thu Jan 5 16:55:52 2023] Out of memory: Killed process 30314 (gunicorn) total-vm:2918484kB, anon-rss:2872412kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:5756kB oom_score_adj:0
[Thu Jan 5 17:09:55 2023] Out of memory: Killed process 15344 (gunicorn) total-vm:4205252kB, anon-rss:4166980kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:8276kB oom_score_adj:0
[Thu Jan 5 18:24:24 2023] Out of memory: Killed process 17193 (gunicorn) total-vm:2968284kB, anon-rss:2941412kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:5856kB oom_score_adj:0
[Thu Jan 5 18:51:45 2023] Out of memory: Killed process 18221 (gunicorn) total-vm:2908556kB, anon-rss:2866624kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5740kB oom_score_adj:0
[Thu Jan 5 18:53:19 2023] Out of memory: Killed process 2550 (gunicorn) total-vm:4633480kB, anon-rss:4578596kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:9120kB oom_score_adj:0
[Thu Jan 5 19:04:03 2023] Out of memory: Killed process 26855 (gunicorn) total-vm:2947052kB, anon-rss:2921236kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5812kB oom_score_adj:0
[Thu Jan 5 19:33:24 2023] Out of memory: Killed process 12263 (gunicorn) total-vm:4842780kB, anon-rss:4783644kB, file-rss:1336kB, shmem-rss:0kB, UID:467 pgtables:9528kB oom_score_adj:0
[Thu Jan 5 19:47:13 2023] Out of memory: Killed process 20480 (gunicorn) total-vm:4776396kB, anon-rss:4735120kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:9400kB oom_score_adj:0
[Thu Jan 5 19:47:38 2023] Out of memory: Killed process 23009 (gunicorn) total-vm:2658244kB, anon-rss:2630716kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5248kB oom_score_adj:0
[Thu Jan 5 20:20:10 2023] Out of memory: Killed process 7542 (gunicorn) total-vm:2877672kB, anon-rss:2840140kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5676kB oom_score_adj:0
[Thu Jan 5 22:23:23 2023] Out of memory: Killed process 8954 (gunicorn) total-vm:2376512kB, anon-rss:2342552kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:4736kB oom_score_adj:0
[Thu Jan 5 22:23:43 2023] Out of memory: Killed process 4396 (gunicorn) total-vm:2521152kB, anon-rss:2484520kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:4984kB oom_score_adj:0
[Thu Jan 5 23:17:47 2023] Out of memory: Killed process 14157 (gunicorn) total-vm:1947704kB, anon-rss:1921316kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:3988kB oom_score_adj:0
[Thu Jan 5 23:19:34 2023] Out of memory: Killed process 30034 (gunicorn) total-vm:2769896kB, anon-rss:2749300kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5472kB oom_score_adj:0
[Thu Jan 5 23:26:52 2023] Out of memory: Killed process 21156 (gunicorn) total-vm:2718712kB, anon-rss:2690324kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:5364kB oom_score_adj:0
[Fri Jan 6 00:25:10 2023] Out of memory: Killed process 7790 (gunicorn) total-vm:2313576kB, anon-rss:2287992kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:4576kB oom_score_adj:0
[Fri Jan 6 00:31:48 2023] Out of memory: Killed process 26846 (gunicorn) total-vm:3204688kB, anon-rss:3164676kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:6324kB oom_score_adj:0
[Fri Jan 6 01:36:00 2023] Out of memory: Killed process 30475 (gunicorn) total-vm:3141108kB, anon-rss:3104828kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:6196kB oom_score_adj:0
[Fri Jan 6 02:12:23 2023] Out of memory: Killed process 13472 (gunicorn) total-vm:2977060kB, anon-rss:2930428kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5872kB oom_score_adj:0
[Fri Jan 6 02:19:07 2023] Out of memory: Killed process 30069 (gunicorn) total-vm:3144920kB, anon-rss:3118040kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:6208kB oom_score_adj:0
[Fri Jan 6 03:38:46 2023] Out of memory: Killed process 26967 (gunicorn) total-vm:2988888kB, anon-rss:2941740kB, file-rss:0kB, shmem-rss:0kB, UID:467 pgtables:5896kB oom_score_adj:0
[Fri Jan 6 04:31:43 2023] Out of memory: Killed process 13408 (gunicorn) total-vm:2883936kB, anon-rss:2858012kB, file-rss:568kB, shmem-rss:4kB, UID:467 pgtables:5692kB oom_score_adj:0
[Fri Jan 6 04:33:35 2023] Out of memory: Killed process 16854 (gunicorn) total-vm:4493372kB, anon-rss:4439648kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:8844kB oom_score_adj:0
[Fri Jan 6 06:30:26 2023] Out of memory: Killed process 28612 (gunicorn) total-vm:2909780kB, anon-rss:2867460kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5744kB oom_score_adj:0
[Fri Jan 6 06:31:29 2023] Out of memory: Killed process 12384 (gunicorn) total-vm:3916004kB, anon-rss:3889560kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:7716kB oom_score_adj:0
[Fri Jan 6 07:47:31 2023] Out of memory: Killed process 20180 (gunicorn) total-vm:2886980kB, anon-rss:2847144kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5696kB oom_score_adj:0
[Fri Jan 6 07:48:43 2023] Out of memory: Killed process 3687 (gunicorn) total-vm:2953936kB, anon-rss:2927632kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5824kB oom_score_adj:0
[Fri Jan 6 07:59:08 2023] Out of memory: Killed process 28916 (gunicorn) total-vm:2889672kB, anon-rss:2852636kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5700kB oom_score_adj:0
[Fri Jan 6 08:07:28 2023] Out of memory: Killed process 4059 (gunicorn) total-vm:2951624kB, anon-rss:2926532kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5820kB oom_score_adj:0
[Fri Jan 6 09:19:05 2023] Out of memory: Killed process 13988 (gunicorn) total-vm:3335224kB, anon-rss:3290408kB, file-rss:4kB, shmem-rss:0kB, UID:467 pgtables:6572kB oom_score_adj:0
[Fri Jan 6 10:21:52 2023] Out of memory: Killed process 10070 (gunicorn) total-vm:2377092kB, anon-rss:2339696kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:4696kB oom_score_adj:0
[Fri Jan 6 10:33:26 2023] Out of memory: Killed process 3729 (gunicorn) total-vm:2968816kB, anon-rss:2940764kB, file-rss:0kB, shmem-rss:4kB, UID:467 pgtables:5864kB oom_score_adj:0
Updated by pjessen over 1 year ago
- Assignee set to pjessen
As I have been playing with the nginx rewrite map, I noticed several gunicorn worker processes using up 1Gb and more. It seems they had been running for a few days, e.g.
[2023-03-10 04:37:15 +0000] [7650] [INFO] Booting worker with pid: 7650
[2023-03-11 23:46:17 +0000] [7650] [ERROR] Error handling request
[2023-03-14 11:39:35 +0000] [7650] [INFO] Worker exiting (pid: 7650)
(from /var/log/postorius/gunicorn.log)
I decided to just kill off 4-5-6 of those with more than 1Gb memory in use. Of course this immediately released a lot of memory, and
I started wondering if doing it regularly might just be an idea.
Just as for uwsgi, I have added a "--max-requests" argument, set to 100. That may seem quite low, but I think it is worth trading in some start-up seconds for lower memory usage.
Updated by pjessen over 1 year ago
pjessen wrote:
Just as for uwsgi, I have added a "--max-requests" argument, set to 100. That may seem quite low, but I think it is worth trading in some start-up seconds for lower memory usage.
Have increased to 500 - with 100, a worker was restarted about every 10mins. Every hour should be good enough.
Updated by pjessen over 1 year ago
pjessen wrote:
Have increased to 500 - with 100, a worker was restarted about every 10mins. Every hour should be good enough.
Yup, looks good to me. Memory usage is way down, no swap in use, more than 3Gb free.
I added the max-requests argument to the command line with a systemd drop-in override:
# /etc/systemd/system/mailman-web.service
[Unit]
Description=Postorius
[Service]
User=mailmanweb
Group=postorius
Environment="PYTHONPATH=/srv/www/webapps/mailman/web/"
Environment="DJANGO_SETTINGS_MODULE=settings"
WorkingDirectory=/srv/www/webapps/mailman/web/
ExecStart=/usr/bin/gunicorn \
--workers=16 \
--log-file=/var/log/postorius/gunicorn.log \
mailman_web.wsgi
[Install]
WantedBy=multi-user.target
# /etc/systemd/system/mailman-web.service.d/timeout.conf
[Service]
ExecStart=
ExecStart=/usr/bin/gunicorn \
--workers=16 \
--timeout=0 \
--max-requests=500 \
--log-file=/var/log/postorius/gunicorn.log \
mailman_web.wsgi
Obviously requires saltifying, but I think I'll procrastinate for a while. Do we need a category called "Salt pending" ?
Updated by pjessen over 1 year ago
pjessen wrote:
Obviously requires saltifying, but I think I'll procrastinate for a while. Do we need a category called "Salt pending" ?
Actually, a status "Salt pending" might be quite useful.
Updated by crameleon 4 months ago
- Status changed from In Progress to Resolved
- Assignee changed from pjessen to crameleon
- systemd unit changes added to package https://build.opensuse.org/projects/openSUSE:infrastructure:mailman3/packages/python-mailman-web/files/mailman-web.service?expand=1
- systemd unit configured for memory capping