You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm running a proxy (redbird) in cluster mode, using a single process. (and asked about leaks here OptimalBits/redbird#237, which doesn't seem to be the case)
In order to check the memory usage of my application, I've attached the Chrome Debugger on my live application, and checked the values on process.memoryUsage().
The heapUsed seems really fine (50MB), but the rss (which I believe belongs to PM2's master process) keeps growing, and it has 492MB.
I've also took memory snapshots of the proxy, and heap usage doesn't go higher than 55MB.
I'd appreciate it if you could give me any hints of what may be going on. I'm going to experiment running the proxy without PM2 to see what happens soon.
When using Worker threads, rss will be a value that is valid for the entire process, while the other fields will only refer to the current thread. (See docs)
How could we reproduce this issue?
It may not be easy to reproduce as I'm running a game on production, and I've failed to reproduce the issue locally myself.
Supporting information
--- PM2 report ----------------------------------------------------------------
Date : Thu Jan 23 2020 02:23:02 GMT+0000 (Coordinated Universal Time)
===============================================================================
--- Daemon -------------------------------------------------
pm2d version : 4.2.1
node version : 12.14.1
node path : /root/.nvm/versions/node/v12.14.1/bin/pm2
argv : /root/.nvm/versions/node/v12.14.1/bin/node,/root/.nvm/versions/node/v12.14.1/lib/node_modules/pm2/lib/Daemon.js
argv0 : node
user : root
uid : 0
gid : 0
uptime : 20937min
===============================================================================
--- CLI ----------------------------------------------------
local pm2 : 4.2.1
node version : 12.14.1
node path : /root/.nvm/versions/node/v12.14.1/bin/pm2
argv : /root/.nvm/versions/node/v12.14.1/bin/node,/root/.nvm/versions/node/v12.14.1/bin/pm2,report
argv0 : node
user : root
uid : 0
gid : 0
===============================================================================
--- System info --------------------------------------------
arch : x64
platform : linux
type : Linux
cpus : Intel(R) Xeon(R) CPU E5-2650 v4 @ 2.20GHz
cpus nb : 3
freemem : 78020608
totalmem : 1033015296
home : /root
===============================================================================
--- PM2 list -----------------------------------------------
┌─────┬─────────────┬─────────────┬─────────┬─────────┬──────────┬────────┬──────┬───────────┬──────────┬──────────┬──────────┬──────────┐
│ id │ name │ namespace │ version │ mode │ pid │ uptime │ ↺ │ status │ cpu │ mem │ user │ watching │
├─────┼─────────────┼─────────────┼─────────┼─────────┼──────────┼────────┼──────┼───────────┼──────────┼──────────┼──────────┼──────────┤
│ 3 │ proxy │ default │ 0.10.4 │ cluster │ 8141 │ 10h │ 2 │ online │ 17.9% │ 492.8mb │ root │ disabled │
│ 0 │ myapp │ default │ 3.7.0 │ fork │ 11894 │ 3h │ 4 │ online │ 0% │ 55.2mb │ root │ disabled │
│ 6 │ myapp │ default │ 3.7.0 │ fork │ 12041 │ 106m │ 2 │ online │ 0% │ 49.4mb │ root │ disabled │
│ 7 │ myapp │ default │ 3.7.0 │ fork │ 8132 │ 10h │ 1 │ online │ 25.6% │ 94.6mb │ root │ disabled │
└─────┴─────────────┴─────────────┴─────────┴─────────┴──────────┴────────┴──────┴───────────┴──────────┴──────────┴──────────┴──────────┘
===============================================================================
--- Daemon logs --------------------------------------------
/root/.pm2/pm2.log last 20 lines:
PM2 | 2020-01-22T13:25:53: PM2 log: [PM2] This PM2 is not UP TO DATE
PM2 | 2020-01-22T13:25:53: PM2 log: [PM2] Upgrade to version 4.2.3
PM2 | 2020-01-22T15:23:50: PM2 log: App [myapp:7] exited with code [1] via signal [SIGINT]
PM2 | 2020-01-22T15:23:50: PM2 log: App [myapp:7] starting in -fork mode-
PM2 | 2020-01-22T15:23:50: PM2 log: App [myapp:7] online
PM2 | 2020-01-22T15:23:50: PM2 log: App [proxy:3] exited with code [0] via signal [SIGINT]
PM2 | 2020-01-22T15:23:50: PM2 log: App [proxy:3] starting in -cluster mode-
PM2 | 2020-01-22T15:23:50: PM2 log: App [proxy:3] online
PM2 | 2020-01-22T16:19:12: PM2 log: App [myapp:6] exited with code [1] via signal [SIGINT]
PM2 | 2020-01-22T16:19:12: PM2 log: App [myapp:6] starting in -fork mode-
PM2 | 2020-01-22T16:19:12: PM2 log: App [myapp:6] online
PM2 | 2020-01-22T22:46:40: PM2 log: App [myapp:0] exited with code [1] via signal [SIGINT]
PM2 | 2020-01-22T22:46:40: PM2 log: App [myapp:0] starting in -fork mode-
PM2 | 2020-01-22T22:46:40: PM2 log: App [myapp:0] online
PM2 | 2020-01-23T00:36:33: PM2 log: App [myapp:6] exited with code [1] via signal [SIGINT]
PM2 | 2020-01-23T00:36:33: PM2 log: App [myapp:6] starting in -fork mode-
PM2 | 2020-01-23T00:36:33: PM2 log: App [myapp:6] online
PM2 | Debugger listening on ws://127.0.0.1:9229/aebbe33c-8ceb-44c8-9835-10768e00af3d
PM2 | For help, see: https://nodejs.org/en/docs/inspector
PM2 | Debugger attached.
The text was updated successfully, but these errors were encountered:
What's going wrong?
I'm running a proxy (redbird) in
cluster
mode, using a single process. (and asked about leaks here OptimalBits/redbird#237, which doesn't seem to be the case)In order to check the memory usage of my application, I've attached the Chrome Debugger on my live application, and checked the values on
process.memoryUsage()
.The
heapUsed
seems really fine (50MB), but therss
(which I believe belongs to PM2's master process) keeps growing, and it has 492MB.I've also took memory snapshots of the proxy, and heap usage doesn't go higher than 55MB.
I'd appreciate it if you could give me any hints of what may be going on. I'm going to experiment running the proxy without PM2 to see what happens soon.
How could we reproduce this issue?
It may not be easy to reproduce as I'm running a game on production, and I've failed to reproduce the issue locally myself.
Supporting information
The text was updated successfully, but these errors were encountered: