Memory problems with jest and workers

nodejs
getting-started

#1

I’m writting this down because I’ve lost some hours to discover this problem.

I’ve been trying to run jest for a while and I was facing memory problems all the time.

The output for npm test command:

> potato@1.0.0 test /home/ubuntu/potato
> jest --coverage

died unexpectedly

The process dump generated by circle was showing this:

16972 144436 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib/child/i       ndex.js
17052 143812 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
16956 143012 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17062 142864 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
16962 142332 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17007 142324 6.5 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17012 142296 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17022 142288 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
16961 142268 6.5 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
16971 142204 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17037 142112 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17098 142080 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
16997 142064 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17067 142052 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
16992 142036 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17088 142032 6.3 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
16987 142016 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
16982 141972 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
16977 141848 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17072 141132 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17006 141124 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17078 140488 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17021 140404 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17042 140404 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17047 140344 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17061 140328 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17093 140304 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17027 140148 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17032 140128 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17083 140084 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib    /child/   index.js
17107 140076 6.4 /opt/circleci/nodejs/v4.2.6/bin/node /home/ubuntu/potato/node_modules/jest/node_modules/jest-cli/node_modules/worker-farm/lib/child/index.js

After some research I’ve discovered that jest creates one worker for each existing cpu in the system, with this code:

maxProcesses: options.maxProcesses || os.cpus().length,

I suspect that the os.cpus().lenght is returning 32 (the cpus of the real machine) instead of 2 the cpus assigned to the container. Therefore it was allocating 4.4gb and breaking the container.

The solution, for me, is limiting manually the number of workers:

jest --coverage --maxWorkers 2

Or you can just run the test on the main process, without workers, by doing:

jest --coverage --runInBand

Limit memory 4GB when run Docker Images
#2

I had the same issue, thanks :slight_smile:


#3

Us too, and this change also halved our CI runtime. Thanks for this!


#4

Same problem here, you saved us quite some time, thanks a lot :slight_smile:


#5

Great work here @jpiqueras this issue randomly just started happening for us the past few weeks and I finally started digging all over the place. Finally found this solution and it did the trick. Thanks!


#6