Storing artifacts silently fails



When storing a directory as an artifact some files would not be uploaded if there are too many files.

  • the folder has ~1500 files

  • the first ~1000 files uploads successfully
    Uploaded /home/circleci/allure-report/data/0add1c1a-e7cf-492b-b7b5-e19b1c93d424-attachment.png

  • the subsequent files fails to upload
    Failed opening file /home/circleci/allure-report/data/cb3640c5-96af-47d4-b47b-aea04805b72b-attachment.png

  • I’ve verified the files are present and has the same permissions

  • I’ve also tried to increase the file descriptor limit
    run: sudo sh -c "ulimit -n 10000"

  • The directory has ~150mb of data, well below the 3gb limit

any suggestions or insights appreciated

After some digging, I’m pretty sure the problem is the file descriptor limit which is set to open files (-n) 1024 on the default machine: true config. The attempt I made previously to update the limit only opens a shell with that change and will not persist. So I’ll prob use a docker container with updated configs.

The real question is why the store_artifacts command keeps all the files open during the upload.


This topic was automatically closed 41 days after the last reply. New replies are no longer allowed.