[CircleCI - Parallel runs] Unable to upload artifact on one of my run

Hi community,

Reaching to you as i run out of ideas.

I am currently running tests in parallel with circleCI (I have 4 parallel runs).
Each run generate an artifact which I persist to the same workspace (Then i use this workspace to merge all the report of the 4 run into 1 for review).

Everything was working fine until 1 week ago i would say.

Now out of the 4 runs, one can’t upload the artifact anymore (it can be the first run or the last run, it’s random and sometimes i got lucky all of them work).

Needless to say it’s very annoying for a testing perspective.

To my knowledge nothing has changed on our side.

Here the error i got:
“Uploading /root/project/all-blob-reports to all-blob-reports
No artifact files found at /root/project/all-blob-reports
Total size uploaded: 0 B”

And then the yml file details (can’t upload as attachment):

version: 2.1

parameters:
staging:
type: integer
default: 0

jobs:
run-test:
parameters:
staging:
type: integer
default: 0
docker:
- image: can’t disclosed
steps:
- checkout
- run:
name: Create playwright report directory
command: mkdir -p all-blob-reports
- run: npm i -D @playwright/test
- run: npx playwright install
- run: npx playwright install chrome
- run: npx playwright install msedge
- run:
name: Run sharded tests
command: SHARD=“$((${CIRCLE_NODE_INDEX}+1))”; STAGING=<< parameters.staging >> npx playwright test circleci tests split --shard=${SHARD}/${CIRCLE_NODE_TOTAL} || cp blob-report/* all-blob-reports
- store_artifacts:
path: all-blob-reports
- persist_to_workspace:
root: .
paths:
- all-blob-reports
parallelism: 4
merge-reports:
docker:
- image: can’t disclosed
steps:
- checkout
- attach_workspace:
at: .
- run:
name: Merge reports
command: npx playwright merge-reports --reporter html ./all-blob-reports
- store_artifacts:
path: ./playwright-report

workflows:
run-test-workflow:
jobs:
- run-test:
staging: << pipeline.parameters.staging >>
- merge-reports:
requires:
- run-test

Any ideas?

1 Like

Do you happen to have a more precise time stamp on when you started seeing these failures? I can go and look to see if we changed anything on our end around that time

Hi Sebastian,

Thanks for your reply. For what i can see it started 7 days ago (from the morning i believe) so i would say last Wednesday.
Don’t have a more specific time stamp so far.

It starts to happen at the same time i created a branch on my pipeline. Wonder if it’s related but can’t see how.

@sebastian-lerner did you found something?