Problem using split by timings in Workflows


I’m currently trying to migrate my team’s project from 1 to 2. So far, I’ve taken a median of 13.5 minutes down to about 7-8 minutes (depending on how many caches get hit)! However, I can’t seem to get the split-by=timings to find the test metadata, so it falls back to the name type:

Requested historical based timing, but they are not present.  Falling back to name based sorting

I’m pretty sure I’m doing the test metadata correctly though because I’m seeing the pretty display of errors at the top:

Here’s what I’m doing in the step where I run the test:

      - run:
          name: Run tests
            CODECOV_FLAG: backend
          command: |
            bin/rails db:setup
            TESTFILES=$(circleci tests glob "spec/**/*_spec.rb" | circleci tests split --split-by=timings)
            bin/rspec --profile 10 -r rspec_junit_formatter --format progress --format RspecJunitFormatter --out rspec/rspec.xml -- ${TESTFILES}
      - store_test_results:
          path: rspec
      - store_artifacts:
          path: tmp/capybara
          path: rspec

Note that I am using the workflows functionality. I read something in the docs regarding some key not being supported in workflows. Does this mean I can’t use this timings functionality?


This is also being discussed in Error: "Requested historical based timing, but they are not present"


Try Adding --timings-type=classname

TESTFILES=$(circleci tests glob "spec/**/*_spec.rb" | circleci tests split --split-by=timings --timings-type=classname)


Adding --timings-type=classname did not change anything for me. Still not picking up test timings.


Agreed, I think this just doesn’t work with workflows?


can we get an official response on this from a CircleCI employee?


This is next up on our roadmap to fix. Test timings are available for 2.0 but not Workflows.


Just spent some time banging my head trying to solve that, would be great to have a big warning on workflows documentation page and on store_test_results documentation so people don’t waste the effort on that :angry:


The problem is that only the first container gets test timings so you can easily fix it on your own using workspace to pass it:

  # Copy .circleci-task-data from first container to make test timings work
  - run: cp -R /.circleci-task-data tmp/circleci-task-data/

  - persist_to_workspace:
       root: .
         - tmp/circleci-task-data
         - vendor/bundle
         - .bundle

Later just set them up into a proper place before running tests:

  - attach_workspace:
      at: /source

  # Copy .circleci-task-data from first container to make test timings work
  - run: cp -R tmp/circleci-task-data/* /.circleci-task-data/


We do have a warning on store_test_results but I definitely want to see the warning in other places, like the FAQ.


Hey @pawelniewie thanks for the tip on the workaround. I’ve tried implementing it in our workflow but not having much luck, if I ls -la /.circleci-task-data in the first job in our workflow it lists an empty directory.

This means that in the ‘later’ step, the job will have a fit because there’s no files to be found to copy back from the tmp location to the real one.

Am I missing something else here?


Yeah, unfortunately that worked once and breaks now, no idea why. After playing with workflows today we abandoned them for now. Setting up containers/steps take a lot of time, and in our use case (running tests only) doesn’t bring expected results (decreasing time multiple containers are used).


I added this to docs here:



We have released support for timing-based test splitting within Workflows. Timing data will be made available from the most recent job (within the last 50) with the same job name.