Workspaces + unneccesary files

We have a reasonably complicated workflow that performs a series of jobs:

  1. do a compile
  2. if it passes, build RPM packages
  3. if those pass, use the RPM packages for integration/smoke tests, and then also build Debian packages
  4. if all of that passes, then deploy to package repositories

The problem is, in step 4, we don’t want to deploy until we know all of the integration/smoke stuff passes, but that means we’re passing along all of the workspace junk that comes from running those, even when the only thing I need to deploy is the final artifacts from the RPM/Debian build process.

Is there a good way to download artifacts from a previous step instead? I’m starting to see my publishes fail because it’s taking so long to download all the workspace layers and attach them, that it times out and fails. Plus there’s no reason to push all that stuff around anyway.

I’ve thought about using caching, and tying it to the workflow ID number. Does that seem reasonable?

Answering my own question, it appears to work fine to use the workflow ID in caching, so you can store stuff that shouldn’t be passed down in the workspace. Just be aware if your containers don’t match you can run into permission issues like this support page discusses.

jobs:
  build-rpms:
    steps:
      - do-things
      - save_cache:
          key: rpms-{{ .Environment.CIRCLE_WORKFLOW_ID }}
          paths:
            - path/to/artifacts-to-cache/
  publish-rpms:
    steps:
      - restore_cache:
          keys:
            - rpms-{{ .Environment.CIRCLE_WORKFLOW_ID }}
      - do-more-things

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.