Update pipeline parameter in job or command?

I’d like to define a set of parameters at the top level, and modify or update them in the course of a workflow, and then use them in evaluating conditional workflows or logic (when statements) to decide whether to run jobs. Is that possible? I can do it by using environment variables, but that seems less elegant. Thanks in advance!

Thanks,
Robert

1 Like

Hello @rprince - Thank you for being a member of our CircleCI community! In May of this year, we introduced advanced config-compile-time logic statements to config version 2.1. This might give you the conditional flexibility in using your when statements you are seeking.

This Discuss Post is the general announcement & description: Advanced Logic in Config

However, one of my colleagues in Support created the following Support Article which helps illustrate this and gives some example configs: https://support.circleci.com/hc/en-us/articles/360043638052-Conditional-steps-in-jobs-and-conditional-workflows

Have a look at these articles and let us know if this helps!

Hi @JonathanC503 ,

Thanks for your response. The flexible configuration is nice, but what I’m after is the ability to change the values of CircleCI config parameters dynamically. I understand that’s not possible currently, and I hope that you can add it in a future release.

Thanks!
Robert

Hello @rprince - Thank you for your update here. I did not realize you were seeking the ability to dynamically update config parameters within the build. You are correct in that is not possible at this time.

If you wouldn’t mind, could you describe your use-case and how this feature would benefit you or your team? Having this information helps us understand the business need more clearly. Thanks!

Hi @JonathanC503,

Sorry for the initial confusion. As I mentioned I would like to modify or update parameters in the course of a workflow (dynamically, while it is running) and use them to determine when to run jobs.

For example, I have a workflow that includes generation of some artifacts that then get uploaded somewhere for later analysis. It’s not desirable to run that step every time; depending on the outcome of a previous job it may or may not be desirable. I would like the ability to evaluate during the workflow itself whether or not to run that part.

This would seem to call for some kind of state that is accessible throughout the workflow, and has the property that it (the state) can be updated/changed during the workflow. It’s that last part (runtime update of data initially provisioned at config-compile time) that I’m missing.

There’s certainly a number of ways to work around this, like pushing that state information to a remote file and reading/writing it as necessary. I’ve chosen to handle it in the scripts that ultimately get called by the workflow jobs or commands; the scripts themselves then choose whether or not to execute.

I hope that explains what I’m after - please let me know if it doesn’t.

Thanks,
R

@rprince we needed similar functionality and what we ended up doing is storing the results we needed as an artifact and using the swissknife orb get job artifact command to fetch the set of results we need and based on the contents of this file we decided what to do.

I also added a command to get the job number of a rerun job. However if you have a specific kind of job whose number you need to be able to fetch let me know and I can add a command for you.

Yes this is definitely not as neat as dynamic workflows, buuut it gets close :slight_smile:

1 Like

@JonathanC503 - I have always wanted to be able to do this as well. I would consider this a basic feature due to the way I build workflows. My use case is this: I have a workflow like this:

  • aws/create-changeset
  • approve-changeset
    type: approval
  • aws/execute-changeset
  • run:
    command: some post-deploy commands I need to run

What ends up happening is the first step tries to create a change set, but there are no changes. However, I still need to go approve the empty changeset, then execute changeset runs. It wouldn’t be a big deal becuase the execute part doesn’t happen but I still have to wait for a container, install dependencies, etc… When really once I determine the changeset is empty I wish I could just skip both the appove AND the execute and go straight to the post-deploy script at the workflow level.

This is just one example, but I’ve run into many cases where I’ve wanted to do this.

Is there an API call I can make in the first step to set a pipeline parameter or something that I can then use as a conditional?

Thanks for the consideration

1 Like

Any update from circleci on this requirement of updating parameter value dynamically in the job?

1 Like