My Situation is this.
I have 1 codebase that I need to build out a number of times using different environment variables.
My first thought was to loop around an array of my variables in 1 job and call another job with those variables but I can’t see how this would be done.
Am I on the right track with this or is there a better solution?
It depends on how powerful/complex you want the final solution.
Our build process takes a number of products and builds on CircleCI a number of unique deployable images and so is much like yours with an independent set of environment variables per target image.
To handle this I now store all values on the service offered by doppler.com
This provides a CLI tool to access values and each defined config has a unique access key. The result is that my config.yml has bash code like the following
echo “<<parameters.doppler_token>>” | doppler configure set token --scope / --no-check-version
DOCKER_USER=$(doppler secrets get --silent --plain DOCKER_USER)
PROJECT_NAME=$(doppler secrets get --silent --plain PROJECT_NAME)
TAG_VALUE1=$(doppler secrets get --silent --plain DOCKER_TAG_VALUE1)
TAG_VALUE2=$(doppler secrets get --silent --plain DOCKER_TAG_VALUE2)
The first line just uses the unique access key to select the config I want to use and then the other lines just retrieve values to be placed in local environment variables for reuse by the CI task.
You could then just call the defined job a number of times and just pass in the different access keys.
I may have taken this system a little too far as it is now used to drive
- The deployment of virtual machines in-house on VMWARE and at AWS
- The building of system images
- The driving of our test systems
- The configuration of the docker-compose based environment that run the whole system stack.
The last item is the big ‘task’ as it is about 15 docker containers and over 200 environment variables which are then used to express 9 different deployments.