Dynamic configuration - setting up dynamic mapping

We have a monorepo with new services added regularly, each one is under a new folder. I was going to use dynamic configuration to trigger workflows/jobs according to the changed path. I’ve been following this tutorial.

My problem is that I don’t want developers to edit the config file every time a new service is added. In the example below 4 paths are mapped to 4 parameters that are used to trigger workflows, but I must know the paths and parameter names in advance.

workflows:
  setup:
    jobs:
      - path-filtering/filter:
          base-revision: master
          mapping: |
            data/.* config-data-modified true
            service/common/.* service-layer-modified true
            service/site-api/.* run-site-api-workflow true
            service/batch/.* run-batch-services-workflow true

Is there any way to generate this mapping on the fly?
Let’s say there was a change in a file under the path ‘service/new_service’. So I would want the setup config file to get the changed paths from git, and generate the correct mapping:

            service/new_service/.* run-new_service-workflow true

as if I gave it a pattern:

            service/<git-param-of-changed-folder-name->/.* run-<git-param-of-changed-folder-name->-workflow true

Another possible solution is that I will have a script that builds the needed continuation yaml file with the required jobs, but I still need my script to know the names of changed protocols coming from git so I can only run the needed tests.

Any ways to achieve this?

The best way to do this would be to avoid the path-filtering orb and instead use the underlying feature, dynamic config. The path-filtering filter job just does the following:

  1. Use git diff to determine what has changed between the pushed SHA and the base revision
  2. Using a python script that takes the mapping, compares against that list of changed files, and determines what pipeline parameters need to be set
  3. Exports that list of pipeline parameters to a json file
  4. Runs the continuation/continue command with that file.

If you create a script that does steps 1-3 with the logic you want and wrap it in a job that runs the script and the continue command, that should achieve your desired outcome.