Hello folks.
I’ve been searching for a few days but couldn’t find anything: is there any way to execute different jobs on the exact same machine during a workflow?
Like, a build
job would run docker-compose
to build containers and install a database, and a test
job would execute tests right after the build
job, by using the previously created database.
I’m looking for that because my entire project’s dev and test workflow depends on multiple containers (hence the use of docker-compose) that need to be built and populated with the test data.
For example I have a qa
container, a database one, an application one, etc., and i.e functional tests need to use app & db containers.
For now, all I can see is that I can create a docker image and “save” it, but I’ll have to recreate and restart the container on every job, so if I have multiple jobs relying on a database (which I do) I will have to recreate the database everytime.
The issue I see here is that it creates lots of redundant calls: building images and containers, setting them up (install dependencies, install database, etc), and execute my scripts.
For now, I have one job on which I have separate tests for that, but I wish I could instead run all these steps in separate jobs in order to make it easier to debug which scripts return errors, instead of going to the step logs, and also to run them in parallel.
Any idea? Maybe I’m missing something?