Reuse the same machine to execute scripts

Hello folks.

I’ve been searching for a few days but couldn’t find anything: is there any way to execute different jobs on the exact same machine during a workflow?

Like, a build job would run docker-compose to build containers and install a database, and a test job would execute tests right after the build job, by using the previously created database.

I’m looking for that because my entire project’s dev and test workflow depends on multiple containers (hence the use of docker-compose) that need to be built and populated with the test data.
For example I have a qa container, a database one, an application one, etc., and i.e functional tests need to use app & db containers.
For now, all I can see is that I can create a docker image and “save” it, but I’ll have to recreate and restart the container on every job, so if I have multiple jobs relying on a database (which I do) I will have to recreate the database everytime.

The issue I see here is that it creates lots of redundant calls: building images and containers, setting them up (install dependencies, install database, etc), and execute my scripts.

For now, I have one job on which I have separate tests for that, but I wish I could instead run all these steps in separate jobs in order to make it easier to debug which scripts return errors, instead of going to the step logs, and also to run them in parallel.

Any idea? Maybe I’m missing something?

You cannot reuse machines, no, and I don’t think you can keep containers running across jobs either.

However yes, you can save and restore pristine Docker images (or even Docker containers after they have been run). I would try Workspaces for that in the first instance.

I wonder also whether you could try dumping the contents of your database to your Workspace, at the end of the first job, and then restoring it in subsequent jobs. That should avoid the need to pull your database seeder code into all projects that need data.