I’m trying to set up a workflow that does the following:
1 Build backend service - builds a Docker container
2. Build UI - builds a Docker container
3. Run end to end tests - uses both Docker containers
4. Deploy backend service to heroku - deploys Docker container
5. Deploy UI to Netlify - deploys built files
All 5 of these are separate jobs in a single workflow
3 depends on 1 and 2.
4 and 5 depend on 3.
I can get most of this working fine. Unfortunately, the Docker containers built in #1 and #2 aren’t visible in #3 and #4. So instead I’ve got #4 building the backend service container instead of #1 - that’s got the CD deploy working. I just don’t know how to get #3 working without it needing to completely rebuild the containers again.
I’ve seen the idea of pushing the containers to Docker Hub. That would work, but these are per-commit builds for development, and not release-worthy artifacts, so that seems wrong.
You could try exporting the images using docker save and then put them in a folder declared as a workspace. That will be seen by following jobs, which can then docker load them.
Of course, if you can minimise the size of your images, that will be a substantial win - there may be size limits to workspaces, and they will probably involve gzipping, so you’ll save time if you can make them small.
If you have a separate registry for your temporary builds, I don’t think this is a problem at all. In fact, I do this just for layer caching. However, I agree that a solution that stays within the bounds of the CI servers is a bit cleaner.