Running tests against running docker containers

This section talks about how one could run Docker containers using remote Docker, but the Docker container can’t be accessed from the primary container. We’re starting to build our services as microservices built in Docker images, so when we run integration tests, we need to be able to run our services in Docker.

Our team came up with two solutions so far:

  1. Use machine instead of docker
    • Machine allows running Docker containers without remote Docker, thus alleviating the networking issue
    • It seems like all of Circle CI’s VM images run Ubuntu, whereas our scripts require CentOS. It is an unknown amount of work to make the scripts work on Ubuntu as well.
    • If Circle CI has CentOS VM images we can use with machine, this would be the preferred option
  2. Run the tests themselves in a container (i.e. writing a Dockerfile for each test job)
    • Basically make the primary container do container orchestration
    • Networking is solved because both the tests and the services would be in remote Docker, so they can talk to each other
    • One benefit for us is our CI config is huge and unmaintainable, and this can help us revisit our jobs and clean up any stale dependencies
    • The drawback is that it makes the CI system a tad more complex, and it makes the primary container underutilized


1 Like

Another option we just found is to use SSH tunnelling through the remote-docker VM.

docker run -p 3000:3000 ...
ssh -N -L localhost:3000:localhost:3000 remote-docker

will run a service, exposing port 3000 to the remote docker VM, and then forward port 3000 in the remote docker VM to the primary tesing machine.

However, this seems to slow down network traffic to the containers, so I’m trying different things out, along this vein. Anyone have any other ideas?