Shared database across containers?

I’ve read through the docs, but I’m a bit confused about how/whether I can make this work.

I run a test suite in parallel on CircleCI. These tests are run against our staging servers (not against localhost). I’d like to maintain a common database of test accounts that can be “checked out” for each test – so as to avoid overlap. This database would have to be accessible to all of my containers during execution.

What I can’t figure out is if I spin up a Postgres or Mysql database as part of my configuration, it’ll create a separate database for each container, right? Or does it spin up a single database that is shared amongst containers?

jobs:
  build:
    docker:
      - image: circleci/ruby:latest-browsers
      - image: circleci/postgres:9.4.12-alpine
    parallelism: 10

Would this give me one Postgres database, or 10 Postgres databases? If the answer is 10, is there a way to do a single database that’s shared between all containers?

1 Like

For the time being, I’m going external and I’m going to write an app to do what I need (and make API calls from CircleCI). If somebody comes up with a better solution later, I’d love to hear it!

This will give you 10 databases.

Each parallel task in a job runs independently - and might not necessarily even run at the same time.

If you need to co-ordinate state between multiple job runs then an external datastore is probably your best option - as you may also need to cater for the scenario where multiple different job runs are overlapping.

1 Like

Thanks! That’s what I figured. I created an external datasource with a really lightweight API for my purposes here.

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.