How can I start testing database connection in circleCI?

How can I start testing for database connections in CircleCI? I am aware of creating a secondary docker image and have all your database settings in it. But i am not aware on how to do that, are there any examples that I can follow? Documentation covers the basics, but I couldn’t get much help from it.

I have a python test file that requires DB connections -

class TestDataPlatformConnection(object):
    dp = dp.DataPlatformConnection()

    def test_insert_from_dicts(self):
        with pytest.raises(Exception):
            self.dp.insert_from_dicts('', '', ['test1', 'test2'])

    def test_create_table_sql(self):
        assert self.dp.create_table_sql('prod', 'school', ['school_name', 'city']) == \
            'CREATE TABLE "prod"."school" ("school_name" text , "city" text);'

        assert self.dp.create_table_sql('dev', 'geocoding', ['lat', 'long']) == \
            'CREATE TABLE "dev"."geocoding" ("lat" text , "long" text);'

        with pytest.raises(Exception):
            _ = self.dp.create_table_sql('sandbox', 'school', None)

        with pytest.raises(Exception):
            _ = self.dp.create_table_sql('sandbox', 'school', [])

        with pytest.raises(Exception):
            _ = self.dp.create_table_sql('sandbox', 'school', 'bad columns value')

        with pytest.raises(Exception):
            _ = self.dp.create_table_sql('', '', ['lat', 'long'])

        with pytest.raises(Exception):
            _ = self.dp.create_table_sql(None, None, ['lat', 'long'])

    def test_drop_temp_table_sql(self):
        assert self.dp.drop_temp_table_sql('transit') == \
            'DROP TABLE IF EXISTS transient."transit"'

        with pytest.raises(Exception):
            _ = self.dp.drop_temp_table_sql('')

        with pytest.raises(Exception):
            _ = self.dp.drop_temp_table_sql(None)

        with pytest.raises(TypeError):
            self.dp.drop_temp_table_sql(12345)

    def test_duplicate_table_sql(self):
        assert self.dp.duplicate_table_sql('sandbox', 'school', 'prod', 'school') == \
            'CREATE TABLE "sandbox"."school" AS SELECT * FROM "prod"."school" LIMIT 0'

    def test_copy_csv_sql(self):
        assert self.dp.copy_csv_sql('sandbox', 'school') == \
            'COPY "sandbox"."school" FROM STDIN WITH CSV HEADER DELIMITER AS \',\''

        assert self.dp.copy_csv_sql('sandbox', 'school', header=False) == \
            'COPY "sandbox"."school" FROM STDIN WITH CSV DELIMITER AS \',\''

        assert self.dp.copy_csv_sql('sandbox', 'school') == \
            'COPY "sandbox"."school" FROM STDIN WITH CSV HEADER DELIMITER AS \',\''

    def test_csv_export_sql(self):
        assert self.dp.csv_export_sql('SELECT * FROM prod.geo') == \
            'COPY (SELECT * FROM prod.geo) TO STDOUT WITH CSV HEADER'

        assert self.dp.csv_export_sql('') == \
            'COPY () TO STDOUT WITH CSV HEADER'

    def test_table_sample_sql(self):
        assert self.dp.table_sample_sql('prod', 'school', 100) == \
            'SELECT * FROM "prod"."school" LIMIT 100'

        with pytest.raises(Exception):
            _ = self.dp.table_sample_sql('prod', 'school', '100')

what settings do I need to put in my DB docker image for these tests to work?
my config.yml file looks like this -

version: 2
jobs:
  build:
    docker:
      - image: circleci/python:2.7.14-jessie
        environment:
          DP_DBNAME: ""
          DP_USER: ""
          DP_PASSWORD: ""
          DP_HOST: ""
          DP_PORT: ""
          DP_LOG_LEVEL: ""
          DP_PHOTO_S3_BUCKET: ""

          # prod CREA credentials
          DP_CREA_DDF_USERNAME: ""
          DP_CREA_DDF_PASSWORD: ""
          DP_CREA_DDF_LOGIN_URL: ""

          # TREB DLA credentials
          DP_TREB_DLA_USERNAME: ""
          DP_TREB_DLA_PASSWORD: ""
          DP_TREB_DLA_PHOTOS_FTP_SERVER_URL: ""
          DP_TREB_DLA_FORM_POST_URL: ""

          # TREB VOW credentials
          DP_TREB_VOW_USERNAME: ""
          DP_TREB_VOW_USERNAME_ACTIVES: ""
          DP_TREB_VOW_PASSWORD: ""
          DP_TREB_VOW_LOGIN_URL: ""
          DP_TREB_RETS_LOGIN_URL: ""

          # BRC credentials
          DP_BRC_USERNAME: ""
          DP_BRC_PASSWORD: ""

          # BRC IDX credentials
          DP_BRC_IDX_LOGIN_URL: ""
          DP_BRC_IDX_REW_USERNAME: ""
          DP_BRC_IDX_REW_PASSWORD: ""
          DP_BRC_IDX_BROKER_USERNAME: ""
          DP_BRC_IDX_BROKER_PASSWORD: ""

          # remaining credentials
          DP_VIREB_IDX_LOGIN_URL: ""
          DP_VREB_IDX_LOGIN_URL: ""
          DP_WLS_IDX_LOGIN_URL: ""
          DP_TREB_IDX_USERNAME: ""
          DP_VIREB_IDX_USERNAME: ""
          DP_VREB_IDX_USERNAME: ""
          DP_WLS_IDX_USERNAME: ""
          DP_TREB_IDX_USERNAME_ACTIVES: ""
          DP_VIREB_IDX_PASSWORD: ""
          DP_VREB_IDX_PASSWORD: ""
          DP_WLS_IDX_PASSWORD: ""
          DP_TREB_IDX_PASSWORD: ""


          # Full path to Google API keyfile with data write access to Google BigQuery event tracking project
          GOOGLE_APPLICATION_CREDENTIALS: ""

          # Twilio
          TWILIO_ACCOUNT_SID: ""
          TWILIO_AUTH_TOKEN: ""
          TWILIO_SOURCE_PHONE_NUMBER: ""
          TWILIO_MY_PHONE_NUMBER: ""

          # AWS
          AWS_ACCESS_KEY: ""
          AWS_SECRET_KEY: ""
          DP_LISTING_PHOTOS_S3_BUCKET: ""


      - image: circleci/postgres:9.6.5-alpine-ram
        environment:
          POSTGRES_USER: root
          POSTGRES_DB: circle_test
          POSTGRES_PASSWORD: ""

    steps:
      - checkout
      - run:
          name: Install Python dependencies in a venv
          command: |
            virtualenv venv
            . venv/bin/activate
            cd database_jobs
            python -m pip install -r requirements.txt
      - run:
          name: Running tests
          command: |
            . venv/bin/activate
            cd database_jobs
            python -m pytest -s -vv tests/ --ignore=tests/test_data_platform.py --ignore=tests/test_sql_helper.py

Would you expand on what problem you’re actually trying to solve? Do you have a database service that might be up, and you want to determine if it is?

What is the purpose of the various env vars attached to your circleci/python:2.7.14-jessie container?

hi halfer,
thanks for the prompt reply.

  1. env vars in circleci/python:2.7.14-jessie container are required for python unit-tests. Everything is working fine as far as that containers is concerned.

  2. my issue is with 2 test files (test_data_platform.py) that need database connections to test some basic DB connectivity. the python test file code I pasted above uses psycopg2 python package to connect to the production postgres database. the connection looks like this in data_platform.py -

     conn_string = get_conn_string(os.environ.get('DP_DBNAME'), os.environ.get('DP_USER'), os.environ.get('DP_HOST'),
                                                   os.environ.get('DP_PORT'), os.environ.get('DP_PASSWORD'))
    
    self.conn = psycopg2.connect(conn_string, cursor_factory=NamedTupleCursor)

OK. Prior to running your tests, you’ll need to execute the necessary SQL to create the user(s) and database(s) you’ll need. I imagine you can do this with the PostgreSQL client psql, which you’ll need to install in a separate step. On my desktop Ubuntu, it’d be sudo apt-get install postgresql-client-9.5, but it looks like it might be 9.6 for you.

So, install the client, then run the set-up SQL you need, then finally run the tests.

1 Like

thanks Halfer for the info, this seems like a tedious setup, but I will give it a shot…alternatively, can I use a test DB setup (which is a replica of production DB) I have on AWS instead of creating a docker image in config.yml what do you suggest?

Setting up databases, users and test fixtures locally is pretty standard.

You can, but that feels slower and more fragile; remote connections will be slower and more likely to fail. Also, you’d not be using the smallest possible set of test data, and you’d not have a repository snapshot of what a good test suite looks like.

I don’t know what problem you’re solving, so perhaps there is a reason to do this - feel free to outline it if you wish.

based on your suggestion, I started doing my own docker setup for the DB and here is my config.yml:

version: 2
jobs:
  build:
    docker:
      - image: circleci/python:2.7.14-jessie
        environment:
          DP_DBNAME: ""
          DP_USER: ""
          DP_PASSWORD: ""
          DP_HOST: ""
          DP_PORT: ""
          DP_LOG_LEVEL: ""
          DP_PHOTO_S3_BUCKET: ""

          # prod CREA credentials
          DP_CREA_DDF_USERNAME: ""
          DP_CREA_DDF_PASSWORD: ""
          DP_CREA_DDF_LOGIN_URL: ""

          # TREB DLA credentials
          DP_TREB_DLA_USERNAME: ""
          DP_TREB_DLA_PASSWORD: ""
          DP_TREB_DLA_PHOTOS_FTP_SERVER_URL: ""
          DP_TREB_DLA_FORM_POST_URL: ""

          # TREB VOW credentials
          DP_TREB_VOW_USERNAME: ""
          DP_TREB_VOW_USERNAME_ACTIVES: ""
          DP_TREB_VOW_PASSWORD: ""
          DP_TREB_VOW_LOGIN_URL: ""
          DP_TREB_RETS_LOGIN_URL: ""

          # BRC credentials
          DP_BRC_USERNAME: ""
          DP_BRC_PASSWORD: ""

          # BRC IDX credentials
          DP_BRC_IDX_LOGIN_URL: ""
          DP_BRC_IDX_REW_USERNAME: ""
          DP_BRC_IDX_REW_PASSWORD: ""
          DP_BRC_IDX_BROKER_USERNAME: ""
          DP_BRC_IDX_BROKER_PASSWORD: ""

          # remaining credentials
          DP_VIREB_IDX_LOGIN_URL: ""
          DP_VREB_IDX_LOGIN_URL: ""
          DP_WLS_IDX_LOGIN_URL: ""
          DP_TREB_IDX_USERNAME: ""
          DP_VIREB_IDX_USERNAME: ""
          DP_VREB_IDX_USERNAME: ""
          DP_WLS_IDX_USERNAME: ""
          DP_TREB_IDX_USERNAME_ACTIVES: ""
          DP_VIREB_IDX_PASSWORD: ""
          DP_VREB_IDX_PASSWORD: ""
          DP_WLS_IDX_PASSWORD: ""
          DP_TREB_IDX_PASSWORD: ""


          # Full path to Google API keyfile with data write access to Google BigQuery event tracking project
          GOOGLE_APPLICATION_CREDENTIALS: ""

          # Twilio
          TWILIO_ACCOUNT_SID: ""
          TWILIO_AUTH_TOKEN: ""
          TWILIO_SOURCE_PHONE_NUMBER: ""
          TWILIO_MY_PHONE_NUMBER: ""

          # AWS
          AWS_ACCESS_KEY: ""
          AWS_SECRET_KEY: ""
          DP_LISTING_PHOTOS_S3_BUCKET: ""


      - image: circleci/postgres:9.6.5-alpine-ram
        environment:
          POSTGRES_USER: root
          POSTGRES_DB: circle_test
          POSTGRES_PASSWORD: ""

    steps:
      - checkout
      - run:
          name: Install Python dependencies in a venv
          command: |
            virtualenv venv
            . venv/bin/activate
            cd database_jobs
            python -m pip install -r requirements.txt

      - run:
          # https://discuss.circleci.com/t/how-do-you-use-postgres-9-6-x-or-9-x-x/9855/3
          name: Database Setup
          command: |
            sudo apt-get install postgres-9.6
            sudo service postgresql stop
            sudo mv /usr/lib/postgresql-9.6/9.6 /usr/lib/postgresql/9.6
            sudo mv /etc/postgresql-9.6/9.6 /etc/postgresql/9.6
            sudo service postgresql start 9.6

            sudo -u postgres psql -p 5433 -c "create user ubuntu with password 'ubuntu';"
            sudo -u postgres psql -p 5433 -c "alter user ubuntu with superuser;"

            sudo -u postgres psql -p 5433 -c "create database test;"

            sudo -u postgres psql -p 5433 -c "select * from test;"



      - run:
          name: Running tests
          command: |
            . venv/bin/activate
            cd database_jobs
            python -m pytest -s -vv tests/ --ignore=tests/test_data_platform.py --ignore=tests/test_sql_helper.py

but this is the error I am getting:

...Building dependency tree       


Reading state information... Done

E: Unable to locate package postgres-9.6
E: Couldn't find any package by regex 'postgres-9.6'
Exited with code 100

do I have to specify some path for the installation?

Back up a little bit :stuck_out_tongue:

So, you’re spinning up circleci/postgres:9.6.5-alpine-ram as a secondary container, which I guess (from the name) is a PostgreSQL server. So, it’s not wrong to install PostgreSQL locally, but you don’t need to. When you install servers in secondary containers, CircleCI merges the networking stack of that container with localhost, so you should be able to connect on localhost:5433 (assuming 5433 is the standard PostgreSQL port).

However, if you want to carry on installing it in your primary container, then drop the secondary container (you don’t need both, and if that is listening on 5433 then your local one likely won’t start). The command apt-get install is indeed correct, but you’d need to find out what the package is called. The best way to do that is to SSH into the box after a failed build, and do:

apt-cache search postgresql | grep server

It’s postgresql-9.5 for me on my local Ubuntu, not postgres-*.

No, but an apt-get update might be a good idea first.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.