Orb docker image not used as executor when additional job image specified

I’m setting up a Python/Django project. That means I need a Python image as well as a Postgres database image to run together. The Python image will be the executor.

I setup an inline orb for the Python stuff. I took the official orb and modified it to use poetry instead of pip.

For postgres, I just specified the docker image in my job, but the orb image is still the executor.

When my job actually runs, it fails. According to the logs, during the Spin Up Environment phase, only the postgres docker image is spun up. The orb image that is the executor is not launched. The postgres image ends up being used as executor and obviously failing. Here is my entire config with inline orb for reference.


orbs:
  python-poetry:
    description: Common CircleCI tasks for the Python programming language using poetry instead of pip

    commands:
      dist:
        description: Build a distribution package using poetry.
        steps:
        - run:
            command: poetry build -n
            name: Build distribution package
      install-deps:
        description: Install packages using poetry.
        steps:
        - run:
            command: poetry install --no-dev -n
            name: Install Dependencies
      load-cache:
        description: Load cached poetry packages
        parameters:
          key:
            default: poetry
            description: The cache key to use. The key is immutable.
            type: string
        steps:
        - restore_cache:
            keys:
            - << parameters.key >>-{{ checksum "poetry.lock" }}
      save-cache:
        description: Save poetry packages to cache.
        parameters:
          key:
            default: poetry
            description: The cache key to use. The key is immutable.
            type: string
          lib-path:
            default: /home/.cache/pypoetry/virtualenvs
            description: The path where the requirements are saved to.
            type: string
        steps:
        - save_cache:
            key: << parameters.key >>-{{ checksum "poetry.lock"  }}
            paths:
            - /home/circleci/.local/bin/
            - << parameters.lib-path >>
      test:
        description: Run python tests using unittest or pytest module.
        parameters:
          pytest:
            default: false
            description: Use pytest as test runner.
            type: boolean
        steps:
        - run:
            command: |
              if << parameters.pytest >>; then
                poetry run pytest
              else
                poetry run python -m unittest discover -v
              fi
            name: Test
      django_test:
        description: Run python tests for django project
        parameters:
          project-name:
            description: Django project name
            type: string
        steps:
        - run:
            command: poetry run ./<< parameters.project-name >>/manage.py test

    executors:
      default:
        description: CircleCI python image
        docker:
          - image: circleci/python:<< parameters.tag >>
        parameters:
          tag:
            default: latest
            description: Tag of the python docker image to use. Must include poetry.
            type: string

jobs:
  build-and-test:
    executor: python-poetry/default

    docker:
      - image: circleci/postgres:latest
        environment:
          POSTGRES_USER: ${FRC_DB_USER}
          POSTGRES_DB: ${FRC_DB_NAME}
          POSTGRES_PASSWORD: ${FRC_DB_PASS}

    steps:
      - checkout
      - python-poetry/load-cache
      - python-poetry/install-deps
      - python-poetry/save-cache
      - python-poetry/django_test:
          project-name: frontrowcrew

workflows:
  main:
    jobs:
      - build-and-test

How can I get both images to launch? Do I need to make an orb for postgres as well?

Update: I have confirmed that putting PostgreSQL into its own orb works and solves the problem. However, the question still stands. Is that really the only way? If you use an orb as the executor in your config, any additional docker images must also come from orbs?

That’s certainly interesting… @KyleTryon do you have any insight here?

Hello @Apreche

    executor: python-poetry/default

    docker:
      - image: circleci/postgres:latest
        environment:
          POSTGRES_USER: ${FRC_DB_USER}
          POSTGRES_DB: ${FRC_DB_NAME}
          POSTGRES_PASSWORD: ${FRC_DB_PASS}

This will not work as it is seen as duplicate keys or two executors. The docker executor itself can have two images, but we can not list multiple executors.

You could potentially add a parameter to your executor which would contain the “database image” that you want to use.

    executors:
      default:
        description: CircleCI python image
        docker:
          - image: circleci/python:<< parameters.tag >>
          - image: << parameters.db >>
        parameters:
          tag:
            default: latest
            description: Tag of the python docker image to use. Must include poetry.
            type: string
          db:
            default: some/image
            description: database
            type: string

How to use:

jobs:
  build-and-test:
    executor: 
        name: python-poetry/default
        tag: 
        db:

2 Likes

Kyle is correct. Alternatively, you can just list both Docker images under the docker key and not use the executor from the orb. Either or works, you just can’t mix the two.

Thanks. It wasn’t entirely clear that docker: and executor: were mutually exclusive. I guess I’ll just stick with my two-orb solution for now. It’s working.