Running E2E tests in a separate job from the app build

Hi,
I wish to make my E2E tests a separate job using workflows.
I want to know how can I set it up so the E2E tests could access the web server of the launched app on the first job and how to keep the container of the first job running while the tests are being run.
Here’s my config:
version: 2.1

orbs:
  aws-s3: circleci/aws-s3@1.0.11
  sentry-cli: picturepipe/sentry-cli@1.0.1

jobs:
  build:
    working_directory: ~/FE
    environment:
      NODE_ENV: test
    docker:
      - image: circleci/python:latest-node
    steps:
      - checkout
      - run:
          name: Update npm
          command: sudo npm install -g npm@latest
      - restore_cache:
          key: v1-fe-dependency-cache-{{ checksum "package.json" }}
      - run:
          name: Install Dependencies
          command: npm install
      - save_cache:
          key: v1-fe-dependency-cache-{{ checksum "package.json" }}
          paths:
            - ./node_modules
      - run:
          name: Run Frontend server
          command: npm run start
          background: true
      - add_ssh_keys:
          fingerprints:
            - "...."
      - run:
          name: Clone Backend
          command: git clone git@github.com:xxx/BE.git ~/BE
      - run:
        name:
      - restore_cache:
          key: v1-be-dependency-cache-{{ checksum "~/BE/src/requirements.txt" }}
      - run:
          name: Run Backend Server
          command: |
            cd ~/BE
            source ./ci-deploy.sh
      - save_cache:
          key: v1-be-dependency-cache-{{ checksum "~/BE/src/requirements.txt" }}
          paths:
            - ~/BE/src/venv
  test:
    working_directory: ~/Automation
    environment:
      NODE_ENV: test
    docker:
      - image: circleci/node:latest-browsers
    steps:
      - run:
          name: Add Github RSA Key To Known Hosts
          command: |
            mkdir ~/.ssh
            touch ~/.ssh/known_hosts
            ssh-keyscan -H github.com >> ~/.ssh/known_hosts
      - add_ssh_keys:
          fingerprints:
            - "....."
      - run:
          name: Clone Automation
          command: git clone git@github.com:xxx/Automation.git ~/Automation
      - restore_cache:
          key: v1-automation-dependency-cache-{{ checksum "package.json" }}
      - run:
          name: Install Dependencies
          command: npm install
      - save_cache:
          key: v1-automation-dependency-cache-{{ checksum "package.json" }}
          paths:
            - ./node_modules
      - run:
          name: Test
          command: npm test
  deploy:
    working_directory: ~/FE
    docker:
      - image: circleci/node:latest
    environment:
      S3_BUCKET_NAME: presentation
    steps:
      - checkout
      - run:
          name: Install Dependencies
          command: npm install
      - run:
          name: Build
          command: npm run build
      - aws-s3/sync:
          from: build
          to: 's3://${S3_BUCKET_NAME}'
          aws-access-key-id: AWS_ACCESS_KEY_ID
          aws-secret-access-key: AWS_SECRET_ACCESS_KEY
          aws-region: AWS_REGION
          overwrite: true

workflows:
  version: 2
  build_and_test:
    jobs:
      - build
      - test:
          requires:
            - build
      - deploy:
          requires:
            - test
          filters:
            branches:
              only:
                - master
                - develop
  nightly:
    triggers:
      - schedule:
          cron: "0 0 * * *"
          filters:
            branches:
              only:
                - master
                - develop
    jobs:
      - test

I’m also wondering if I can separate my front-end and back-end to 2 different jobs and make the front-end be able to access the back-end somehow and keep their containers running for the test job to run?

Thanks in advance!

1 Like