[CircleCI Security Alert] Rotate any secrets stored in CircleCI

Has CircleCI just revoked all deploy SSH keys by any chance?

Should be resolved, are you still seeing the problem?

I am not aware of deploy key revocation on our side at this time, only the personal/project API keys as previously posted

Ah yep sorry - false alarm. Thanks for the quick reply :heart:

In case it helps anybody else, I’ve produced this tool for listing my CircleCi secrets: GitHub - rupert-madden-abbott/circleci-audit

Looks like CircleCI have their own tool now as well but it’s not clear to me what it covers.

My tool handles:

  • Project Env Vars, Keys and Jira Integration
  • Context Env Vars
1 Like

We have made audit logs available to all customers, with these specific changes

  • Enable self-service access for all plan tiers including free customers. Frequency limits as follows:
    • By account type:
      • Free: 1 request per day
      • All paid plans: 3 requests per day
    • Decrease maximum query window to 30 days - specifically, a customer can request logs for any 30 day window where the start of the window is within the last calendar year

This can be found in the “Security” tab of the “Org Settings” page.

For more information on self-serve audit logs please see our [documentation]Audit logs - CircleCI).

Has CircleCI taken action to remove user SSH checkout keys?

Earlier today I made a report using the API, and now those keys are not showing up.

This is problematic, as I need to assume those keys are compromised, but I only have the key fingerprints from earlier today - which isn’t enough to track them down and ensure they’re revoked at the GitHub end (the API response doesn’t note which user the keys belonged to).

Anyone who didn’t already list these keys earlier today won’t be able to get a list at all.

1 Like

Joining in on the above questions about SSH keys.

We rotated all of our deploy keys yesterday (deleted the key on the Github side, deleted the key on the CircleCI side, clicked Add Deploy Key again).

We have just found that the new keys have been deleted on the Github side for every project (they’re still present in each project on the Circle side).

It does seem best in this case to revoke user tokens on the CircleCI end, otherwise we are talking about an error-prone manual process which is massively difficult for large orgs.

I had the exact same thing happen yesterday. I assumed it was because I generated new SSH keys before revoking my oAuth relationship between CircleCI and Github. After regenerating the keys with my new oAuth session things look better.

But I would like confirmation that this isn’t the root cause so I can be on the lookout for our keys getting removed again.

To follow up on this briefly, I was able to match up some of my user keys via their fingerprints, and found that they’d been deleted on the CircleCI side, but were still valid on the GitHub side.

@aaronstillwell I think y’all have to hit “Revoke all user tokens” on your Github app, as per @bwalding 's comment. As Github Enterprise customers, if we deauthorize the CircleCI app for our org, or remove engineers from our org, the CircleCI tokens stop having access to company resources. Cool! That’s what we want.

But if we reinstate those engineers or re-enable the app for our org, the potentially-compromised tokens start working again. Users do not have to re-authorize CircleCI with Github to continue using CircleCI and see company projects and resources. The compromised tokens do not get fully invalidated when the users or app gets removed.

So in short, it looks like there is nothing we can do, even as Github Business / Enterprise owners to nuke the Github access tokens provisioned by CircleCI. But if y’all hit that button, users should have to re-authorize and the potentially-compromised tokens should be nuked. Please hit it ASAP. Thanks!

Sorry for the delay in the response. Answering in-line:

More information will be shared on this as it becomes available. We will provide updates at that time on the original blog post.

We have not changed our IP since April 2022.

We recommend checking your audit logs on Github to see if repo were accessed. If the key used for decryption was stored as a secret you will want to rotate.

No update at this time, but once there is we will post here as well as update the original blog post as done yesterday.

Is there any way of extracting the legacy AWS_ACCESS_KEY_ID from CircleCI? The UI gives the option to delete the legacy AWS credentials, but you can’t actually see what the ID is. This is important to be able to revoke/rotate the affected AWS access key. Is there an API/UI for finding this information?

I know there is the option of making a config.yml which spits out the value, but that doesn’t scale well for many repos.

1 Like

Following the steps in our support article, we recommend rotating all secrets listed in that page out of an abundance of caution.

Additionally, we’ve updated the page to share that we’re rotating all GitHub OAuth tokens for all users, and will provide an update when the process is complete.

That page doesn’t mention legacy AWS credentials which you are holding for certain projects as identified by - https://support.circleci.com/hc/en-us/articles/360021415793-Wrong-AWS-credentials-being-used

Does that mean those are safe and do not need to be rotated? How can we rotate them if we don’t know what they are?

The wording in the article for API tokens says rotated which to me sounded like a new token was generated in its place. But do you really mean tokens were removed?

Also, this broke a lot of my automation suddenly. I am surprised there was no warning on this. We were using API token auth to rotate our secrets…

A new update has been pushed to the blog post:

Security update 01/06/2023

Our team is working to take every action available to assist customers in the mitigation of this incident.

Since our last update, our team has addressed the following areas on behalf of customers:

  • Personal and Project API Tokens: We have removed all Personal and Project API Tokens created before 00:00 UTC on January 5, 2023.
  • Bitbucket OAuth: As of 10:00 UTC on January 6, 2023 our partners at Atlassian expired all OAuth tokens for Bitbucket users. Bitbucket tokens will refresh for users upon login, and no additional action is needed here. Bitbucket users will still need to replace SSH tokens.
  • GitHub OAuth: We are currently rotating all GitHub OAuth tokens on behalf of customers. We expect this process to be complete by 00:00 UTC on Jan 7, 2023. We will update here when this process is done. Customers who wish to rotate their own GitHub OAuth tokens may follow the directions below.

Sorry if I miss tagging anyone, trying to get everyone who posted something related to this:
@duffn @traviscrist @ring-pete @david-davidson @atnak @microbit-matth @bwalding @TheMetalCode @agius

Thank you all for your patience as we try to get answers for other outstanding questions.