Updated_at field does not show actual updated time for REST APIs that list Pipelines / get a single pipeline data

When a job in the workflow of a pipeline runs, expected the updated_at field of the List Pipelines/ Get a pipeline REST API response data to get updated. It does not get updated. Is it a bug?

Hello,

Could you please share a link to a specific pipeline API call as an example? If you’d like to keep it private you can DM it to me.

Hi, Thanks for looking into this. The problem is for this api - https://circleci.com/api/v2/project/github///pipeline
Please find attached the screenshot.
The updated_at does not get updated when a job in a workflow runs.


My workflow has a few jobs and then an approval - hold type as well. After the approval is done and then when the next job in the workflow runs, I dont see the updated_at of the pipeline getting updated.

Hello,

This is because of the on-hold job. Since the approval/on-hold job isn’t a real job it won’t update the updated_at timestamp. For now all I can advise is to not rely on that timestamp to be updated when using an on-hold job. But it should update accordingly when a real job completes execution following it.

Hi, I just tested it .
I have two workflows.
workflow one -> build job
workflow 2 -> hold, deploy to qa, deploy to prod

I ran the API after the build job ran:
“updated_at”: “2020-12-03T14:38:10.284Z”,
Then after the hold is approved and deplo to qa and prod ran,
“updated_at”: “2020-12-03T14:38:10.284Z”,

It did not get updated for me.

I’ve been working on this in DMs with @deepaannjohn but i’ll post the answer I discovered here. The updated_at field of the pipeline API responses are not a reflection of workflow or job level events. This field will come into play with some other features we’re developing. For now the updated_at field will always be the same as the created_at field, and therefor it should not be used to check for state updates on the underlying worflows/jobs.

Hi, Thank you. Is there any plans to add a ‘pipeline last updated timestamp’ field?

I can’t say for sure. Can you tell me ultimately what you were hoping to use that value for? Would it also be sufficient to get the workflows in the pipeline, and then look at the individual workflow_stopped value to see when the workflow ended? It’s also possible to see and check the state of the jobs in the workflow as well. If those aren’t suitable I’m happy to create a feature request for your idea.

We would want to pull the details of all the latest job executions as part of our pipeline orchestration tooling. Currently, I do it by pulling in ALL pipelines and then ALL of their workflows and then check the “started_at” time of each job. If the job "started_at’ time is greater than the last time I pulled in the data, then that is considered as a new execution. We expect scaling issues here soon. Hence it would be great to have a last_updated field for the pipelines. If you can create a feature request, that would be great,

Hello,

I’ve created an idea for you. Let me know if it doesn’t accurately represent the feature you’re looking for and I can update it.