gha: set 10-minute timeout on "report" actions
I had a CI run fail to "Upload reports": Exponential backoff for retry #1. Waiting for 4565 milliseconds before continuing the upload at offset 0 Finished backoff for retry #1, continuing with upload Total file count: 211 ---- Processed file #160 (75.8%) ... Total file count: 211 ---- Processed file #164 (77.7%) Total file count: 211 ---- Processed file #164 (77.7%) Total file count: 211 ---- Processed file #164 (77.7%) A 503 status code has been received, will attempt to retry the upload ##### Begin Diagnostic HTTP information ##### Status Code: 503 Status Message: Service Unavailable Header Information: { "content-length": "592", "content-type": "application/json; charset=utf-8", "date": "Mon, 21 Aug 2023 14:08:10 GMT", "server": "Kestrel", "cache-control": "no-store,no-cache", "pragma": "no-cache", "strict-transport-security": "max-age=2592000", "x-tfs-processid": "b2fc902c-011a-48be-858d-c62e9c397cb6", "activityid": "49a48b53-0411-4ff3-86a7-4528e3f71ba2", "x-tfs-session": "49a48b53-0411-4ff3-86a7-4528e3f71ba2", "x-vss-e2eid": "49a48b53-0411-4ff3-86a7-4528e3f71ba2", "x-vss-senderdeploymentid": "63be6134-28d1-8c82-e969-91f4e88fcdec", "x-frame-options": "SAMEORIGIN" } ###### End Diagnostic HTTP information ###### Retry limit has been reached for chunk at offset 0 to https://pipelinesghubeus5.actions.githubusercontent.com/Y2huPMnV2RyiTvKoReSyXTCrcRyxUdSDRZYoZr0ONBvpl5e9Nu/_apis/resources/Containers/8331549?itemPath=integration-reports%2Fubuntu-22.04-systemd%2Fbundles%2Ftest-integration%2FTestInfoRegistryMirrors%2Fd20ac12e48cea%2Fdocker.log Warning: Aborting upload for /tmp/reports/ubuntu-22.04-systemd/bundles/test-integration/TestInfoRegistryMirrors/d20ac12e48cea/docker.log due to failure Error: aborting artifact upload Total file count: 211 ---- Processed file #165 (78.1%) A 503 status code has been received, will attempt to retry the upload Exponential backoff for retry #1. Waiting for 5799 milliseconds before continuing the upload at offset 0 As a result, the "Download reports" continued retrying: ... Total file count: 1004 ---- Processed file #436 (43.4%) Total file count: 1004 ---- Processed file #436 (43.4%) Total file count: 1004 ---- Processed file #436 (43.4%) An error occurred while attempting to download a file Error: Request timeout: /Y2huPMnV2RyiTvKoReSyXTCrcRyxUdSDRZYoZr0ONBvpl5e9Nu/_apis/resources/Containers/8331549?itemPath=integration-reports%2Fubuntu-20.04%2Fbundles%2Ftest-integration%2FTestCreateWithDuplicateNetworkNames%2Fd47798cc212d1%2Fdocker.log at ClientRequest.<anonymous> (/home/runner/work/_actions/actions/download-artifact/v3/dist/index.js:3681:26) at Object.onceWrapper (node:events:627:28) at ClientRequest.emit (node:events:513:28) at TLSSocket.emitRequestTimeout (node:_http_client:839:9) at Object.onceWrapper (node:events:627:28) at TLSSocket.emit (node:events:525:35) at TLSSocket.Socket._onTimeout (node:net:550:8) at listOnTimeout (node:internal/timers:559:17) at processTimers (node:internal/timers:502:7) Exponential backoff for retry #1. Waiting for 5305 milliseconds before continuing the download Total file count: 1004 ---- Processed file #436 (43.4%) And, it looks like GitHub doesn't allow cancelling the job, possibly because it is defined with `if: always()`? Signed-off-by: Sebastiaan van Stijn <github@gone.nl>
This commit is contained in:
parent
a01bcf9767
commit
d6f340e784
1 changed files with 3 additions and 0 deletions
3
.github/workflows/test.yml
vendored
3
.github/workflows/test.yml
vendored
|
@ -166,6 +166,7 @@ jobs:
|
|||
|
||||
unit-report:
|
||||
runs-on: ubuntu-20.04
|
||||
timeout-minutes: 10
|
||||
if: always()
|
||||
needs:
|
||||
- unit
|
||||
|
@ -354,6 +355,7 @@ jobs:
|
|||
|
||||
integration-report:
|
||||
runs-on: ubuntu-20.04
|
||||
timeout-minutes: 10
|
||||
if: always()
|
||||
needs:
|
||||
- integration
|
||||
|
@ -482,6 +484,7 @@ jobs:
|
|||
|
||||
integration-cli-report:
|
||||
runs-on: ubuntu-20.04
|
||||
timeout-minutes: 10
|
||||
if: always()
|
||||
needs:
|
||||
- integration-cli
|
||||
|
|
Loading…
Add table
Reference in a new issue