Project

General

Custom queries

Profile

Actions

action #127532

closed

CI checks failing on os-autoinst master size:M

Added by okurz over 1 year ago. Updated over 1 year ago.

Status:
Resolved
Priority:
High
Assignee:
Category:
Regressions/Crashes
Target version:
Start date:
2023-04-12
Due date:
% Done:

0%

Estimated time:

Description

Observation

https://github.com/os-autoinst/os-autoinst#os-autoinst-- shows "ci failing". https://github.com/os-autoinst/os-autoinst/actions?query=branch%3Amaster shows failed results. Latest result at time of writing:
https://github.com/os-autoinst/os-autoinst/actions/runs/4599640678/jobs/8125256383 failing with

7:         # No tests run!
7: 
7:     #   Failed test 'No tests run for subtest "stop grabbing"'
7:     #   at ./t/29-backend-generalhw.t line 171.
7:     # Looks like you failed 1 test of 2.
7: 
7: #   Failed test 'serial grab'
7: #   at ./t/29-backend-generalhw.t line 174.
7: Can't kill('-TERM', '2481'): No such process at /opt/backend/generalhw.pm line 178
7: # Tests were run but no plan was declared and done_testing() was not seen.
7: # Looks like your test exited with 255 just after 10.
7: [17:42:06] ./t/29-backend-generalhw.t ................. 
7: Dubious, test returned 255 (wstat 65280, 0xff00)
7: Failed 1/10 subtests 

Acceptance criteria

  • AC1: t/29-backend-generalhw.t is stable

Suggestions

  • Confirm if this is happening reliably - it seems like it may be flaky
  • Try to make sense of the code to spot the race condition
  • Run the test in a loop and see if it fails
  • Try running the test with a smaller time limit
Actions #1

Updated by livdywan over 1 year ago

  • Subject changed from CI checks failing on os-autoinst master to CI checks failing on os-autoinst master size:M
  • Description updated (diff)
  • Status changed from New to Workable
Actions #2

Updated by mkittler over 1 year ago

  • Assignee set to mkittler
Actions #3

Updated by mkittler over 1 year ago

  • Status changed from Workable to Feedback

I remember that error and I have already investigated it before. I still have no clue why the test would run into this condition so I have created a workaround: https://github.com/os-autoinst/os-autoinst/pull/2300

Actions #4

Updated by okurz over 1 year ago

  • Status changed from Feedback to Resolved

https://github.com/os-autoinst/os-autoinst/pull/2300 merged. https://github.com/os-autoinst/os-autoinst/actions is currently green for master and with the patch I assume CI will stay better and if it would turn bad then we would notice again so we can resolve here.

Actions

Also available in: Atom PDF