openqa-review index page doesn't fail when a report is missing
There were errors in two of the pipelines, so they were retriggered and and succedding. But https://openqa.io.suse.de/openqa-review/openqa_suse_de_status.html was missing and not listed in the report index either.
- See https://gitlab.suse.de/openqa/openqa-review/-/jobs/527255 (which is fine now of course)
Updated by jbaier_cz over 2 years ago
I would say it is kind of "expected behaviour". The review job generates a single HTML pages aka report and stores it as an artifact, the pages job downloads all the artifacts from previous jobs (there will be no artifact for failed jobs) and has no clue about how many jobs are there (there is a room for improvement covered by poo#96353, the pages job can now use the information from review_jobs.yml to know about missing artifacts). This job (pages) needs to succeed because only then GitLab will take artifacts from this job and publish them as a GitLab Page (as a part of the internal pages:deploy job). This is illustrated in the job and the subsequent rerun, the report in question is missing in the first one but present in the second one.
One can regenerate any report by simply restarting the corresponding job, but to see the updated results on web, the artifacts needs to be reexported -- the pages job needs to be restarted as well (the pages are in fact an export not a direct view of the latest artifacts). As far as I know, it is possible to restart the job via triggering it via API, however is not an easy way and I am not sure if that is worthy enough (it is a replacement for one single click in the same UI where one actually restarted the review job).
Failing the pages job if one of the review job fails means no page will be exported at all, that was decided as a bad option (see !6 for more). Instead, I suggest to prioritize poo#96353 to make failed reviews more visible.