Project

General

Profile

Actions

action #93710

closed

openQA Project - coordination #39719: [saga][epic] Detection of "known failures" for stable tests, easy test results review and easy tracking of known issues

action #52655: [epic] Move openqa-review from cron-jobs on lord.arch to a more sustainable long-term solution

Reference individual openqa-review reports in gitlab CI artifacts, e.g. using gitlab pages

Added by okurz almost 3 years ago. Updated over 2 years ago.

Status:
Resolved
Priority:
Normal
Assignee:
Target version:
Start date:
Due date:
% Done:

0%

Estimated time:

Description

Motivation

https://gitlab.suse.de/openqa/openqa-review/-/pipeline_schedules references multiple different reports that should replace okurz's personal cron jobs on lord.arch to a generic solution for running openqa-review and publishing the test overview report documents.

Acceptance criteria

Suggestions

  • Research how we can parameterize gitlab CI artifacts or just generate all the reports into individual documents and reference the HTML documents directly in URLs
  • Optional: Ensure that the index page lists all available reports, e.g. by not having an index.html at all

Related issues 3 (0 open3 closed)

Related to openQA Project - coordination #91914: [epic] Make reviewing openQA results per squad easierResolvedokurz2021-05-25

Actions
Related to QA - action #95033: openqa-review fails upon trying to access openqa with no-urlencoded addressesResolveddheidler2021-07-022021-07-16

Actions
Copied from QA - action #91356: Save openqa-review reports as gitlab CI artifactsResolvedosukup2021-04-19

Actions
Actions #1

Updated by okurz almost 3 years ago

  • Copied from action #91356: Save openqa-review reports as gitlab CI artifacts added
Actions #2

Updated by jbaier_cz almost 3 years ago

I can probably look at this. The parametrization of the pipeline shouldn't be a big problem as we can run arbitrary code inside a gitlab pipeline (even generated one from a previous jobs).

If I understood that correctly, we basically want to have all pages from https://w3.nue.suse.com/~okurz/openqa_suse_de_status.html moved under https://openqa.io.suse.de/openqa-review/. In the gitlab UI I see 4 (not-yet) scheduled jobs, I assume ARGS for that schedules are directly from the aforementioned cron jobs and are directly usable, is that right?

My other question, do we want to have one schedule to generate all available reports, or do we want one schedule per report with a nice index page to list them all?

Actions #3

Updated by osukup almost 3 years ago

@jbaier_cz I thought Gitlab pages:deploy rewrote the whole generated site ... so only one schedule is possible?

Actions #4

Updated by jbaier_cz almost 3 years ago

Indeed, pages:deploy will overwrite the content of the pages with the new version of the artifacts from pages jobs, however you can have multiple jobs generating just the one particular report and store that report as an artifact of that jobs. Afterwards, you will have a pages jobs, which will take all the generated artifacts, puts them together, add an index page and export that in a folder for the pages:deploy job. Sounds a little bit complicated, but can be done. The key here is to store the individual reports as individual artifacts. It can also be done (most probably) inside one job only, a job which will download old version of the page, create updated version of the desired report and update artifacts.

Actions #5

Updated by okurz almost 3 years ago

jbaier_cz wrote:

I can probably look at this. The parametrization of the pipeline shouldn't be a big problem as we can run arbitrary code inside a gitlab pipeline (even generated one from a previous jobs).

jbaier_cz, would be great if you can take this task.

If I understood that correctly, we basically want to have all pages from https://w3.nue.suse.com/~okurz/openqa_suse_de_status.html moved under https://openqa.io.suse.de/openqa-review/. In the gitlab UI I see 4 (not-yet) scheduled jobs, I assume ARGS for that schedules are directly from the aforementioned cron jobs and are directly usable, is that right?

Correct.

The current, complete crontab looks like this:

0 8 * * 1-5 env $HOME/local/openqa_review/bin/openqa-review-daily-email
# the same with custom parameters for openqa.opensuse.org
20 8 * * * env openqa_host=https://openqa.opensuse.org recv=okurz@suse.de html_target_file=openqa_opensuse_org_status.html $HOME/local/openqa_review/bin/openqa-review-daily-email
30 8 * * * env recv=okurz@suse.de html_target_file=openqa_sle15_status.html $HOME/local/openqa_review/bin/openqa-review-daily-email --job-groups="^SLE.*15"
40 8 * * * env recv=okurz@suse.de html_target_file=openqa_sle15_functional_status.html $HOME/local/openqa_review/bin/openqa-review-daily-email --job-groups="^SLE.*15.*(Functional)"
50 8 * * * env recv=okurz@suse.de html_target_file=openqa_sle15_yast_status.html $HOME/local/openqa_review/bin/openqa-review-daily-email --job-groups="^SLE.*15.*(YaST)"
# Provide an example report with "skip-passed" option for people to compare https://progress.opensuse.org/issues/93727
0 9 * * * 1-5 env recv=okurz@suse.de html_target_file=openqa_suse_de_skip_passed.html $HOME/local/openqa_review/bin/openqa-review-daily-email --skip-passed

the exact time is not that relevant. The jobs have just been configured to not run all in parallel to reduce load on the calling host as well as the openQA target instance. For each cronjob we want to have a corresponding report generated and for each stored in gitlab CI job artifacts as well as an email sent which is done by the script "openqa-review-daily-email" as well as generating the report.

My other question, do we want to have one schedule to generate all available reports, or do we want one schedule per report with a nice index page to list them all?

I don't see how an index page would depend on one vs. multiple schedules. If we just have no index.html in a folder on the gitlab pages filled space I assume that would show up as an index page automatically? If not then we can leave that out of scope for now and we will just reference reports directly by URL. Regarding one vs. multiple schedules I don't know which approach is easier and what would allow to save artifacts for each report configuration. You can try that out.

Actions #6

Updated by jbaier_cz almost 3 years ago

  • Assignee set to jbaier_cz

okurz wrote:

I don't see how an index page would depend on one vs. multiple schedules.

Sorry, I wasn't clear with the priorities, I meant: (only one schedule or one schedule per report) with a nice index page to list them all

If we just have no index.html in a folder on the gitlab pages filled space I assume that would show up as an index page automatically?

That is actually a very interesting question. Personally, I never tried it. I assumed automatic directory listing would be disabled for security reasons. I will definitely try that.

Regarding one vs. multiple schedules I don't know which approach is easier and what would allow to save artifacts for each report configuration. You can try that out.

I will experiment a little with the multi-job pipeline as there is a benefit in the possibility to regenerate any single report at will and in worst case I can easily revert to one job with serialized report generating.

Actions #7

Updated by osukup almost 3 years ago

If we just have no index.html in a folder on the GitLab Pages filled space I assume that would show up as an index page automatically?

for now, is the result page renamed in job to index.html :D

Actions #8

Updated by jbaier_cz almost 3 years ago

Notes for the future: Some options are currently limited as passing artifacts from child pipeline to parent pipeline is not (yet) possible. For more references see https://gitlab.com/gitlab-org/gitlab/-/issues/215725 and https://gitlab.com/gitlab-org/gitlab/-/issues/285100

Actions #9

Updated by jbaier_cz almost 3 years ago

  • Status changed from Workable to In Progress

Due to the limitations above, the pages job has to be inside the dynamically created child pipeline, otherwise there has to be a job which will collect the artifacts from the child pipeline by URL via curl and makes them visible for the rest of the parent pipeline. This seems to be overly complicated and unnecessary for the current usage. I would like to go with the first option (including the deployment inside the child pipeline).

Actions #10

Updated by livdywan almost 3 years ago

jbaier_cz wrote:

Due to the limitations above, the pages job has to be inside the dynamically created child pipeline, otherwise there has to be a job which will collect the artifacts from the child pipeline by URL via curl and makes them visible for the rest of the parent pipeline. This seems to be overly complicated and unnecessary for the current usage. I would like to go with the first option (including the deployment inside the child pipeline).

I think whatever is most practical is fine. The result is what matters to most people here.

Actions #11

Updated by jbaier_cz almost 3 years ago

First working prototype ready: Here is the pipeline, the result is published as openqa_opensuse_org_status.html and more...

The pipeline is fully driven by the variable REVIEW_JOBS, currently with this setting:

reviews:
  - html_target_file: openqa_suse_de_status.html
  # the same with custom parameters for openqa.opensuse.org
  - html_target_file: openqa_opensuse_org_status.html
    recv: okurz@suse.de
    openqa_host: https://openqa.opensuse.org
  - html_target_file: openqa_sle15_status.html
    recv: okurz@suse.de
    extra_args: --job-groups ^SLE.*15
  - html_target_file: openqa_sle15_functional_status.html
    recv: okurz@suse.de
    extra_args: --job-groups ^SLE.*15.*(Functional)
  - html_target_file: openqa_sle15_yast_status.html
    recv: okurz@suse.de
    extra_args: --job-groups ^SLE.*15.*(YaST)
  # Provide an example report with "skip-passed" option for people to compare https://progress.opensuse.org/issues/93727
#  - html_target_file: openqa_suse_de_skip_passed.html
#    recv: okurz@suse.de
#    extra_args: --skip-passed

That should be roughly the same as the provided cronjob.

What is still missing, what can be enhanced and/or what needs discussing:

  1. Parameter --skip-passed does not work, the reason is probably old version of python3-openqa_review inside the container. One possible solution is to rebuild the used container, second solution is to install needed dependencies inside the CI job (as there are not many dependencies in this case, it should be so hard and time consuming to do it). My proposal is to create another (weekly?) schedule to auto-update the container and use the container. This needs @okurz to decide as the container lives in his home space.

  2. Index page is missing and automatic directory listing is disabled due to security. It would be probably nice to create at least a basic index file which will list all artifacts, this will be doable inside the pages job.

  3. If needed, the REVIEW_JOBS parser could be enhanced to allow more variables from bin/openqa-review-daily-email to be set.

Actions #12

Updated by okurz almost 3 years ago

jbaier_cz wrote:

First working prototype ready: Here is the pipeline, the result is published as openqa_opensuse_org_status.html and more...

impressive and very nice. https://jbaier_cz.io.suse.de/openqa-review/openqa_sle15_status.html looks also good. https://jbaier_cz.io.suse.de/openqa-review/ yields 404 so we have our answer regarding an "automagic index page" but that's ok.

  1. Parameter --skip-passed does not work, the reason is probably old version of python3-openqa_review inside the container. One possible solution is to rebuild the used container,

oops. That should be fixed now by me creating a new tag on github, updating the package on OBS and submitting: https://build.opensuse.org/request/show/899975

second solution is to install needed dependencies inside the CI job (as there are not many dependencies in this case, it should be so hard and time consuming to do it). My proposal is to create another (weekly?) schedule to auto-update the container and use the container. This needs @okurz to decide as the container lives in his home space.

Well, for containers on OBS they should be automatically triggered as soon as dependencies change. The reason to build a container within IBS is as only there we have access to ca.suse.de repos which bring SSL certificates which we can use to access internal SSL secured systems, e.g. openqa.suse.de and right now I am not sure if the container images are triggered automatically. I reminded myself now that the container is built based on the openqa-review package from obsrepositories:/opensuse/tumbleweed so maybe the problem is simply that for the new submission no submission of a new package to openSUSE Tumbleweed was created. I am no fan of periodic schedules when we can have event based triggers. Another alternative would be to use the "development package" from https://build.opensuse.org/package/show/home:okurz/python-openqa_review which is automatically triggered on every git commit instead of the package version within openSUSE Tumbleweed but normally we should be ok to wait for changes to arrive in Tumbleweed.

  1. Index page is missing and automatic directory listing is disabled due to security. It would be probably nice to create at least a basic index file which will list all artifacts, this will be doable inside the pages job.

  2. If needed, the REVIEW_JOBS parser could be enhanced to allow more variables from bin/openqa-review-daily-email to be set.

do you have a merge request showing the proposed gitlab CI config changes?

Actions #13

Updated by jbaier_cz almost 3 years ago

okurz wrote:

Well, for containers on OBS they should be automatically triggered as soon as dependencies change. The reason to build a container within IBS is as only there we have access to ca.suse.de repos which bring SSL certificates which we can use to access internal SSL secured systems, e.g. openqa.suse.de and right now I am not sure if the container images are triggered automatically. I reminded myself now that the container is built based on the openqa-review package from obsrepositories:/opensuse/tumbleweed so maybe the problem is simply that for the new submission no submission of a new package to openSUSE Tumbleweed was created. I am no fan of periodic schedules when we can have event based triggers. Another alternative would be to use the "development package" from https://build.opensuse.org/package/show/home:okurz/python-openqa_review which is automatically triggered on every git commit instead of the package version within openSUSE Tumbleweed but normally we should be ok to wait for changes to arrive in Tumbleweed.

I would also prefer the event base triggering, however I was a little bit confused about your container. As it is build from Tumbleweed it should be rebuild fairly often and I got the impression this is not the case. But it seems the missing piece of the puzzle is the non-updated version of python-openqa_review inside Factory.

By the way, I am not sure if that is a final solution, but the package ca-certificates-suse from OBS can be installed and hopefully it could bring the same set of internal certificates.

do you have a merge request showing the proposed gitlab CI config changes?

I would like to solve the missing index first, MR will be ready probably by the end of this week. In the mean time, the changes (and thus the main content of the MR) can be seen from comparing my fork.

Actions #14

Updated by okurz almost 3 years ago

jbaier_cz wrote:

By the way, I am not sure if that is a final solution, but the package ca-certificates-suse from OBS can be installed and hopefully it could bring the same set of internal certificates.

ah right! Forgot about that this package is now in a public place as well. Good idea! Doing that in #93988

Actions #15

Updated by openqa_review almost 3 years ago

  • Due date set to 2021-06-29

Setting due date based on mean cycle time of SUSE QE Tools

Actions #16

Updated by jbaier_cz almost 3 years ago

As mentioned in https://progress.opensuse.org/issues/93988#note-3, the container from OBS is unfortunately not up to date (I think the usage of Dockerfile does not play well with the event-based rebuild triggering) and the IBS based container failed to build. We should definitely make the process smoother.

In the mean time, I can add an extra zypper up to the pipeline to make sure we have the new package installed. It would be also nice to add perl-YAML to the container to eliminate zypper calls.

Actions #17

Updated by okurz almost 3 years ago

I have merged https://gitlab.suse.de/openqa/openqa-review/-/merge_requests/3 so now we have the updated container image. I think it auto-updates.

What do you mean about perl-YAML? What is it needed for? If it's needed for your changes then we can include that in https://build.opensuse.org/package/show/home:okurz:container:ca/openqa-review . You could create a submit request to that package. In that case please add a comment in the Dockerfile what perl-YAML is used for

Actions #18

Updated by jbaier_cz almost 3 years ago

It is used by the CI itself to generate the pipeline configuration from the variable value (which is stored in YAML for convenient editation). Just wanted to make sure you are OK with that addition, I will make a SR.

Actions #19

Updated by jbaier_cz over 2 years ago

So my approach to this problem was published in this MR. Feel free to comment, I already see some improvements myself.

Actions #20

Updated by okurz over 2 years ago

Actions #21

Updated by okurz over 2 years ago

I approved https://gitlab.suse.de/openqa/openqa-review/-/merge_requests/4 . cdywan had a minor suggestion which you can implement in there or in a follow-up.

Actions #22

Updated by jbaier_cz over 2 years ago

  • Status changed from In Progress to Feedback

Changes merged and pipeline schedule updated. Let's wait for next run to see the results.

Actions #23

Updated by livdywan over 2 years ago

  • Due date changed from 2021-06-29 to 2021-07-02

What's the final URL where the results should be found? I'm unsure where to check now

Actions #24

Updated by jbaier_cz over 2 years ago

Results are exported as gitlab-pages on the auto-generated url https://<group>.io.suse.de/<project>/, in this case it is at https://openqa.io.suse.de/openqa-review/. Eventually, it will also benefit from the custom domain as in poo#91356.

Actions #25

Updated by jbaier_cz over 2 years ago

I created https://gitlab.suse.de/openqa/openqa-review/-/merge_requests/6 to at least mask the issues and to generate a page with something.

Actions #26

Updated by jbaier_cz over 2 years ago

  • Related to action #95033: openqa-review fails upon trying to access openqa with no-urlencoded addresses added
Actions #27

Updated by jbaier_cz over 2 years ago

  • Status changed from Feedback to Resolved

As the status pages is now successfully generated and the issues with the openqa-review are already tracked elsewhere, I consider this closed.

Actions #28

Updated by okurz over 2 years ago

  • Status changed from Resolved to Feedback

Nearly there but please check the ACs of the ticket :)

Actions #29

Updated by jbaier_cz over 2 years ago

Ah right, sorry. I didn't realize we also have that address there (in the wiki) and not only in the resulting e-mails.

Actions #30

Updated by livdywan over 2 years ago

I guess this is blocking on #93943 before https://openqa.io.suse.de/openqa-review/ has all variants currently in the wiki - although I wonder if in general it might be a good idea to simply link to the overview

Actions #31

Updated by okurz over 2 years ago

cdywan wrote:

I guess this is blocking on #93943 before https://openqa.io.suse.de/openqa-review/ has all variants currently in the wiki

Yes. You can call it blocked because I don't want to tell people: "Hey, we have a better replacement – it's an empty page!" :D

  • although I wonder if in general it might be a good idea to simply link to the overview

Previously people preferred to have these short links for specific needs. With our awesome url shortener we can still have these.

Actions #32

Updated by livdywan over 2 years ago

Another question, the reports don't seem to match up - do we just want to replace what's in the wiki? Or was something else supposed to be configured here? I'd like to wrap it up as Jan won't be available for a while.

Actions #33

Updated by livdywan over 2 years ago

  • Due date changed from 2021-07-02 to 2021-07-07
Actions #34

Updated by livdywan over 2 years ago

  • Assignee changed from jbaier_cz to livdywan
Actions #35

Updated by okurz over 2 years ago

cdywan wrote:

Another question, the reports don't seem to match up - do we just want to replace what's in the wiki?

All the reports from https://progress.opensuse.org/projects/qa/wiki/Wiki#Test-results-overview should be generated within gitlab CI and referenced from the corresponding documents from https://openqa.io.suse.de/openqa-review/

Or was something else supposed to be configured here?

Right now on https://openqa.io.suse.de/openqa-review/ I can see that successfully the following reports could be generated:

  • openqa_opensuse_org_status
  • openqa_sle15_functional_status
  • openqa_sle15_status
  • openqa_sle15_yast_status
  • openqa_suse_de_skip_passed
  • openqa_suse_de_status

that should be good enough for a start. You can mention on the wiki section how new reports could be added, i.e. in the pipeline variable "REVIEW_JOBS" within https://gitlab.suse.de/openqa/openqa-review/-/pipeline_schedules/41/edit

I'd like to wrap it up as Jan won't be available for a while.

appreciated

Actions #36

Updated by livdywan over 2 years ago

  • Due date changed from 2021-07-07 to 2021-07-09

So it turned out one of the reports was (is) actually missing, and the logs kinda look like it wasn't run in the first place, and it wouldn't even show the commands for the real step:

$ git clone --depth 1 https://github.com/os-autoinst/openqa_review.git .
Cloning into '.'...
Uploading artifacts for successful job
Uploading artifacts...
WARNING: openqa_suse_de_status.html: no matching files 
ERROR: No files to upload                          
Cleaning up file based variables
Job succeeded

So I re-triggered, and re-ran locally to see what would happen but I couldn't reproduce it - it almost seems like it's another GitLab-internal problem since there was no change to the pipeline or the code in the meantime. And so I decided to file SD-192168 with this and the previous upload error.

Actions #37

Updated by livdywan over 2 years ago

  • Status changed from Feedback to Resolved

cdywan wrote:

So I re-triggered, and re-ran locally to see what would happen but I couldn't reproduce it - it almost seems like it's another GitLab-internal problem since there was no change to the pipeline or the code in the meantime. And so I decided to file SD-192168 with this and the previous upload error.

I waited to triple-check this before wrapping it up. I think we're good now - and if somehow this occurs again we can have a new ticket since this is not about GitLab problems. Thusly I updated the links now.

There was a mention of a short link service, but nothing is documented or commented in a persistent place so I updated the links based on tools I have access to i.e. verbatim URLs.

Actions #38

Updated by okurz over 2 years ago

cdywan wrote:

There was a mention of a short link service, but nothing is documented or commented in a persistent place

how do you know that "nothing is documented", have you searched all ressources within SUSE? ;)
Hint for the future, just point your browser to https://s.qa.suse.de/ or https://s.qa.suse.de/any_new_short_link_you_like_to_have_as_long_as_it_is_not_yet_used and you will understand how it works :)

Actions #39

Updated by livdywan over 2 years ago

  • Due date deleted (2021-07-09)
Actions

Also available in: Atom PDF