Project

General

Profile

Actions

action #88536

closed

openQA Project - coordination #102906: [saga][epic] Increased stability of tests with less "known failures", known incompletes handled automatically within openQA

openQA Project - coordination #88229: [epic] Prevent unintended test coverage decrease

Find out differences in openQA test coverage with metabase

Added by hurhaj almost 4 years ago. Updated over 3 years ago.

Status:
Resolved
Priority:
Normal
Assignee:
Target version:
Start date:
2021-02-12
Due date:
% Done:

0%

Estimated time:

Description

User story

As a QE engineer, I would like to have a way of comparing openQA's tests (*.pm) utilization between our products, mostly before their release and after the release. With ~1500 tests available at this moment, it's very difficult to find out whether particular test is being used to it's full potential, if it is running on all products, code streams and architectures that it could and should run on. That brings a risk of needlessly lowering the testing coverage that is already available to us.

Acceptance criteria

  • AC1: Possibility to filter the data, to easily compare coverage between whichever products, code streams.

Tasks

Further details

As per previous discussions, Metabase seems to be the best way to realize this.


Files


Related issues 1 (0 open1 closed)

Related to QA - action #88127: [tools][qem] Test coverage DB for maintenance updatesClosedjbaier_cz2021-02-08

Actions
Actions #2

Updated by okurz almost 4 years ago

  • Related to action #88127: [tools][qem] Test coverage DB for maintenance updates added
Actions #3

Updated by okurz almost 4 years ago

Actions #4

Updated by okurz almost 4 years ago

  • Status changed from New to Feedback
  • Assignee set to okurz
  • Target version set to Ready

@hurhaj thanks for the ticket. I think we can merge the content of this ticket into #88229 as in my understanding the goal is the same, do you agree?

Actions #5

Updated by hurhaj almost 4 years ago

okurz wrote:

@hurhaj thanks for the ticket. I think we can merge the content of this ticket into #88229 as in my understanding the goal is the same, do you agree?

Yes, I believe so too. If the acceptance criteria from this poo will be met, we can merge for sure. Thanks

Actions #6

Updated by okurz over 3 years ago

  • Subject changed from Comparable test coverage to Find out differences in openQA test coverage with metabase
  • Description updated (diff)
  • Status changed from Feedback to Workable
  • Assignee changed from okurz to hurhaj
  • Target version changed from Ready to future
  • Parent task set to #88229

@hurhaj I have moved some content into the epic #88229 and set that ticket as "parent" to this one. So then I adapted this ticket to be specific about metabase. By this I think it should be easier to followup with this one specific aspect and leave everything more generic for the parent epic. So I would assume that some parts or all you can solve yourself by getting familiar with metabase. And if you find data missing we could look into providing such data to metabase. Ok?

Actions #7

Updated by hurhaj over 3 years ago

okurz wrote:

@hurhaj I have moved some content into the epic #88229 and set that ticket as "parent" to this one. So then I adapted this ticket to be specific about metabase. By this I think it should be easier to followup with this one specific aspect and leave everything more generic for the parent epic. So I would assume that some parts or all you can solve yourself by getting familiar with metabase. And if you find data missing we could look into providing such data to metabase. Ok?

yes, if the data will be availabe in metabase, filtering/comparing/etc can be easily done by anyone. thanks for merging

Actions #8

Updated by hurhaj over 3 years ago

The two available tables in metabase provide some info, but not quite what we need. job_modules contain "script" and jobs contains "distri" and "version". What is needed is table that will look something like:
| grub_test.pm | sle | 15-sp1, 15, ...|

Actions #9

Updated by hurhaj over 3 years ago

  • Status changed from Workable to Resolved

OK, I created:
https://maintenance-statistics.dyn.cloud.suse.de/question/287

which meets my needs for comparing coverage of selected codestreams.

Actions #10

Updated by okurz over 3 years ago

It is great to hear that you have found a solution. I would be very interested to see how it looks but unfortunately I am missing permissions to access the above URL. Can you change permissions or something? Also I would be interested to hear more about it and also try to cover the question how we can extend on from that and maybe help the other tickets we have?

Actions #11

Updated by hurhaj over 3 years ago

Since I did not find any permissions, I moved it to https://maintenance-statistics.dyn.cloud.suse.de/collection/82
It's just a joint table of already existing job_modules and jobs, with columns that I need selected. For this poo it's enough.

As for other poos, I welcome anyone to add whichever columns they need and save it. But the problem is that it tends to time out pretty often, especially when I try to filter it, so we will very likely have to contact admins about it. Another thing is that resulting json file is freaking massive, about million lines, so filtering is needed (coming back to time out issue) for any sensible working with the available data.

Actions #12

Updated by maritawerner over 3 years ago

AS QE Project Manager I would not only be interested in all products, code streams and architectures of a testcase but I would also be interested in the functional area of the specific product, e.g. I would be interested in the area of

  • yast
  • virtualization
  • security
  • Kernel
  • Dektop
  • Network

Pretty similar to what we have in the openQa dashboard for product QA.

Actions #13

Updated by okurz over 3 years ago

Well, with the job group name all the relevant data should be available in metabase already as well. "metabase" is meant to be easily consumable so you might be able to answer your questions from metabase itself similar to what hurhaj has explained above.

Actions #14

Updated by okurz over 3 years ago

A more recent example of what can be done with metabase:

which_openQA_testsuites_and_test_modules_fail_the_most_on_OSD_metabase

For me the queries execute rather quickly, just because the issue was mentioned once that metabase on openQA data would be slow or something.

Actions

Also available in: Atom PDF