Project

General

Profile

action #88536

openQA Project - coordination #39719: [saga][epic] Detection of "known failures" for stable tests, easy test results review and easy tracking of known issues

openQA Project - coordination #88229: [epic] Prevent unintended test coverage decrease

Find out differences in openQA test coverage with metabase

Added by hurhaj 4 months ago. Updated 3 months ago.

Status:
Resolved
Priority:
Normal
Assignee:
Target version:
Start date:
2021-02-12
Due date:
% Done:

0%

Estimated time:

Description

User story

As a QE engineer, I would like to have a way of comparing openQA's tests (*.pm) utilization between our products, mostly before their release and after the release. With ~1500 tests available at this moment, it's very difficult to find out whether particular test is being used to it's full potential, if it is running on all products, code streams and architectures that it could and should run on. That brings a risk of needlessly lowering the testing coverage that is already available to us.

Acceptance criteria

  • AC1: Possibility to filter the data, to easily compare coverage between whichever products, code streams.

Tasks

Further details

As per previous discussions, Metabase seems to be the best way to realize this.


Related issues

Related to QA - action #88127: [tools][qem] Test coverage DB for maintenance updatesResolved2021-02-08

History

#2 Updated by okurz 4 months ago

  • Related to action #88127: [tools][qem] Test coverage DB for maintenance updates added

#3 Updated by okurz 4 months ago

#4 Updated by okurz 4 months ago

  • Status changed from New to Feedback
  • Assignee set to okurz
  • Target version set to Ready

hurhaj thanks for the ticket. I think we can merge the content of this ticket into #88229 as in my understanding the goal is the same, do you agree?

#5 Updated by hurhaj 4 months ago

okurz wrote:

hurhaj thanks for the ticket. I think we can merge the content of this ticket into #88229 as in my understanding the goal is the same, do you agree?

Yes, I believe so too. If the acceptance criteria from this poo will be met, we can merge for sure. Thanks

#6 Updated by okurz 3 months ago

  • Subject changed from Comparable test coverage to Find out differences in openQA test coverage with metabase
  • Description updated (diff)
  • Status changed from Feedback to Workable
  • Assignee changed from okurz to hurhaj
  • Target version changed from Ready to future
  • Parent task set to #88229

hurhaj I have moved some content into the epic #88229 and set that ticket as "parent" to this one. So then I adapted this ticket to be specific about metabase. By this I think it should be easier to followup with this one specific aspect and leave everything more generic for the parent epic. So I would assume that some parts or all you can solve yourself by getting familiar with metabase. And if you find data missing we could look into providing such data to metabase. Ok?

#7 Updated by hurhaj 3 months ago

okurz wrote:

hurhaj I have moved some content into the epic #88229 and set that ticket as "parent" to this one. So then I adapted this ticket to be specific about metabase. By this I think it should be easier to followup with this one specific aspect and leave everything more generic for the parent epic. So I would assume that some parts or all you can solve yourself by getting familiar with metabase. And if you find data missing we could look into providing such data to metabase. Ok?

yes, if the data will be availabe in metabase, filtering/comparing/etc can be easily done by anyone. thanks for merging

#8 Updated by hurhaj 3 months ago

The two available tables in metabase provide some info, but not quite what we need. job_modules contain "script" and jobs contains "distri" and "version". What is needed is table that will look something like:
| grub_test.pm | sle | 15-sp1, 15, ...|

#9 Updated by hurhaj 3 months ago

  • Status changed from Workable to Resolved

OK, I created:
https://maintenance-statistics.dyn.cloud.suse.de/question/287

which meets my needs for comparing coverage of selected codestreams.

#10 Updated by okurz 3 months ago

It is great to hear that you have found a solution. I would be very interested to see how it looks but unfortunately I am missing permissions to access the above URL. Can you change permissions or something? Also I would be interested to hear more about it and also try to cover the question how we can extend on from that and maybe help the other tickets we have?

#11 Updated by hurhaj 3 months ago

Since I did not find any permissions, I moved it to https://maintenance-statistics.dyn.cloud.suse.de/collection/82
It's just a joint table of already existing job_modules and jobs, with columns that I need selected. For this poo it's enough.

As for other poos, I welcome anyone to add whichever columns they need and save it. But the problem is that it tends to time out pretty often, especially when I try to filter it, so we will very likely have to contact admins about it. Another thing is that resulting json file is freaking massive, about million lines, so filtering is needed (coming back to time out issue) for any sensible working with the available data.

#12 Updated by maritawerner 3 months ago

AS QE Project Manager I would not only be interested in all products, code streams and architectures of a testcase but I would also be interested in the functional area of the specific product, e.g. I would be interested in the area of

  • yast
  • virtualization
  • security
  • Kernel
  • Dektop
  • Network

Pretty similar to what we have in the openQa dashboard for product QA.

#13 Updated by okurz 3 months ago

Well, with the job group name all the relevant data should be available in metabase already as well. "metabase" is meant to be easily consumable so you might be able to answer your questions from metabase itself similar to what hurhaj has explained above.

Also available in: Atom PDF