Project

General

Profile

Actions

action #43568

closed

[sle][functional][y][aarch64][sporadic] test fails in partitioning_full_lvm - Volume Management list stays collapsed

Added by mloviska over 5 years ago. Updated about 5 years ago.

Status:
Resolved
Priority:
Normal
Assignee:
Category:
Bugs in existing tests
Target version:
SUSE QA - Milestone 22
Start date:
2018-11-08
Due date:
2019-02-26
% Done:

0%

Estimated time:
3.00 h
Difficulty:

Description

Observation

openQA test in scenario sle-12-SP4-Server-DVD-aarch64-lvm-encrypt-separate-boot@aarch64 fails in
partitioning_full_lvm

Lists records are collapsed by default in sle12sp4 in system view of Expert Partitioner.
Possibly we are pressing right key when expert partitioner is loading data.

Reproducible

Fails since (at least) Build 0456 (current job)

Expected result

Last good: 0455 (or more recent)

Further details

Always latest result in this scenario: latest

Actions #1

Updated by mloviska over 5 years ago

  • Subject changed from [sle][functional][y] test fails in partitioning_full_lvm - Volume Management list stays collapsed to [sle][functional][y][aarch64] test fails in partitioning_full_lvm - Volume Management list stays collapsed
Actions #2

Updated by riafarov over 5 years ago

  • Due date set to 2018-12-04
  • Status changed from New to Workable
Actions #3

Updated by riafarov over 5 years ago

  • Subject changed from [sle][functional][y][aarch64] test fails in partitioning_full_lvm - Volume Management list stays collapsed to [sle][functional][y][aarch64][sporadic] test fails in partitioning_full_lvm - Volume Management list stays collapsed
Actions #4

Updated by okurz over 5 years ago

  • Target version set to Milestone 21
Actions #5

Updated by mloviska over 5 years ago

  • Status changed from Workable to In Progress
  • Assignee set to mloviska
Actions #6

Updated by riafarov over 5 years ago

  • Estimated time set to 3.00 h
Actions #7

Updated by mloviska over 5 years ago

  • Estimated time deleted (3.00 h)

As I suspected, we are pressing keys when the partitioner is in the middle of loading data of volume management.

Volume Management

Actions #8

Updated by mloviska over 5 years ago

  • Estimated time set to 3.00 h
Actions #10

Updated by riafarov over 5 years ago

Could you please check if it has broken: https://openqa.suse.de/tests/2279361#step/partitioning_raid/11
Would be weird, but seems that change exposed some issue we have had in our code.

Actions #11

Updated by riafarov over 5 years ago

  • Status changed from Resolved to Workable
Actions #14

Updated by mloviska over 5 years ago

There is a change in the buttons field again Add Pa(r)tition vs separated buttons. Can I fix it as part of this ticket ?

Actions #15

Updated by mloviska over 5 years ago

Another change in the overview vs overview with buttons

Actions #16

Updated by okurz over 5 years ago

mloviska wrote:

Can I fix it as part of this ticket ?

From my point of view: Sure, you never need to ask for allowance and using the same ticket instead of creating a new one with additional "buerocrazy" should be preferred as long as we can ensure we are not getting lost and don't follow the plan. The idea of this ticket was to make sure the test scenario passes again or shows obvious product issues so in the case you mentioned I think it's part of the ticket.

Actions #17

Updated by mloviska over 5 years ago

okurz wrote:

mloviska wrote:

Can I fix it as part of this ticket ?

From my point of view: Sure, you never need to ask for allowance and using the same ticket instead of creating a new one with additional "buerocrazy" should be preferred as long as we can ensure we are not getting lost and don't follow the plan. The idea of this ticket was to make sure the test scenario passes again or shows obvious product issues so in the case you mentioned I think it's part of the ticket.

Thanks for clarification. I was not sure whether there is a separate ticket or not ( I could not find any ). Alright I will update the partitioner_raid along with partition_setup library

Actions #18

Updated by mloviska over 5 years ago

There is also a difference between archs especially ARM vs Intel and ppc. At this moment we have to wait if this issue hits s390x as well. Otherwise it looks like a regression on ARM.

arm vs intel

Actions #19

Updated by mloviska over 5 years ago

<mloviska> Are the warnings regarding obsolete  yast2-storage-ng from OBS normal ? http://pastebin.suse.de/23564  https://build.suse.de/public/build/SUSE:SLE-15-SP1:GA/standard/aarch64/yast2-storage-ng/_log
<snwint> mloviska: installation-images hasn't been built for ages due to the mentioned dependency problem
<HuHa> mloviska: that ISO is ancient and not fit for any testing
<snwint> mloviska: I would say the aarch64 tree is not in a good shape at the moment....
<HuHa> mloviska: the project manager has to take care to get this arch built properly
<HuHa> right now it's pointless to test that ISO
<HuHa> you'll just rediscover that all the storage stack is incomplete - just like it was in mid-April

So we have a build problem on aarch64

Actions #20

Updated by okurz over 5 years ago

I see. It is good that you have found this out. At best we have an internal issue to describe this problem which we can "block" this ticket by. If you are not aware of any I recommend you create a bug on bugzilla, mention it here and set this ticket to "Blocked" to not loose time. Then we can use this bug to label the according test failures as well or just keep this "Blocked" ticket as reference.

Actions #21

Updated by mloviska over 5 years ago

Sure, I will open the bug. Thanks!

Actions #22

Updated by mloviska over 5 years ago

<snwint> mloviska: Rudi tracked it down to Qt build failing...

https://build.suse.de/package/show/SUSE:SLE-15-SP1:GA/libqt5-qtsvg

[   60s] In file included from /usr/include/GLES3/gl31.h:39:0,
[   60s]                  from /usr/include/qt5/QtGui/qopengl.h:105,
[   60s]                  from /usr/include/qt5/QtWidgets/5.9.7/QtWidgets/private/qwidget_p.h:62,
[   60s]                  from qsvgwidget.cpp:48:
[   60s] /usr/include/GLES3/gl3platform.h:28:10: fatal error: KHR/khrplatform.h: No such file or directory
[   60s]  #include <KHR/khrplatform.h>
[   60s]           ^~~~~~~~~~~~~~~~~~~
[   60s] compilation terminated.
[   60s] make[2]: *** [Makefile:566: .obj/qsvgwidget.o] Error 1
[   60s] make[2]: *** Waiting for unfinished jobs....
[   67s] make[2]: Leaving directory '/home/abuild/rpmbuild/BUILD/qtsvg-opensource-src-5.9.7/src/svg'
[   67s] make[1]: *** [Makefile:46: sub-svg-make_first-ordered] Error 2
[   67s] make[1]: Leaving directory '/home/abuild/rpmbuild/BUILD/qtsvg-opensource-src-5.9.7/src'
[   67s] make: *** [Makefile:47: sub-src-make_first] Error 2
[   67s] error: Bad exit status from /var/tmp/rpm-tmp.Wd5IVG (%build)
[   67s] 
[   67s] 
[   67s] RPM build errors:
[   67s]     Bad exit status from /var/tmp/rpm-tmp.Wd5IVG (%build)
[   67s] 
[   67s] centriq5 failed "build libqt5-qtsvg.spec" at Sat Nov 24 10:47:55 UTC 2018.
Actions #23

Updated by mloviska over 5 years ago

  • Status changed from Workable to Blocked
Actions #24

Updated by riafarov over 5 years ago

Nice investigation. I believe there is nothing to be done on our side in the end, so let's see if we get this fixed soon. Thanks again!

Actions #25

Updated by mloviska over 5 years ago

riafarov wrote:

Nice investigation. I believe there is nothing to be done on our side in the end, so let's see if we get this fixed soon. Thanks again!

Agreed, at most we can contact PMs/RMs to let them know about the dependency issue in OBS.

Actions #26

Updated by okurz over 5 years ago

With the bug report they should be aware, right?

Actions #27

Updated by riafarov over 5 years ago

  • Due date changed from 2018-12-04 to 2018-12-18

Bug is still not fixed, we have a hope that it gets fixed in the upcoming sprint.

Actions #28

Updated by riafarov over 5 years ago

  • Due date changed from 2018-12-18 to 2019-01-22
  • Status changed from Blocked to Workable
  • Assignee deleted (mloviska)
  • Target version changed from Milestone 21 to Milestone 22

Bug is resolved, can work on this one now.

Actions #29

Updated by okurz over 5 years ago

  • Due date changed from 2019-01-22 to 2019-01-29

adjusting to corresponding sprint end date.

Actions #30

Updated by mloviska over 5 years ago

  • Status changed from Workable to In Progress
  • Assignee set to mloviska
Actions #31

Updated by mloviska over 5 years ago

Actions #32

Updated by mloviska about 5 years ago

  • Status changed from In Progress to Feedback
Actions #33

Updated by oorlov about 5 years ago

  • Due date changed from 2019-01-29 to 2019-02-12
Actions #34

Updated by mloviska about 5 years ago

  • Status changed from Feedback to In Progress
[2019-01-31T19:41:48.444 UTC] [debug] /var/lib/openqa/cache/openqa.suse.de/tests/sle/tests/installation/partitioning_full_lvm.pm:28 called partition_setup::addlv
[2019-01-31T19:41:48.444 UTC] [debug] <<< testapi::wait_still_screen(stilltime=3, timeout=4, similarity_level=47)
[2019-01-31T19:41:51.524 UTC] [debug] >>> testapi::wait_still_screen: detected same image for 3 seconds, last detected similarity is 1000000
[2019-01-31T19:41:51.525 UTC] [debug] /var/lib/openqa/cache/openqa.suse.de/tests/sle/tests/installation/partitioning_full_lvm.pm:28 called partition_setup::addlv
[2019-01-31T19:41:51.525 UTC] [debug] <<< testapi::send_key(key='right', do_wait=0)

doesn't work, going for send_key_until_needlematch approach

Actions #35

Updated by mloviska about 5 years ago

  • Status changed from In Progress to Feedback
Actions #36

Updated by mloviska about 5 years ago

  • Status changed from Feedback to Resolved

sle-12-SP5-Server-DVD-aarch64-Build0120-lvm-encrypt-separate-boot@aarch64

Maybe some "<sle12sp5" tests will fail on 1 missing needle. I guess we can solve these possible cases on the fly

Actions #37

Updated by mloviska about 5 years ago

I am wondering how many times the test had to press "right" key to roll down VGs.

Actions #38

Updated by mloviska about 5 years ago

  • Status changed from Resolved to In Progress
Actions #39

Updated by riafarov about 5 years ago

  • Due date changed from 2019-02-12 to 2019-02-26
Actions #40

Updated by mloviska about 5 years ago

[2019-02-11T16:23:32.049 UTC] [debug] <<< testapi::send_key(key='alt-s', do_wait=0)
[2019-02-11T16:23:32.257 UTC] [debug] /var/lib/openqa/cache/openqa.suse.de/tests/sle/tests/installation/partitioning_full_lvm.pm:28 called partition_setup::addlv
[2019-02-11T16:23:32.258 UTC] [debug] <<< testapi::send_key(key='home', do_wait=0)
[2019-02-11T16:23:32.466 UTC] [debug] /var/lib/openqa/cache/openqa.suse.de/tests/sle/tests/installation/partitioning_full_lvm.pm:28 called partition_setup::addlv
[2019-02-11T16:23:32.467 UTC] [debug] <<< testapi::check_screen(mustmatch='volume_management_feature', timeout=1)

Actions to be performed:

  1. Activate System View
  2. Go to the top entry -> "linux"
  3. Roll down until Volume Management is reached

What happened:

  1. SUT exited from VG configuration, therefore Volume Management was active in System View
  2. System View was activated
  3. SUT pressed "home" but YaST2/SUT did not responded in expected time
  4. SUT matched needle from point 3. in the previous list
Actions #41

Updated by mloviska about 5 years ago

  • Status changed from In Progress to Feedback

PR:

OSD VR:

I hope the job will be picked up by openqaworker-arm-2 worker

Actions #42

Updated by mloviska about 5 years ago

  • Status changed from Feedback to Resolved
Actions #43

Updated by okurz about 5 years ago

This is an autogenerated message for openQA integration by the openqa_review script:

This bug is still referenced in a failing openQA test: lvm-encrypt-separate-boot
https://openqa.suse.de/tests/2798098

Actions

Also available in: Atom PDF