Project

General

Profile

Wiki » History » Version 23

dheidler, 2019-07-02 11:16

1 18 okurz
# test results overview
2
* Latest report based on openQA test results http://s.qa.suse.de/test-status , SLE12: http://s.qa.suse.de/test-status-sle12 , SLE15: http://s.qa.suse.de/test-status-sle15
3
4
5 1 mgriessmeier
# QA SLE Functional - Team description
6
7 10 okurz
{{toc}}
8
9 15 okurz
**QSF (QA SLE Functional)** is a virtual team focusing on QA of the "functional" domain of the SUSE SLE products. The virtual team is mainly comprised of members of [SUSE QA SLE Nbg](https://wiki.suse.net/index.php/SUSE-Quality_Assurance/Organization/Members_and_Responsibilities#QA_SLE_NBG_Team) including members from [SUSE QA SLE Prg](https://wiki.suse.net/index.php/SUSE-Quality_Assurance/Organization/Members_and_Responsibilities#QA_SLE_PRG_Team). The [SLE Departement](https://wiki.suse.net/index.php/SUSE-Quality_Assurance/SLE_Department#QSF_.28QA_SLE_Functional.29) page describes our QA responsibilities. We focus on our automatic tests running in [openQA](https://openqa.suse.de) under the job groups "Functional" as well as "Autoyast" for the respective products, for example [SLE 15 / Functional](https://openqa.suse.de/group_overview/110) and [SLE 15 / Autoyast](https://openqa.suse.de/group_overview/129). We back our automatic tests with exploratory manual tests, especially for the product milestone builds. Additionally we care about corresponding openSUSE openQA tests (see as well https://openqa.opensuse.org).
10 1 mgriessmeier
11 8 okurz
* long-term roadmap: http://s.qa.suse.de/qa-long-term
12
* overview of current openQA SLE12SP5 tests with progress ticket references: https://openqa.suse.de/tests/overview?distri=sle&version=12-SP5&groupid=139&groupid=142
13 11 okurz
* fate tickets for SLE12SP5 feature testing: based on http://s.qa.suse.de/qa_sle_functional_feature_tests_sle12sp5 new report based on all tickets with milestone before SLE12SP5 GM, http://s.qa.suse.de/qa_sle_functional_feature_tests_sle15sp1 for SLE15SP1
14 1 mgriessmeier
* only "blocker" or "shipstopper" bugs on "interesting products" for SLE15 http://s.qa.suse.de/qa_sle_functional_bug_query_sle15_2, http://s.qa/qa_sle_bugs_sle12_2 for SLE12
15 7 szarate
* Better organization of planned work can be seen at the [SUSE QA](https://progress.opensuse.org/projects/suseqa) project (which is not public).
16 1 mgriessmeier
17
# Test plan
18
19 3 szarate
When looking for coverage of certain components or use cases keep the [openQA glossary](http://open.qa/docs/#concept) in mind. It is important to understand that "tests in openQA" could be a scenario, for example a "textmode installation run", a combined multi-machine scenario, for example "a remote ssh based installation using X-forwarding", or a test module, for example "vim", which checks if the vim editor is correctly installed, provides correct rendering and basic functionality. You are welcome to contact any member of the team to ask for more clarification about this.
20 1 mgriessmeier
21
In detail the following areas are tested as part of "SLE functional":
22
23
* different hardware setups (UEFI, acpi)
24
* support for localization
25 19 okurz
* openSUSE: virtualization - some "virtualization" tests are active on o3 with reduced set compared to SLE coverage (on behalf of QA SLE virtualization due to team capacity constraints, clarified in QA SLE coordination meeting 2018-03-28)
26
* openSUSE: migration - comparable to "virtualization", a reduced set compared to SLE coverage is active on o3 (on behalf of QA SLE migration due to team capacity constraints, clarified in QA SLE coordination meeting 2018-04)
27 1 mgriessmeier
28 19 okurz
29 1 mgriessmeier
## QSF-y
30 16 riafarov
List of the tested areas:
31 1 mgriessmeier
* Installation testing
32
* RAID (levels 0, 1, 10, 5, 6)
33
* USBinstall
34
* allpatterns
35
* autoyast (installer features, profile features, reinstall, tftp, …)
36
* filesystems (btrfs, ext4, xfs; btrfs features: snapper, qgroups, balancing, send-receive)
37
* lvm (+RAID, resize, full-encrypt, cryptlvm, activate existing)
38
* dud_sdk
39
* YaST configuration modules including services
40
* default installation (gnome installation)
41
* partition layouts (GPT)
42
* medium checks (rescue system, mediacheck with integrity, memtest)
43
* Addon, extension and module selection and registration (different repository sources)
44
* Basic install and functionality tests of SLE modules, for example public cloud module, etc.
45
* registration of installed system
46
* minimal server installation
47
* SMT (only installation)
48
* remote installation (VNC, SSH, SSH+X-Forwarding)
49
* PXE-Boot
50
* different system roles (KVM, XEN)
51
* installer self-update
52
* installation system features (activation of existing volumes, hostname setting, SSH key import, etc.)
53
* special network/disk devices (zfcp, nvme, multipath, iSCSI)
54 5 okurz
* the SLE product ["RT"](https://progress.opensuse.org/issues/43199)
55 1 mgriessmeier
56 17 riafarov
Detailed test plan for SLES and RT can be found here: [SLES_Integration_Level_Testplan.md](https://gitlab.suse.de/riafarov/qa-sle-functional-y/blob/master/SLES_Integration_Level_Testplan.md)
57 1 mgriessmeier
58 18 okurz
* Latest report based on openQA test results SLE12: http://s.qa.suse.de/test-status-sle12-yast , SLE15: http://s.qa.suse.de/test-status-sle15-yast
59
60
61 1 mgriessmeier
## QSF-u
62
63 2 mgriessmeier
"Testing is the future, and the future starts with you"
64
65 1 mgriessmeier
* basic operations (firefox, zypper, logout/reboot/shutdown)
66
* boot_to_snapshot
67
* functional application tests (kdump, gpg, ipv6, java, git, openssl, openvswitch, VNC)
68
* NIS (server, client)
69
* toolchain (development module)
70
* systemd
71
* "transactional-updates" as part of the corresponding SLE server role, not CaaSP
72 18 okurz
73
* Latest report based on openQA test results SLE12: http://s.qa.suse.de/test-status-sle12-functional , SLE15: http://s.qa.suse.de/test-status-sle15-functional
74 1 mgriessmeier
75
76 6 okurz
## Explicitly not covered by QSF
77
78
* quarterly updated media: Expected to be covered by Maintenance + QAM
79
80
81 1 mgriessmeier
### What we do
82
83
We collected opinions, personal experiences and preferences starting with the following four topics: What are fun-tasks ("new tests", "collaborate", "do it right"), what parts are annoying ("old & sporadic issues"), what do we think is expected from qsf-u ("be quick", "keep stuff running", "assess quality") and what we should definitely keep doing to prevent stakeholders becoming disappointed ("build validation", "communication & support").
84
85
#### How we work on our backlog
86
87 12 okurz
* no "due date"
88
* we pick up tickets that have not been previously discussed
89
* more flexible choice
90
* WIP-limits:
91
 * global limit of 10 tickets "In Progress"
92 1 mgriessmeier
93 14 okurz
* target numbers or "guideline", "should be", in priorities:
94
 1. New, untriaged: 0
95
 2. Workable: 40
96
 3. New, assigned to [u]: ideally less than 200 (should not stop you from triaging)
97 12 okurz
98
* SLAs for priority tickets - how to ensure to work on tickets which are more urgent?
99 1 mgriessmeier
 * "taken": <1d: immediate -> looking daily
100
 * 2-3d: urgent
101
 * first goal is "urgency removal": <1d: immediate, 1w: urgent
102
103 12 okurz
* our current "cycle time" is 1h - 1y (maximum, with interruptions)
104
105 1 mgriessmeier
* everybody should set priority + milestone in obvious cases, e.g. new reproducible test failures in multiple critical scenarios, in general case the PO decides
106
107
#### How we like to choose our battles
108
109
We self-assessed our tasks on a scale from "administrative" to "creative" and found in the following descending order: daily test review (very "administrative"), ticket triaging, milestone validation, code review, create needles, infrastructure issues, fix and cleanup tests, find bugs while fixing failing tests, find bugs while designing new tests, new automated tests (very "creative"). Then we found we appreciate if our work has a fair share of both sides. Probably a good ratio is 60% creative plus 40% administrative tasks. Both types have their advantages and we should try to keep the healthy balance.
110
111
112
#### What "product(s)" do we (really) *care* about?
113
114
Brainstorming results:
115
116
* openSUSE Krypton -> good example of something that we only remotely care about or not at all even though we see the connection point, e.g. test plasma changes early before they reach TW or Leap as operating systems we rely on or SLE+packagehub which SUSE does not receive direct revenue from but indirect benefit. Should be "community only", that includes members from QSF though
117
* openQA -> (like OBS), helps to provide ROI for SUSE
118
* SLE(S) (in development versions)
119
* Tumbleweed
120
* Leap, because we use it
121
* SLES HA
122
* SLE migration
123
* os-autoinst-distri-opensuse+backend+needles
124
125
From this list strictly no "product" gives us direct revenue however most likely SLE(S) (as well as SLES HA and SLE migration) are good examples of direct connection to revenue (based on SLE subscriptions). Conducting a poll in the team has revealed that 3 persons see "SLE(S)" as our main product and 3 see "os-autoinst-distri-opensuse+backend+needles" as the main product. We mainly agreed that however we can not *own* a product like "SLE" because that product is mainly not under our control.
126
127
Visualizing "cost of testing" vs. "risk of business impact" showed that both metrics have an inverse dependency, e.g. on a range from "upstream source code" over "package self-tests", "openSUSE Factory staging", "Tumbleweed", "SLE" we consider SLE to have the highest business risk attached and therefore defines our priority however testing at upstream source level is considered most effective to prevent higher cost of bugs or issues. Our conclusion is that we must ensure that the high-risk SLE base has its quality assured while supporting a quality assurance process as early as possible in the development process. package self-tests as well as the openQA staging tests are seen as a useful approach in that direction as well as "domain specfic specialist QA engineers" working closely together with according in-house development parties.
128
129
# Documentation
130
131
This documentation should only be interesting for the team QA SLE functional. If you find that some of the following topics are interesting for other people, please extract those topics to another wiki section.
132
133
### QA SLE functional Dashboards
134
135
In room 3.2.15 from Nuremberg office are two dedicated laptops each with a monitor attached showing a selected overview of openQA test resuls with important builds from SLE and openSUSE.
136 4 szarate
Such laptops are configured with a root account with the default password for production machines. First point of contact: [slindomansilla.suse.com](mailto:slindomansilla@suse.com), (okurz@suse.de)[mailto:okurz@suse.de]
137 1 mgriessmeier
138
* ''dashboard-osd-3215.suse.de'': Showing current view of openqa.suse.de filtered for some job group results, e.g. "Functional"
139
* ''dashboard-o3-3215.suse.de'': Showing current view of openqa.opensuse.org filtered for some job group results which we took responsibility to review and are mostly interested in
140
141
142
#### dashboard-osd-3215
143
144
* OS: openSUSE Tumbleweed
145
* Services: ssh, mosh, vnc, x2x
146
* Users:
147
** root
148
** dashboard
149
* VNC: `vncviewer dashboard-osd-3215`
150
* X2X: `ssh -XC dashboard@dashboard-osd-3215 x2x -west -to :0.0`
151
** (attaches the dashboard monitor as an extra display to the left of your screens. Then move the mouse over and the attached X11 server will capture mouse and keyboard)
152
153
154
#### Content of /home/dashboard/.xinitrc
155
156 3 szarate
```
157 1 mgriessmeier
#
158
# Source common code shared between the
159
# X session and X init scripts
160
#
161
. /etc/X11/xinit/xinitrc.common
162
163
xset -dpms
164
xset s off
165
xset s noblank
166
[...]
167
#
168
# Add your own lines here...
169
#
170
$HOME/bin/osd_dashboard &
171 3 szarate
```
172 1 mgriessmeier
173
#### Content of /home/dashboard/bin/osd_dashboard
174
175 3 szarate
```
176 1 mgriessmeier
#!/bin/bash
177
178
DISPLAY=:0 unclutter &
179
180
DISPLAY=:0 xset -dpms
181
DISPLAY=:0 xset s off
182
DISPLAY=:0 xset s noblank
183
184
url="${url:-"https://openqa.suse.de/?group=SLE+15+%2F+%28Functional%7CAutoyast%29&default_expanded=1&limit_builds=3&time_limit_days=14&show_tags=1&fullscreen=1#"}"
185 20 dheidler
DISPLAY=:0 chromium --kiosk "$url"
186 3 szarate
```
187 1 mgriessmeier
188
#### Cron job:
189
190 3 szarate
```
191 1 mgriessmeier
Min     H       DoM     Mo      DoW     Command
192 23 dheidler
*	*	*	*	*	/home/dashboard/bin/reload_chromium
193 3 szarate
```
194 1 mgriessmeier
195 21 dheidler
#### Content of /home/dashboard/bin/reload_chromium
196 1 mgriessmeier
197 3 szarate
```
198 1 mgriessmeier
#!/bin/bash
199
200
DISPLAY=:0 xset -dpms
201
DISPLAY=:0 xset s off
202
DISPLAY=:0 xset s noblank
203
204 22 dheidler
DISPLAY=:0 xdotool windowactivate $(DISPLAY=:0 xdotool search --class Chromium)
205 21 dheidler
DISPLAY=:0 xdotool key F5
206
DISPLAY=:0 xdotool windowactivate $(DISPLAY=:0 xdotool getactivewindow)
207 3 szarate
```
208 1 mgriessmeier
209
#### Issues:
210
211
* ''When the screen shows a different part of the web page''
212
** a simple mouse scroll through vnc or x2x may suffice.
213
* ''When the builds displayed are freeze without showing a new build, it usually means that midori, the browser displaying the info on the screen, crashed.''
214
** you can try to restart midori this way:
215
*** ps aux | grep midori
216
*** kill $pid
217
*** /home/dashboard/bin/osd_dashboard
218
** If this also doesn't work, restart the machine.