Project

General

Profile

Wiki » History » Version 1

mgriessmeier, 2018-11-19 13:18

1 1 mgriessmeier
# QA SLE Functional - Team description
2
3
'''QSF (QA SLE Functional)''' is a virtual team focusing on QA of the "functional" domain of the SUSE SLE products. The virtual team is mainly comprised of members of [SUSE QA SLE Nbg](https://wiki.microfocus.net/index.php/SUSE-Quality_Assurance/QA_SLE_Functional) including members from [[SUSE-Quality_Assurance/Organization/Members_and_Responsibilities#QA_SLE_PRG_Team|SUSE QA SLE Prg]]. The [[SUSE-Quality_Assurance/SLE_Department#Functional|SLE Departement]] page describes our QA responsibilities. We focus on our automatic tests running in [https://openqa.suse.de openQA] under the job groups "Functional" as well as "Autoyast" for the respective products, for example [https://openqa.suse.de/group_overview/110 SLE 15 / Functional] and [https://openqa.suse.de/group_overview/129 SLE 15 / Autoyast]. We back our automatic tests with exploratory manual tests, especially for the product milestone builds. Additionally we care about corresponding openSUSE openQA tests (see as well https://openqa.opensuse.org).
4
5
* long-term roadmap: http://s.qa.suse.de/qa-long-term
6
* overview of current openQA SLE12SP4 tests with progress ticket references: https://openqa.suse.de/tests/overview?distri=sle&version=12-SP4&groupid=139&groupid=142
7
* fate tickets for SLE12SP4 feature testing: based on http://s.qa.suse.de/qa_sle_functional_feature_tests_sle12sp4 new report based on all tickets with milestone before SLE12SP4 GM, http://s.qa.suse.de/qa_sle_functional_feature_tests_sle15sp1 for SLE15SP1
8
* only "blocker" or "shipstopper" bugs on "interesting products" for SLE15 http://s.qa.suse.de/qa_sle_functional_bug_query_sle15_2, http://s.qa/qa_sle_bugs_sle12 for SLE12
9
* Latest report based on openQA test results http://s.qa.suse.de/test-status , SLE12: http://s.qa.suse.de/test-status-sle12 , SLE15: http://s.qa.suse.de/test-status-sle15
10
11
# Test plan
12
13
When looking for coverage of certain components or use cases keep the [http://open.qa/docs/#concepts openQA glossary] in mind. It is important to understand that "tests in openQA" could be a scenario, for example a "textmode installation run", a combined multi-machine scenario, for example "a remote ssh based installation using X-forwarding", or a test module, for example "vim", which checks if the vim editor is correctly installed, provides correct rendering and basic functionality. You are welcome to contact any member of the team to ask for more clarification about this.
14
15
In detail the following areas are tested as part of "SLE functional":
16
17
* different hardware setups (UEFI, acpi)
18
* support for localization
19
20
21
## QSF-y
22
23
* Installation testing
24
* RAID (levels 0, 1, 10, 5, 6)
25
* USBinstall
26
* allpatterns
27
* autoyast (installer features, profile features, reinstall, tftp, …)
28
* filesystems (btrfs, ext4, xfs; btrfs features: snapper, qgroups, balancing, send-receive)
29
* lvm (+RAID, resize, full-encrypt, cryptlvm, activate existing)
30
* dud_sdk
31
* YaST configuration modules including services
32
* default installation (gnome installation)
33
* partition layouts (GPT)
34
* medium checks (rescue system, mediacheck with integrity, memtest)
35
* Addon, extension and module selection and registration (different repository sources)
36
* Basic install and functionality tests of SLE modules, for example public cloud module, etc.
37
* registration of installed system
38
* minimal server installation
39
* SMT (only installation)
40
* remote installation (VNC, SSH, SSH+X-Forwarding)
41
* PXE-Boot
42
* different system roles (KVM, XEN)
43
* installer self-update
44
* installation system features (activation of existing volumes, hostname setting, SSH key import, etc.)
45
* special network/disk devices (zfcp, nvme, multipath, iSCSI)
46
47
48
## QSF-u
49
50
* basic operations (firefox, zypper, logout/reboot/shutdown)
51
* boot_to_snapshot
52
* functional application tests (kdump, gpg, ipv6, java, git, openssl, openvswitch, VNC)
53
* NIS (server, client)
54
* toolchain (development module)
55
* systemd
56
57
58
### What we do
59
60
We collected opinions, personal experiences and preferences starting with the following four topics: What are fun-tasks ("new tests", "collaborate", "do it right"), what parts are annoying ("old & sporadic issues"), what do we think is expected from qsf-u ("be quick", "keep stuff running", "assess quality") and what we should definitely keep doing to prevent stakeholders becoming disappointed ("build validation", "communication & support").
61
62
#### How we work on our backlog
63
64
* no "due date" +1
65
 * we pick up tickets that have not been previously discussed
66
 * more flexible choice
67
 * more "blocked" tickets for 
68
 * proposal: no global wip-limit -> 2 p.p. are too low? min 1? -> decision: global limit of 10 tickets "In Progress"
69
 * how to ensure to work on tickets which are more urgent?
70
71
* SLAs for priority tickets
72
 * "taken": <1d: immediate -> looking daily
73
 * 2-3d: urgent
74
 * first goal is "urgency removal": <1d: immediate, 1w: urgent
75
76
* our current "cycle time" is 1h - 1y (maximum, with interruptions)
77
78
79
#### How we like to choose our battles
80
81
We self-assessed our tasks on a scale from "administrative" to "creative" and found in the following descending order: daily test review (very "administrative"), ticket triaging, milestone validation, code review, create needles, infrastructure issues, fix and cleanup tests, find bugs while fixing failing tests, find bugs while designing new tests, new automated tests (very "creative"). Then we found we appreciate if our work has a fair share of both sides. Probably a good ratio is 60% creative plus 40% administrative tasks. Both types have their advantages and we should try to keep the healthy balance.
82
83
84
#### What "product(s)" do we (really) *care* about?
85
86
Brainstorming results:
87
88
* openSUSE Krypton -> good example of something that we only remotely care about or not at all even though we see the connection point, e.g. test plasma changes early before they reach TW or Leap as operating systems we rely on or SLE+packagehub which SUSE does not receive direct revenue from but indirect benefit. Should be "community only", that includes members from QSF though
89
* openQA -> (like OBS), helps to provide ROI for SUSE
90
* SLE(S) (in development versions)
91
* Tumbleweed
92
* Leap, because we use it
93
* SLES HA
94
* SLE migration
95
* os-autoinst-distri-opensuse+backend+needles
96
97
From this list strictly no "product" gives us direct revenue however most likely SLE(S) (as well as SLES HA and SLE migration) are good examples of direct connection to revenue (based on SLE subscriptions). Conducting a poll in the team has revealed that 3 persons see "SLE(S)" as our main product and 3 see "os-autoinst-distri-opensuse+backend+needles" as the main product. We mainly agreed that however we can not *own* a product like "SLE" because that product is mainly not under our control.
98
99
Visualizing "cost of testing" vs. "risk of business impact" showed that both metrics have an inverse dependency, e.g. on a range from "upstream source code" over "package self-tests", "openSUSE Factory staging", "Tumbleweed", "SLE" we consider SLE to have the highest business risk attached and therefore defines our priority however testing at upstream source level is considered most effective to prevent higher cost of bugs or issues. Our conclusion is that we must ensure that the high-risk SLE base has its quality assured while supporting a quality assurance process as early as possible in the development process. package self-tests as well as the openQA staging tests are seen as a useful approach in that direction as well as "domain specfic specialist QA engineers" working closely together with according in-house development parties.
100
101
102
## TODO
103
104
* "transactional updates": Request for SLE15SP1, not covered by QSF until decided differently, e.g. within QA SLE in cooperation with "QA SLE caasp"
105
* quarterly updated media: Request for SLE15SP1, not covered by QSF until decided differently. Expected to be covered by Maintenance + QAM
106
107
108
# Documentation
109
110
This documentation should only be interesting for the team QA SLE functional. If you find that some of the following topics are interesting for other people, please extract those topics to another wiki section.
111
112
### QA SLE functional Dashboards
113
114
In room 3.2.15 from Nuremberg office are two dedicated laptops each with a monitor attached showing a selected overview of openQA test resuls with important builds from SLE and openSUSE.
115
Such laptops are configured with a root account with the default password for production machines. First point of contact: [mailto:slindomansilla@suse.com slindomansilla.suse.com], [mailto:okurz@suse.de okurz@suse.de]
116
117
* ''dashboard-osd-3215.suse.de'': Showing current view of openqa.suse.de filtered for some job group results, e.g. "Functional"
118
* ''dashboard-o3-3215.suse.de'': Showing current view of openqa.opensuse.org filtered for some job group results which we took responsibility to review and are mostly interested in
119
120
121
#### dashboard-osd-3215
122
123
* OS: openSUSE Tumbleweed
124
* Services: ssh, mosh, vnc, x2x
125
* Users:
126
** root
127
** dashboard
128
* VNC: `vncviewer dashboard-osd-3215`
129
* X2X: `ssh -XC dashboard@dashboard-osd-3215 x2x -west -to :0.0`
130
** (attaches the dashboard monitor as an extra display to the left of your screens. Then move the mouse over and the attached X11 server will capture mouse and keyboard)
131
132
133
#### Content of /home/dashboard/.xinitrc
134
135
<pre>
136
[...]
137
#
138
# Source common code shared between the
139
# X session and X init scripts
140
#
141
. /etc/X11/xinit/xinitrc.common
142
143
xset -dpms
144
xset s off
145
xset s noblank
146
[...]
147
#
148
# Add your own lines here...
149
#
150
$HOME/bin/osd_dashboard &
151
</pre>
152
153
154
#### Content of /home/dashboard/bin/osd_dashboard
155
156
<pre>
157
#!/bin/bash
158
159
DISPLAY=:0 unclutter &
160
161
DISPLAY=:0 xset -dpms
162
DISPLAY=:0 xset s off
163
DISPLAY=:0 xset s noblank
164
165
url="${url:-"https://openqa.suse.de/?group=SLE+15+%2F+%28Functional%7CAutoyast%29&default_expanded=1&limit_builds=3&time_limit_days=14&show_tags=1&fullscreen=1#"}"
166
for i in `seq 1 10`; do
167
  zoom=$zoom"-e ZoomIn "
168
done
169
DISPLAY=:0 midori -e Fullscreen -e Navigationbar $zoom -a "$url"
170
</pre>
171
172
#### Cron job:
173
174
<pre>
175
Min     H       DoM     Mo      DoW     Command
176
*	*	*	*	*	/home/dashboard/bin/reload_midori
177
</pre>
178
179
#### Content of /home/dashboard/bin/reload_midori
180
181
<pre>
182
#!/bin/bash
183
184
DISPLAY=:0 xset -dpms
185
DISPLAY=:0 xset s off
186
DISPLAY=:0 xset s noblank
187
188
DISPLAY=:0 xdotool search --onlyvisible --any midori windowactivate --sync key F5
189
</pre>
190
191
#### Issues:
192
193
* ''When the screen shows a different part of the web page''
194
** a simple mouse scroll through vnc or x2x may suffice.
195
* ''When the builds displayed are freeze without showing a new build, it usually means that midori, the browser displaying the info on the screen, crashed.''
196
** you can try to restart midori this way:
197
*** ps aux | grep midori
198
*** kill $pid
199
*** /home/dashboard/bin/osd_dashboard
200
** If this also doesn't work, restart the machine.