Project

General

Profile

Wiki » History » Version 333

okurz, 2021-09-29 12:06
Add more specific backends in out of scope

1 27 okurz
{{toc}}
2
3
# Test results overview
4 257 livdywan
* Latest report based on openQA test results for [OSD](https://openqa.io.suse.de/openqa-review/openqa_suse_de_status.html) and [SLE15](https://openqa.io.suse.de/openqa-review/openqa_sle15_status.html)
5 259 okurz
  * Find more reports on https://openqa.io.suse.de/openqa-review/
6 36 okurz
* only "blocker" or "shipstopper" bugs on "interesting products" for SLE: http://s.qa.suse.de/qa_sle_bugs_sle , SLE15: http://s.qa.suse.de/qa_sle_bugs_sle15_all, SLE12: http://s.qa/qa_sle_bugs_sle12_2
7 1 mgriessmeier
8 64 okurz
# QE tools - Team description
9 1 mgriessmeier
10 84 okurz
"The easiest way to provide complete quality for your software"
11
12 150 okurz
We provide the most complete free-software system-level testing solution to ensure high quality of operating systems, complete software stacks and multi-machine services for software distribution builders, system integration engineers and release teams. We continuously develop, maintain and release our software to be readily used by anyone while we offer a friendly community to support you in your needs. We maintain the main public and SUSE internal openQA server as well as supporting tools in the surrounding ecosystem.
13 84 okurz
14 27 okurz
## Team responsibilities
15 1 mgriessmeier
16 27 okurz
* Develop and maintain upstream openQA
17
* Administration of openqa.suse.de and workers (But not physical hardware, as these belong to the departments that purchased them and we merely facilitate)
18
* Helps administrating and maintaining openqa.opensuse.org, including coordination of efforts aiming at solving problems affecting o3
19 317 okurz
* Develop and maintain SUSE maintenance QA tools (SMELT, template generator, MTUI, openQA QAM bot, etc, e.g. from https://confluence.suse.com/display/maintenanceqa/Toolchain+for+maintenance+quality+engineering)
20 27 okurz
* Support colleagues, team members and open source community
21 1 mgriessmeier
22 27 okurz
## Out of scope
23
24
* Maintenance of individual tests
25
* Maintenance of physical hardware
26 333 okurz
* Maintenance of special worker addendums needed for tests, e.g. external hypervisor hosts for s390x, powerVM, xen, hyperv, IPMI, VMWare (Clarification: We maintain the code for all backends but we are no experts in specific domains. So we always try to help but it's a case by case decision based on what we realistically can provide based on our competence.)
27 27 okurz
* Ticket triaging of http://progress.opensuse.org/projects/openqatests/
28 214 okurz
* Setup of configuration for individual products to test, e.g. new job groups in openQA
29 27 okurz
* Feature development within the backend for single teams (commonly provided by teams themselves)
30
31 95 okurz
## Our common userbase
32
33 332 okurz
Known users of our products: Most SUSE QA engineers, SUSE SLE release managers and release engineers, every SLE developer submitting "submit requests" in OBS/IBS where product changes are tested as part of the "staging" process before changes are accepted in either SLE or openSUSE (staging tests must be green before packages are accepted), same for all openSUSE contributors submitting to either openSUSE:Factory (for Tumbleweed, SLE, future Leap versions) or Leap, other GNU/Linux distributions like Fedora https://openqa.fedoraproject.org/ , Debian https://openqa.debian.net/ , https://openqa.qubes-os.org/ , https://openqa.endlessm.com/ , the GNOME project https://openqa.gnome.org, https://www.codethink.co.uk/articles/2021/automated-linux-kernel-testing/, openSUSE KDE contributors (with their own workflows, https://openqa.opensuse.org/group_overview/23 ), openSUSE GNOME contributors (https://openqa.opensuse.org/group_overview/35 ), OBS developers (https://openqa.opensuse.org/parent_group_overview/7#grouped_by_build) , wicked developers (https://gitlab.suse.de/wicked-maintainers/wicked-ci#openqa), and of course our team itself for "openQA-in-openQA Tests" :) https://openqa.opensuse.org/group_overview/24
34 95 okurz
Keep in mind: "Users of openQA" and talking about "openSUSE release managers and engineers" means SUSE employees but also employees of other companies, also development partners of SUSE.
35
In summary our products, for example openQA, are a critical part of many development processes hence outages and regressions are disruptive and costly. Hence we need to ensure a high quality in production hence we practice DevOps with a slight tendency to a conservative approach for introducing changes while still ensuring a high development velocity.
36
37 27 okurz
## How we work
38
39 187 okurz
The QE Tools team is following the DevOps approach working using a lightweight Agile approach also inspired by [Extreme Programming](https://extremeprogramming.org/) and [Kanban](https://en.wikipedia.org/wiki/Kanban_(development)) and of course the original http://agilemanifesto.org/. We plan and track our works using tickets on https://progress.opensuse.org . We pick tickets based on priority and planning decisions. We use weekly meetings as checkpoints for progress and also track cycle and lead times to crosscheck progress against expectations.
40 27 okurz
41 83 okurz
* [tools team - backlog](https://progress.opensuse.org/issues?query_id=230): The complete backlog of the team
42 86 okurz
* [tools team - backlog, high-level view](https://progress.opensuse.org/issues?query_id=526): A high-level view of the backlog, all epics and higher (an "epic" includes multiple stories)
43
* [tools team - backlog, top-level view](https://progress.opensuse.org/issues?query_id=524): A top-level view of the backlog, only sagas and higher (a "saga" is bigger than an epic and can include multiple epics, i.e.  "epic of epics")
44 67 okurz
* [tools team - what members of the team are working on](https://progress.opensuse.org/issues?query_id=400): To check progress and know what the team is currently occupied with
45 116 okurz
* [tools team - closed within last 60 days](https://progress.opensuse.org/issues?query_id=541): What was recently resolved
46 1 mgriessmeier
47 184 okurz
*Be aware:* Custom queries in the right-hand sidebar of individual projects, e.g. https://progress.opensuse.org/projects/openqav3/issues , show queries with the same name but are limited to the scope of the specific projects so can show only a subset of all relevant tickets.
48 1 mgriessmeier
49 185 okurz
### What we expect from team members
50
51 222 okurz
* Actively show visible contributions to our products every workday *(pull requests, code review, ticket updates in decending priority, i.e. if you are very active in pull requests + code review ticket updates are much less important)*
52 186 okurz
* Be responsive over usual communication platforms and channels *(user questions, team discussions)*
53
* Stick to our rules *(this wiki, SLOs, alert handling)*
54 185 okurz
55 32 okurz
### Common tasks for team members
56
57
This is a list of common tasks that we follow, e.g. reviewing daily based on individual steps in the DevOps Process ![DevOps Process](devops-process_25p.png)
58
59
* **Plan**:
60
 * State daily learning and planned tasks in internal chat room
61
 * Review backlog for time-critical, triage new tickets, pick tickets from backlog; see https://progress.opensuse.org/projects/qa/wiki#How-we-work-on-our-backlog
62
* **Code**:
63 1 mgriessmeier
 * See project specific contribution instructions
64 72 okurz
 * Provide peer-review following https://github.com/notifications based on projects within the scope of https://github.com/os-autoinst/ with the exception of test code repositories, especially https://github.com/os-autoinst/openQA, https://github.com/os-autoinst/os-autoinst, https://github.com/os-autoinst/scripts, https://github.com/os-autoinst/os-autoinst-distri-openQA, https://github.com/os-autoinst/openqa-trigger-from-obs, https://github.com/os-autoinst/openqa_review as well as other projects like https://gitlab.suse.de/qa-maintenance/openQABot/
65 32 okurz
* **Build**:
66
 * See project specific contribution instructions
67
* **Test**:
68
 * Monitor failures on https://travis-ci.org/ relying on https://build.opensuse.org/package/show/devel:openQA/os-autoinst_dev for os-autoinst (email notifications)
69
 * Monitor failures on https://app.circleci.com/pipelines/github/os-autoinst/openQA?branch=master relying on https://build.opensuse.org/project/show/devel:openQA:ci for openQA (email notifications)
70
* **Release**:
71
 * By default we use the rolling-release model for all projects unless specified otherwise
72 260 tinita
 * Monitor [devel:openQA on OBS](https://build.opensuse.org/project/show/devel:openQA) (all packages and all subprojects) for failures, ensure packages are published on http://download.opensuse.org/repositories/devel:/openQA/, ensure to be added as a Maintainer for that project (members need to be added individually, you can ask existing team members, e.g. the SM)
73 32 okurz
 * Monitor http://jenkins.qa.suse.de/view/openQA-in-openQA/ for the openQA-in-openQA Tests and automatic submissions of os-autoinst and openQA to openSUSE:Factory through https://build.opensuse.org/project/show/devel:openQA:tested
74
* **Deploy**:
75
 * o3 is automatically deployed (daily), see https://progress.opensuse.org/projects/openqav3/wiki/Wiki#Automatic-update-of-o3
76 198 mkittler
 * osd is automatically deployed (multiple times per week), monitor https://gitlab.suse.de/openqa/osd-deployment/pipelines and watch for notification email to openqa@suse.de
77 32 okurz
* **Operate**:
78
 * Apply infrastructure changes from https://gitlab.suse.de/openqa/salt-states-openqa (osd) or manually over sshd (o3)
79 37 okurz
 * Monitor for backup, see https://gitlab.suse.de/qa-sle/backup-server-salt
80 32 okurz
config changes in salt (osd), backups, job group configuration changes
81 61 okurz
 * Ensure old unused/non-matching needles are cleaned up (osd+o3), see #73387
82 217 okurz
 * Maintain https://gitlab.suse.de/qa-maintenance/qamops and https://confluence.suse.com/display/maintenanceqa/qam.suse.de
83 32 okurz
* **Monitor**:
84 106 livdywan
 * React on alerts from [stats.openqa-monitor.qa.suse.de](https://stats.openqa-monitor.qa.suse.de/alerting/list?state=not_ok) (emails on [osd-admins@suse.de](http://mailman.suse.de/mailman/listinfo/osd-admins) and login via LDAP credentials, you must be an *editor* to edit panels and hooks via the web UI)
85
 * Look for incomplete jobs or scheduled not being worked on o3 and osd (API or webUI) - see also #81058 for *power*
86 44 okurz
 * React on alerts from https://gitlab.suse.de/openqa/auto-review/, https://gitlab.suse.de/openqa/openqa-review/, https://gitlab.suse.de/openqa/monitor-o3 (subscribe to projects for notifications)
87 258 okurz
 * Be responsive on #opensuse-factory (irc://irc.libera.chat/opensuse-factory, formerly irc://chat.freenode.net/opensuse-factory) for help, support and collaboration (Unless you have a better solution it is suggested to use [Element.io](https://matrix.to/#/!ilXMcHXPOjTZeauZcg:libera.chat) or [Element.io](https://app.element.io/#/room/%23freenode_%23opensuse-factory:matrix.org) for a sustainable presence; you also need a [registered IRC account](https://libera.chat/guides/registration), formerly [freenode](https://freenode.net/kb/answer/registration))
88 283 livdywan
 * Be responsive on [#qa-tools in Rocket.Chat](https://chat.suse.de/channel/qa-tools) for internal coordination and alarm handling, fallback to #suse-qe-tools:opensuse.org (matrix) as backup if other channels are temporarily down, alternatively public channels on matrix/ IRC if the topics are not confidential
89 1 mgriessmeier
 * Be responsive on [#testing](https://chat.suse.de/channel/testing) for help, support and collaboration
90 50 okurz
 * Be responsive on mailing lists opensuse-factory@opensuse.org and openqa@suse.de (see https://en.opensuse.org/openSUSE:Mailing_lists_subscription)
91 142 okurz
 * Be responsive in https://matrix.to/#/#openqa:opensuse.org or the bridged room [#openqa](https://discord.com/channels/366985425371398146/817367056956653621) on https://discord.gg/opensuse if you have a discord account
92 31 okurz
93 27 okurz
### How we work on our backlog
94 1 mgriessmeier
95 27 okurz
* "due dates" are only used as exception or reminders
96
* every team member can pick up tickets themselves
97
* everybody can set priority, PO can help to resolve conflicts
98 321 okurz
* consider the [ready, not assigned/blocked/low](https://progress.opensuse.org/issues?query_id=490) query as preferred. It is suggested to pick up tickets based on priority. "Workable" tickets are often convenient and hence preferred.
99 60 livdywan
* ask questions in tickets, even potentially "stupid" questions, oftentimes descriptions are unclear and should be improved
100 62 okurz
* There are "low-level infrastructure tasks" only conducted by some team members, the "DevOps" aspect does not include that but focusses on the joint development and operation of our main products
101 74 okurz
* Consider tickets with the subject keyword or tag "learning" as good learning opportunities for people new to a certain area. Experts in the specific area should prefer helping others but not work on the ticket
102 91 okurz
* For tickets which are out of the scope of the team remove from backlog, delegate to corresponding teams or persons but be nice and supportive, e.g. [SUSE-IT](https://sd.suse.com/), [EngInfra](https://infra.nue.suse.com/) also see [SLA](https://confluence.suse.com/display/qasle/Service+Level+Agreements), [test maintainer](https://progress.opensuse.org/projects/openqatests/), QE-LSG PrjMgr/mgmt
103 328 okurz
 * For [EngInfra](https://sd.suse.com/servicedesk/customer/portal/1) tickets ensure there's a ticket for it in [openQA Infrastructure](https://progress.opensuse.org/projects/openqa-infrastructure/issues), use `Engineering-Infra` under **Select a system**, use `[openqa] …` in the subject, reference the progress ticket and put `osd-admins@suse.de` in the **Request participants** field or use the **Share** button to add osd-admins@suse.de later. Use the tracker ticket for internal notes
104 112 okurz
* Whenever we apply changes to the infrastructure we should have a ticket
105 88 okurz
* Refactoring and general improvements are conducted while we work on features or regression fixes
106 89 okurz
* For every regression or bigger issue that we encounter try to come up with at least two improvements, e.g. the actual issue is fixed and similar cases are prevented in the future with better tests and optionally also monitoring is improved
107 268 okurz
* For critical issues and very big problems collect "lessons learned", e.g. in notes in the ticket or a meeting with minutes in the ticket, consider https://en.wikipedia.org/wiki/Five_whys and answer at least the following questions: "User impact, outwards-facing communication and mitigation, upstream improvement ideas, Why did the issue appear, can we reduce our detection time, can we prevent similar issues in the future, what can we improve technically, what can we improve in our processes". Also see https://youtu.be/_Dv4M39Arec
108 194 okurz
* okurz proposes to use "#NoEstimates". Though that topic is controversial and often misunderstood. https://ronjeffries.com/xprog/articles/the-noestimates-movement/ describes it nicely :) Hence tickets should be evenly sized and no estimation numbers should be provided on tickets
109 203 okurz
* If you really want you can look at the [burndown chart](https://progress.opensuse.org/agile/charts?utf8=%E2%9C%93&set_filter=1&f%5B%5D=chart_period&op%5Bchart_period%5D=%3E%3Ct-&v%5Bchart_period%5D%5B%5D=90&f%5B%5D=fixed_version_id&op%5Bfixed_version_id%5D=%3D&v%5Bfixed_version_id%5D%5B%5D=418&f%5B%5D=&chart=burndown_chart&chart_unit=issues&interval_size=day) (some people wish to have this) but we consider it unnecessary due to the continuous development, not a project with defined end. Also an [agile board](https://progress.opensuse.org/agile/board?utf8=%E2%9C%93&set_filter=1&f%5B%5D=fixed_version_id&op%5Bfixed_version_id%5D=%3D&v%5Bfixed_version_id%5D%5B%5D=418&f%5B%5D=status_id&op%5Bstatus_id%5D=%3D&f_status%5B%5D=1&f_status%5B%5D=12&f_status%5B%5D=2&f_status%5B%5D=15&f_status%5B%5D=4&c%5B%5D=tracker&c%5B%5D=assigned_to&c%5B%5D=cf_16) is available but likely due to problems within the redmine installation ordering cards is not reliable.
110 229 okurz
* Write to qa-team@suse.de as well for critical changes as well as chat channels
111
* Everyone should propose reverts of features if we find problems that can not be immediately fixed or worked around in production
112 112 okurz
113 55 okurz
#### Definition of DONE
114 1 mgriessmeier
115 322 dheidler
Also see https://web.archive.org/web/20110308065330/http://www.allaboutagile.com/definition-of-done-10-point-checklist/ and https://web.archive.org/web/20170214020537/https://www.scrumalliance.org/community/articles/2008/september/what-is-definition-of-done-(dod)
116 55 okurz
117
* Code changes are made available via a pull request on a version control repository, e.g. github for openQA
118
* [Guidelines for git commits](http://chris.beams.io/posts/git-commit/) have been followed
119
* Code has been reviewed (e.g. in the github PR)
120 1 mgriessmeier
* Depending on criticality/complexity/size/feature: A local verification test has been run, e.g. post link to a local openQA machine or screenshot or logfile
121 268 okurz
* For regressions: A regression fix is provided, flaws in the design, monitoring, process have been considered
122 55 okurz
* Potentially impacted package builds have been considered, e.g. openSUSE Tumbleweed and Leap, Fedora, etc.
123
* Code has been merged (either by reviewer or "mergify" bot or reviewee after 'LGTM' from others)
124
* Code has been deployed to osd and o3 (monitor automatic deployment, apply necessary config or infrastructure changes)
125
126 56 okurz
#### Definition of READY for new features
127 55 okurz
128
The following points should be considered before a new feature ticket is READY to be implemented:
129
130
* Follow the ticket template from https://progress.opensuse.org/projects/openqav3/wiki/#Feature-requests
131
* A clear motivation or user expressing a wish is available
132 266 okurz
* Acceptance criteria are stated (see ticket template) or use `[timeboxed:<nr>h]` with `<nr>` hours for tasks that should be limited in time, e.g. a research task with `[timeboxed:20h] research …`
133 55 okurz
* add tasks as a hint where to start
134
135 1 mgriessmeier
#### WIP-limits (reference "Kanban development")
136 28 okurz
137 79 livdywan
* global limit of 10 tickets, and 3 tickets per person respectively [In Progress](https://progress.opensuse.org/issues?query_id=505)
138
* limit of 20 tickets per person in [Feedback](https://progress.opensuse.org/issues?query_id=520)
139 27 okurz
140 1 mgriessmeier
#### Target numbers or "guideline", "should be", in priorities
141
142 172 okurz
1. *New, untriaged QA (openQA, etc.):* [0 (daily)](https://progress.opensuse.org/projects/qa/issues?query_id=576) . Every ticket should have a target version, e.g. "Ready" for QE tools team, "future" if unplanned, others for other teams
143 64 okurz
1. *Untriaged "tools" tagged:* [0 (daily)](https://progress.opensuse.org/issues?query_id=481) . Every ticket should have a target version, e.g. "Ready" for QE tools team, "future" if unplanned, others for other teams
144 273 okurz
1. *Workable (properly defined):* [10-40](https://progress.opensuse.org/issues?query_id=478) . Enough tickets to reflect a proper plan but not too many to limit unfinished data (see "waste")
145 82 okurz
1. *Overall backlog length:* [ideally less than 100](https://progress.opensuse.org/issues?query_id=230) . Similar as for "Workable". Enough tickets to reflect a proper roadmap as well as give enough flexibility for all unfinished work but limited to a feasible number that can still be overlooked by the team without loosing overview. One more reason for a maximum of 100 are that pagination in redmine UI allows to show only up to 100 issues on one page at a time, same for redmine API access.
146 71 okurz
1. *Within due-date:* [0 (daily/weekly)](https://progress.opensuse.org/issues?query_id=514) . We should take due-dates serious, finish tickets fast and at the very least update tickets with an explanation why the due-date could not be hold and update to a reasonable time in the future based on usual cycle time expectations
147 27 okurz
148
#### SLOs (service level objectives)
149
150
* for picking up tickets based on priority, first goal is "urgency removal":
151 123 okurz
 * **immediate**: [<1 day](https://progress.opensuse.org/issues?query_id=542)
152
 * **urgent**: [<1 week](https://progress.opensuse.org/issues?query_id=543)
153
 * **high**: [<1 month](https://progress.opensuse.org/issues?query_id=544)
154 124 okurz
 * **normal**: [<1 year](https://progress.opensuse.org/issues?query_id=545)
155 118 livdywan
 * **low**: undefined
156 1 mgriessmeier
157 123 okurz
* aim for cycle time of individual tickets (not epics or sagas): 1h-2w
158 31 okurz
159 331 okurz
* reference for SLOs and related topics: https://sre.google/sre-book/table-of-contents/
160
161 54 mkittler
#### Backlog prioritization
162 47 okurz
163
When we prioritize tickets we assess:
164
1. What the main use cases of openQA are among all users, be it SUSE QA engineers, other SUSE employees, openSUSE contributors as well as any other outside user of openQA
165
2. We try to understand how many persons and products are affected by feature requests as well as regressions (or "concrete bugs" as the ticket category is called within the openQA Project) and prioritize issues affecting more persons and products and use cases over limited issues
166
3. We prioritize regressions higher than work on (new) feature requests
167
4. If a workaround or alternative exists then this lowers priority. We prioritize tasks that need deep understanding of the architecture and an efficient low-level implementation over convenience additions that other contributors are more likely to be able to implement themselves.
168
169 330 okurz
#### Periodic backlog grooming
170
171
These queries can be used as help to organize our work efficiently
172
173
* [QE tools team - backlog - sorted by update time](https://progress.opensuse.org/issues?query_id=654) ensure all tickets are reasonably up-to-date and don't keep hanging around
174
* [QE tools team - due date forecast](https://progress.opensuse.org/issues?query_id=651) prevent running into due-dates proactively
175
176 38 okurz
### Team meetings
177
178 313 livdywan
* **Daily:** Use (internal) chat actively, e.g. formulate your findings or achievements and plans for the day, "think out loud" while working on individual problems. Optionally join [m.o.o/suse_qa_tools](https://meet.opensuse.org/suse_qa_tools) every Monday, Tuesday and Thursday 1030-1045 CET/CEST
179 1 mgriessmeier
  * *Goal*: Quick support on problems, feedback on plans, collaboration and self-reflection (compare to [Daily Scrum](https://www.scrumguides.org/scrum-guide.html#events-daily))
180 312 livdywan
* **Ticket estimations:** Every Thursday 1100-1200 CET/CEST in [m.o.o/suse_qa_tools](https://meet.opensuse.org/suse_qa_tools) Estimate t-shirt sizes for our tickets
181 1 mgriessmeier
  * *Goal*: Ensure tickets are workable. Refine and split tickets for larger estimates.
182 313 livdywan
* **Midweekly unblock:** Every Wednesday 1100-1200 CET/CEST in [m.o.o/suse_qa_tools](https://meet.opensuse.org/suse_qa_tools) ([fallback](https://meet.jit.si/suse_qa_tools)).
183
  * *Goal*: Discuss tasks in progress, unblock people.
184 158 okurz
* **Weekly coordination:** Every Friday 1115-1145(-1215) CET/CEST in [m.o.o/suse_qa_tools](https://meet.opensuse.org/suse_qa_tools) ([fallback](https://meet.jit.si/suse_qa_tools)). Community members and guests are particularly welcome to join this meeting.
185
  * *Goal*: Demo of features, Team backlog coordination and design decisions of bigger topics (compare to [Sprint Planning](https://www.scrumguides.org/scrum-guide.html#events-planning)).
186 166 okurz
  * *Conduction*: Demo recently finished feature work depending on [last closed](https://progress.opensuse.org/issues?query_id=572), crosscheck status of team, discuss blocked tasks and upcoming work
187 197 okurz
* **Fortnightly Retrospective:** Friday 1145-1215 CET/CEST every odd week, same room as the weekly meeting. On these days the weekly has hard time limit of 1115-1145.
188 1 mgriessmeier
  * *Goal*: Inspect and adapt, learn and improve (compare to [Sprint Retrospective](https://www.scrumguides.org/scrum-guide.html#events-retro))
189 110 livdywan
  * *Announcements*: Create a new *discussion* with all team members in Rocket Chat and a new [retrospected game](retrospected.com) which can be filled in all week. Specific actions will be recorded as tickets.
190 329 livdywan
* **Virtual coffee:** Weekly every Monday 1100-1120 CET/CEST, same room as the weekly.
191 58 livdywan
  * *Goal*: Connect and bond as a team, understand each other (compare to [Informal Communication in an all-remote environment](https://about.gitlab.com/company/culture/all-remote/informal-communication))
192
* **extension on-demand:** Optional meeting on invitation in the suggested time slot Thursday 1000-1200 CET/CEST, in the same room as the weekly, on-demand or replacing the *Virtual coffee talk*.
193 1 mgriessmeier
  * *Goal*: Introduce, research and discuss bigger topics, e.g. backlog overview, processes and workflows
194 175 okurz
* **Workshop:** Friday 0900-0950 CET/CEST every week in [m.o.o/suse_qa_tools](https://meet.opensuse.org/suse_qa_tools) especially for community members and users! We will run this every week with the plan to move to a fortnightly cadence every even week.
195 110 livdywan
  * *Goal*: Demonstrate new and important features, explain already existing, but less well-known features, and discuss questions from the user community. All your questions are welcome!
196
  * *Announcements*: Drop a reminder with a teaser in [#testing](https://chat.suse.de/channel/testing).
197 267 okurz
  * *Recordings*: Consider recording, e.g. using OBS, and upload to youtube, link on topics link. SUSE internal topics can be published on http://streaming.nue.suse.com/i/QE-Tools-Workshops/ by ssh-uploading to ftp@streaming.nue.suse.com:~/i/QE-Tools-Workshops/ (get your SSH key added by existing team members, e.g. okurz)
198 144 livdywan
199 177 okurz
#### Best practices for meetings
200
* Meetings concerning the whole team are moderated by the scrum master by default, who should join the call early and verify that the meeting itself and any tools used are working or e.g. advise the use of the fallback option.
201
* We would prefer UTC for meeting times to be globally fair but as many other SUSE meetings are bound to European time we need to stick to that as well.
202
* It is recommended to use the Jitsi Audio-feedback feature, blue/green circles depending on microphone volume. Everybody should ensure that at least "two green balls" show up
203
* Hand signals over video can be used, e.g. "waving/circling hands": "I am lost, please bring me into discussion again"; "T-Sign": "I need a break"; "Raised hand": "I would like to speak"
204 193 okurz
* Discuss topics relevant for all within the common meetings, continue discussions pro-actively over asynchronous communication, e.g. tickets, as well as conduct topic centered follow-up meetings with only relevant attendees
205 144 livdywan
206
#### Workshop Topics
207
208 145 livdywan
* *SUSE QE Tools roadmap*: Recent achievements, mid-term plan and future outlook. Every first Friday every month (Idea based on discussion between okurz and vpelcak 2021-02-09)
209 151 okurz
* **2021-01-15:** *DONE* [openqa-auto-review and openqa-investigate](https://youtu.be/_t3THhdiDag)
210
* **2021-01-29:** *DONE* overview of development repositories on https://github.com/os-autoinst/
211
* **2021-02-05:** *DONE* [powerpc](https://youtu.be/q1CM2AH5aKY) (@nicksinger)
212
* **2021-02-12:** *DONE* [job templates](https://youtu.be/YPuH0bcr524) (@tinita, @cdywan)
213
* **2021-02-19:** *DONE* [SUSE QEM review workflow discussions](https://youtu.be/nCIAcvD7SA8) (@dzedro, @mgrifalconi)
214
* **2021-02-26:** *DONE* open conversation
215 247 okurz
* **2021-03-05:** *DONE* [SUSE QE Tools roadmap - 2021-03](https://youtu.be/vIqBIEMH0O0) (@okurz, @mkittler)
216 164 okurz
* **2021-03-12:** *DONE* [openqa-mon](https://youtu.be/CNLihgMKt30) @ph03nix
217 151 okurz
* **2021-03-19:** *DONE* [multi-machine tests](https://youtu.be/9j-NgNTzJ0w) (@okurz; topic proposal by zluo, initially brought up as: "high RAM and storage requirements")
218 154 okurz
* **2021-03-26:** *skipped due to SUSE Hack Week*
219
* **2021-04-02:** *public holiday*
220 157 okurz
* **2021-04-09:** *DONE* [SUSE QE Tools roadmap - 2021-04](https://youtu.be/nfMilLcCosQ) (@okurz, @cdywan)
221 165 okurz
* **2021-04-16:** *DONE* [openqa.opensuse.org infrastructure overview](https://youtu.be/G5bQKI2tURk) (see question in #88831#note-19 , @okurz)
222 168 okurz
* **2021-04-23:** *DONE* [openQA tests written in Python](https://youtu.be/GjKZ51lnCh0) (@okurz, @cdywan)
223 189 okurz
* **2021-04-30:** *DONE* [openqa-review: A review helper script for openQA with complete test overview reports](https://youtu.be/J2eI0gKnQNM) (@okurz)
224 191 okurz
* **2021-05-07:** *DONE* [SUSE QE Tools roadmap - 2021-05](https://youtu.be/J2eI0gKnQNM) (@okurz, @cdywan)
225 195 okurz
* **2021-05-14:** *DONE* [Review badges and recent changes related to them](https://youtu.be/rduc1z1HB-4) (@mkittler)
226 202 okurz
* **2021-05-21:** *DONE* [openQA API Playground](https://youtu.be/EfXZKbQS-Kg) (@okurz)
227 205 okurz
* **2021-05-28:** *DONE* [Tumbleweed workflows focussed on openQA](https://youtu.be/YiiuNqRPGAk) (proposal by okurz motivated by https://chat.suse.de/channel/testing?msg=EysbgG5kFrHbmjvcy , e.g. impact of failing tests, to-test manager, etc.; by okurz, dimstar?)*
228 220 okurz
* **2021-06-04:** *DONE* SUSE QE Tools roadmap - 2021-06
229
* **2021-06-11:** *DONE* [intro to os-autoinst development (demo how to investigate and test a small fix)](https://youtu.be/IeXaVb5dqy8) (@okurz, @mkittler)
230 227 okurz
* **2021-06-18:** *DONE* [How to be prepared when we introduce openQA features](https://youtu.be/wERuChD-88Y) (@cdywan, @okurz)
231 240 okurz
* **2021-06-25:** *DONE* Workflow discussions: SUSE QE aggregate tests (Proposed by okurz: We would like to learn from others how these are included in the workflow; no recording) (@okurz)
232 249 okurz
* **2021-07-02:** *DONE* [SUSE QE Tools roadmap - 2021-07](https://youtu.be/bppQFEhlfic) (@okurz, @cdywan)
233 276 okurz
* **2021-07-09:** *DONE* A glimpse into the QE Core workflow (@geor, @ilausuch)
234
* **2021-07-16:** *DONE* Testing SLES+HA & SAP Clusters with openQA (@acarvajal)
235 277 okurz
* **2021-07-23:** *DONE* [Sporadic failures](https://youtu.be/TB-QO3Ipo1E) (@punkioudi)
236 279 okurz
* **2021-07-30:** *DONE* [An Overview of the HANA Performance Continuous Integration](https://youtu.be/R4f4Lxr0-zk) (@jgwang)
237 282 livdywan
* **2021-08-06:** *DONE* [SUSE QE Tools roadmap - 2021-08](https://youtu.be/6SVV3Kb0lSI) (@tinita, @cdywan)
238 288 okurz
* **2021-08-13:** *DONE* A proposal to maintain testsuites through a [GitLab CI pipeline](https://gitlab.suse.de/qa-maintenance/qam-openqa-yml/-/merge_requests/163) @apappas
239 231 livdywan
* SMELT showcase (@vanastasiadis)
240 288 okurz
* **2021-08-20:** *DONE* [Space management](https://youtu.be/g331EIPd_jQ) (@mkittler)
241 291 okurz
* **2021-08-27:** *DONE* [openQA soft-fails (what are they, how to use, when to use, limitations)](https://youtu.be/HZAvYw86-lw) (@okurz)
242 307 okurz
* **2021-09-03:** *DONE* [SUSE QE Tools roadmap - 2021-09](https://youtu.be/5o6hUkEfrsA) (@okurz)
243 320 okurz
* **2021-09-10:** *DONE* open conversation (@okurz)
244
* **2021-09-17:** *DONE* [discussing new openQA features (openqa-review, priorities, module search)](https://youtu.be/0-QiVh1qBbI) (@okurz)
245 324 okurz
* **2021-09-24:** *DONE* [Scripting openQA from top to bottom by phoenix](https://youtu.be/RUVtn6unMfs) (@cdywan, Felix)
246 323 okurz
* **2021-10-01:** *SUSE QE Tools roadmap - 2021-10* (@okurz)
247
* **2021-10-08:** *Brainstorming and ideas for better connecting OBS+openQA (outgoing webhooks)* (@okurz, @hennevogel)
248 151 okurz
* proposal by ybonatakis: Explore integration of other tools, test frameworks, Integration
249 1 mgriessmeier
* proposal by ybonatakis: QA best practices
250
* proposal by okurz: How we review openQA test results, by SUSE QE teams: Who volunteers from each team to present? Propose a speaker and a date!
251 290 okurz
252 269 acarvajal
* proposal by acarvajal: Follow up to Testing SLES+HA & SAP Clusters with openQA: Test Results & Known Issues (@acarvajal)
253 59 livdywan
254 111 livdywan
#### Announcements
255
256
- For every meeting, regular or one-off, desired attendants should be invited to make sure a slot blocked in their calendar and reminders with the correct local time will show up when it's time to join the meeting
257
  - Create a new event, for example in Thunderbird via the *Calendar* tab or `New > Event` via the menu.
258
  - Pick your audience, for example `qa-team@suse.de` will reach test developers and reviewers, or you can select individual attendants via their respective email addresses.
259
  - Add attendees accordingly.
260
  - Specify the time of the meeting
261
  - Set a schedule to repeat the event if applicable.
262
  - Add a location, e.g. https://meet.opensuse.org/suse_qa_tools
263
  - Don't worry if any of the details might change - you can update the invitation later and participants will be notified.
264
- See the respective meeting for regular actions such as communication via chat
265
266 73 okurz
### Team
267
268
The team is comprised of engineers from different teams, some only partially available:
269 251 okurz
* Christian Dywan (Scrum Master) *kalikiana@Freenode*
270 209 okurz
* Oliver Kurz (Product Owner)
271 98 okurz
* Marius Kittler
272 326 dheidler
* Dominik Heidler (Moving back to Core from Oct 1st)
273 1 mgriessmeier
* Nick Singer (only OPS)
274 315 kraih
* ~~Sebastian Riedel~~ (Part time contributions, currently working on other projects)
275 251 okurz
* Tina Müller (Part time (35h)) *tinita@Freenode, github: perlpunk*
276
* Vasileios Anastasiadis (Bill) (+dedicated work areas)
277
* Ondřej Súkup (dedicated work areas)
278 1 mgriessmeier
* Jan Baier (part time, QEM-dedicated work areas)
279 293 okurz
* ~~Ivan Lausuch~~ (joining exchange program (To QA public cloud and containers) in September, duration not determined)
280 308 livdywan
* Xiaojing Liu (Jane) *github: Amrysliu*
281 309 livdywan
* Moritz Kodytek @kodymo / [@FruitFly638](https://github.com/kodymo)
282 275 okurz
283 107 livdywan
### Onboarding for new joiners
284
285
* Request to get added to the [tools team on GitHub](https://github.com/orgs/os-autoinst/teams/tools-team)
286 303 livdywan
* Login at [stats.openqa-monitor.qa.suse.de](https://stats.openqa-monitor.qa.suse.de/alerting/list) with NIS/LDAP credentials and ask to be given the *admin* role
287 255 okurz
* Watch this wiki page (click "Watch" button on top of this page)
288 107 livdywan
* Subscribe to [osd-admins@suse.de](http://mailman.suse.de/mailman/listinfo/osd-admins), [openqa@suse.de](http://mailman.suse.de/mailman/listinfo/openqa) and [opensuse-factory@opensuse.org](https://lists.opensuse.org/archives/list/factory@lists.opensuse.org)
289 302 livdywan
* Join #suse-qe-tools:opensuse.org (matrix) and [team-qa-tools on Slack](https://suse.slack.com/archives/C02AJ1E568M)
290 234 livdywan
* Request to join [devel:openQA on OBS](https://build.opensuse.org/project/show/devel:openQA) and check that you have both `Request created` and `New comment for request created` enabled for `Maintainer of the target` in your [OBS notification settings](https://build.opensuse.org/my/subscriptions) (staging bot writes reminder comments on open reviews)
291 304 okurz
* Connect to `#opensuse-factory` on *libera.chat*, see "Common tasks for team members - Monitor" above
292 107 livdywan
* Request admin access on [osd](http://openqa.suse.de/) and [o3](http://openqa.opensuse.org/)
293 262 livdywan
* Request to get added to the [QA project in Progress](https://progress.opensuse.org/projects/qa/settings/members) and *enable notifications for the openQA project* in [your account settings](https://progress.opensuse.org/my/account)
294 200 livdywan
* Request to get added to the [openqa team in GitLab](https://gitlab.suse.de/groups/openqa/-/group_members)
295 1 mgriessmeier
* Add your ssh key to gitlab.suse.de/openqa/salt-pillars-openqa with a merge request
296 278 jbaier_cz
* Add your ssh key to gitlab.suse.de/qa-maintenance/qamops/-/blob/master/ansible/books/vars/main.yml with a merge request
297 162 okurz
* Ask an existing admin, e.g. other members of the team, to add your username and ssh key to o3
298 183 okurz
* Ensure you are subscribed to all projects referenced in https://progress.opensuse.org/projects/qa/wiki#Common-tasks-for-team-members
299 253 tinita
* ~~Ensure you have access to https://gitlab.suse.de/OPS-Service/monitoring (create EngInfra ticket otherwise) and add yourself in https://gitlab.suse.de/OPS-Service/monitoring/-/tree/master/icinga/shared/contacts to receive monitoring information~~ EngInfra does not grant access to additional people currently. That might change again in the future.
300 238 livdywan
* Ask for access to the team calendar if needed (on demand, via invitation)
301 306 livdywan
* *Watch* [qa-tools-backlog-assistant](https://github.com/os-autoinst/qa-tools-backlog-assistant) and choose *All Activity*
302 107 livdywan
303 45 okurz
### Alert handling
304
305
#### Best practices
306
307
* "if it hurts, do it more often": https://www.martinfowler.com/bliki/FrequencyReducesDifficulty.html
308
* Reduce [Mean-time-to-Detect (MTTD)](https://searchitoperations.techtarget.com/definition/mean-time-to-detect-MTTD) and [Mean-time-to-Recovery](https://raygun.com/blog/what-is-mttr/)
309
310
#### Process
311
312 294 okurz
* React on any alert or report of an outage
313
* If users report outages of components of our infrastructure
314
  * Consider forming a task force and work together
315
  * Inform the affected users about the impact, mitigation/workarounds and ETA for resolution
316
* For each failing alert, e.g. grafana
317 316 okurz
 * Create a ticket for the issue (with a tag "alert"; create ticket unless the alert is trivial to resolve and needs no improvement; if an alert is unhandled for at least 4h then a ticket must be created; even create a ticket if alerts turn to "ok" to prevent these issues in the future and to improve the alter)
318 45 okurz
 * Link the corresponding grafana panel in the ticket
319
 * Respond to the notification email with a link to the ticket
320 1 mgriessmeier
 * Optional: Inform in chat
321 51 okurz
 * Optional: Add "annotation" in corresponding grafana panel with a link to the corresponding ticket 
322 46 okurz
 * Pause the alert if you think further alerting the team does not help (e.g. you can work on fixing the problem, alert is non-critical but problem can not be fixed within minutes)
323 45 okurz
* If you consider an alert non-actionable then change it accordingly
324
* If you do not know how to handle an alert ask the team for help
325
* After resolving the issue add explanation in ticket, unpause alert and verify it going to "ok" again, resolve ticket
326
327
#### References
328
329
* https://nl.devoteam.com/en/blog-post/monitoring-reduce-mean-time-recovery-mttr/
330
331 297 tinita
#### Gitlab Pipeline Notifications
332
333
Currently, the following projects are configured to write an email to osd-admins@suse.de if a pipeline fails:
334 305 okurz
* https://gitlab.suse.de/openqa/auto-review
335
* https://gitlab.suse.de/openqa/grafana-webhook-actions
336
* https://gitlab.suse.de/openqa/monitor-o3
337 1 mgriessmeier
* https://gitlab.suse.de/openqa/openqa-review
338 305 okurz
* https://gitlab.suse.de/openqa/osd-deployment
339 298 tinita
* https://gitlab.suse.de/openqa/salt-states-openqa
340
* https://gitlab.suse.de/openqa/salt-pillars-openqa
341
* https://gitlab.suse.de/qa-maintenance/bot-ng
342 305 okurz
* https://gitlab.suse.de/qa-maintenance/openQABot
343 297 tinita
344
The configuration can be found by going to:
345
* Settings
346
* -> Integrations
347
* -> Pipeline Status Emails (Enable plugin if not yet)
348
349
It seems there is no way for subscribing to certain pipelines as a user.
350 223 okurz
351
### Things to try
352 224 okurz
* Everybody can be "Product Owner" or "Scrum Master" or "Admin" or "Developer" for some time to get the different perspective
353 223 okurz
* From time to time ask stakeholders for their list of priorities regarding our tasks
354 99 okurz
355
### Extra-ordinary "hack-week" 2020-W51
356
357
SUSE QE Tools plans to have an internal "hack-week": Condition: We close 30 tickets from our backlog within the time frame 2020-12-03 until 2020-12-11 start of weekly meeting. No cheating! :) See [this query](https://progress.opensuse.org/issues?utf8=%E2%9C%93&set_filter=1&sort=priority%3Adesc%2Cid%3Adesc&f%5B%5D=status_id&op%5Bstatus_id%5D=c&f%5B%5D=fixed_version_id&op%5Bfixed_version_id%5D=%3D&v%5Bfixed_version_id%5D%5B%5D=418&f%5B%5D=closed_on&op%5Bclosed_on%5D=%3E%3C&v%5Bclosed_on%5D%5B%5D=2020-12-03&v%5Bclosed_on%5D%5B%5D=2020-12-11&f%5B%5D=&c%5B%5D=subject&c%5B%5D=project&c%5B%5D=status&c%5B%5D=assigned_to&c%5B%5D=relations&c%5B%5D=priority&c%5B%5D=category&c%5B%5D=cf_16&group_by=status&t%5B%5D=). During week 2020-W51 everyone is allowed to work on any hack-week project, it should just have a reasonable, "explainable" connection to our normal work. okurz volunteers to take over ops-duty for the week.
358 105 okurz
359
Result during meeting 2020-12-11: We missed the goal (by a slight amount) but we are motivated to try again in the next year :) Everybody, put some easy tickets aside for the next time!
360 115 okurz
361
### Extra-ordinary "hack-week" 2021-W8
362
363
Similar as our attempt for 2020-W51 with same rules, except condition: We close 30 tickets from our backlog within the time frame 2021-02-05 until 2021-02-19 start of weekly meeting. No cheating! See [this query](https://progress.opensuse.org/issues?utf8=%E2%9C%93&set_filter=1&sort=priority%3Adesc%2Cid%3Adesc&f%5B%5D=status_id&op%5Bstatus_id%5D=c&f%5B%5D=fixed_version_id&op%5Bfixed_version_id%5D=%3D&v%5Bfixed_version_id%5D%5B%5D=418&f%5B%5D=closed_on&op%5Bclosed_on%5D=%3E%3C&v%5Bclosed_on%5D%5B%5D=2021-02-05&v%5Bclosed_on%5D%5B%5D=2021-02-19&f%5B%5D=&c%5B%5D=subject&c%5B%5D=project&c%5B%5D=status&c%5B%5D=assigned_to&c%5B%5D=relations&c%5B%5D=priority&c%5B%5D=category&c%5B%5D=cf_16&group_by=status&t%5B%5D=).
364 129 okurz
365
Result during meeting 2021-02-19: We missed the goal (25/30 tickets resolved) but again we are open to try again, maybe after next SUSE hack week.
366 31 okurz
367
### Historical
368 64 okurz
369 62 okurz
Previously the former QA tools team used target versions "Ready" (to be planned into individual milestone periods or sprints), "Current Sprint" and "Done". However the team never really did use proper time-limited sprints so the distinction was rather vague. After having tickets "Resolved" after some time the PO or someone else would also update the target version to "Done" to signal that the result has been reviewed. This was causing a lot of ticket update noise for not much value considering that the [Definition-of-Done](https://progress.opensuse.org/projects/openqav3/wiki/#ticket-workflow) when properly followed already has rather strict requirements on when something can be considered really "Resolved" hence the team eventually decided to not use the "Done" target version anymore. Since about 2019-05 (and since okurz is doing more backlog management) the team uses priorities more as well as the status "Workable" together with an explicit team member list for "What the team is working on" to better visualize what is making team members busy regardless of what was "officially" planned to be part of the team's work. So we closed the target version. On 2020-07-03 okurz subsequently closed "Current Sprint" as also this one was in most cases equivalent to just picking an assignee for a ticket or setting to "In Progress". We can just distinguish between "(no version)" meaning untriaged, "Ready" meaning tools team should consider picking up these issues and "future" meaning that there is no plan for this to be picked up. Everything else is defined by status and priority.
370 27 okurz
In 2020-10-27 we discussed together to find out the history of the team. We clarified that the team started out as a not well defined "Dev+Ops" team. "team responsibilities" have been mainly unchanged since at least beginning of 2019. We agreed that learning from users and production about our "Dev" contributions is good, so this part of "Ops" is responsibility of everyone.
371 128 okurz
372
Also see #73060 for more details about how the responsibilities were setup.
373 104 okurz
374
## Change announcements
375 258 okurz
376 104 okurz
For new, cool features or disruptive changes consider providing according notifications to our common userbase as well as potential future users, for example create post on opensuse-factory@opensuse.org , link to post on openqa@suse.de , invite for workshop, post on one.suse.com, #opensuse-factory (IRC) (irc://irc.libera.chat/opensuse-factory), [#testing (RC)](https://chat.suse.de/testing)
377 69 tjyrinki_suse
378 1 mgriessmeier
# QE Core and QE Yast - Team descriptions
379 70 tjyrinki_suse
380 68 tjyrinki_suse
(this chapter has seen changes in 2020-11 regarding QSF -> QE Core / QE Yast change)
381 70 tjyrinki_suse
382 7 szarate
**QE Core** (formerly QSF, QA SLE Functional) and **QE Yast** are squads focusing on Quality Engineering of the core and yast functionality of the SUSE SLE products. The squad is comprised of members of QE Integration - [SUSE QA SLE Nbg](https://wiki.suse.net/index.php/SUSE-Quality_Assurance/Organization/Members_and_Responsibilities#QA_SLE_NBG_Team), including [SUSE QA SLE Prg](https://wiki.suse.net/index.php/SUSE-Quality_Assurance/Organization/Members_and_Responsibilities#QA_SLE_PRG_Team) - and QE Maintenance people (formerly "QAM"). The [SLE Departement](https://wiki.suse.net/index.php/SUSE-Quality_Assurance/SLE_Department#QSF_.28QA_SLE_Functional.29) page describes our QA responsibilities. We focus on our automatic tests running in [openQA](https://openqa.suse.de) under the job groups "Functional" as well as "Autoyast" for the respective products, for example [SLE 15 / Functional](https://openqa.suse.de/group_overview/110) and [SLE 15 / Autoyast](https://openqa.suse.de/group_overview/129). We back our automatic tests with exploratory manual tests, especially for the product milestone builds. Additionally we care about corresponding openSUSE openQA tests (see as well https://openqa.opensuse.org).
383 237 tjyrinki_suse
384 1 mgriessmeier
* More recent scope of QE Core's testing (under work, hopefully to be replicated here later) https://confluence.suse.com/display/qasle/Tests+Maintained+by+QE+Core
385
* long-term roadmap: http://s.qa.suse.de/qa-long-term
386
* overview of current openQA SLE12SP5 tests with progress ticket references: https://openqa.suse.de/tests/overview?distri=sle&version=12-SP5&groupid=139&groupid=142
387
* fate tickets for SLE12SP5 feature testing: based on http://s.qa.suse.de/qa_sle_functional_feature_tests_sle12sp5 new report based on all tickets with milestone before SLE12SP5 GM, http://s.qa.suse.de/qa_sle_functional_feature_tests_sle15sp1 for SLE15SP1
388 3 szarate
* only "blocker" or "shipstopper" bugs on "interesting products" for SLE15 http://s.qa.suse.de/qa_sle_functional_bug_query_sle15_2, http://s.qa/qa_sle_bugs_sle12_2 for SLE12
389 1 mgriessmeier
* Better organization of planned work can be seen at the [SUSE QA](https://progress.opensuse.org/projects/suseqa) project (which is not public).
390 27 okurz
391 1 mgriessmeier
## Test plan
392
393
When looking for coverage of certain components or use cases keep the [openQA glossary](http://open.qa/docs/#concept) in mind. It is important to understand that "tests in openQA" could be a scenario, for example a "textmode installation run", a combined multi-machine scenario, for example "a remote ssh based installation using X-forwarding", or a test module, for example "vim", which checks if the vim editor is correctly installed, provides correct rendering and basic functionality. You are welcome to contact any member of the team to ask for more clarification about this.
394 19 okurz
395
In detail the following areas are tested as part of "SLE functional":
396 1 mgriessmeier
397
* different hardware setups (UEFI, acpi)
398
* support for localization
399
* openSUSE: virtualization - some "virtualization" tests are active on o3 with reduced set compared to SLE coverage (on behalf of QA SLE virtualization due to team capacity constraints, clarified in QA SLE coordination meeting 2018-03-28)
400 26 riafarov
* openSUSE: migration - comparable to "virtualization", a reduced set compared to SLE coverage is active on o3 (on behalf of QA SLE migration due to team capacity constraints, clarified in QA SLE coordination meeting 2018-04)
401
402 69 tjyrinki_suse
403 18 okurz
### QE Yast
404 69 tjyrinki_suse
405 1 mgriessmeier
Squad focuses on testing YaST components, including installer and snapper.
406 18 okurz
407 1 mgriessmeier
Detailed test plan for SLES can be found here: [SLES_Integration_Level_Testplan.md](https://gitlab.suse.de/qsf-y/qa-sle-functional-y/blob/master/SLES_Integration_Level_Testplan.md)
408
409 2 mgriessmeier
* Latest report based on openQA test results SLE12: http://s.qa.suse.de/test-status-sle12-yast , SLE15: http://s.qa.suse.de/test-status-sle15-yast
410 69 tjyrinki_suse
411 1 mgriessmeier
### QE Core
412
413
"Testing is the future, and the future starts with you"
414 264 szarate
415
* Current definitions can be found at https://confluence.suse.com/display/qasle/Tests+Maintained+by+QE+Core, 
416
417 6 okurz
Note: Link mentioned above is WIP; QE-Core's work has impact on the openSUSE community as well, to keep the community in sync, either https://progress.opensuse.org/projects/qa/wiki#QE-Core or a better place has to be used to share what is the scope of work, always keeping to a unique source of truth, that is available to the community, keeping SLE's specific information, available to SUSE employees only. 
418
419 1 mgriessmeier
* Latest report based on openQA test results SLE12: http://s.qa.suse.de/test-status-sle12-functional , SLE15: http://s.qa.suse.de/test-status-sle15-functional
420 221 okurz
421 1 mgriessmeier
## In new organization also covered by QE Core and others
422
423
* quarterly updated media: former QA Maintenance (QAM) is now part of the various QE squads. However, QU media does happen together with Maintenance Coordination that is not part of these squads.
424 27 okurz
425 1 mgriessmeier
## What we do
426
427 12 okurz
We collected opinions, personal experiences and preferences starting with the following four topics: What are fun-tasks ("new tests", "collaborate", "do it right"), what parts are annoying ("old & sporadic issues"), what do we think is expected from qsf-u ("be quick", "keep stuff running", "assess quality") and what we should definitely keep doing to prevent stakeholders becoming disappointed ("build validation", "communication & support").
428 27 okurz
429 12 okurz
### How we work on our backlog
430
431
* no "due date"
432 1 mgriessmeier
* we pick up tickets that have not been previously discussed
433 14 okurz
* more flexible choice
434
* WIP-limits:
435
 * global limit of 10 tickets "In Progress"
436
437 12 okurz
* target numbers or "guideline", "should be", in priorities:
438
 1. New, untriaged: 0
439 69 tjyrinki_suse
 2. Workable: 40
440 1 mgriessmeier
 3. New, assigned to [qe-core] or [qe-yast]: ideally less than 200 (should not stop you from triaging)
441
442
* SLAs for priority tickets - how to ensure to work on tickets which are more urgent?
443
 * "taken": <1d: immediate -> looking daily
444 12 okurz
 * 2-3d: urgent
445 1 mgriessmeier
 * first goal is "urgency removal": <1d: immediate, 1w: urgent
446 12 okurz
447 1 mgriessmeier
* our current "cycle time" is 1h - 1y (maximum, with interruptions)
448
449
* everybody should set priority + milestone in obvious cases, e.g. new reproducible test failures in multiple critical scenarios, in general case the PO decides
450 27 okurz
451 1 mgriessmeier
### How we like to choose our battles
452
453
We self-assessed our tasks on a scale from "administrative" to "creative" and found in the following descending order: daily test review (very "administrative"), ticket triaging, milestone validation, code review, create needles, infrastructure issues, fix and cleanup tests, find bugs while fixing failing tests, find bugs while designing new tests, new automated tests (very "creative"). Then we found we appreciate if our work has a fair share of both sides. Probably a good ratio is 60% creative plus 40% administrative tasks. Both types have their advantages and we should try to keep the healthy balance.
454
455 27 okurz
456 1 mgriessmeier
### What "product(s)" do we (really) *care* about?
457
458
Brainstorming results:
459
460
* openSUSE Krypton -> good example of something that we only remotely care about or not at all even though we see the connection point, e.g. test plasma changes early before they reach TW or Leap as operating systems we rely on or SLE+packagehub which SUSE does not receive direct revenue from but indirect benefit. Should be "community only", that includes members from QSF though
461
* openQA -> (like OBS), helps to provide ROI for SUSE
462
* SLE(S) (in development versions)
463
* Tumbleweed
464
* Leap, because we use it
465
* SLES HA
466
* SLE migration
467
* os-autoinst-distri-opensuse+backend+needles
468
469
From this list strictly no "product" gives us direct revenue however most likely SLE(S) (as well as SLES HA and SLE migration) are good examples of direct connection to revenue (based on SLE subscriptions). Conducting a poll in the team has revealed that 3 persons see "SLE(S)" as our main product and 3 see "os-autoinst-distri-opensuse+backend+needles" as the main product. We mainly agreed that however we can not *own* a product like "SLE" because that product is mainly not under our control.
470
471
Visualizing "cost of testing" vs. "risk of business impact" showed that both metrics have an inverse dependency, e.g. on a range from "upstream source code" over "package self-tests", "openSUSE Factory staging", "Tumbleweed", "SLE" we consider SLE to have the highest business risk attached and therefore defines our priority however testing at upstream source level is considered most effective to prevent higher cost of bugs or issues. Our conclusion is that we must ensure that the high-risk SLE base has its quality assured while supporting a quality assurance process as early as possible in the development process. package self-tests as well as the openQA staging tests are seen as a useful approach in that direction as well as "domain specfic specialist QA engineers" working closely together with according in-house development parties.
472 27 okurz
473 1 mgriessmeier
## Documentation
474
475
This documentation should only be interesting for the team QA SLE functional. If you find that some of the following topics are interesting for other people, please extract those topics to another wiki section.
476
477
### QA SLE functional Dashboards
478
479 4 szarate
In room 3.2.15 from Nuremberg office are two dedicated laptops each with a monitor attached showing a selected overview of openQA test resuls with important builds from SLE and openSUSE.
480 1 mgriessmeier
Such laptops are configured with a root account with the default password for production machines. First point of contact: [slindomansilla.suse.com](mailto:slindomansilla@suse.com), (okurz@suse.de)[mailto:okurz@suse.de]
481
482
* ''dashboard-osd-3215.suse.de'': Showing current view of openqa.suse.de filtered for some job group results, e.g. "Functional"
483
* ''dashboard-o3-3215.suse.de'': Showing current view of openqa.opensuse.org filtered for some job group results which we took responsibility to review and are mostly interested in
484 24 dheidler
485 1 mgriessmeier
### dashboard-osd-3215
486
487
* OS: openSUSE Tumbleweed
488
* Services: ssh, mosh, vnc, x2x
489
* Users:
490
** root
491
** dashboard
492
* VNC: `vncviewer dashboard-osd-3215`
493
* X2X: `ssh -XC dashboard@dashboard-osd-3215 x2x -west -to :0.0`
494
** (attaches the dashboard monitor as an extra display to the left of your screens. Then move the mouse over and the attached X11 server will capture mouse and keyboard)
495
496
#### Content of /home/dashboard/.xinitrc
497 3 szarate
498 1 mgriessmeier
```
499
#
500
# Source common code shared between the
501
# X session and X init scripts
502
#
503
. /etc/X11/xinit/xinitrc.common
504
505
xset -dpms
506
xset s off
507
xset s noblank
508
[...]
509
#
510
# Add your own lines here...
511
#
512 3 szarate
$HOME/bin/osd_dashboard &
513 1 mgriessmeier
```
514
515
#### Content of /home/dashboard/bin/osd_dashboard
516 3 szarate
517 1 mgriessmeier
```
518
#!/bin/bash
519
520
DISPLAY=:0 unclutter &
521
522
DISPLAY=:0 xset -dpms
523
DISPLAY=:0 xset s off
524
DISPLAY=:0 xset s noblank
525
526 20 dheidler
url="${url:-"https://openqa.suse.de/?group=SLE+15+%2F+%28Functional%7CAutoyast%29&default_expanded=1&limit_builds=3&time_limit_days=14&show_tags=1&fullscreen=1#"}"
527 3 szarate
DISPLAY=:0 chromium --kiosk "$url"
528 1 mgriessmeier
```
529
530
#### Cron job:
531 3 szarate
532 1 mgriessmeier
```
533 23 dheidler
Min     H       DoM     Mo      DoW     Command
534 3 szarate
*	*	*	*	*	/home/dashboard/bin/reload_chromium
535 1 mgriessmeier
```
536 21 dheidler
537 1 mgriessmeier
#### Content of /home/dashboard/bin/reload_chromium
538 3 szarate
539 1 mgriessmeier
```
540
#!/bin/bash
541
542
DISPLAY=:0 xset -dpms
543
DISPLAY=:0 xset s off
544
DISPLAY=:0 xset s noblank
545 22 dheidler
546 21 dheidler
DISPLAY=:0 xdotool windowactivate $(DISPLAY=:0 xdotool search --class Chromium)
547
DISPLAY=:0 xdotool key F5
548 3 szarate
DISPLAY=:0 xdotool windowactivate $(DISPLAY=:0 xdotool getactivewindow)
549 1 mgriessmeier
```
550
551
#### Issues:
552
553
* ''When the screen shows a different part of the web page''
554
** a simple mouse scroll through vnc or x2x may suffice.
555
* ''When the builds displayed are freeze without showing a new build, it usually means that midori, the browser displaying the info on the screen, crashed.''
556
** you can try to restart midori this way:
557
*** ps aux | grep midori
558
*** kill $pid
559
*** /home/dashboard/bin/osd_dashboard
560 25 dheidler
** If this also doesn't work, restart the machine.
561
562
563
### dashboard-o3
564
565
* Raspberry Pi 3B+
566
* IP: `10.160.65.207`
567
568
#### Content of /home/tux/.xinitrc
569
```
570
#!/bin/bash
571
572
unclutter &
573
openbox &
574
xset s off
575
xset -dpms
576
sleep 5
577
url="https://openqa.opensuse.org?group=openSUSE Tumbleweed\$|openSUSE Leap [0-9]{2}.?[0-9]*\$|openSUSE Leap.\*JeOS\$|openSUSE Krypton|openQA|GNOME Next&limit_builds=2&time_limit_days=14&&show_tags=1&fullscreen=1#build-results"
578
chromium --kiosk "$url" &
579
580
while sleep 300 ; do
581
        xdotool windowactivate $(xdotool search --class Chromium)
582
        xdotool key F5
583
        xdotool windowactivate $(xdotool getactivewindow)
584
done
585
```
586
587
#### Content of /usr/share/lightdm/lightdm.conf.d/50-suse-defaults.conf
588
```
589
[Seat:*]
590
pam-service = lightdm
591
pam-autologin-service = lightdm-autologin
592
pam-greeter-service = lightdm-greeter
593
xserver-command=/usr/bin/X
594
session-wrapper=/etc/X11/xdm/Xsession
595
greeter-setup-script=/etc/X11/xdm/Xsetup
596
session-setup-script=/etc/X11/xdm/Xstartup
597
session-cleanup-script=/etc/X11/xdm/Xreset
598
autologin-user=tux
599
autologin-timeout=0
600 1 mgriessmeier
```