Project

General

Profile

Wiki » History » Version 37

okurz, 2020-07-23 08:31
QA Tools: Add "Be responsive on mailing lists opensuse-factory@opensuse.org and openqa@suse.de"

1 27 okurz
{{toc}}
2
3
# Test results overview
4 18 okurz
* Latest report based on openQA test results http://s.qa.suse.de/test-status , SLE12: http://s.qa.suse.de/test-status-sle12 , SLE15: http://s.qa.suse.de/test-status-sle15
5 36 okurz
* only "blocker" or "shipstopper" bugs on "interesting products" for SLE: http://s.qa.suse.de/qa_sle_bugs_sle , SLE15: http://s.qa.suse.de/qa_sle_bugs_sle15_all, SLE12: http://s.qa/qa_sle_bugs_sle12_2
6 1 mgriessmeier
7 27 okurz
# QA tools - Team description
8 1 mgriessmeier
9 27 okurz
## Team responsibilities
10 1 mgriessmeier
11 27 okurz
* Develop and maintain upstream openQA
12
* Administration of openqa.suse.de and workers (But not physical hardware, as these belong to the departments that purchased them and we merely facilitate)
13
* Helps administrating and maintaining openqa.opensuse.org, including coordination of efforts aiming at solving problems affecting o3
14
* Support colleagues, team members and open source community
15 1 mgriessmeier
16 27 okurz
## Out of scope
17
18
* Maintenance of individual tests
19
* Maintenance of physical hardware
20
* Maintenance of special worker addendums needed for tests, e.g. external hypervisor hosts for s390x, powerVM
21
* Ticket triaging of http://progress.opensuse.org/projects/openqatests/
22
* Feature development within the backend for single teams (commonly provided by teams themselves)
23
24
## How we work
25
26 32 okurz
The QA Tools team is following the DevOps approach working using a lightweight Agile approach. We plan and track our works using tickets on https://progress.opensuse.org . We pick tickets based on priority and planning decisions. We use weekly meetings as checkpoints for progress and also track cycle and lead times to crosscheck progress against expectations.
27 27 okurz
28 31 okurz
* [Ready Issues](https://progress.opensuse.org/projects/openqav3/issues?query_id=230)
29
* [what members of the tools team are working on](https://progress.opensuse.org/projects/openqav3/issues?query_id=400)
30 1 mgriessmeier
31
Also find the custom queries in the right-hand sidebar of https://progress.opensuse.org/projects/openqav3/issues for tickets and their plans.
32 32 okurz
33
### Common tasks for team members
34
35
This is a list of common tasks that we follow, e.g. reviewing daily based on individual steps in the DevOps Process ![DevOps Process](devops-process_25p.png)
36
37
* **Plan**:
38
 * State daily learning and planned tasks in internal chat room
39
 * Review backlog for time-critical, triage new tickets, pick tickets from backlog; see https://progress.opensuse.org/projects/qa/wiki#How-we-work-on-our-backlog
40
* **Code**:
41
 * See project specific contribution instructions
42
 * Provide peer-review following https://github.com/notifications based on projects within the scope of https://github.com/os-autoinst/ with the exception of test code repositories, especially https://github.com/os-autoinst/openQA, https://github.com/os-autoinst/os-autoinst, https://github.com/os-autoinst/scripts, https://github.com/os-autoinst/os-autoinst-distri-openQA, https://github.com/os-autoinst/openqa-trigger-from-obs, https://github.com/os-autoinst/openqa_review
43
* **Build**:
44
 * See project specific contribution instructions
45
* **Test**:
46
 * Monitor failures on https://travis-ci.org/ relying on https://build.opensuse.org/package/show/devel:openQA/os-autoinst_dev for os-autoinst (email notifications)
47
 * Monitor failures on https://app.circleci.com/pipelines/github/os-autoinst/openQA?branch=master relying on https://build.opensuse.org/project/show/devel:openQA:ci for openQA (email notifications)
48
* **Release**:
49
 * By default we use the rolling-release model for all projects unless specified otherwise
50
 * Monitor https://build.opensuse.org/project/show/devel:openQA (all packages and all subprojects) for failures, ensure packages are published on http://download.opensuse.org/repositories/devel:/openQA/
51
 * Monitor http://jenkins.qa.suse.de/view/openQA-in-openQA/ for the openQA-in-openQA Tests and automatic submissions of os-autoinst and openQA to openSUSE:Factory through https://build.opensuse.org/project/show/devel:openQA:tested
52
* **Deploy**:
53
 * o3 is automatically deployed (daily), see https://progress.opensuse.org/projects/openqav3/wiki/Wiki#Automatic-update-of-o3
54
 * osd is automatically deployed (weekly), monitor https://gitlab.suse.de/openqa/osd-deployment/pipelines and watch for notification email to openqa@suse.de
55
* **Operate**:
56
 * Apply infrastructure changes from https://gitlab.suse.de/openqa/salt-states-openqa (osd) or manually over sshd (o3)
57 37 okurz
 * Monitor for backup, see https://gitlab.suse.de/qa-sle/backup-server-salt
58 32 okurz
config changes in salt (osd), backups, job group configuration changes
59
* **Monitor**:
60
 * React on alerts from https://stats.openqa-monitor.qa.suse.de/alerting/list?state=not_ok (emails on osd-admins@suse.de)
61
 * Look for incomplete jobs or scheduled not being worked on o3 and osd (API or webUI)
62 34 okurz
 * Be responsive on #opensuse-factory (irc://chat.freenode.net/opensuse-factory) for help, support and collaboration (Unless you have a better solution it is suggested to use [Element.io](https://app.element.io/#/room/%23freenode_%23opensuse-factory:matrix.org) for a sustainable presence)
63 1 mgriessmeier
 * Be responsive on [#testing](https://chat.suse.de/channel/testing) for help, support and collaboration
64 37 okurz
 * Be responsive on mailing lists opensuse-factory@opensuse.org and openqa@suse.de
65 31 okurz
66 27 okurz
### How we work on our backlog
67
68
* "due dates" are only used as exception or reminders
69
* every team member can pick up tickets themselves
70
* everybody can set priority, PO can help to resolve conflicts
71
72
#### WIP-limits (reference "Kanban development"):
73 28 okurz
74 1 mgriessmeier
* global limit of 14 tickets "In Progress"
75
* personal limit of 3 tickets "In Progress"
76 30 okurz
77 31 okurz
To check: Open [query](https://progress.opensuse.org/projects/openqav3/issues?utf8=%E2%9C%93&set_filter=1&type=IssueQuery&sort=id%3Adesc&f%5B%5D=status_id&op%5Bstatus_id%5D=%3D&v%5Bstatus_id%5D%5B%5D=2&f%5B%5D=assigned_to_id&op%5Bassigned_to_id%5D=%3D&v%5Bassigned_to_id%5D%5B%5D=32300&v%5Bassigned_to_id%5D%5B%5D=15&v%5Bassigned_to_id%5D%5B%5D=34361&v%5Bassigned_to_id%5D%5B%5D=23018&v%5Bassigned_to_id%5D%5B%5D=22072&v%5Bassigned_to_id%5D%5B%5D=24624&v%5Bassigned_to_id%5D%5B%5D=17668&v%5Bassigned_to_id%5D%5B%5D=33482&v%5Bassigned_to_id%5D%5B%5D=32669&f%5B%5D=subproject_id&op%5Bsubproject_id%5D=*&f%5B%5D=&c%5B%5D=subject&c%5B%5D=project&c%5B%5D=status&c%5B%5D=assigned_to&c%5B%5D=fixed_version&c%5B%5D=due_date&c%5B%5D=priority&c%5B%5D=updated_on&c%5B%5D=category&group_by=assigned_to&t%5B%5D=) and look for tickets total number of tickets as well as per person
78 27 okurz
79 1 mgriessmeier
#### Target numbers or "guideline", "should be", in priorities
80 27 okurz
81 28 okurz
1. New, untriaged: 0 (daily)
82
2. Workable (properly defined): 40
83 27 okurz
3. Overall backlog length: ideally less than 100
84
85
#### SLOs (service level objectives)
86
87
* for picking up tickets based on priority, first goal is "urgency removal":
88 29 okurz
 * **immediate**: [<1 day](https://progress.opensuse.org/projects/openqav3/issues?utf8=%E2%9C%93&set_filter=1&f%5B%5D=priority_id&op%5Bpriority_id%5D=%3D&v%5Bpriority_id%5D%5B%5D=7&f%5B%5D=status_id&op%5Bstatus_id%5D=o&f%5B%5D=subproject_id&op%5Bsubproject_id%5D=%3D&v%5Bsubproject_id%5D%5B%5D=125&f%5B%5D=updated_on&op%5Bupdated_on%5D=%3Ct-&v%5Bupdated_on%5D%5B%5D=1&f%5B%5D=&c%5B%5D=subject&c%5B%5D=project&c%5B%5D=status&c%5B%5D=assigned_to&c%5B%5D=fixed_version&c%5B%5D=due_date&c%5B%5D=priority&c%5B%5D=updated_on&c%5B%5D=category&group_by=priority)
89
 * **urgent**: [<1 week](https://progress.opensuse.org/projects/openqav3/issues?utf8=%E2%9C%93&set_filter=1&f%5B%5D=priority_id&op%5Bpriority_id%5D=%3D&v%5Bpriority_id%5D%5B%5D=6&f%5B%5D=status_id&op%5Bstatus_id%5D=o&f%5B%5D=subproject_id&op%5Bsubproject_id%5D=%3D&v%5Bsubproject_id%5D%5B%5D=125&f%5B%5D=updated_on&op%5Bupdated_on%5D=%3Ct-&v%5Bupdated_on%5D%5B%5D=7&f%5B%5D=&c%5B%5D=subject&c%5B%5D=project&c%5B%5D=status&c%5B%5D=assigned_to&c%5B%5D=fixed_version&c%5B%5D=due_date&c%5B%5D=priority&c%5B%5D=updated_on&c%5B%5D=category&group_by=status)
90
 * **high**: [<1 month](https://progress.opensuse.org/projects/openqav3/issues?utf8=%E2%9C%93&set_filter=1&f%5B%5D=status_id&op%5Bstatus_id%5D=o&f%5B%5D=priority_id&op%5Bpriority_id%5D=%3D&v%5Bpriority_id%5D%5B%5D=5&f%5B%5D=subproject_id&op%5Bsubproject_id%5D=%3D&v%5Bsubproject_id%5D%5B%5D=125&f%5B%5D=updated_on&op%5Bupdated_on%5D=%3Ct-&v%5Bupdated_on%5D%5B%5D=30&f%5B%5D=&c%5B%5D=subject&c%5B%5D=project&c%5B%5D=status&c%5B%5D=assigned_to&c%5B%5D=fixed_version&c%5B%5D=due_date&c%5B%5D=priority&c%5B%5D=updated_on&c%5B%5D=category&group_by=status)
91
 * **normal**: [<1 year](https://progress.opensuse.org/projects/openqav3/issues?utf8=%E2%9C%93&set_filter=1&f%5B%5D=priority_id&op%5Bpriority_id%5D=%3D&v%5Bpriority_id%5D%5B%5D=4&f%5B%5D=status_id&op%5Bstatus_id%5D=o&f%5B%5D=subproject_id&op%5Bsubproject_id%5D=%3D&v%5Bsubproject_id%5D%5B%5D=125&f%5B%5D=updated_on&op%5Bupdated_on%5D=%3Ct-&v%5Bupdated_on%5D%5B%5D=365&f%5B%5D=&c%5B%5D=subject&c%5B%5D=project&c%5B%5D=status&c%5B%5D=assigned_to&c%5B%5D=fixed_version&c%5B%5D=due_date&c%5B%5D=priority&c%5B%5D=updated_on&c%5B%5D=category&group_by=status)
92 1 mgriessmeier
 * **low**: undefined
93
94
* aim for cycle time of individual tickets (not epics or sagas): 1h-2w
95 31 okurz
96
97
### Historical
98
99
Previously the QA tools team used target versions "Ready" (to be planned into individual milestone periods or sprints), "Current Sprint" and "Done". However the team never really did use proper time-limited sprints so the distinction was rather vague. After having tickets "Resolved" after some time the PO or someone else would also update the target version to "Done" to signal that the result has been reviewed. This was causing a lot of ticket update noise for not much value considering that the [Definition-of-Done](https://progress.opensuse.org/projects/openqav3/wiki/#ticket-workflow) when properly followed already has rather strict requirements on when something can be considered really "Resolved" hence the team eventually decided to not use the "Done" target version anymore. Since about 2019-05 (and since okurz is doing more backlog management) the team uses priorities more as well as the status "Workable" together with an explicit team member list for "What the team is working on" to better visualize what is making team members busy regardless of what was "officially" planned to be part of the team's work. So we closed the target version. On 2020-07-03 okurz subsequently closed "Current Sprint" as also this one was in most cases equivalent to just picking an assignee for a ticket or setting to "In Progress". We can just distinguish between "(no version)" meaning untriaged, "Ready" meaning tools team should consider picking up these issues and "future" meaning that there is no plan for this to be picked up. Everything else is defined by status and priority.
100 27 okurz
101
# QA SLE Functional - Team description
102
103 1 mgriessmeier
**QSF (QA SLE Functional)** is a virtual team focusing on QA of the "functional" domain of the SUSE SLE products. The virtual team is mainly comprised of members of [SUSE QA SLE Nbg](https://wiki.suse.net/index.php/SUSE-Quality_Assurance/Organization/Members_and_Responsibilities#QA_SLE_NBG_Team) including members from [SUSE QA SLE Prg](https://wiki.suse.net/index.php/SUSE-Quality_Assurance/Organization/Members_and_Responsibilities#QA_SLE_PRG_Team). The [SLE Departement](https://wiki.suse.net/index.php/SUSE-Quality_Assurance/SLE_Department#QSF_.28QA_SLE_Functional.29) page describes our QA responsibilities. We focus on our automatic tests running in [openQA](https://openqa.suse.de) under the job groups "Functional" as well as "Autoyast" for the respective products, for example [SLE 15 / Functional](https://openqa.suse.de/group_overview/110) and [SLE 15 / Autoyast](https://openqa.suse.de/group_overview/129). We back our automatic tests with exploratory manual tests, especially for the product milestone builds. Additionally we care about corresponding openSUSE openQA tests (see as well https://openqa.opensuse.org).
104 7 szarate
105 1 mgriessmeier
* long-term roadmap: http://s.qa.suse.de/qa-long-term
106
* overview of current openQA SLE12SP5 tests with progress ticket references: https://openqa.suse.de/tests/overview?distri=sle&version=12-SP5&groupid=139&groupid=142
107
* fate tickets for SLE12SP5 feature testing: based on http://s.qa.suse.de/qa_sle_functional_feature_tests_sle12sp5 new report based on all tickets with milestone before SLE12SP5 GM, http://s.qa.suse.de/qa_sle_functional_feature_tests_sle15sp1 for SLE15SP1
108
* only "blocker" or "shipstopper" bugs on "interesting products" for SLE15 http://s.qa.suse.de/qa_sle_functional_bug_query_sle15_2, http://s.qa/qa_sle_bugs_sle12_2 for SLE12
109 3 szarate
* Better organization of planned work can be seen at the [SUSE QA](https://progress.opensuse.org/projects/suseqa) project (which is not public).
110 1 mgriessmeier
111 27 okurz
## Test plan
112 1 mgriessmeier
113
When looking for coverage of certain components or use cases keep the [openQA glossary](http://open.qa/docs/#concept) in mind. It is important to understand that "tests in openQA" could be a scenario, for example a "textmode installation run", a combined multi-machine scenario, for example "a remote ssh based installation using X-forwarding", or a test module, for example "vim", which checks if the vim editor is correctly installed, provides correct rendering and basic functionality. You are welcome to contact any member of the team to ask for more clarification about this.
114
115 19 okurz
In detail the following areas are tested as part of "SLE functional":
116
117 1 mgriessmeier
* different hardware setups (UEFI, acpi)
118
* support for localization
119
* openSUSE: virtualization - some "virtualization" tests are active on o3 with reduced set compared to SLE coverage (on behalf of QA SLE virtualization due to team capacity constraints, clarified in QA SLE coordination meeting 2018-03-28)
120
* openSUSE: migration - comparable to "virtualization", a reduced set compared to SLE coverage is active on o3 (on behalf of QA SLE migration due to team capacity constraints, clarified in QA SLE coordination meeting 2018-04)
121 26 riafarov
122
123 27 okurz
### QSF-y
124 18 okurz
125
Virtual team focuses on testing YaST components, including installer and snapper.
126 1 mgriessmeier
127 18 okurz
Detailed test plan for SLES can be found here: [SLES_Integration_Level_Testplan.md](https://gitlab.suse.de/qsf-y/qa-sle-functional-y/blob/master/SLES_Integration_Level_Testplan.md)
128 1 mgriessmeier
129
* Latest report based on openQA test results SLE12: http://s.qa.suse.de/test-status-sle12-yast , SLE15: http://s.qa.suse.de/test-status-sle15-yast
130 2 mgriessmeier
131
132 27 okurz
### QSF-u
133 1 mgriessmeier
134
"Testing is the future, and the future starts with you"
135
136
* basic operations (firefox, zypper, logout/reboot/shutdown)
137
* boot_to_snapshot
138 18 okurz
* functional application tests (kdump, gpg, ipv6, java, git, openssl, openvswitch, VNC)
139
* NIS (server, client)
140 1 mgriessmeier
* toolchain (development module)
141
* systemd
142 6 okurz
* "transactional-updates" as part of the corresponding SLE server role, not CaaSP
143
144
* Latest report based on openQA test results SLE12: http://s.qa.suse.de/test-status-sle12-functional , SLE15: http://s.qa.suse.de/test-status-sle15-functional
145 1 mgriessmeier
146 6 okurz
147
## Explicitly not covered by QSF
148 1 mgriessmeier
149
* quarterly updated media: Expected to be covered by Maintenance + QAM
150
151
152 27 okurz
## What we do
153 1 mgriessmeier
154
We collected opinions, personal experiences and preferences starting with the following four topics: What are fun-tasks ("new tests", "collaborate", "do it right"), what parts are annoying ("old & sporadic issues"), what do we think is expected from qsf-u ("be quick", "keep stuff running", "assess quality") and what we should definitely keep doing to prevent stakeholders becoming disappointed ("build validation", "communication & support").
155 12 okurz
156 27 okurz
### How we work on our backlog
157 12 okurz
158
* no "due date"
159
* we pick up tickets that have not been previously discussed
160 1 mgriessmeier
* more flexible choice
161 14 okurz
* WIP-limits:
162
 * global limit of 10 tickets "In Progress"
163
164
* target numbers or "guideline", "should be", in priorities:
165 12 okurz
 1. New, untriaged: 0
166
 2. Workable: 40
167 1 mgriessmeier
 3. New, assigned to [u]: ideally less than 200 (should not stop you from triaging)
168
169
* SLAs for priority tickets - how to ensure to work on tickets which are more urgent?
170
 * "taken": <1d: immediate -> looking daily
171
 * 2-3d: urgent
172 12 okurz
 * first goal is "urgency removal": <1d: immediate, 1w: urgent
173 1 mgriessmeier
174 12 okurz
* our current "cycle time" is 1h - 1y (maximum, with interruptions)
175 1 mgriessmeier
176
* everybody should set priority + milestone in obvious cases, e.g. new reproducible test failures in multiple critical scenarios, in general case the PO decides
177
178 27 okurz
### How we like to choose our battles
179 1 mgriessmeier
180
We self-assessed our tasks on a scale from "administrative" to "creative" and found in the following descending order: daily test review (very "administrative"), ticket triaging, milestone validation, code review, create needles, infrastructure issues, fix and cleanup tests, find bugs while fixing failing tests, find bugs while designing new tests, new automated tests (very "creative"). Then we found we appreciate if our work has a fair share of both sides. Probably a good ratio is 60% creative plus 40% administrative tasks. Both types have their advantages and we should try to keep the healthy balance.
181
182
183 27 okurz
### What "product(s)" do we (really) *care* about?
184 1 mgriessmeier
185
Brainstorming results:
186
187
* openSUSE Krypton -> good example of something that we only remotely care about or not at all even though we see the connection point, e.g. test plasma changes early before they reach TW or Leap as operating systems we rely on or SLE+packagehub which SUSE does not receive direct revenue from but indirect benefit. Should be "community only", that includes members from QSF though
188
* openQA -> (like OBS), helps to provide ROI for SUSE
189
* SLE(S) (in development versions)
190
* Tumbleweed
191
* Leap, because we use it
192
* SLES HA
193
* SLE migration
194
* os-autoinst-distri-opensuse+backend+needles
195
196
From this list strictly no "product" gives us direct revenue however most likely SLE(S) (as well as SLES HA and SLE migration) are good examples of direct connection to revenue (based on SLE subscriptions). Conducting a poll in the team has revealed that 3 persons see "SLE(S)" as our main product and 3 see "os-autoinst-distri-opensuse+backend+needles" as the main product. We mainly agreed that however we can not *own* a product like "SLE" because that product is mainly not under our control.
197
198
Visualizing "cost of testing" vs. "risk of business impact" showed that both metrics have an inverse dependency, e.g. on a range from "upstream source code" over "package self-tests", "openSUSE Factory staging", "Tumbleweed", "SLE" we consider SLE to have the highest business risk attached and therefore defines our priority however testing at upstream source level is considered most effective to prevent higher cost of bugs or issues. Our conclusion is that we must ensure that the high-risk SLE base has its quality assured while supporting a quality assurance process as early as possible in the development process. package self-tests as well as the openQA staging tests are seen as a useful approach in that direction as well as "domain specfic specialist QA engineers" working closely together with according in-house development parties.
199
200 27 okurz
## Documentation
201 1 mgriessmeier
202
This documentation should only be interesting for the team QA SLE functional. If you find that some of the following topics are interesting for other people, please extract those topics to another wiki section.
203
204
### QA SLE functional Dashboards
205
206
In room 3.2.15 from Nuremberg office are two dedicated laptops each with a monitor attached showing a selected overview of openQA test resuls with important builds from SLE and openSUSE.
207 4 szarate
Such laptops are configured with a root account with the default password for production machines. First point of contact: [slindomansilla.suse.com](mailto:slindomansilla@suse.com), (okurz@suse.de)[mailto:okurz@suse.de]
208 1 mgriessmeier
209
* ''dashboard-osd-3215.suse.de'': Showing current view of openqa.suse.de filtered for some job group results, e.g. "Functional"
210
* ''dashboard-o3-3215.suse.de'': Showing current view of openqa.opensuse.org filtered for some job group results which we took responsibility to review and are mostly interested in
211
212 24 dheidler
### dashboard-osd-3215
213 1 mgriessmeier
214
* OS: openSUSE Tumbleweed
215
* Services: ssh, mosh, vnc, x2x
216
* Users:
217
** root
218
** dashboard
219
* VNC: `vncviewer dashboard-osd-3215`
220
* X2X: `ssh -XC dashboard@dashboard-osd-3215 x2x -west -to :0.0`
221
** (attaches the dashboard monitor as an extra display to the left of your screens. Then move the mouse over and the attached X11 server will capture mouse and keyboard)
222
223
#### Content of /home/dashboard/.xinitrc
224
225 3 szarate
```
226 1 mgriessmeier
#
227
# Source common code shared between the
228
# X session and X init scripts
229
#
230
. /etc/X11/xinit/xinitrc.common
231
232
xset -dpms
233
xset s off
234
xset s noblank
235
[...]
236
#
237
# Add your own lines here...
238
#
239
$HOME/bin/osd_dashboard &
240 3 szarate
```
241 1 mgriessmeier
242
#### Content of /home/dashboard/bin/osd_dashboard
243
244 3 szarate
```
245 1 mgriessmeier
#!/bin/bash
246
247
DISPLAY=:0 unclutter &
248
249
DISPLAY=:0 xset -dpms
250
DISPLAY=:0 xset s off
251
DISPLAY=:0 xset s noblank
252
253
url="${url:-"https://openqa.suse.de/?group=SLE+15+%2F+%28Functional%7CAutoyast%29&default_expanded=1&limit_builds=3&time_limit_days=14&show_tags=1&fullscreen=1#"}"
254 20 dheidler
DISPLAY=:0 chromium --kiosk "$url"
255 3 szarate
```
256 1 mgriessmeier
257
#### Cron job:
258
259 3 szarate
```
260 1 mgriessmeier
Min     H       DoM     Mo      DoW     Command
261 23 dheidler
*	*	*	*	*	/home/dashboard/bin/reload_chromium
262 3 szarate
```
263 1 mgriessmeier
264 21 dheidler
#### Content of /home/dashboard/bin/reload_chromium
265 1 mgriessmeier
266 3 szarate
```
267 1 mgriessmeier
#!/bin/bash
268
269
DISPLAY=:0 xset -dpms
270
DISPLAY=:0 xset s off
271
DISPLAY=:0 xset s noblank
272
273 22 dheidler
DISPLAY=:0 xdotool windowactivate $(DISPLAY=:0 xdotool search --class Chromium)
274 21 dheidler
DISPLAY=:0 xdotool key F5
275
DISPLAY=:0 xdotool windowactivate $(DISPLAY=:0 xdotool getactivewindow)
276 3 szarate
```
277 1 mgriessmeier
278
#### Issues:
279
280
* ''When the screen shows a different part of the web page''
281
** a simple mouse scroll through vnc or x2x may suffice.
282
* ''When the builds displayed are freeze without showing a new build, it usually means that midori, the browser displaying the info on the screen, crashed.''
283
** you can try to restart midori this way:
284
*** ps aux | grep midori
285
*** kill $pid
286
*** /home/dashboard/bin/osd_dashboard
287
** If this also doesn't work, restart the machine.
288 25 dheidler
289
290
### dashboard-o3
291
292
* Raspberry Pi 3B+
293
* IP: `10.160.65.207`
294
295
#### Content of /home/tux/.xinitrc
296
```
297
#!/bin/bash
298
299
unclutter &
300
openbox &
301
xset s off
302
xset -dpms
303
sleep 5
304
url="https://openqa.opensuse.org?group=openSUSE Tumbleweed\$|openSUSE Leap [0-9]{2}.?[0-9]*\$|openSUSE Leap.\*JeOS\$|openSUSE Krypton|openQA|GNOME Next&limit_builds=2&time_limit_days=14&&show_tags=1&fullscreen=1#build-results"
305
chromium --kiosk "$url" &
306
307
while sleep 300 ; do
308
        xdotool windowactivate $(xdotool search --class Chromium)
309
        xdotool key F5
310
        xdotool windowactivate $(xdotool getactivewindow)
311
done
312
```
313
314
#### Content of /usr/share/lightdm/lightdm.conf.d/50-suse-defaults.conf
315
```
316
[Seat:*]
317
pam-service = lightdm
318
pam-autologin-service = lightdm-autologin
319
pam-greeter-service = lightdm-greeter
320
xserver-command=/usr/bin/X
321
session-wrapper=/etc/X11/xdm/Xsession
322
greeter-setup-script=/etc/X11/xdm/Xsetup
323
session-setup-script=/etc/X11/xdm/Xstartup
324
session-cleanup-script=/etc/X11/xdm/Xreset
325
autologin-user=tux
326
autologin-timeout=0
327
```