openSUSE Project Management Tool: Issueshttps://progress.opensuse.org/https://progress.opensuse.org/themes/openSUSE/favicon/favicon.ico?15829177842022-03-31T11:13:01ZopenSUSE Project Management Tool
Redmine openQA Project - action #109310 (Resolved): qem-bot/dashboard - mixed old and new incidents size:Mhttps://progress.opensuse.org/issues/1093102022-03-31T11:13:01Zosukup
<a name="Observation"></a>
<h2 >Observation<a href="#Observation" class="wiki-anchor">¶</a></h2>
<p>Maintenance sometimes re-uses old incidents instead of creating new ones for package which leads to mixed results in dashboard :(</p>
<p>see: <a href="https://suse.slack.com/archives/C02D16TCP99/p1648721562205869">https://suse.slack.com/archives/C02D16TCP99/p1648721562205869</a></p>
<p>So we need workaround/solution for this corner case</p>
<p>See also <a href="https://github.com/openSUSE/qem-dashboard/issues/61">https://github.com/openSUSE/qem-dashboard/issues/61</a></p>
<p>Originally brought up by coolo in<br>
<a href="https://suse.slack.com/archives/C02D16TCP99/p1638283633141300">https://suse.slack.com/archives/C02D16TCP99/p1638283633141300</a> </p>
<blockquote>
<p>I just noticed a rather alarming issue: <a href="http://dashboard.qam.suse.de/incident/20989">http://dashboard.qam.suse.de/incident/20989</a> talks about 43 passed, 1 failed jobs for the incident</p>
</blockquote>
<a name="Problems"></a>
<h2 >Problems<a href="#Problems" class="wiki-anchor">¶</a></h2>
<ul>
<li><a href="http://dashboard.qam.suse.de/incident/20639">http://dashboard.qam.suse.de/incident/20639</a> references "208 passed, 4 failed, 12 stopped" and a link to openQA results <a href="https://openqa.suse.de/tests/overview?build=%3A20639%3Aopensc">https://openqa.suse.de/tests/overview?build=%3A20639%3Aopensc</a> but the openQA test results only show 183 passed and 18 soft-failed
<ul>
<li>-> dashboard should not say "passed" when it means "passed+softfailed" but "ok", see <a href="https://github.com/os-autoinst/openQA/blob/master/lib/OpenQA/Jobs/Constants.pm#L76=">https://github.com/os-autoinst/openQA/blob/master/lib/OpenQA/Jobs/Constants.pm#L76=</a></li>
<li>-> Consider using time-fixed links, e.g. <a href="https://openqa.suse.de/tests/overview?build=%3A20639%3Aopensc&t=2022-04-01+08%3A53%3A19+%2B0000">https://openqa.suse.de/tests/overview?build=%3A20639%3Aopensc&t=2022-04-01+08%3A53%3A19+%2B0000</a></li>
<li>-> Ensure that the results are current and correspond to what openQA sees itself (numbers should match)</li>
<li>-> Exclude any results that are outside a "reasonable time range", e.g. <a href="http://dashboard.qam.suse.de/blocked">http://dashboard.qam.suse.de/blocked</a> for 20639 shows incident results from some months ago, build 2021…</li>
</ul></li>
</ul>
<a name="Acceptance-criteria"></a>
<h2 >Acceptance criteria<a href="#Acceptance-criteria" class="wiki-anchor">¶</a></h2>
<ul>
<li><strong>AC1:</strong> It is possible to reuse incidents and qem-bot can still approve releated release requests</li>
</ul>
<a name="Suggestions"></a>
<h2 >Suggestions<a href="#Suggestions" class="wiki-anchor">¶</a></h2>
<ul>
<li>Read the qem-dashboard schema to understand where important settings are stored in <a href="https://github.com/openSUSE/qem-dashboard/">https://github.com/openSUSE/qem-dashboard/</a> , in particular <a href="https://github.com/openSUSE/qem-dashboard/blob/main/migrations/dashboard.sql">https://github.com/openSUSE/qem-dashboard/blob/main/migrations/dashboard.sql</a></li>
<li>Read the proper manual process as "Workaround" and for us to understand (further down)</li>
<li>Just delete all aggregate openQA data in qem-dashboard older than configurable, but default 90 days</li>
</ul>
<a name="Workarounds"></a>
<h2 >Workarounds<a href="#Workarounds" class="wiki-anchor">¶</a></h2>
<ul>
<li>Ask maintenance to create a new, fresh incident, e.g. by a comment in IBS</li>
<li>Detect invalid requests e.g. with outdates results and reject them</li>
<li>Manually delete</li>
</ul>
<p>Something along the lines of</p>
<pre><code>ssh root@qam2.suse.de
machinectl shell postgresql
sudo -u postgres psql dashboard_db
(wreak havok in here)
SELECT update_settings FROM openqa_jobs WHERE update_settings is not NULL AND updated < NOW() - INTERVAL X
(store update_settings)
DELETE FROM openqa_jobs WHERE update_settings is not NULL AND updated < NOW() - INTERVAL X
DELETE FROM update_openqa_settings WHERE id in `stored update_settings`
</code></pre> openQA Infrastructure - action #109301 (Rejected): openqaworker14 + openqaworker15 sporadically g...https://progress.opensuse.org/issues/1093012022-03-31T09:07:53Zosukup
<a name="OBSERVATION"></a>
<h2 >OBSERVATION<a href="#OBSERVATION" class="wiki-anchor">¶</a></h2>
<p>on reboot time to time this workers fails to correctly boot ending in emergency mode:</p>
<pre><code>bře 08 14:34:24 openqaworker14 kernel: Loading iSCSI transport class v2.0-870.
bře 08 14:34:24 openqaworker14 systemd[1]: Finished Create Volatile Files and Directories.
bře 08 14:34:24 openqaworker14 systemd[1]: Starting Security Auditing Service...
bře 08 14:34:24 openqaworker14 openqa-establish-nvme-setup[1557]: NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINT
bře 08 14:34:24 openqaworker14 openqa-establish-nvme-setup[1557]: nvme0n1 259:0 0 3.5T 0 disk
bře 08 14:34:24 openqaworker14 openqa-establish-nvme-setup[1557]: ├─nvme0n1p1 259:1 0 512M 0 part
bře 08 14:34:24 openqaworker14 openqa-establish-nvme-setup[1557]: ├─nvme0n1p2 259:2 0 1T 0 part /
bře 08 14:34:24 openqaworker14 openqa-establish-nvme-setup[1557]: └─nvme0n1p3 259:3 0 2.5T 0 part
bře 08 14:34:24 openqaworker14 openqa-establish-nvme-setup[1557]: └─md127 9:127 0 2.5T 0 raid0
bře 08 14:34:24 openqaworker14 openqa-establish-nvme-setup[1552]: Stopping current RAID "/dev/md/openqa"
bře 08 14:34:24 openqaworker14 systemd[1]: Finished Flush Journal to Persistent Storage.
bře 08 14:34:24 openqaworker14 kernel: i40iw_open: i40iw_open completed
bře 08 14:34:24 openqaworker14 systemd[1]: Created slice Slice /system/rdma-load-modules.
bře 08 14:34:24 openqaworker14 systemd[1]: Starting Load RDMA modules from /etc/rdma/modules/iwarp.conf...
bře 08 14:34:24 openqaworker14 systemd[1]: Starting Load RDMA modules from /etc/rdma/modules/rdma.conf...
bře 08 14:34:24 openqaworker14 kernel: ixgbe 0000:d8:00.1: Multiqueue Enabled: Rx Queue count = 63, Tx Queue count = 63 XDP Queue count = 0
bře 08 14:34:24 openqaworker14 systemd[1]: Finished Load RDMA modules from /etc/rdma/modules/iwarp.conf.
bře 08 14:34:24 openqaworker14 openqa-establish-nvme-setup[1559]: mdadm: stopped /dev/md/openqa
bře 08 14:34:24 openqaworker14 openqa-establish-nvme-setup[1552]: Creating RAID0 "/dev/md/openqa" on: /dev/nvme0n1p3
bře 08 14:34:24 openqaworker14 openqa-establish-nvme-setup[1574]: mdadm: /dev/nvme0n1p3 appears to be part of a raid array:
bře 08 14:34:24 openqaworker14 openqa-establish-nvme-setup[1574]: level=raid0 devices=1 ctime=Mon Mar 7 10:20:52 2022
bře 08 14:34:24 openqaworker14 openqa-establish-nvme-setup[1574]: mdadm: unexpected failure opening /dev/md127
bře 08 14:34:24 openqaworker14 openqa-establish-nvme-setup[1552]: Unable to create RAID, mdadm returned with non-zero code
bře 08 14:34:24 openqaworker14 kernel: i40iw_open: i40iw_open completed
bře 08 14:34:24 openqaworker14 systemd[1]: openqa_nvme_format.service: Main process exited, code=exited, status=1/FAILURE
bře 08 14:34:24 openqaworker14 systemd[1]: openqa_nvme_format.service: Failed with result 'exit-code'.
bře 08 14:34:24 openqaworker14 systemd[1]: Failed to start Setup NVMe before mounting it.
bře 08 14:34:24 openqaworker14 systemd[1]: Dependency failed for /var/lib/openqa.
bře 08 14:34:24 openqaworker14 systemd[1]: Dependency failed for openQA Worker #1.
bře 08 14:34:24 openqaworker14 systemd[1]: openqa-worker-auto-restart@1.service: Job openqa-worker-auto-restart@1.service/start failed with result 'dependency'.
bře 08 14:34:24 openqaworker14 systemd[1]: Dependency failed for var-lib-openqa-share.automount.
bře 08 14:34:24 openqaworker14 systemd[1]: var-lib-openqa-share.automount: Job var-lib-openqa-share.automount/start failed with result 'dependency'.
bře 08 14:34:24 openqaworker14 systemd[1]: Dependency failed for openQA Worker #3.
bře 08 14:34:24 openqaworker14 systemd[1]: openqa-worker-auto-restart@3.service: Job openqa-worker-auto-restart@3.service/start failed with result 'dependency'.
bře 08 14:34:24 openqaworker14 systemd[1]: Dependency failed for Prepare NVMe after mounting it.
bře 08 14:34:24 openqaworker14 systemd[1]: openqa_nvme_prepare.service: Job openqa_nvme_prepare.service/start failed with result 'dependency'.
bře 08 14:34:24 openqaworker14 systemd[1]: Dependency failed for Local File Systems.
bře 08 14:34:24 openqaworker14 systemd[1]: local-fs.target: Job local-fs.target/start failed with result 'dependency'.
bře 08 14:34:24 openqaworker14 systemd[1]: local-fs.target: Triggering OnFailure= dependencies.
bře 08 14:34:24 openqaworker14 systemd[1]: Dependency failed for openQA Worker #2.
bře 08 14:34:24 openqaworker14 systemd[1]: openqa-worker-auto-restart@2.service: Job openqa-worker-auto-restart@2.service/start failed with result 'dependency'.
bře 08 14:34:24 openqaworker14 systemd[1]: Dependency failed for openQA Worker #4.
bře 08 14:34:24 openqaworker14 systemd[1]: openqa-worker-auto-restart@4.service: Job openqa-worker-auto-restart@4.service/start failed with result 'dependency'.
bře 08 14:34:24 openqaworker14 systemd[1]: var-lib-openqa.mount: Job var-lib-openqa.mount/start failed with result 'dependency'.
</code></pre>
<p>Cause of problem is probably difference in hw configuration of this workers. Our standard workers have 1x HDD with OS and 1x name SSD with /dev/md/openQA. This workers have only one nvme SSD.<br>
Configured as:</p>
<pre><code>nvme0n1
├─nvme0n1p1 vfat FAT32 9AED-277B 506M 1% /boot/efi
├─nvme0n1p2 btrfs 5a405f4e-bd0c-46cb-a5ee-a0e976968be1 1016,5G 1% /
└─nvme0n1p3 linux_raid_member 1.2 openqaworker14:openqa 03972fdb-874d-cbec-4cb8-bca5412d90a2
└─md127 ext2 1.0 4c30279b-d757-4a97-b636-539b18bc9e22 2,3T 0% /var/lib/openqa
</code></pre> openQA Project - action #107497 (Resolved): [qe-tools] openqaworker14 (and openqa15) - developer...https://progress.opensuse.org/issues/1074972022-02-24T08:24:28Zosukup
<a name="Observation"></a>
<h2 >Observation<a href="#Observation" class="wiki-anchor">¶</a></h2>
<p>in live view developer mode fails and reports fail every 2 sec</p>
<p>from browser javascript console:</p>
<pre><code>Establishing ws connection to wss://openqa.suse.de/liveviewhandler/tests/8210859/developer/ws-proxy/status
Received message via ws proxy: {"data":null,"type":"info","what":"connecting to os-autoinst command server at ws:\/\/10.100.96.68:20043\/OX0bG6w17OkxmWt_\/ws"}
Received message via ws proxy: {"data":null,"type":"error","what":"unable to upgrade ws to command server"}
Error from ws proxy: unable to upgrade ws to command server
Connection to livehandler lost
</code></pre>
<p>the connection between osd and workers works without problems and IP address of worker is also correct</p>
<a name="Acceptance-criteria"></a>
<h2 >Acceptance criteria<a href="#Acceptance-criteria" class="wiki-anchor">¶</a></h2>
<ul>
<li><strong>AC1:</strong> Developer mode on openqaworker14.qa.suse.cz works reliably as part of the productive OSD infrastructure</li>
</ul>
<a name="Suggestions"></a>
<h2 >Suggestions<a href="#Suggestions" class="wiki-anchor">¶</a></h2>
<ul>
<li>Follow <a href="https://open.qa/docs/#debugdevelmode" class="external">https://open.qa/docs/#debugdevelmode</a></li>
<li><em>not</em> likely related to known issues with network performance</li>
</ul>
openQA Infrastructure - action #106594 (Resolved): [tools] openqaworker-arm-3 periodically fails ...https://progress.opensuse.org/issues/1065942022-02-10T11:36:16Zosukup
<a name="Observation"></a>
<h2 >Observation<a href="#Observation" class="wiki-anchor">¶</a></h2>
<p>from journalctl -xe -u os-autoinst-openvswitch</p>
<pre><code>úno 09 21:56:21 openqaworker-arm-3 os-autoinst-openvswitch[2924]: Waiting for IP on bridge 'br1', 300s left ...
úno 09 21:56:22 openqaworker-arm-3 os-autoinst-openvswitch[2924]: Waiting for IP on bridge 'br1', 299s left ...
....
úno 09 22:01:20 openqaworker-arm-3 os-autoinst-openvswitch[2924]: Waiting for IP on bridge 'br1', 3s left ...
úno 09 22:01:21 openqaworker-arm-3 os-autoinst-openvswitch[2924]: Waiting for IP on bridge 'br1', 2s left ...
úno 09 22:01:22 openqaworker-arm-3 os-autoinst-openvswitch[2924]: can't parse bridge local port IP at /usr/lib/os-autoinst/os-autoinst-openvswitch line 43.
úno 09 22:01:22 openqaworker-arm-3 os-autoinst-openvswitch[2924]: Waiting for IP on bridge 'br1', 1s left ...
úno 09 22:01:22 openqaworker-arm-3 systemd[1]: os-autoinst-openvswitch.service: Main process exited, code=exited, status=255/EXCEPTION
</code></pre>
<p>Default timeout is 60 seconds, on openqaworker-arm-3 is now 5 minutes, but still isn't enough after system reboot</p>
<a name="Rollback-steps"></a>
<h2 >Rollback steps<a href="#Rollback-steps" class="wiki-anchor">¶</a></h2>
<ul>
<li>Unpause alert "Failed systemd services alert (except openqa.suse.de)"systemd services (</li>
</ul>
openQA Infrastructure - action #106365 (Resolved): Improve security for OSD worker credentials br...https://progress.opensuse.org/issues/1063652022-02-09T10:25:15Zosukup
<a name="Motivation"></a>
<h2 >Motivation<a href="#Motivation" class="wiki-anchor">¶</a></h2>
<p><a href="https://progress.opensuse.org/issues/105405" class="external">https://progress.opensuse.org/issues/105405</a> .. changed visibility of salt-pillars-openqa broke <code>deploy</code> stage of CI</p>
<a name="Acceptance-criteria"></a>
<h2 >Acceptance criteria<a href="#Acceptance-criteria" class="wiki-anchor">¶</a></h2>
<ul>
<li><strong>AC1</strong>: Working salt-states+salt-pillars pipelines in gitlab</li>
<li><strong>AC2:</strong> salt-pillars repo stays non-public</li>
</ul>
<a name="Suggestions"></a>
<h2 >Suggestions<a href="#Suggestions" class="wiki-anchor">¶</a></h2>
<ul>
<li>Try out deploy tokens on OSD to fetch the git repo</li>
</ul>
openQA Infrastructure - action #106035 (Rejected): [qe-tools] dehydrated service fails on osdhttps://progress.opensuse.org/issues/1060352022-02-07T08:09:49Zosukup
<p>OSD has systemd in degraded state because system service dehydrated ends in failed state ..</p>
<pre><code>dehydrated.service - Certificate Update Runner for Dehydrated
Loaded: loaded (/usr/lib/systemd/system/dehydrated.service; static)
Active: failed (Result: exit-code) since Mon 2022-02-07 09:03:35 CET; 4min 58s ago
TriggeredBy: ● dehydrated.timer
Process: 26947 ExecStart=/usr/bin/dehydrated --cron (code=exited, status=1/FAILURE)
Main PID: 26947 (code=exited, status=1/FAILURE)
Feb 07 09:03:34 openqa systemd[1]: Starting Certificate Update Runner for Dehydrated...
Feb 07 09:03:34 openqa dehydrated[26947]: # INFO: Using main config file /etc/dehydrated/config
Feb 07 09:03:34 openqa dehydrated[26947]: # INFO: Using additional config file /etc/dehydrated/config.d/suse-ca.sh
Feb 07 09:03:34 openqa dehydrated[26947]: # INFO: Running /usr/bin/dehydrated as dehydrated/dehydrated
Feb 07 09:03:34 openqa sudo[26947]: root : PWD=/ ; USER=dehydrated ; GROUP=dehydrated ; COMMAND=/usr/bin/dehydrated --cron
Feb 07 09:03:35 openqa dehydrated[27267]: {}
Feb 07 09:03:35 openqa systemd[1]: dehydrated.service: Main process exited, code=exited, status=1/FAILURE
Feb 07 09:03:35 openqa systemd[1]: dehydrated.service: Failed with result 'exit-code'.
Feb 07 09:03:35 openqa systemd[1]: Failed to start Certificate Update Runner for Dehydrated.
</code></pre> openQA Project - action #101779 (Resolved): osd deployment failed with non-zero status openqa-wor...https://progress.opensuse.org/issues/1017792021-11-01T08:52:29Zosukup
<p>zypper reports needing reboot so non zero status ( 10x are informal, safely can be considered as zero)</p>
<p><a href="https://gitlab.suse.de/openqa/osd-deployment/-/jobs/669420" class="external">https://gitlab.suse.de/openqa/osd-deployment/-/jobs/669420</a></p>
<p><a href="https://gitlab.suse.de/openqa/osd-deployment/-/blob/master/.gitlab-ci.yml#L188" class="external">https://gitlab.suse.de/openqa/osd-deployment/-/blob/master/.gitlab-ci.yml#L188</a></p>
openQA Project - action #101478 (Resolved): openqa-review pipeline failed because details-* JSON ...https://progress.opensuse.org/issues/1014782021-10-26T07:14:02Zosukup
<p><a href="https://gitlab.suse.de/openqa/openqa-review/-/jobs/658778" class="external">https://gitlab.suse.de/openqa/openqa-review/-/jobs/658778</a></p>
<pre><code>+ /usr/bin/openqa-review --host https://openqa.suse.de -n -r -T --query-issue-status --no-empty-sections --include-softfails --running-threshold=2 --exclude-job-groups '^(Released|Development|old|EOL)' --reminder-comment-on-issues --save --save-dir /tmp/tmp.UXQt70yio3 --skip-passed
WARNING:openqa_review.browser:Unable to decode JSON for [{"build_version_sort":0,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":"90","default_keep_important_results_in_days":"0","default_keep_logs_in_days":"10","default_keep_results_in_days":"21","default_priority":40,"description":"https:\/\/confluence.suse.com\/pages\/viewpage.action?pageId=723878219\r\n\r\nResponsible persons: Stefan Barth <stephan.barth@suse.com>, Heiko Rommel <heiko.rommel@suse.com>","exclusively_kept_asset_size":0,"id":36,"name":"Maintenance: On Submission","size_limit_gb":null,"sort_order":6},{"build_version_sort":0,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":90,"default_keep_important_results_in_days":0,"default_keep_logs_in_days":10,"default_keep_results_in_days":70,"default_priority":50,"description":"","exclusively_kept_asset_size":0,"id":8,"name":"Maintenance: Single Incidents","size_limit_gb":null,"sort_order":7},{"build_version_sort":0,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":120,"default_keep_important_results_in_days":0,"default_keep_logs_in_days":30,"default_keep_results_in_days":90,"default_priority":50,"description":"","exclusively_kept_asset_size":0,"id":1,"name":"Released","size_limit_gb":null,"sort_order":14},{"build_version_sort":1,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":"90","default_keep_important_results_in_days":"0","default_keep_logs_in_days":"10","default_keep_results_in_days":"21","default_priority":50,"description":"","exclusively_kept_asset_size":0,"id":32,"name":"SLE Micro","size_limit_gb":null,"sort_order":4},{"build_version_sort":1,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":"90","default_keep_important_results_in_days":"0","default_keep_logs_in_days":"10","default_keep_results_in_days":"21","default_priority":50,"description":null,"exclusively_kept_asset_size":0,"id":33,"name":"WSL","size_limit_gb":null,"sort_order":5},{"build_version_sort":0,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":20,"default_keep_important_results_in_days":50,"default_keep_logs_in_days":5,"default_keep_results_in_days":12,"default_priority":60,"description":"Just grouping job groups that have no direct product relevance","exclusively_kept_asset_size":0,"id":9,"name":"Development","size_limit_gb":null,"sort_order":13},{"build_version_sort":0,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":"90","default_keep_important_results_in_days":"0","default_keep_logs_in_days":"10","default_keep_results_in_days":"21","default_priority":50,"description":"","exclusively_kept_asset_size":0,"id":35,"name":"Containers","size_limit_gb":null,"sort_order":3},{"build_version_sort":0,"carry_over_bugrefs":0,"default_keep_important_logs_in_days":90,"default_keep_important_results_in_days":0,"default_keep_logs_in_days":10,"default_keep_results_in_days":70,"default_priority":50,"description":"","exclusively_kept_asset_size":0,"id":27,"name":"Public Cloud","size_limit_gb":null,"sort_order":2},{"build_version_sort":0,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":90,"default_keep_important_results_in_days":0,"default_keep_logs_in_days":10,"default_keep_results_in_days":70,"default_priority":50,"description":"","exclusively_kept_asset_size":0,"id":24,"name":"Maintenance: Kiwi","size_limit_gb":null,"sort_order":12},{"build_version_sort":1,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":30,"default_keep_important_results_in_days":0,"default_keep_logs_in_days":10,"default_keep_results_in_days":50,"default_priority":60,"description":"https:\/\/confluence.suse.com\/display\/openqa\/Testing+of+Quarterly+Refreshed+ISOs+in+openQA\r\n\r\nResponsible person: hrommel <heiko.rommel@suse.com>","exclusively_kept_asset_size":158186928501,"id":23,"name":"Maintenance: QR","size_limit_gb":150,"sort_order":15},{"build_version_sort":1,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":"90","default_keep_important_results_in_days":"0","default_keep_logs_in_days":"10","default_keep_results_in_days":"21","default_priority":50,"description":null,"exclusively_kept_asset_size":0,"id":30,"name":"L3","size_limit_gb":null,"sort_order":11},{"build_version_sort":1,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":"90","default_keep_important_results_in_days":"0","default_keep_logs_in_days":"10","default_keep_results_in_days":"21","default_priority":50,"description":null,"exclusively_kept_asset_size":0,"id":34,"name":"SLES JeOS","size_limit_gb":null,"sort_order":1},{"build_version_sort":1,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":"90","default_keep_important_results_in_days":"0","default_keep_logs_in_days":"10","default_keep_results_in_days":"21","default_priority":50,"description":null,"exclusively_kept_asset_size":0,"id":28,"name":"Others","size_limit_gb":null,"sort_order":18},{"build_version_sort":1,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":"90","default_keep_important_results_in_days":"0","default_keep_logs_in_days":"10","default_keep_results_in_days":"21","default_priority":50,"description":null,"exclusively_kept_asset_size":0,"id":37,"name":"EOL","size_limit_gb":null,"sort_order":16},{"build_version_sort":0,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":90,"default_keep_important_results_in_days":0,"default_keep_logs_in_days":10,"default_keep_results_in_days":70,"default_priority":50,"description":"","exclusively_kept_asset_size":0,"id":21,"name":"Maintenance: Single Incidents SLE-12","size_limit_gb":null,"sort_order":8},{"build_version_sort":1,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":"90","default_keep_important_results_in_days":"0","default_keep_logs_in_days":"10","default_keep_results_in_days":"21","default_priority":50,"description":null,"exclusively_kept_asset_size":0,"id":31,"name":"Maintenace: KOTD","size_limit_gb":null,"sort_order":10},{"build_version_sort":0,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":90,"default_keep_important_results_in_days":0,"default_keep_logs_in_days":10,"default_keep_results_in_days":70,"default_priority":50,"description":"","exclusively_kept_asset_size":0,"id":7,"name":"Maintenance: Test Repo","size_limit_gb":null,"sort_order":9},{"build_version_sort":1,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":90,"default_keep_important_results_in_days":0,"default_keep_logs_in_days":10,"default_keep_results_in_days":70,"default_priority":50,"description":"see https:\/\/wiki.suse.net\/index.php\/RD-OPS_QA\/openQA_review for the review process and responsible persons for review.\r\n\r\nAlso see [#qa-review](irc:\/\/irc.suse.de\/qa-review)","exclusively_kept_asset_size":0,"id":15,"name":"SLE 15","size_limit_gb":null,"sort_order":0},{"build_version_sort":1,"carry_over_bugrefs":1,"default_keep_important_logs_in_days":"90","default_keep_important_results_in_days":"0","default_keep_logs_in_days":"10","default_keep_results_in_days":"21","default_priority":50,"description":null,"exclusively_kept_asset_size":0,"id":38,"name":"Liberty","size_limit_gb":null,"sort_order":17}]: Expecting value: line 1 column 1 (char 0) (Content was: "https://openqa.suse.de/api/v1/parent_groups")
Traceback (most recent call last):
File "/usr/lib/python3.8/site-packages/openqa_review/browser.py", line 161, in _decode_content
content = json.loads(raw) if as_json else raw
File "/usr/lib64/python3.8/json/__init__.py", line 357, in loads
return _default_decoder.decode(s)
File "/usr/lib64/python3.8/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib64/python3.8/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
</code></pre> openQA Project - action #99168 (Resolved): drop unsupported distributions from devel:openQAhttps://progress.opensuse.org/issues/991682021-09-24T07:59:32Zosukup
<p>we have now a big matrix of build targets in devel:openQA nad sub repositories including EOL distributions, LTSS Enterprise distro etc</p>
<p>if we clean up this -> lower amount of Failed/Unresolvable in repo, don't need to solve problems with unsupported distros and its deps</p>
<p>PS: half-year between Release of new openSUSE:Leap and EOL of the older version is here to move for the new version in that timeframe, not for we can live without any worry .. it will be sometime EOL</p>
QA - action #96968 (Resolved): qem-dashboard jobs api extension https://progress.opensuse.org/issues/969682021-08-16T10:45:29Zosukup
<p><a href="https://gitlab.suse.de/opensuse/qem-dashboard/-/issues/14" class="external">https://gitlab.suse.de/opensuse/qem-dashboard/-/issues/14</a></p>
<p>for approve part of bot, we need access to list of jobs for exact settings (incident_settings/updates_settings)</p>
<p>proposed api is described in gitlab issue</p>
QA - action #96752 (Resolved): 'openSUSE-SLE' product schedule jobs to OSDhttps://progress.opensuse.org/issues/967522021-08-11T12:35:39Zosukup
<p>openSUSE-SLE has different structure than other products of SUSE so it needs some changes in sheduling bot.</p>
<ul>
<li>AC1: bot shedules openSUSE-SLE:15.3 jobs in osd </li>
</ul>
openQA Infrastructure - action #96719 (Resolved): recover imagetester with broken filesystem/hard...https://progress.opensuse.org/issues/967192021-08-10T14:36:00Zosukup
<p>During work on <a href="https://progress.opensuse.org/issues/96311" class="external">https://progress.opensuse.org/issues/96311</a> , we found imagetester wasn't updated for 2 months</p>
<p>investigate why wasn't automatic transactional update working and update imagetester.</p>
<p>now blocked by <a href="https://infra.nue.suse.com/SelfService/Display.html?id=194271" class="external">https://infra.nue.suse.com/SelfService/Display.html?id=194271</a> , because it didn't survive reboot and this host hasn't any remote management interface</p>
QA - action #93934 (Resolved): [tools][qem] template generator - multiple package version in inci...https://progress.opensuse.org/issues/939342021-06-14T08:51:58Zosukup
<p>new incidents can be with more package version </p>
<p>*AC: new line in metadata section of template</p>
<p><code>PackageVer: 12-SP4(foo=1.2-3.4, bar=1.2-3.4),12-SP5(foo=1.2-4.5,bar=12-4.5,baz=2.3-4.5)</code></p>
QA - action #93931 (Resolved): [tools][qem] MTUI support multiple versions of package in incidenthttps://progress.opensuse.org/issues/939312021-06-14T08:51:07Zosukup
<a name="Acceptance-criteria"></a>
<h2 >Acceptance criteria<a href="#Acceptance-criteria" class="wiki-anchor">¶</a></h2>
<ol>
<li>new Incidents can be created using multiple code branches</li>
</ol>
<a name="Suggestion"></a>
<h2 >Suggestion<a href="#Suggestion" class="wiki-anchor">¶</a></h2>
<ol>
<li>Parse new <code>PackageVer</code> from template</li>
<li>compare correct version during update/downgrade step</li>
<li>export command exports correct versions per refhost</li>
<li>list_packages command with the correct version</li>
</ol>
openQA Project - action #92665 (Resolved): Automatically validate code style for python codehttps://progress.opensuse.org/issues/926652021-05-13T09:53:58Zosukup
<a name="Motivation"></a>
<h2 >Motivation<a href="#Motivation" class="wiki-anchor">¶</a></h2>
<p>It would be great to have automated checks for more style aspects to python code</p>
<p>We use perl-Tidy and perl-Critic for perl code and strictly checks in CI, it will be nice to have same<br>
standard also for other languages.</p>
<a name="Acceptance-criteria"></a>
<h2 >Acceptance criteria<a href="#Acceptance-criteria" class="wiki-anchor">¶</a></h2>
<ul>
<li><strong>AC1</strong>: python code style is checked automatically</li>
</ul>
<a name="Acceptance-tests"></a>
<h2 >Acceptance tests<a href="#Acceptance-tests" class="wiki-anchor">¶</a></h2>
<ul>
<li><strong>AT1</strong>: Inconsistent codestyle prevents deployments</li>
</ul>
<a name="Suggestion"></a>
<h2 >Suggestion<a href="#Suggestion" class="wiki-anchor">¶</a></h2>
<ul>
<li>use <a href="https://github.com/psf/black" class="external">black</a> for python code style</li>
</ul>