action #168901
closed
coordination #168895: [saga][epic][infra] Support SUSE PRG office move while ensuring business continuity
coordination #168898: [epic][infra] Support SUSE PRG office datacenter "PRG1" move while ensuring business continuity
Support SUSE PRG office datacenter "PRG1" move to a new location "PRG3" while ensuring business continuity - pre-planning size:M
Added by okurz 6 months ago.
Updated 3 months ago.
Category:
Feature requests
Description
Motivation¶
Vit Pelcak informed me that he is invited "to discuss requirements for server room for the new Prague office." We will support the planning process.
Vit wrote:
So far I got these requirements:
Move all the machines
Working PXE, IPMI...Move in time not colliding with the product testing/release (@Jan Stehlík suggested Feb/Mar)
Move in phases so that we don't have all the machines offline at the same time
My addition: Do we actually need machines in a new office server room or just move them to PRG2e? Nobody from the QE tools team commonly visits the server rooms in neither PRG1 nor PRG2 so it wouldn't make a difference. There should be IPv6
Note: void.qam.suse.cz (aka openqa.qam.suse.cz) with its workers is currently in PRG1.
Acceptance Criteria¶
- AC1: Assets of QE Tools inside PRG1 are known
- AC2: Racktable entries for the identified assets are up-to-date
- AC3: We know if we want/need machines in the new PRG site "PRG3"
- AC4: For every identified asset, the future of the asset is decided (i.e. to be discarded, moved, replaced with new one)
Suggestions¶
- Be aware about #170458
- Review current state in racktables and update where there is already obvious outdated information or incomplete information, e.g. contact persons that have left the company or missing MAC addresses for machines
- For unclear racktables entries talk to contact persons / machine owners / loaners
- Identify groups of machines and for every identified asset decide about the future of the asset, i.e. to be discarded, moved, replaced with new one
- Look at https://confluence.suse.com/display/~ewalker/Prague+%28in+office%29+Server+Room+requirements+for+Prague+teams
- Status changed from Feedback to New
- Assignee deleted (
okurz)
- Target version changed from future to Ready
- Assignee set to jbaier_cz
As discussed I will take care of this one.
- Description updated (diff)
At least from what I know, this won't be an issue until next year. We should be ready nevertheless.
And just to make sure it will not fall into oblivion, void.qam.suse.cz (aka openqa.qam.suse.cz) with its workers is currently in PRG1.
- Subject changed from Support SUSE PRG office datacenter "PRG1" move while ensuring business continuity - pre-planning to Support SUSE PRG office datacenter "PRG1" move to a new location "PRG3" while ensuring business continuity - pre-planning size:M
- Description updated (diff)
- Status changed from New to Workable
- Status changed from Workable to Blocked
Block this one on #170458 as that will help with AC1/AC2.
- Description updated (diff)
Actually this shouldn't wait for #154042 which would be about a DHCP VM with unclear ownership. But this ticket here should focus on the physical machines within PRG1. With an evacuation of PRG1 possible even qanet.qa.suse.cz would be obsolete. Instead I added a relation to the sibling task #170458. @jbaier_cz the situation is unchanged, wait for the blocker you mentioned.
- Status changed from Blocked to Workable
- Assignee deleted (
jbaier_cz)
- Priority changed from Normal to High
- Status changed from Workable to In Progress
- Assignee set to robert.richardson
- Due date set to 2025-01-28
Setting due date based on mean cycle time of SUSE QE Tools
robert.richardson wrote in #note-15:
Regarding AC3, since all the machines are in use they shouldn't be discarded, but rather moved or replaced.
I guess that might depend on the destination; for some of them it might be cheaper to just buy a new hardware on the new place if those hosts are just normal workers without any special features.
- Description updated (diff)
In doubt ask the contact persons listed in racktables
Done
You mean the Ticket should be about QE LSG tag removal here ? Because all machines with paul.gonin as contact already have both the QAM
and QE LSG
tag.
The example you provided does not list thehehjik
as contact, and i also cant find any machine(s) whith that contact using the search function.. Am i missing something here, did you maybe already address this and the previous point ?
Removed QE LSG
, what about the QA
tag ?
Done, although i ended up not using the query after all, as that included machines managed (and already mentioned in the thread) by vpelcak.
robert.richardson wrote in #note-23:
- You mean the Ticket should be about QE LSG tag removal here ? Because all machines with paul.gonin as contact already have both the
QAM
and QE LSG
tag.
Yes, QE LSG should be removed as pgonin and his team are not in QE LSG
The example you provided does not list thehehjik
as contact, and i also cant find any machine(s) whith that contact using the search function.. Am i missing something here, did you maybe already address this and the previous point ?
Removed QE LSG
, what about the QA
tag ?
Who is the owner then?
okurz wrote in #note-24:
robert.richardson wrote in #note-23:
- You mean the Ticket should be about QE LSG tag removal here ? Because all machines with paul.gonin as contact already have both the
QAM
and QE LSG
tag.
Yes, QE LSG should be removed as pgonin and his team are not in QE LSG
Ok, i've simply removed those "QE LSG" tags as they where wrongly added by me yesterday going off of the FQDN and QAM tag.
The example you provided does not list thehehjik
as contact, and i also cant find any machine(s) whith that contact using the search function.. Am i missing something here, did you maybe already address this and the previous point ?
Removed QE LSG
, what about the QA
tag ?
Who is the owner then?
For me it shows Jan Kohoutek
- Status changed from In Progress to Resolved
I've added the qa tools section to the according planing page, which anyone may extend / edit in case i missed any special requirements.
Resolving the ticket as discussed on slack.
- Due date deleted (
2025-01-28)
Also available in: Atom
PDF