coordination #71242

Updated by JERiveraMoya about 1 year ago

As a Quality Engineer I would like to validate that all the promises that the installer does are met in the installed system. In other word, instead of just trusting in the experience of the collaborator creating the test case, use that as well but follow what the product promises promise in each screen as starting point and organize it accordingly.

As of now the modules that are available to validate the system suffer of the following issues (old ones and even new ones recently created):
- Try to do too much, parsing output in a way that go far beyond what the product promises promise in some areas, intersecting probably with what is tested upstream or even with Kernel testing or other teams in the company are doing.
- Validation of different things spread in different modules, no single-purpose in general.
Besides that for some screens during installation there are no validations or they are really hard to find them.

Proposal in this ticket (that could converted to a Saga/Epic) would be as follows:
- Every single input/user action plus the labels displayed for those options during installation are cover in individual specific modules scheduled after fist boot with proper naming. All the check should map the screens 'literally' with some methods.
- Validations are not deep unless it is really necessary but can be extended (breaking them out in more module in future).
- Full set of validation are not repeated for each test suite, we need to find the groups and run some default test suite for them.
- Test data is used for each module to validate those inputs but test data is not used to store other information.
- Multi-product approach should be taken into account. As test module for validation could be really short, scheduling or not those modules should suffice.

- Test case to validate screen [System role]( could be that when selecting Text Mode we could figure out a simple validation of X11 and find out what should not be in the system comparing with a gnome installation.
- Test case to validate screen [Suggested Partitioning]( could be to ensure that the deletion really happened as well as the creation and verification of the btrfs subvolumes, but without going further checking btrfs in depth (at least in this test suite). Test data for the validation will not contain anything related with btrfs fields that were not introduced by the user somehow or visualized during the installation.
- Test case to validate screen [Installation Settings]( could be split in multiple modules to check every section. Clicking on each item there are more promises that the software does that could be tested individually and deserve attention even if the user does not navigate to them during the installation, but those are the promises anyway.