Project

General

Profile

coordination #71242

Updated by JERiveraMoya over 3 years ago

As a Quality Engineer I would like to validate that all the promises that the installer does are met in the installed system. 

 Therefore we need a set In other word, instead of validations which met the following criteria: 
 - Single-purpose validation, enough atomic to be used just trusting in any other scenario. 
 - Utilize test data and have a common structure for all the scenarios. 
 - Use always common library for parsing the output (currently there are 2-3 implementation experience of the same thing which need to be unified). 
 - Validations are focused on collaborator creating the verification test case, use that as well but follow what the product promises in each screen as starting point and organize it accordingly. 

 As of now the configuration modules that are available to validate the installer does, change system suffer of the system following issues (old ones and validate it is not in scope. 
 even new ones recently created): 
  - Validation are multi-architecture and product. 
 - Validations are focused mainly Try to do too much, parsing output in a way that go far beyond what the product promises in some areas, intersecting probably with what is visible tested upstream or even with Kernel testing or other teams in the screen with the following order company are doing. 
  - Validation of preference: 
   - User selection/actions: test data different things spread in different modules, no single-purpose in general. 
 Besides that for those should some screens during installation there are no validations or they are really hard to find them. 

 Proposal in this ticket (that could converted to a Saga/Epic) would be reusable by YuiRestClient. 
   as follows: 
  - defaults in Every single input/user action plus the screen labels displayed for a those options during installation are cover in individual specific scenario (any control that is visible which modules scheduled after fist boot with proper naming. All the user didn't interact with) 
 check should map the screens 'literally' with some methods. 
  - Additional validation Validations are not deep unless it is really necessary but can be focused on (the ones not visible extended (breaking them out in any screen): 
   - documentation and other defaults of the test scenario. 
   - any other thing the tester/developers would consider important as well. 
 - Validation are easy to expand adding a few more test data and perhaps some code, i.e: check a new setting module in the same file should be cheap. 
 future). 
  - Consider speed Full set of validation, reading just once the output and doing the rest in Perl. 
 - When validation fails should be very easy to understand the problem (expected vs actual results) 
 - All scenarios in YaST group should apply those validations. Consider are not repeated for each test suite, we need to repeat find the validation in similar scenarios, specially groups and run some default test suite for defaults. 

 Examples: 
 them. 
  - Test case data is used for each module to validate screen [Network Settings](https://documentation.suse.com/sles/15-SP2/single-html/SLES-installquick/#sec-sle-installquick-install-time) configuration those inputs but test data is not used to store other information. 
  - Multi-product approach should be taken into account. As test module for interface, dhcp, hostname, routing validation could be checked in the installed system. really short, scheduling or not those modules should suffice. 

 Examples: 
 - Test case to validate screen [System role](https://documentation.suse.com/sles/15-SP2/single-html/SLES-installquick/#sec-sle-installquick-install-roles) could be that when selecting Text Mode we could figure out a simple validation of X11 and find out what should not be in the system comparing with a gnome installation. 
 - Test case to validate screen [Suggested Partitioning](https://documentation.suse.com/sles/15-SP2/single-html/SLES-installquick/#sec-sle-installquick-install-partitioner) subvolume actions could be a good candidate, clicking on 'see details' there are info to ensure that we could check. the deletion really happened as well as the creation and verification of the btrfs subvolumes, but without going further checking btrfs in depth (at least in this test suite). Test data for the validation will not contain anything related with btrfs fields that were not introduced by the user somehow or visualized during the installation. 
 - Test case to validate screen [Installation Settings](https://documentation.suse.com/sles/15-SP2/single-html/SLES-installquick/#sec-sle-installquick-install-inst-settings) could be split in multiple modules to check every section. Clicking on each item there are more promises that the software does that could be tested individually and deserve attention even if the user does not navigate to them during the installation, but those are the promises anyway.

Back