Project

General

Profile

action #97109

Updated by tinita over 3 years ago

## Observation 

 The generation of openqa_review pages fetches a lot of equal urls. The more reports we generate, the bigger the problem will become. As we know, our webui is not very fast at the moment. 

 These statistics are from today. According to the access_log, generation of all osd pages took almost 4 hours (from 2:00 to 5:52). No significant gaps between the several piipeline jobs. About 5200 requests were made, but only 1605 different urls: 

 ``` 
 grep '"openqa-review ' access_log | perl -MData::Dumper -nwE'$Data::Dumper::Sortkeys = 1;if (m/"GET (\S+) HTTP\S+ (\d+)/) { $urls{$1}++ } END { say Dumper \%urls }' | wc -l 
 1605 
 ``` 
 This shows how many urls were called n number of times: 
 ``` 
 1: 19 
 2: 8 
 3: 1278 
 4: 152 
 5: 141 
 6: 3 
 ``` 
 Time (according to access_log) in seconds for fetching the urls (by number of requests): 
 ``` 
 1: 47s 
 2: 33s 
 3: 8165s 
 4: 1348s 
 5: 1237s 
 6: 160s 
 ``` 

 ## Suggestions 

 * Use the artifact or cache feature of gitlab to save the openqa_review url cache directory between the gitlab pipeline jobs 
 * https://docs.gitlab.com/ee/ci/caching/#cache-vs-artifacts

Back