action #35374
closedRendering of external harness output in test results as rows
Description
As a continuation of the work being done to display external test harness results better within openQA.
openQA currently will display results that come from an external harness as rows and use the known text info boxes (When multiple results or steps per testcase are available, multiple boxes are added), instead a list of the steps is preferred, and that when the user clicks on a particular step, then the details are displayed.
The interaction would then be similar to the one provided by http://metan.ucw.cz/outgoing/ltp-syscalls.html but the look of the list itself should be similar to Mock #1 or Mock #3
Mock #2 is provided since it's a css-less rendering of xunit test results.
AC1: openQA is able to display external harness output, in the same view for the tests results as a list and not as test text boxes.
AC2: once a particular item from the results list is clicked, the details are shown/expanded.
Optional AC3: passing tests from the test harness can be collapsed by default, to save space.
Files
Updated by szarate almost 7 years ago
- Description updated (diff)
- Category set to 124
- Target version set to Current Sprint
Updated by szarate almost 7 years ago
- Related to action #34507: Document requirements for external test suite results to be rendered by openQA added
Updated by szarate almost 7 years ago
- Related to deleted (action #34507: Document requirements for external test suite results to be rendered by openQA)
Updated by szarate almost 7 years ago
- Related to action #34507: Document requirements for external test suite results to be rendered by openQA added
Updated by mkittler almost 7 years ago
I tried to execute LTP tests locally to gather some test results to play with. However, I haven't had any success so far. For instance I tried to clone 1651356
from OSD the clone job script can not download all files:
Cloning dependencies of sle-15-Installer-DVD-x86_64-Build589.1-ltp_fs_ext4@64bit
Cloning dependencies of sle-15-Installer-DVD-x86_64-Build589.1-install_ltp+sle+15+Installer-DVD@64bit
downloading
http://openqa.suse.de/tests/1651356/asset/iso/SLE-15-Installer-DVD-x86_64-Build589.1-Media1.iso
to
/hdd/openqa-devel/openqa/share/factory/iso/SLE-15-Installer-DVD-x86_64-Build589.1-Media1.iso
Created job #18: sle-15-Installer-DVD-x86_64-Build589.1-create_hdd_minimal_base+sdk@64bit -> http://localhost:9526/t18
downloading
http://openqa.suse.de/tests/1651356/asset/iso/SLE-15-Installer-DVD-x86_64-Build589.1-Media1.iso
to
/hdd/openqa-devel/openqa/share/factory/iso/SLE-15-Installer-DVD-x86_64-Build589.1-Media1.iso
downloading
http://openqa.suse.de/tests/1651356/asset/other/openposix-test-list-15-589.1
to
/hdd/openqa-devel/openqa/share/factory/other/openposix-test-list-15-589.1
1651356 failed: 404 Not Found
When I skip that issue via --skip-download
option, the jobs are added at least. However, the test fails very early then:
Note that I use the current master of the test distribution, openSUSE needles and SLES needles. Usual tests run fine under my setup.
@szarate {You wrote](https://progress.opensuse.org/issues/34507#note-5):
Once the files have been uploaded, currently openQA will display them as shown in the attached image, which is simply by adding an extra row with the text results.
So apparently you were able to workaround that issues. How did you achieve that? In particular, which commands did you use to clone the job? Or can you share some test results which I can import locally?
@szarate And do I remember your assertion from the Jangouts call correctly? So what you can see on the attached image is openQA's current state.
I'm asking because when I look on OSD at some LTP tests, it looks quite different. Especially, there is a lot of wait_serial
which would make changing the appearance to look like @coolo's mockup more complicated:
Updated by szarate almost 7 years ago
mkittler wrote:
@szarate {You wrote](https://progress.opensuse.org/issues/34507#note-5):
Once the files have been uploaded, currently openQA will display them as shown in the attached image, which is simply by adding an extra row with the text results.
So apparently you were able to workaround that issues. How did you achieve that? In particular, which commands did you use to clone the job? Or can you share some test results which I can import locally?
@szarate And do I remember your assertion from the Jangouts call correctly? So what you can see on the attached image is openQA's current state.
I'm asking because when I look on OSD at some LTP tests, it looks quite different. Especially, there is a lot of
wait_serial
which would make changing the appearance to look like @coolo's mockup more complicated:
What you see in OSD comes from LTP is the team trying to upload information to openQA directly without using the parser.
script_run('curl -k -O https://openqa.opensuse.org/tests/656338/file/result_array.json');
parse_extra_log('LTP', 'result_array.json');
This is mostly what's needed to properly parse the results generated by LTP (Not the ones that openQA is currently showing in the Kernel Jobgroup as that is their own attempt to publish results in the openQA format)
Updated by mkittler almost 7 years ago
@szarate Ok, I'm trying to execute your openQA-in-openQA testcase now.
Besides getting test results locally, I'm also wondering how to implement this. Since my last 'plan' the requirements have changed.
This was my last plan on how to approach this feature: https://progress.opensuse.org/issues/25680#note-14
That the results are not in the database hasn't changed (I assume). Hence pulling and evaluating the additional information on the client-side via JavaScript might still be the cheapest way to implement it. That we now want to approach a uniform view rather than creating an extra table makes things maybe a little bit more complicated. Instead of just creating a new table from the JSON, it would be required to adjust the existing DOM element.
A more advanced approach would be fiddling the JSON data into our database structure. Then we could easily render the required HTML directly on the server-side.
Or we just evaluate the JSON data each and every time in the controller. This would also allow use to render the HTML on the server-side. However, the evaluation of the JSON plus the more complex controller code might be slow down page loading.
Updated by szarate almost 7 years ago
For the time being we don't really need to fiddle with the database I would say.
Currently all the data is available to openQA, and there is no further requirement to move this info to the database.
If you look at: http://phobos.suse.de/api/v1/jobs/11054/details to give an example, you can notice that the nodes under LTP, xunit and TAP have test details with the test_data property, so going from there to displaying rows won't require major changes.
Updated by szarate almost 7 years ago
One thing regarding my previous comment is that: As it was already mentioned by @coolo, since we're now using postgreSQL, we can store json data.
So the result could be completely serialized as a json (via parser->to_json, stored in the database and loaded again with the parser interface, in case the test results are deleted. But again this is a bit more advance, and could be a continuation of this task.
That the results are not in the database hasn't changed (I assume). Hence pulling and evaluating the additional information on the client-side via JavaScript might still be the cheapest way to implement it. That we now want to approach a uniform view rather than creating an extra table makes things maybe a little bit more complicated. Instead of just creating a new table from the JSON, it would be required to adjust the existing DOM element.
I think this would be the way to go, we can alter the test module part, and add the flag that it's external test result (external_harness => $TYPE)
... and from there use a different rendering template.
Updated by mkittler almost 7 years ago
have test details with the test_data property
Yes, and for that property we currently have to open/read a file from disk when rendering the page. (If that is what Perl people call 'slurp' - why can't they just use normal terms?) In fact, we open a file for each and every step. So there's room for improvement. Even the cheap approach of delivering one additional, static JSON file for the entire test which is than fiddled-in by the client should gain better performance.
since we're now using postgreSQL, we can store json data
I know. But I'm not sure about the performance and usability of that feature and how well it would integrate with DBIx.
Updated by mkittler almost 7 years ago
- Status changed from New to In Progress
Updated by mkittler almost 7 years ago
The PR is almost complete (hopefully the tests pass now).
Since the feedback was to render the data in a separate table after all, I'll continue with that. This time based on the parser output.
Updated by coolo almost 7 years ago
Please stop here and create another issue. Otherwise this will become a suck-it-all issue
Updated by mkittler almost 7 years ago
- Precedes action #36232: Rendering of external harness output in a separate table added
Updated by szarate almost 7 years ago
- Target version changed from Current Sprint to Done