On Tue, Nov 21, 2023 at 01:27:44PM +0000, Mark Brown wrote:
(I don't need to see all of the tests that passes; it's the test failures or the test flakes that are significant.)
The listing of tests does get a bit more complex when you mix in running on different platforms.
Yeah, that's part of the aggregation reports problem. Given a particular test, say, generic/475, I'd love to see a summary of which file system config (e.g., ext4/4k, ext4/1k) and which architectures a particular test is failing or which is flaky. Right now, I do this manually using a combination of a mutt mail reader (the test summaries are e-mailed to me), and emacs....
I think if we get tooling in place so that people can just run a script, add a flag to their tools or whatever to ingest results from the standard testsuites the barrier to reporting becomes sufficiently low that it's more of a "why not?" type thing.
Sure, I'm happy to add something like that to my test runners: kvm-xfstests, gce-xfstests, and android-xfstests. Then anyone who uses my test runner infrastructure would get uploading for free. We might need to debate whether I enable uploading as something which is enabled by default or not (I can imagine some people won't wanting to upload information to a public site, lest it leak information about an upcoming mobile handset :-), but that's a minor point.
Personally, I'm not going to have time to look into this for a while, but... patches welcome. Or even something that takes a junit xml file, and uploads it to the kcidb. If someone can make something like that available, I should be able to take it the rest of the way.
Cheers,
- Ted