Hi!
On Mon, 15 Apr 2024 at 15:39, Metzger, Markus T markus.t.metzger@intel.com wrote:
Hello,
| 4 patches in gdb | Patchwork URL: https://patchwork.sourceware.org/patch/88278 | 343a2568d2c gdb, infrun: fix multi-threaded reverse stepping | a4cfc3d32a8 gdb, infrun, record: move no-history notification into normal_stop | fc70b453e32 gdb, infrun, record: fix hang when step-over fails with no- history | 45548f364fd gdb, infrun, btrace: fix reverse/replay stepping at end of execution history | ... applied on top of baseline commit: | 31c21e2c13d [gdb/testsuite] Fix gdb.threads/access-mem-running-thread- exit.exp with clang
FAIL: 1 regressions: 1 progressions
regressions.sum: === gdb tests ===
Running gdb:gdb.threads/interrupt-while-step-over.exp ... FAIL: gdb.threads/interrupt-while-step-over.exp: displaced-stepping=off: iter=6: wait for stops (timeout)
The log contains several FAILs at different iterations, yet this report lists a single new fail at iteration 6 IIUC. Is this test known to be flaky?
I tried reproducing this on x86-64 but couldn't find any flakiness with this test.
Yes, we do see it as a flaky test. Due to how we manage lists of flaky tests, it was removed from the list, leading to being detected as a regression caused by your patches. It will be automatically added to the list of flaky tests soon.
progressions.sum: === gdb tests ===
Running gdb:gdb.threads/detach-step-over.exp ... FAIL: gdb.threads/detach-step-over.exp: breakpoint-condition- evaluation=host: target-non-stop=on: non-stop=on: displaced=off: test_detach_command: iter 2: attach (GDB internal error)
What is a 'progression'?
It means "improvement", maybe we chose a bad name :-) In this case, it means that a FAIL has disappeared (well maybe it was a flaky test too)
The log again contains several fails in this test. I also found the same GDB internal error in xfails.xfail on iter 3 instead of 2.
It's possible that a log contains more failures than what we report as regressions: indeed we compare the current results with a baseline which may contain several failures. We report only the new ones.
I tried reproducing this on x86-64 but couldn't find any flakiness with this test.
Thanks for checking.
It's also possible that the flakiness is related to the target (here 'arm' -- not 'aarch64').
You can find the failure logs in *.log.1.xz files in
precommit/2150/artifact/artifacts/artifacts.precommit/00-sumfiles/ The full lists of regressions and progressions as well as configure and make commands are in
precommit/2150/artifact/artifacts/artifacts.precommit/notify/ The list of [ignored] baseline and flaky failures are in
precommit/2150/artifact/artifacts/artifacts.precommit/sumfiles/xfails.xfail
The configuration of this build is: CI config tcwg_gdb_check master-arm
-----------------8<--------------------------8<--------------------------8<-----------------------
The information below can be used to reproduce a debug environment:
Current build : https://ci.linaro.org/job/tcwg_gdb_check--master-arm- precommit/2150/artifact/artifacts Reference build : https://ci.linaro.org/job/tcwg_gdb_check--master-arm- build/1039/artifact/artifacts
Warning: we do not enable maintainer-mode nor automatically update generated files, which may lead to failures if the patch modifies the master files.
If those are indeed real fails introduced by my patches, any chance I can debug this on an x86-64 system? Via simulation, perhaps?
At this stage I think it's an artifact of flaky tests management.
Thiago will comment if he thinks the problem is really caused by your patches.
Thanks,
Christophe
thanks, Markus. Intel Deutschland GmbH Registered Address: Am Campeon 10, 85579 Neubiberg, Germany Tel: +49 89 99 8853-0, www.intel.de http://www.intel.de Managing Directors: Christin Eisenschmid, Sharon Heck, Tiffany Doon Silva Chairperson of the Supervisory Board: Nicole Lau Registered Office: Munich Commercial Register: Amtsgericht Muenchen HRB 186928 _______________________________________________ linaro-toolchain mailing list -- linaro-toolchain@lists.linaro.org To unsubscribe send an email to linaro-toolchain-leave@lists.linaro.org