All,
During Connect the suggestion was made that each working group should have
its own IRC Channel for discussions and topics relating to the group in
particular (as opposed to #linaro which is 'generic' Linaro conversations).
Therefore I have just set up #linaro-tcwg on Freenode for the Toolchain
Working Group.
This channel is public and open to anyone who wants to talk with the TCWG
group about anything toolchain related.
Thanks,
Matt
--
Matthew Gretton-Dann
Toolchain Working Group, Linaro
== Progress ==
* Support (6/10)
- Trying to add ADRL support in the assembler: http://llvm.org/PR24350
* Buildbots (1/10)
- Some breakages, nothing serious
* Background (3/10)
- Code review, meetings, discussions, general support, etc.
- Working on 2016's plan
== Progress ==
o Linaro GCC (4/10)
* Found and backported missing dependencies
* More backports
o Upstream work (4/10)
* Reviewed some patches
* Tried to reproduced an LRA issue (fixed in the meantime)
* Continue on sanitizing gfortran testsuite
o Misc (2/10)
* Various meetings
== Plan ==
o Continue on-going tasks
== This Week ==
* Target conversion to hook (2/10)
- Completed build and test for ASM_FORMAT_PRIVATE_NAME, ASM_LABEL_OUTPUT_LABEL
to hook
- Converted ASM_OUTPUT_LABEL_REF to hook
* Holidays (8/10)
- 3 days leave and public holiday (Guru nanak Jayanti)
== Next Week ==
- Send updated patch upstream for tcwg-72
- LTO benchmarking with SPEC for correctness on arm and aarch64
- Target hook conversion
The Linaro Toolchain Working Group is pleased to announce the availability
of the Linaro Stable Binary Toolchain GCC 5.2-2015.11 Release Archives.
http://releases.linaro.org/components/toolchain/binaries/5.2-2015.11/http://releases.linaro.org/components/toolchain/gcc-linaro/5.2-2015.11/
These archives provide cross-toolchain executables (compiler, debugger,
linker, etc.) and shared libraries (libstdc++, libc, etc.) that target ARM
or Aarch64 GNU/Linux and bare-metal environments. The cross-toolchain
binaries execute on a Linux or MS Windows (under mingw32) host
operating-system.
Please evaluate this release-candidate for correctness. Linaro will
shortly spin the Linaro GCC 5.2-2015.11 release if this release-candidate
passes stakeholder validation.
For bugs related to this release-candidate please email
linaro-toolchain(a)lists.linaro.org or file a bug at
https://bugs.linaro.org/enter_bug.cgi?product=Linux%20Binary%20toolchain
NEWS
* GCC 5.2 2015.11
The Linaro GCC 5.2 2015.11 binary toolchain release is built from the
Linaro GCC-5.2-2015.11 release source archive. The Linaro GCC-5.2-2015.11
release source archive is derived from the same sources as the Linaro
GCC-5.2-2015.10 snapshot source archive.
* GCC 5.2 2015.11-rc1
The Linaro GCC 5.2 2015.11-rc1 binary toolchain release-candidate is built
from the Linaro GCC-5.2-2015.11 release-candidate source archive. The
Linaro GCC-5.2-2015.11-rc1 release-candidate source archive is derived from
the same sources as the Linaro GCC-5.2-2015.10 snapshot source archive.
--
Ryan S. Arnold
Linaro Toolchain Working Group - Engineering Manager
www.linaro.org
Hello,
I am trying to create gcc 4.9.x toolchains for ARM v7 and v8 based on Linaro's sources. At first Linaro's 4.9-2015.05 binary release looked suitable, but then one of my colleagues noticed that that it had an incompatibility with Red Hat Enterprise Linux 6. Linaro has decided not to fix this incompatibility (see https://bugs.linaro.org/show_bug.cgi?id=1869 ).
So, I tried to work around that bug by rebuilding the toolchains myself on RHEL6 using Linaro's new ABE script. I initially tried to recreate the builds by using ABE's --manifest <manifest_file> command line option. I experienced problems with that, though, including it building gcc version 6.x instead of 4.9.x. I eventually gave up on that approach. Instead, I extracted the required branches and revisions from the manifest files and put them into ABE command line options, like this:
abe.sh --target aarch64-elf --build all --parallel --dump --tarball --release fsl-2015.11.16 --set libc=newlib binutils=binutils-gdb.git~linaro_binutils-2_24-branch@a93e252ee5250dba831e54f98336b40c7210dac7 gcc=gcc-linaro-4.9-2015.05 gmp=5.1.3 gdb=binutils-gdb.git~gdb-7.10-branch@ef5fa52ac9ab68b505b52acb2d2068b366ba8bf2 mpfr=3.1.2 mpc=1.0.1 newlib=newlib.git~linaro_newlib-branch@136b66e404df41435bdec4630c0787b0bc7e7580
abe.sh --target aarch64-linux-gnu --build all --parallel --dump --tarball --release fsl-2015.11.16 --set libc=glibc binutils=binutils-gdb.git~linaro_binutils-2_24-branch@a93e252ee5250dba831e54f98336b40c7210dac7 gcc=gcc-linaro-4.9-2015.05 gmp=5.1.3 gdb=binutils-gdb.git~gdb-7.10-branch@ef5fa52ac9ab68b505b52acb2d2068b366ba8bf2 mpfr=3.1.2 mpc=1.0.1 glibc=glibc-linaro-2.20-2014.11.tar.xz
abe.sh --target arm-eabi --build all --parallel --dump --tarball --release fsl-2015.11.16 --set libc=newlib binutils=binutils-gdb.git~linaro_binutils-2_24-branch@a93e252ee5250dba831e54f98336b40c7210dac7 gcc=gcc-linaro-4.9-2015.05 gmp=5.1.3 gdb=binutils-gdb.git~gdb-7.10-branch@ef5fa52ac9ab68b505b52acb2d2068b366ba8bf2 mpfr=3.1.2 mpc=1.0.1 newlib=newlib.git~linaro_newlib-branch@136b66e404df41435bdec4630c0787b0bc7e7580
abe.sh --target arm-linux-gnueabihf --build all --parallel --dump --tarball --release fsl-2015.11.16 --set libc=glibc binutils=binutils-gdb.git~linaro_binutils-2_24-branch@a93e252ee5250dba831e54f98336b40c7210dac7 gcc=gcc-linaro-4.9-2015.05 gmp=5.1.3 gdb=binutils-gdb.git~gdb-7.10-branch@ef5fa52ac9ab68b505b52acb2d2068b366ba8bf2 mpfr=3.1.2 mpc=1.0.1 glibc=glibc-linaro-2.20-2014.11.tar.xz
That worked, and the resulting toolchains ran without error under RHEL6. Note that I deliberately chose to switch to glibc in the *-linux-* toolchains, whereas the manifest files had them using eglibc.
At least one serious problem remained. The toolchains supported different multilibs than previous releases. For example, arm-eabi-gcc reported that it supported only three sets of libraries:
$ arm-eabi-gcc -print-multi-lib
.;
thumb;@mthumb
fpu;@mfloat-abi=hard
Linaro's 2015.05 build of the toolchain gives the same output. However, previous releases of this toolchain supported a much larger set of multilibs. A build from 2014.08 reports:
$ arm-none-eabi-gcc --print-multi-lib
.;
thumb;@mthumb
v7-a;@march=armv7-a
v7ve;@march=armv7ve
v8-a;@march=armv8-a
v7-a/fpv3/softfp;@march=armv7-a@mfpu=vfpv3-d16@mfloat-abi=softfp
v7-a/fpv3/hard;@march=armv7-a@mfpu=vfpv3-d16@mfloat-abi=hard
v7-a/simdv1/softfp;@march=armv7-a@mfpu=neon@mfloat-abi=softfp
v7-a/simdv1/hard;@march=armv7-a@mfpu=neon@mfloat-abi=hard
v7ve/fpv4/softfp;@march=armv7ve@mfpu=vfpv4-d16@mfloat-abi=softfp
v7ve/fpv4/hard;@march=armv7ve@mfpu=vfpv4-d16@mfloat-abi=hard
v7ve/simdvfpv4/softfp;@march=armv7ve@mfpu=neon-vfpv4@mfloat-abi=softfp
v7ve/simdvfpv4/hard;@march=armv7ve@mfpu=neon-vfpv4@mfloat-abi=hard
v8-a/simdv8/softfp;@march=armv8-a@mfpu=neon-fp-armv8@mfloat-abi=softfp
v8-a/simdv8/hard;@march=armv8-a@mfpu=neon-fp-armv8@mfloat-abi=hard
thumb/v7-a;@mthumb@march=armv7-a
thumb/v7ve;@mthumb@march=armv7ve
thumb/v8-a;@mthumb@march=armv8-a
thumb/v7-a/fpv3/softfp;@mthumb@march=armv7-a@mfpu=vfpv3-d16@mfloat-abi=softfp
thumb/v7-a/fpv3/hard;@mthumb@march=armv7-a@mfpu=vfpv3-d16@mfloat-abi=hard
thumb/v7-a/simdv1/softfp;@mthumb@march=armv7-a@mfpu=neon@mfloat-abi=softfp
thumb/v7-a/simdv1/hard;@mthumb@march=armv7-a@mfpu=neon@mfloat-abi=hard
thumb/v7ve/fpv4/softfp;@mthumb@march=armv7ve@mfpu=vfpv4-d16@mfloat-abi=softfp
thumb/v7ve/fpv4/hard;@mthumb@march=armv7ve@mfpu=vfpv4-d16@mfloat-abi=hard
thumb/v7ve/simdvfpv4/softfp;@mthumb@march=armv7ve@mfpu=neon-vfpv4@mfloat-abi=softfp
thumb/v7ve/simdvfpv4/hard;@mthumb@march=armv7ve@mfpu=neon-vfpv4@mfloat-abi=hard
thumb/v8-a/simdv8/softfp;@mthumb@march=armv8-a@mfpu=neon-fp-armv8@mfloat-abi=softfp
thumb/v8-a/simdv8/hard;@mthumb@march=armv8-a@mfpu=neon-fp-armv8@mfloat-abi=hard
I found that the file that encodes this older set of multilib mappings is gcc-linaro-4.9-2015.05/gcc/config/arm/t-aprofile. Based on some comments in gcc-linaro-4.9-2015.05/gcc/config.gcc, I guessed that ABE should have configured gcc with "--with-multilib-list=aprofile", and without "--with-arch=armv7-a" or "--with-fpu=vfpv3-d16". I quickly hacked these changes into abe/config/gcc.conf like this:
diff --git a/config/gcc.conf b/config/gcc.conf
index 19c44ca..4cc5eaf
--- a/config/gcc.conf
+++ b/config/gcc.conf
@@ -111,9 +111,9 @@ if test x"${build}" != x"${target}"; then
default_configure_flags="${default_configure_flags} --with-tune=cortex-a9"
fi
if test x"${override_arch}" = x -a x"${override_cpu}" = x; then
- default_configure_flags="${default_configure_flags} --with-arch=armv7-a"
+ default_configure_flags="${default_configure_flags}"
fi
- default_configure_flags="${default_configure_flags} --enable-threads=no --with-fpu=vfpv3-d16 --enable-multilib --disable-multiarch"
+ default_configure_flags="${default_configure_flags} --enable-threads=no --with-multilib-list=aprofile --enable-multilib --disable-multiarch"
languages="c,c++,lto"
;;
aarch64*-*elf)
After rebuilding the toolchain, I found it had the desired older set of multilibs.
I hope that this mail will help anyone who experiences similar problems. I have filed a bug report for the multilib issue. See https://bugs.linaro.org/show_bug.cgi?id=1920 .
While validating the toolchains, dejagnu reports a few unexpected failures. Does the TCWG publish their validation results anywhere for comparison? That would be very helpful.
Thanks,
Fred Peterson
Freescale Developer Tools
== Progress ==
o Valfidation and Infra (2/10)
* Some fixes in our release script
* look at refactoring our publishing snapshot job
o Linaro GCC (4/10)
* Start backports for 2015.12
* Tracking dependencies
o Upstream work (1/10)
* Continue on sanitizing gfortran testsuite
o Misc (3/10)
* Various meetings
* Internal support
== Plan ==
o Continue on-going tasks
Controlled image builds - TCWG-360 [2/10]
* A few more test/debug cycles with ci-loop-built image
Jenkins benchmarking job - TCWG-348 [3/10]
* YAML-ised Jenkins job, more test/debug cycles
Juno crashdump [1/10]
* Got a usable dump (via alt-sysrq-c) with latest patches plus some fiddling
SPEC-on-Android [1/10]
* Looked at Qian's work to date, didn't come up with any bright ideas
Misc [3/10]
=Plan=
Review security with shared uinstance/main instance code
Expose more data, benchmarks to bundles
Continue debug/test of Jenkins job
Create bootable image for at least 1 target, or know what the problems are
Write up noise control report (if time)
Set Juno off, try to get a dump of my crash
Probably more support for SPEC-on-Android
=Absences=
'ARM Day' next Monday (30th)