Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-18 Thread Thomas Trepl
Am Samstag, den 18.08.2018, 09:27 +0200 schrieb Thomas Trepl:
> Am Freitag, den 17.08.2018, 21:27 +0100 schrieb Ken Moffat:
> > On Fri, Aug 17, 2018 at 08:57:02PM +0200, Thomas Trepl wrote:
> > > 
> > > Rerun the tests here with "ninja check{,-clang{,-tooling}}" - not
> > > "make". I used
> > > 
> > > CC=gcc CXX=g++  \
> > > cmake -DCMAKE_INSTALL_PREFIX=/usr   \
> > >   -DLLVM_ENABLE_FFI=ON  \
> > >   -DCMAKE_BUILD_TYPE=Release\
> > >   -DLLVM_BUILD_LLVM_DYLIB=ON\
> > >   -DLLVM_TARGETS_TO_BUILD="host;AMDGPU" \
> > >   -DLLVM_BUILD_TESTS=ON \
> > >   -Wno-dev -G Ninja ..
> > > 
> > > The summary looks fine to me:
> > > 
> > > [0/3] Running the LLVM regression tests
> > > -- Testing: 23298 tests, 4 threads --
> > > Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> > > Testing Time: 106.64s
> > >   Expected Passes: 15113
> > >   Expected Failures  : 56
> > >   Unsupported Tests  : 8129
> > > [1/3] Running lit suite /tmp/llvm/build/llvm-
> > > 6.0.1.src/tools/clang/test/Tooling
> > > llvm-lit: /tmp/llvm/build/llvm-
> > > 6.0.1.src/utils/lit/lit/llvm/config.py:334: note: using clang:
> > > /tmp/l
> > > lvm/build/llvm-6.0.1.src/build/bin/clang
> > > -- Testing: 26 tests, 4 threads --
> > > Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> > > Testing Time: 3.32s
> > >   Expected Passes: 26
> > > [2/3] Running the Clang regression tests
> > > llvm-lit: /tmp/llvm/build/llvm-
> > > 6.0.1.src/utils/lit/lit/llvm/config.py:334: note: using clang:
> > > /tmp/llvm/build/llvm-6.0.1.src/build/bin/clang
> > > -- Testing: 11832 tests, 4 threads --
> > > Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> > > Testing Time: 135.52s
> > >   Expected Passes: 11572
> > >   Expected Failures  : 18
> > >   Unsupported Tests  : 242
> > > ...
> > > 
> > > When running "ninja check-all" it looks like (same build
> > > instructions,
> > > clean build):
> > > 
> > > ...
> > > Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> > > 
> > > 1 warning(s) in tests.
> > > Testing Time: 615.44s
> > > 
> > > Failing Tests (8):
> > > LeakSanitizer-AddressSanitizer-x86_64 ::
> > > TestCases/Linux/use_tls_dynamic.cc
> > > LeakSanitizer-Standalone-x86_64 ::
> > > TestCases/Linux/use_tls_dynamic.cc
> > > MemorySanitizer-X86_64 :: Linux/sunrpc.cc
> > > MemorySanitizer-X86_64 :: Linux/sunrpc_bytes.cc
> > > MemorySanitizer-X86_64 :: Linux/sunrpc_string.cc
> > > MemorySanitizer-X86_64 :: dtls_test.c
> > > SanitizerCommon-lsan-x86_64-Linux ::
> > > Posix/sanitizer_set_death_callback_test.cc
> > > ThreadSanitizer-x86_64 :: sunrpc.cc
> > > 
> > >   Expected Passes: 29119
> > >   Expected Failures  : 103
> > >   Unsupported Tests  : 8914
> > >   Unexpected Failures: 8
> > > FAILED: CMakeFiles/check-all 
> > > ...
> > > 
> > > So, comparable to Ken's results but different anyhow - all
> > > failures
> > > have to with the rpc header files. Interesting that cmake checks
> > > for
> > > them and states that they are not found but continues (see the
> > > first
> > > two lines of following grep output), so i think they are no hard
> > > prerequisites. The tests seems not taking care of not having the
> > > headers available. I simply did a grep on the my log file:
> > > 
> > > # grep "rpc/.*not found" llvm-check-all.log 
> > > -- Looking for rpc/xdr.h - not found
> > > -- Looking for tirpc/rpc/xdr.h - not found>
> > > /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> > > rt/test/msan/Linux/sunrpc.cc:15:10: fatal error: 'rpc/xdr.h' file
> > > not
> > > found
> > > /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> > > rt/test/msan/Linux/sunrpc_bytes.cc:8:10: fatal error: 'rpc/xdr.h'
> > > file
> > > not found
> > > /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> > > rt/test/msan/Linux/sunrpc_string.cc:8:10: fatal error:
> > > 'rpc/xdr.h'
> > > file
> > > not found
> > > /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> > > rt/test/tsan/sunrpc.cc:4:10: fatal error: 'rpc/types.h' file not
> > > found
> > > 
> > > Can we assume that those unexpected failures are cause by a flaw
> > > in
> > > the
> > > test suite?
> > > 
> > 
> > I think so.  I was going to suggest dropping back to the targets DJ
> > was using, but I see that there were a lot fewer expected passes
> > there (11572 instead of 29119).  As to the marginally different
> > number of unexpected failures, probably a minor difference in what
> > we have installed.
> > > 
> > > Btw, just redoing build with 'make' to see if the hang is
> > > reproducable
> > > there. If not than it may have been caused by whatever reason...
> > > 
> > 
> > Currently rerunning the base build (i.e. no docs) with
> > hyperthreading (should be slower) and ninja -j4.  It is
> > consistently
> > running at 400%, unlike the build on 4 real cores which mostly 

Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-18 Thread Thomas Trepl
Am Freitag, den 17.08.2018, 21:27 +0100 schrieb Ken Moffat:
> On Fri, Aug 17, 2018 at 08:57:02PM +0200, Thomas Trepl wrote:
> > 
> > Rerun the tests here with "ninja check{,-clang{,-tooling}}" - not
> > "make". I used
> > 
> > CC=gcc CXX=g++  \
> > cmake -DCMAKE_INSTALL_PREFIX=/usr   \
> >   -DLLVM_ENABLE_FFI=ON  \
> >   -DCMAKE_BUILD_TYPE=Release\
> >   -DLLVM_BUILD_LLVM_DYLIB=ON\
> >   -DLLVM_TARGETS_TO_BUILD="host;AMDGPU" \
> >   -DLLVM_BUILD_TESTS=ON \
> >   -Wno-dev -G Ninja ..
> > 
> > The summary looks fine to me:
> > 
> > [0/3] Running the LLVM regression tests
> > -- Testing: 23298 tests, 4 threads --
> > Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> > Testing Time: 106.64s
> >   Expected Passes: 15113
> >   Expected Failures  : 56
> >   Unsupported Tests  : 8129
> > [1/3] Running lit suite /tmp/llvm/build/llvm-
> > 6.0.1.src/tools/clang/test/Tooling
> > llvm-lit: /tmp/llvm/build/llvm-
> > 6.0.1.src/utils/lit/lit/llvm/config.py:334: note: using clang:
> > /tmp/l
> > lvm/build/llvm-6.0.1.src/build/bin/clang
> > -- Testing: 26 tests, 4 threads --
> > Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> > Testing Time: 3.32s
> >   Expected Passes: 26
> > [2/3] Running the Clang regression tests
> > llvm-lit: /tmp/llvm/build/llvm-
> > 6.0.1.src/utils/lit/lit/llvm/config.py:334: note: using clang:
> > /tmp/llvm/build/llvm-6.0.1.src/build/bin/clang
> > -- Testing: 11832 tests, 4 threads --
> > Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> > Testing Time: 135.52s
> >   Expected Passes: 11572
> >   Expected Failures  : 18
> >   Unsupported Tests  : 242
> > ...
> > 
> > When running "ninja check-all" it looks like (same build
> > instructions,
> > clean build):
> > 
> > ...
> > Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> > 
> > 1 warning(s) in tests.
> > Testing Time: 615.44s
> > 
> > Failing Tests (8):
> > LeakSanitizer-AddressSanitizer-x86_64 ::
> > TestCases/Linux/use_tls_dynamic.cc
> > LeakSanitizer-Standalone-x86_64 ::
> > TestCases/Linux/use_tls_dynamic.cc
> > MemorySanitizer-X86_64 :: Linux/sunrpc.cc
> > MemorySanitizer-X86_64 :: Linux/sunrpc_bytes.cc
> > MemorySanitizer-X86_64 :: Linux/sunrpc_string.cc
> > MemorySanitizer-X86_64 :: dtls_test.c
> > SanitizerCommon-lsan-x86_64-Linux ::
> > Posix/sanitizer_set_death_callback_test.cc
> > ThreadSanitizer-x86_64 :: sunrpc.cc
> > 
> >   Expected Passes: 29119
> >   Expected Failures  : 103
> >   Unsupported Tests  : 8914
> >   Unexpected Failures: 8
> > FAILED: CMakeFiles/check-all 
> > ...
> > 
> > So, comparable to Ken's results but different anyhow - all failures
> > have to with the rpc header files. Interesting that cmake checks
> > for
> > them and states that they are not found but continues (see the
> > first
> > two lines of following grep output), so i think they are no hard
> > prerequisites. The tests seems not taking care of not having the
> > headers available. I simply did a grep on the my log file:
> > 
> > # grep "rpc/.*not found" llvm-check-all.log 
> > -- Looking for rpc/xdr.h - not found
> > -- Looking for tirpc/rpc/xdr.h - not found>
> > /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> > rt/test/msan/Linux/sunrpc.cc:15:10: fatal error: 'rpc/xdr.h' file
> > not
> > found
> > /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> > rt/test/msan/Linux/sunrpc_bytes.cc:8:10: fatal error: 'rpc/xdr.h'
> > file
> > not found
> > /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> > rt/test/msan/Linux/sunrpc_string.cc:8:10: fatal error: 'rpc/xdr.h'
> > file
> > not found
> > /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> > rt/test/tsan/sunrpc.cc:4:10: fatal error: 'rpc/types.h' file not
> > found
> > 
> > Can we assume that those unexpected failures are cause by a flaw in
> > the
> > test suite?
> > 
> 
> I think so.  I was going to suggest dropping back to the targets DJ
> was using, but I see that there were a lot fewer expected passes
> there (11572 instead of 29119).  As to the marginally different
> number of unexpected failures, probably a minor difference in what
> we have installed.
> > 
> > Btw, just redoing build with 'make' to see if the hang is
> > reproducable
> > there. If not than it may have been caused by whatever reason...
> > 
> 
> Currently rerunning the base build (i.e. no docs) with
> hyperthreading (should be slower) and ninja -j4.  It is consistently
> running at 400%, unlike the build on 4 real cores which mostly only
> ran at 100% and 200%.
> 
> So, it now looks as if ninja makes a muh better job of the build
> than cmake's Makefiles.  OTOH, building docs might be a lot messier.
> 
> Still thinking about what will suit _me_ best, but if your run with
> 'make' works, don't let me stop you updating with your preference.

'make'-based build 

Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-17 Thread Ken Moffat
On Sat, Aug 18, 2018 at 01:17:17AM +0100, Ken Moffat wrote:
> 
> Started another run (cmake, ninja, cmake for sphinx, ninja, cmake
> for clang docs, ninja, DESTDIR and the docs, then run the tests (to
> get a better idea of extra space/time).  Will also determine if
> specifying -j for the tests helps (it defaulted to 8 jobs, possibly
> because the whole box has 8 CPUs, but loadavg was 10-12 during the
> tests - will try -j4.
> 
> Fun, fun, fun ;-)
> 
Trial was successful.  And ninja is faster (as well as faster for
the build, building the test progs is faster because make runs
single-threaded).

On a haswell i7 using both threads of cores 1 and 2 (CPUs 1,2,5,6),
and changing *each* cmake invocation to add -G ninja, but no other
changes -

Normal build and DESTDIR 29 SBU, 1.8GiB + 627MB installed = 2.4GiB

The minimal docs using just sphinx took not very long and produced
only a few MB of output.  No doubt they would take longer and be
bigger if I'd thrown all the deps at them, but we don't actually
specify a time or size for these.

check-all 14 SBU, extra 11974MiB in my run, 11.7GiB

Also: check all mentioned it could not import psutils so some tests
would be skipped and the timeout command will not work, so I suppose
it should be listed as optional for the tests :
https://pypi.org/project/psutil/

ĸen
-- 
   Entropy not found, thump keyboard to continue

-- 
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page


Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-17 Thread Ken Moffat
On Fri, Aug 17, 2018 at 01:00:39AM +0100, Ken Moffat wrote:
> On Sun, Aug 12, 2018 at 11:15:06PM -0500, Brendan L wrote:
> > Not about llvm, but I agree about using cmake's ninja support.  It
> > made building webkitgtk so much faster for me.
> > 
> 
> I meant to reply to this part of the thread:
> 
> No problem for me with using ninja on cmake packages, but when I
> last looked at doing that (ages ago) it was because the inkscape
> devs were enthusing about it.  And what I found was that on the
> packages I tried it had no benefit for a clean build - at that time
> the big benefits were for developers who changed something, did a
> build, changed something else and then needed to rebuild.
> 
And I now agree that it IS faster.  Dunno whether ninja has improved
or cmake's Makefiles have got worse, but building llvm with 4
available CPUs today (several times, the first time I thought I'd
mistyped something because it was so slow) I eventually discovered:

make -j4 : starts as 4 jobs (i.e. 400% CPU), but quickly falls back
to 1 job (100%) and later increases to 2 jobs (200%).  Perhaps there
were brief other times when 4 jobs were running, but I had better
things to do than sit and continually look at 'top'.

ninja -j4 : consistently 400% CPU.

Similarly with running the tests, make builds as -j1 then runs tests
in parallel, ninja builds with multiple jobs.

ĸen
-- 
   Entropy not found, thump keyboard to continue

-- 
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page


Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-17 Thread Ken Moffat
On Fri, Aug 17, 2018 at 09:27:29PM +0100, Ken Moffat wrote:
> 
> Currently rerunning the base build (i.e. no docs) with
> hyperthreading (should be slower) and ninja -j4.  It is consistently
> running at 400%, unlike the build on 4 real cores which mostly only
> ran at 100% and 200%.
> 
> So, it now looks as if ninja makes a muh better job of the build
> than cmake's Makefiles.  OTOH, building docs might be a lot messier.
> 
A trial AFTER running the tests implied that the existing cmake
commands for main docs and clang docs CAN be run with -G Ninja
*after* the main build without problems.  Now trying to confirm
that.

Also, avoiding all the options *except* Sphinx still allowed me to
build the clang docs, and a quick perusal in 'links -g' seemed fine.
No doubt viewing them in a graphical browser with all the extras
might provide more diagrams, but I'm going to suggest that the
wording for the clang docs be changed to: if you have downloaded (at
least) Sphinx, the clang documentation can be built too.

Started another run (cmake, ninja, cmake for sphinx, ninja, cmake
for clang docs, ninja, DESTDIR and the docs, then run the tests (to
get a better idea of extra space/time).  Will also determine if
specifying -j for the tests helps (it defaulted to 8 jobs, possibly
because the whole box has 8 CPUs, but loadavg was 10-12 during the
tests - will try -j4.

Fun, fun, fun ;-)

ĸen
-- 
   Entropy not found, thump keyboard to continue

-- 
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page


Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-17 Thread Ken Moffat
On Fri, Aug 17, 2018 at 08:57:02PM +0200, Thomas Trepl wrote:
> 
> Rerun the tests here with "ninja check{,-clang{,-tooling}}" - not
> "make". I used
> 
> CC=gcc CXX=g++  \
> cmake -DCMAKE_INSTALL_PREFIX=/usr   \
>   -DLLVM_ENABLE_FFI=ON  \
>   -DCMAKE_BUILD_TYPE=Release\
>   -DLLVM_BUILD_LLVM_DYLIB=ON\
>   -DLLVM_TARGETS_TO_BUILD="host;AMDGPU" \
>   -DLLVM_BUILD_TESTS=ON \
>   -Wno-dev -G Ninja ..
> 
> The summary looks fine to me:
> 
> [0/3] Running the LLVM regression tests
> -- Testing: 23298 tests, 4 threads --
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> Testing Time: 106.64s
>   Expected Passes: 15113
>   Expected Failures  : 56
>   Unsupported Tests  : 8129
> [1/3] Running lit suite /tmp/llvm/build/llvm-
> 6.0.1.src/tools/clang/test/Tooling
> llvm-lit: /tmp/llvm/build/llvm-
> 6.0.1.src/utils/lit/lit/llvm/config.py:334: note: using clang: /tmp/l
> lvm/build/llvm-6.0.1.src/build/bin/clang
> -- Testing: 26 tests, 4 threads --
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> Testing Time: 3.32s
>   Expected Passes: 26
> [2/3] Running the Clang regression tests
> llvm-lit: /tmp/llvm/build/llvm-
> 6.0.1.src/utils/lit/lit/llvm/config.py:334: note: using clang:
> /tmp/llvm/build/llvm-6.0.1.src/build/bin/clang
> -- Testing: 11832 tests, 4 threads --
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> Testing Time: 135.52s
>   Expected Passes: 11572
>   Expected Failures  : 18
>   Unsupported Tests  : 242
> ...
> 
> When running "ninja check-all" it looks like (same build instructions,
> clean build):
> 
> ...
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> 
> 1 warning(s) in tests.
> Testing Time: 615.44s
> 
> Failing Tests (8):
> LeakSanitizer-AddressSanitizer-x86_64 ::
> TestCases/Linux/use_tls_dynamic.cc
> LeakSanitizer-Standalone-x86_64 ::
> TestCases/Linux/use_tls_dynamic.cc
> MemorySanitizer-X86_64 :: Linux/sunrpc.cc
> MemorySanitizer-X86_64 :: Linux/sunrpc_bytes.cc
> MemorySanitizer-X86_64 :: Linux/sunrpc_string.cc
> MemorySanitizer-X86_64 :: dtls_test.c
> SanitizerCommon-lsan-x86_64-Linux ::
> Posix/sanitizer_set_death_callback_test.cc
> ThreadSanitizer-x86_64 :: sunrpc.cc
> 
>   Expected Passes: 29119
>   Expected Failures  : 103
>   Unsupported Tests  : 8914
>   Unexpected Failures: 8
> FAILED: CMakeFiles/check-all 
> ...
> 
> So, comparable to Ken's results but different anyhow - all failures
> have to with the rpc header files. Interesting that cmake checks for
> them and states that they are not found but continues (see the first
> two lines of following grep output), so i think they are no hard
> prerequisites. The tests seems not taking care of not having the
> headers available. I simply did a grep on the my log file:
> 
> # grep "rpc/.*not found" llvm-check-all.log 
> -- Looking for rpc/xdr.h - not found
> -- Looking for tirpc/rpc/xdr.h - not found> 
> /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> rt/test/msan/Linux/sunrpc.cc:15:10: fatal error: 'rpc/xdr.h' file not
> found
> /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> rt/test/msan/Linux/sunrpc_bytes.cc:8:10: fatal error: 'rpc/xdr.h' file
> not found
> /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> rt/test/msan/Linux/sunrpc_string.cc:8:10: fatal error: 'rpc/xdr.h' file
> not found
> /home/lfs/tmp/llvm/build/llvm-6.0.1.src/projects/compiler-
> rt/test/tsan/sunrpc.cc:4:10: fatal error: 'rpc/types.h' file not found
> 
> Can we assume that those unexpected failures are cause by a flaw in the
> test suite?
> 

I think so.  I was going to suggest dropping back to the targets DJ
was using, but I see that there were a lot fewer expected passes
there (11572 instead of 29119).  As to the marginally different
number of unexpected failures, probably a minor difference in what
we have installed.
> 
> Btw, just redoing build with 'make' to see if the hang is reproducable
> there. If not than it may have been caused by whatever reason...
> 

Currently rerunning the base build (i.e. no docs) with
hyperthreading (should be slower) and ninja -j4.  It is consistently
running at 400%, unlike the build on 4 real cores which mostly only
ran at 100% and 200%.

So, it now looks as if ninja makes a muh better job of the build
than cmake's Makefiles.  OTOH, building docs might be a lot messier.

Still thinking about what will suit _me_ best, but if your run with
'make' works, don't let me stop you updating with your preference.

ĸen
-- 
   Entropy not found, thump keyboard to continue



-- 
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page


Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-17 Thread Thomas Trepl
Am Freitag, den 17.08.2018, 05:48 +0100 schrieb Ken Moffat:
> On Sun, Aug 12, 2018 at 10:05:14PM -0500, DJ Lucas wrote:
> > 
> > 
> > On 08/12/2018 02:28 AM, Thomas Trepl wrote:
> > > Am Sonntag, den 12.08.2018, 03:36 +0100 schrieb Ken Moffat:
> > > > 6.0.1 builds.
> > > 
> > > Did you also run the tests?   This is where i struggle
> > > atm.  Tests seem
> > > to fail in a good amount and finally hang.
> > 
> > Which tests fail? Are you building with make or ninja? I'm of the
> > mind to
> > use ninja for all cmake based packages (at least where possible,
> > there was a
> > limitation, a corner case, but I don't recall what it was). I do
> > not build
> > the docs, but if they are desired, then I think they probably have
> > to be
> > included in the first cmake command line with ninja adding
> > -DLLVM_BUILD_DOCS=ON and the two sphinx defines already in the
> > book.
> > 
> 
> Taking a look at this now, and looking at what fedora did for fc29
> (they've moved on to llvm-7 of some sort now) :
> 
> I do build the docs, and it seemed to me that fedora were building
> them separately.  But after using ninja to do the initial build I
> realised that rerunning cmake to build the docs would possibly trash
> the main build.  So I think you are right about building the docs at
> the same time.  Fedora seemed to use
> 
> -DLLVM_INCLUDE_DOCS=ON \
> -DLLVM_BUILD_DOCS=ON \
> -DLLVM_ENABLE_SPHINX=ON \
> -DLLVM_ENABLE_DOXYGEN=OFF \
> 
> from that I *guess* that doxygen might be pulled in automatically
> unless turned off.
> 
> > I'm using the following:
> > {{{
> > CC=gcc CXX=g++  \
> > cmake -DCMAKE_INSTALL_PREFIX=/usr   \
> >   -DLLVM_ENABLE_FFI=ON  \
> >   -DCMAKE_BUILD_TYPE=Release\
> >   -DLLVM_BUILD_LLVM_DYLIB=ON\
> >   -DLLVM_LINK_LLVM_DYLIB=ON \
> >   -DLLVM_TARGETS_TO_BUILD="host;AMDGPU" \
> >   -DLLVM_BUILD_TESTS=ON \
> >   -Wno -G Ninja ..  &&
> > ninja -j12
> > 
> > ninja check{,-clang{,-tooling}} 2>&1 | tee clang-check-log.txt
> > 
> 
> Why not ninja check-all ?
> 
> At the moment I'm trying ninja check-all -v -j4 (I assume it runs
> tests, and installs, in parallel) and just noticed that at least one
> test failed with
> 
> FAIL: AddressSanitizer-x86_64-linux-dynamic ::
> TestCases/throw_invoke_test.cc (3953 of 38144)
>  TEST 'AddressSanitizer-x86_64-linux-dynamic ::
> TestCases/throw_invoke_test.cc' 
> 
> [...]
> Exit Code: 1
> 
> Command Output (stderr):
> --
> Throw stack = 0x7ffe11365180
> ReallyThrow
> a = 42
> CheckStack stack = 0x7ffe11364fe0, 0x7ffe11365170
> /usr/bin/ld: cannot find -lstdc++
> clang-6.0: error: linker command failed with exit code 1 (use -v to
> see invocation)
> 
> [ I _was_ using -v, but the output is as hard to parse as rust test
> output, I'm sure the invocation is in there somewhere ]
> 
> But then llvm seems to be nothing but annoying IMHO.
> 
> Tests just finished in a term limited to 2 pairs of hyperthreaded
> CPUs (i.e. 4 cores) by 'taskset'  It took 14m11, status code 0 but
> the messages finished with:
> ninja: build stopped: subcommand failed.
> 
> Maybe that's why you don't run check-all ?
> 
> The error might have been on target 493 of 493, or on the first
> test.
> 
> FAIL: LeakSanitizer-Standalone-x86_64 ::
> TestCases/Linux/use_tls_dynamic.cc (36696 of 38144)
> 
> Exit Code: 23 (and details of leaks)
> 
> Then after that
> 
> Exit Code: 1
> 
> Command Output (stderr):
> --
> /extra/ken/llvm-6.0.1.src/projects/compiler-
> rt/test/msan/Linux/sunrpc.cc:15:10: fatal error: 'rpc/xdr.h' file not
> found
> #include 
>  ^~~
> 1 error generated.
> 
> More errors, then a similar problem with 
> 
> Ah!  There ARE summary details near the end of the logged output:
> 
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 
> 
> 1 warning(s) in tests.
> Testing Time: 490.61s
> 
> Failing Tests (11):
> AddressSanitizer-x86_64-linux :: TestCases/throw_invoke_test.cc
> AddressSanitizer-x86_64-linux-dynamic ::
> TestCases/throw_invoke_test.cc
> LeakSanitizer-AddressSanitizer-x86_64 ::
> TestCases/Linux/use_tls_dynamic.cc
> LeakSanitizer-Standalone-x86_64 ::
> TestCases/Linux/use_tls_dynamic.cc
> MemorySanitizer-X86_64 :: Linux/sunrpc.cc
> MemorySanitizer-X86_64 :: Linux/sunrpc_bytes.cc
> MemorySanitizer-X86_64 :: Linux/sunrpc_string.cc
> MemorySanitizer-X86_64 :: dtls_test.c
> SanitizerCommon-lsan-x86_64-Linux ::
> Posix/sanitizer_set_death_callback_test.cc
> ThreadSanitizer-x86_64 :: static_init6.cc
> ThreadSanitizer-x86_64 :: sunrpc.cc
> 
>   Expected Passes: 29115
>   Expected Failures  : 103
>   Unsupported Tests  : 8915
>   Unexpected Failures: 11
> FAILED: CMakeFiles/check-all 
> 
> 
> Well, at least it didn't hang.  But I'm not sure if this has been a
> good use of my time.
> 

Rerun 

Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-17 Thread Ken Moffat
(going back to this earlier part of the thread after my excursions
last night and earlier today, Cc'ing Thomas in case he hasn't been
following this.

On Mon, Aug 13, 2018 at 09:23:45AM +0200, Thomas Trepl wrote:
> Am Sonntag, den 12.08.2018, 22:05 -0500 schrieb DJ Lucas:
> > 
> > On 08/12/2018 02:28 AM, Thomas Trepl wrote:
> > > Am Sonntag, den 12.08.2018, 03:36 +0100 schrieb Ken Moffat:
> > > > 6.0.1 builds.
> > > 
> > > Did you also run the tests?   This is where i struggle atm.  Tests
> > > seem
> > > to fail in a good amount and finally hang.
> > 
> > Which tests fail? Are you building with make or ninja? I'm of the
> > mind 

> You are kidding ;-) "limiting to 12 threads", "only 16GB of RAM"
> Sounds like you're on a very tiny and slow machine...
> 
> I used "make" - will redo checks with ninja and keep you updated. Btw,
> new multilib-patch to come, building m32, mx32 and m64 ...
> 

I've now run the tests using the instructions in the book (i.e.
'make').  This was using 4 cores (not two pairs of hyperthreads as I
had intended - I've now made a note of which CPUs are on the same
cores ;).

The initial build was "odd" - I specified make -j4, but it only ever
seemed to use 100% CPU (i.e. 1 core).  My "real" build (first build,
plus separate build for html using sphinx) took just over 25 minutes
using -j8.  After 50 minutes I stopped and tried again.

This time I paid more attention: brief initial 400%, then a long
100%, later 200%.  I left it, completed in 57m22.

Tried the tests using 'make' - about 20 minutes to build all the
test progs, then using the 4 cores - tests completed in 28m50 (19.5
SBU).

Neither this test nor last night's ninja test got the clean results
DJ had, but they seem to accord with what the book said for 6.0.0.

1 warning(s) in tests.
Testing Time: 486.50s

Failing Tests (11):
AddressSanitizer-x86_64-linux :: TestCases/throw_invoke_test.cc
AddressSanitizer-x86_64-linux-dynamic :: TestCases/throw_invoke_test.cc
LeakSanitizer-AddressSanitizer-x86_64 :: TestCases/Linux/use_tls_dynamic.cc
LeakSanitizer-Standalone-x86_64 :: TestCases/Linux/use_tls_dynamic.cc
MemorySanitizer-X86_64 :: Linux/sunrpc.cc
MemorySanitizer-X86_64 :: Linux/sunrpc_bytes.cc
MemorySanitizer-X86_64 :: Linux/sunrpc_string.cc
MemorySanitizer-X86_64 :: dtls_test.c
SanitizerCommon-lsan-x86_64-Linux :: 
Posix/sanitizer_set_death_callback_test.cc
ThreadSanitizer-x86_64 :: static_init6.cc
ThreadSanitizer-x86_64 :: sunrpc.cc

  Expected Passes: 29115
  Expected Failures  : 103
  Unsupported Tests  : 8915
  Unexpected Failures: 11
make[3]: *** [CMakeFiles/check-all.dir/build.make:58: CMakeFiles/check-all] 
Error 1
make[2]: *** [CMakeFiles/Makefile2:439: CMakeFiles/check-all.dir/all] Error 2
make[1]: *** [CMakeFiles/Makefile2:446: CMakeFiles/check-all.dir/rule] Error 2
make: *** [Makefile:251: check-all] Error 2

I haven't been measuring space (expected this to hang), but after
the tests (without sphinx or clang docs) the build tree is using
14GB according to 'du -sch'.

Sorry I've no idea why the tests hung for Thomas.

I think I might play around with using ninja to see if it really is
faster (that seemed to be the impression last night), but with two
real CPUs and their matching hyperthreaded CPUs.  But taking a break
for the moment.

ĸen
-- 
   Entropy not found, thump keyboard to continue

-- 
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page


Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-17 Thread Ken Moffat
On Fri, Aug 17, 2018 at 05:48:27AM +0100, Ken Moffat wrote:
> 
> Failing Tests (11):
> AddressSanitizer-x86_64-linux :: TestCases/throw_invoke_test.cc
> AddressSanitizer-x86_64-linux-dynamic :: TestCases/throw_invoke_test.cc
> LeakSanitizer-AddressSanitizer-x86_64 :: 
> TestCases/Linux/use_tls_dynamic.cc
> LeakSanitizer-Standalone-x86_64 :: TestCases/Linux/use_tls_dynamic.cc
> MemorySanitizer-X86_64 :: Linux/sunrpc.cc
> MemorySanitizer-X86_64 :: Linux/sunrpc_bytes.cc
> MemorySanitizer-X86_64 :: Linux/sunrpc_string.cc
> MemorySanitizer-X86_64 :: dtls_test.c
> SanitizerCommon-lsan-x86_64-Linux :: 
> Posix/sanitizer_set_death_callback_test.cc
> ThreadSanitizer-x86_64 :: static_init6.cc
> ThreadSanitizer-x86_64 :: sunrpc.cc
> 
>   Expected Passes: 29115
>   Expected Failures  : 103
>   Unsupported Tests  : 8915
>   Unexpected Failures: 11
> FAILED: CMakeFiles/check-all 
> 
It might have helped if I had read the current text about the tests
;-)

Doing the build with ninja will, probably, need a single invocation
with optional switches for sphinx and the clang docs.

But I now see that for 6.0.0 the book says:

Tests are built with a single thread, but run using the maximum
number of processors/threads available. Note that the several Sanitizer
tests (9 of 26479) are known to fail.

So, 11 of 29115 is probably par for the course.

The two missing rpc headers are concerning (rpc/xdr.h and
rpc/types.h).  But at least rpc/types.h dropped out of glibc-2.26
(sic) - perhaps replaced by tirpc/types.h, and I've never had
rpc/xdr.h.  So perhaps they are unimportant (after the changes in
glibc-2.28, I'm on the lookout for missing headers).

Unfortunately I didn't note the elapsed time because I thought it
had failed, nor did I note the space.

I've just started a build of 6.0.1 using what is currently in the
book to see if I can replicate the problem.

ĸen
-- 
   Entropy not found, thump keyboard to continue

-- 
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page


Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-16 Thread Ken Moffat
On Sun, Aug 12, 2018 at 10:05:14PM -0500, DJ Lucas wrote:
> 
> 
> On 08/12/2018 02:28 AM, Thomas Trepl wrote:
> > Am Sonntag, den 12.08.2018, 03:36 +0100 schrieb Ken Moffat:
> 
> > > 6.0.1 builds.
> > Did you also run the tests?   This is where i struggle atm.  Tests seem
> > to fail in a good amount and finally hang.
> 
> Which tests fail? Are you building with make or ninja? I'm of the mind to
> use ninja for all cmake based packages (at least where possible, there was a
> limitation, a corner case, but I don't recall what it was). I do not build
> the docs, but if they are desired, then I think they probably have to be
> included in the first cmake command line with ninja adding
> -DLLVM_BUILD_DOCS=ON and the two sphinx defines already in the book.
> 

Taking a look at this now, and looking at what fedora did for fc29
(they've moved on to llvm-7 of some sort now) :

I do build the docs, and it seemed to me that fedora were building
them separately.  But after using ninja to do the initial build I
realised that rerunning cmake to build the docs would possibly trash
the main build.  So I think you are right about building the docs at
the same time.  Fedora seemed to use

-DLLVM_INCLUDE_DOCS=ON \
-DLLVM_BUILD_DOCS=ON \
-DLLVM_ENABLE_SPHINX=ON \
-DLLVM_ENABLE_DOXYGEN=OFF \

from that I *guess* that doxygen might be pulled in automatically
unless turned off.

> I'm using the following:
> {{{
> CC=gcc CXX=g++  \
> cmake -DCMAKE_INSTALL_PREFIX=/usr   \
>   -DLLVM_ENABLE_FFI=ON  \
>   -DCMAKE_BUILD_TYPE=Release\
>   -DLLVM_BUILD_LLVM_DYLIB=ON\
>   -DLLVM_LINK_LLVM_DYLIB=ON \
>   -DLLVM_TARGETS_TO_BUILD="host;AMDGPU" \
>   -DLLVM_BUILD_TESTS=ON \
>   -Wno -G Ninja ..  &&
> ninja -j12
> 
> ninja check{,-clang{,-tooling}} 2>&1 | tee clang-check-log.txt
> 

Why not ninja check-all ?

At the moment I'm trying ninja check-all -v -j4 (I assume it runs
tests, and installs, in parallel) and just noticed that at least one
test failed with

FAIL: AddressSanitizer-x86_64-linux-dynamic :: TestCases/throw_invoke_test.cc 
(3953 of 38144)
 TEST 'AddressSanitizer-x86_64-linux-dynamic :: 
TestCases/throw_invoke_test.cc' 

[...]
Exit Code: 1

Command Output (stderr):
--
Throw stack = 0x7ffe11365180
ReallyThrow
a = 42
CheckStack stack = 0x7ffe11364fe0, 0x7ffe11365170
/usr/bin/ld: cannot find -lstdc++
clang-6.0: error: linker command failed with exit code 1 (use -v to see 
invocation)

[ I _was_ using -v, but the output is as hard to parse as rust test
output, I'm sure the invocation is in there somewhere ]

But then llvm seems to be nothing but annoying IMHO.

Tests just finished in a term limited to 2 pairs of hyperthreaded
CPUs (i.e. 4 cores) by 'taskset'  It took 14m11, status code 0 but
the messages finished with:
ninja: build stopped: subcommand failed.

Maybe that's why you don't run check-all ?

The error might have been on target 493 of 493, or on the first
test.

FAIL: LeakSanitizer-Standalone-x86_64 :: TestCases/Linux/use_tls_dynamic.cc 
(36696 of 38144)

Exit Code: 23 (and details of leaks)

Then after that

Exit Code: 1

Command Output (stderr):
--
/extra/ken/llvm-6.0.1.src/projects/compiler-rt/test/msan/Linux/sunrpc.cc:15:10: 
fatal error: 'rpc/xdr.h' file not found
#include 
 ^~~
1 error generated.

More errors, then a similar problem with 

Ah!  There ARE summary details near the end of the logged output:

Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90.. 

1 warning(s) in tests.
Testing Time: 490.61s

Failing Tests (11):
AddressSanitizer-x86_64-linux :: TestCases/throw_invoke_test.cc
AddressSanitizer-x86_64-linux-dynamic :: TestCases/throw_invoke_test.cc
LeakSanitizer-AddressSanitizer-x86_64 :: TestCases/Linux/use_tls_dynamic.cc
LeakSanitizer-Standalone-x86_64 :: TestCases/Linux/use_tls_dynamic.cc
MemorySanitizer-X86_64 :: Linux/sunrpc.cc
MemorySanitizer-X86_64 :: Linux/sunrpc_bytes.cc
MemorySanitizer-X86_64 :: Linux/sunrpc_string.cc
MemorySanitizer-X86_64 :: dtls_test.c
SanitizerCommon-lsan-x86_64-Linux :: 
Posix/sanitizer_set_death_callback_test.cc
ThreadSanitizer-x86_64 :: static_init6.cc
ThreadSanitizer-x86_64 :: sunrpc.cc

  Expected Passes: 29115
  Expected Failures  : 103
  Unsupported Tests  : 8915
  Unexpected Failures: 11
FAILED: CMakeFiles/check-all 


Well, at least it didn't hang.  But I'm not sure if this has been a
good use of my time.

ĸen
-- 
   Entropy not found, thump keyboard to continue

-- 
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page


Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-16 Thread Ken Moffat
On Sun, Aug 12, 2018 at 11:15:06PM -0500, Brendan L wrote:
> Not about llvm, but I agree about using cmake's ninja support.  It
> made building webkitgtk so much faster for me.
> 

I meant to reply to this part of the thread:

No problem for me with using ninja on cmake packages, but when I
last looked at doing that (ages ago) it was because the inkscape
devs were enthusing about it.  And what I found was that on the
packages I tried it had no benefit for a clean build - at that time
the big benefits were for developers who changed something, did a
build, changed something else and then needed to rebuild.

As editors, we unfortunately need to measure clean builds.

Cmake was always poor at determining the dependencies which needed
to be rebuilt, and that was where ninja used to show the benefits.

ĸen
-- 
   Entropy not found, thump keyboard to continue

-- 
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page


Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-13 Thread Thomas Trepl
Am Sonntag, den 12.08.2018, 22:05 -0500 schrieb DJ Lucas:
> 
> On 08/12/2018 02:28 AM, Thomas Trepl wrote:
> > Am Sonntag, den 12.08.2018, 03:36 +0100 schrieb Ken Moffat:
> > > 6.0.1 builds.
> > 
> > Did you also run the tests?   This is where i struggle atm.  Tests
> > seem
> > to fail in a good amount and finally hang.
> 
> Which tests fail? Are you building with make or ninja? I'm of the
> mind 
> to use ninja for all cmake based packages (at least where possible, 
> there was a limitation, a corner case, but I don't recall what it
> was). 
> I do not build the docs, but if they are desired, then I think they 
> probably have to be included in the first cmake command line with
> ninja 
> adding -DLLVM_BUILD_DOCS=ON and the two sphinx defines already in the
> book.
> 
> I'm using the following:
> {{{
> CC=gcc CXX=g++  \
> cmake -DCMAKE_INSTALL_PREFIX=/usr   \
>-DLLVM_ENABLE_FFI=ON  \
>-DCMAKE_BUILD_TYPE=Release\
>-DLLVM_BUILD_LLVM_DYLIB=ON\
>-DLLVM_LINK_LLVM_DYLIB=ON \
>-DLLVM_TARGETS_TO_BUILD="host;AMDGPU" \
>-DLLVM_BUILD_TESTS=ON \
>-Wno -G Ninja ..  &&
> ninja -j12
> 
> ninja check{,-clang{,-tooling}} 2>&1 | tee clang-check-log.txt
> 
> ninja install
> }}}
> 
> Results:
> {{{
> [0/3] Running the LLVM regression tests
> -- Testing: 23298 tests, 16 threads --
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90..
> Testing Time: 50.99s
>Expected Passes: 15110
>Expected Failures  : 56
>Unsupported Tests  : 8132
> [1/3] Running lit suite /sources/llvm-
> 6.0.1.src/tools/clang/test/Tooling
> llvm-lit: /sources/llvm-6.0.1.src/utils/lit/lit/llvm/config.py:334: 
> note: using clang: /sources/llvm-6.0.1.src/build/bin/clang
> -- Testing: 26 tests, 16 threads --
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90..
> Testing Time: 0.21s
>Expected Passes: 26
> [2/3] Running the Clang regression tests
> llvm-lit: /sources/llvm-6.0.1.src/utils/lit/lit/llvm/config.py:334: 
> note: using clang: /sources/llvm-6.0.1.src/build/bin/clang
> -- Testing: 11832 tests, 16 threads --
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90..
> Testing Time: 59.26s
>Expected Passes: 11572
>Expected Failures  : 18
>Unsupported Tests  : 242
> }}}
> 
> 
> I limit all ninja builds, as a matter of course, to 12 threads as I
> get 
> OOMs with only 16GB of RAM on Chromium (but seems it was OK for
> tests, 
> as I forgot about it above). Also of note is that I am on multi-lib
> from 
> lfs-systemd-20180803, so 'host' builds both x86_64 and i386. Finally,
> I 
> build in chroot until Xorg (so building as root for the moment).
> 
> HTH
> 
> --DJ
> 
You are kidding ;-) "limiting to 12 threads", "only 16GB of RAM"
Sounds like you're on a very tiny and slow machine...

I used "make" - will redo checks with ninja and keep you updated. Btw,
new multilib-patch to come, building m32, mx32 and m64 ...

--
Thomas

-- 
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page


Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-12 Thread Brendan L
Not about llvm, but I agree about using cmake's ninja support.  It
made building webkitgtk so much faster for me.

On Sun, Aug 12, 2018 at 10:05 PM, DJ Lucas  wrote:
>
>
> On 08/12/2018 02:28 AM, Thomas Trepl wrote:
>>
>> Am Sonntag, den 12.08.2018, 03:36 +0100 schrieb Ken Moffat:
>
>
>>> 6.0.1 builds.
>>
>> Did you also run the tests?   This is where i struggle atm.  Tests seem
>> to fail in a good amount and finally hang.
>
>
> Which tests fail? Are you building with make or ninja? I'm of the mind to
> use ninja for all cmake based packages (at least where possible, there was a
> limitation, a corner case, but I don't recall what it was). I do not build
> the docs, but if they are desired, then I think they probably have to be
> included in the first cmake command line with ninja adding
> -DLLVM_BUILD_DOCS=ON and the two sphinx defines already in the book.
>
> I'm using the following:
> {{{
> CC=gcc CXX=g++  \
> cmake -DCMAKE_INSTALL_PREFIX=/usr   \
>   -DLLVM_ENABLE_FFI=ON  \
>   -DCMAKE_BUILD_TYPE=Release\
>   -DLLVM_BUILD_LLVM_DYLIB=ON\
>   -DLLVM_LINK_LLVM_DYLIB=ON \
>   -DLLVM_TARGETS_TO_BUILD="host;AMDGPU" \
>   -DLLVM_BUILD_TESTS=ON \
>   -Wno -G Ninja ..  &&
> ninja -j12
>
> ninja check{,-clang{,-tooling}} 2>&1 | tee clang-check-log.txt
>
> ninja install
> }}}
>
> Results:
> {{{
> [0/3] Running the LLVM regression tests
> -- Testing: 23298 tests, 16 threads --
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90..
> Testing Time: 50.99s
>   Expected Passes: 15110
>   Expected Failures  : 56
>   Unsupported Tests  : 8132
> [1/3] Running lit suite /sources/llvm-6.0.1.src/tools/clang/test/Tooling
> llvm-lit: /sources/llvm-6.0.1.src/utils/lit/lit/llvm/config.py:334: note:
> using clang: /sources/llvm-6.0.1.src/build/bin/clang
> -- Testing: 26 tests, 16 threads --
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90..
> Testing Time: 0.21s
>   Expected Passes: 26
> [2/3] Running the Clang regression tests
> llvm-lit: /sources/llvm-6.0.1.src/utils/lit/lit/llvm/config.py:334: note:
> using clang: /sources/llvm-6.0.1.src/build/bin/clang
> -- Testing: 11832 tests, 16 threads --
> Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90..
> Testing Time: 59.26s
>   Expected Passes: 11572
>   Expected Failures  : 18
>   Unsupported Tests  : 242
> }}}
>
>
> I limit all ninja builds, as a matter of course, to 12 threads as I get OOMs
> with only 16GB of RAM on Chromium (but seems it was OK for tests, as I
> forgot about it above). Also of note is that I am on multi-lib from
> lfs-systemd-20180803, so 'host' builds both x86_64 and i386. Finally, I
> build in chroot until Xorg (so building as root for the moment).
>
> HTH
>
> --DJ
>
> --
> http://lists.linuxfromscratch.org/listinfo/blfs-dev
> FAQ: http://www.linuxfromscratch.org/blfs/faq.html
> Unsubscribe: See the above information page
-- 
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page


Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-12 Thread DJ Lucas



On 08/12/2018 02:28 AM, Thomas Trepl wrote:

Am Sonntag, den 12.08.2018, 03:36 +0100 schrieb Ken Moffat:



6.0.1 builds.

Did you also run the tests?   This is where i struggle atm.  Tests seem
to fail in a good amount and finally hang.


Which tests fail? Are you building with make or ninja? I'm of the mind 
to use ninja for all cmake based packages (at least where possible, 
there was a limitation, a corner case, but I don't recall what it was). 
I do not build the docs, but if they are desired, then I think they 
probably have to be included in the first cmake command line with ninja 
adding -DLLVM_BUILD_DOCS=ON and the two sphinx defines already in the book.


I'm using the following:
{{{
CC=gcc CXX=g++  \
cmake -DCMAKE_INSTALL_PREFIX=/usr   \
  -DLLVM_ENABLE_FFI=ON  \
  -DCMAKE_BUILD_TYPE=Release\
  -DLLVM_BUILD_LLVM_DYLIB=ON\
  -DLLVM_LINK_LLVM_DYLIB=ON \
  -DLLVM_TARGETS_TO_BUILD="host;AMDGPU" \
  -DLLVM_BUILD_TESTS=ON \
  -Wno -G Ninja ..  &&
ninja -j12

ninja check{,-clang{,-tooling}} 2>&1 | tee clang-check-log.txt

ninja install
}}}

Results:
{{{
[0/3] Running the LLVM regression tests
-- Testing: 23298 tests, 16 threads --
Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90..
Testing Time: 50.99s
  Expected Passes: 15110
  Expected Failures  : 56
  Unsupported Tests  : 8132
[1/3] Running lit suite /sources/llvm-6.0.1.src/tools/clang/test/Tooling
llvm-lit: /sources/llvm-6.0.1.src/utils/lit/lit/llvm/config.py:334: 
note: using clang: /sources/llvm-6.0.1.src/build/bin/clang

-- Testing: 26 tests, 16 threads --
Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90..
Testing Time: 0.21s
  Expected Passes: 26
[2/3] Running the Clang regression tests
llvm-lit: /sources/llvm-6.0.1.src/utils/lit/lit/llvm/config.py:334: 
note: using clang: /sources/llvm-6.0.1.src/build/bin/clang

-- Testing: 11832 tests, 16 threads --
Testing: 0 .. 10.. 20.. 30.. 40.. 50.. 60.. 70.. 80.. 90..
Testing Time: 59.26s
  Expected Passes: 11572
  Expected Failures  : 18
  Unsupported Tests  : 242
}}}


I limit all ninja builds, as a matter of course, to 12 threads as I get 
OOMs with only 16GB of RAM on Chromium (but seems it was OK for tests, 
as I forgot about it above). Also of note is that I am on multi-lib from 
lfs-systemd-20180803, so 'host' builds both x86_64 and i386. Finally, I 
build in chroot until Xorg (so building as root for the moment).


HTH

--DJ

--
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page


Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-12 Thread Ken Moffat
On Sun, Aug 12, 2018 at 09:28:12AM +0200, Thomas Trepl wrote:
> Am Sonntag, den 12.08.2018, 03:36 +0100 schrieb Ken Moffat:
> > On Sun, Aug 12, 2018 at 03:27:10AM +0100, Ken Moffat wrote:
> > > Trying to build (all of) llvm-6.0.0 :
> > > 
> > > [  9%] Building CXX object
> > > projects/compiler-
> > > rt/lib/sanitizer_common/CMakeFiles/RTSanitizerCommon.x86_64.dir/san
> > > itizer_platform_limits_posix.cc.o
> > > /scratch/working/llvm-6.0.0.src/projects/compiler-
> > > rt/lib/sanitizer_common/sanitizer_platform_limits_posix.cc:162:10:
> > > fatal error: sys/ustat.h: No such file or directory
> > >  #include 
> > >   ^
> > > compilation terminated.
> > > 
> > > This is because the deprecated sys/ustat.h and apparently also
> > > ustat.h were removed in glibc-2.28.
> > > 
> > 
> > 6.0.1 builds.
> Did you also run the tests?   This is where i struggle atm.  Tests seem
> to fail in a good amount and finally hang.
> 
No, I didn't.  This is just a build, for most BLFS packages I ignore
tests unless I'm editing.  Looks as if glibc-2.28 is turning out to
be a pain.

ĸen
-- 
   Entropy not found, thump keyboard to continue

-- 
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page


Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-12 Thread Thomas Trepl
Am Sonntag, den 12.08.2018, 03:36 +0100 schrieb Ken Moffat:
> On Sun, Aug 12, 2018 at 03:27:10AM +0100, Ken Moffat wrote:
> > Trying to build (all of) llvm-6.0.0 :
> > 
> > [  9%] Building CXX object
> > projects/compiler-
> > rt/lib/sanitizer_common/CMakeFiles/RTSanitizerCommon.x86_64.dir/san
> > itizer_platform_limits_posix.cc.o
> > /scratch/working/llvm-6.0.0.src/projects/compiler-
> > rt/lib/sanitizer_common/sanitizer_platform_limits_posix.cc:162:10:
> > fatal error: sys/ustat.h: No such file or directory
> >  #include 
> >   ^
> > compilation terminated.
> > 
> > This is because the deprecated sys/ustat.h and apparently also
> > ustat.h were removed in glibc-2.28.
> > 
> 
> 6.0.1 builds.
Did you also run the tests?   This is where i struggle atm.  Tests seem
to fail in a good amount and finally hang.

--
Thomas


-- 
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page


Re: [blfs-dev] llvm-6.0.0 FTBFS with glibc-2.28

2018-08-11 Thread Ken Moffat
On Sun, Aug 12, 2018 at 03:27:10AM +0100, Ken Moffat wrote:
> Trying to build (all of) llvm-6.0.0 :
> 
> [  9%] Building CXX object
> projects/compiler-rt/lib/sanitizer_common/CMakeFiles/RTSanitizerCommon.x86_64.dir/sanitizer_platform_limits_posix.cc.o
> /scratch/working/llvm-6.0.0.src/projects/compiler-rt/lib/sanitizer_common/sanitizer_platform_limits_posix.cc:162:10:
> fatal error: sys/ustat.h: No such file or directory
>  #include 
>   ^
> compilation terminated.
> 
> This is because the deprecated sys/ustat.h and apparently also
> ustat.h were removed in glibc-2.28.
> 
6.0.1 builds.

ĸen
-- 
   Entropy not found, thump keyboard to continue

-- 
http://lists.linuxfromscratch.org/listinfo/blfs-dev
FAQ: http://www.linuxfromscratch.org/blfs/faq.html
Unsubscribe: See the above information page