It helps. Thank you.
Now I'm trying to install sys-cluster/slurm package. It fails on 
installation phase:

user@ubuntu ~/trunk/src/scripts $ emerge-amd64-usr sys-cluster/slurm 
--autounmask-write
--- Invalid atom in 
/build/amd64-usr/etc/portage/package.unmask/cros-workon: 
=sys-cluster/slurm-16.05.2 ~amd64
Calculating dependencies... done!

>>> Verifying ebuild manifests

>>> Emerging (1 of 1) sys-cluster/slurm-16.05.2::portage-stable for 
/build/amd64-usr/
 * slurm-16.05.2.tar.bz2 SHA256 SHA512 WHIRLPOOL size ;-) ...               
                                            [ ok ]
 * Running stacked hooks for pre_pkg_setup
 *    sysroot_build_bin_dir ...                                             
                                            [ ok ]
 * Running stacked hooks for pre_src_unpack
 *    python_multilib_setup ...                                             
                                            [ ok ]
>>> Unpacking source...
>>> Unpacking slurm-16.05.2.tar.bz2 to 
/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/work
>>> Source unpacked in 
/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/work
>>> Preparing source in 
/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/work/slurm-16.05.2 
...
 * Applying slurm-16.05.2-disable-sview.patch ...                           
                                            [ ok ]
/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/temp/environment: 
line 4505: hprefixify: command not found
 * Running eautoreconf in 
'/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/work/slurm-16.05.2' 
...
 * Running libtoolize --install --copy --force --automake ...               
                                            [ ok ]
 * Running aclocal -I auxdir -I /build/amd64-usr/usr/share/aclocal ...     
                                             [ ok ]
 * Running autoconf --force -I /build/amd64-usr/usr/share/aclocal ...       
                                            [ ok ]
 * Running autoheader -I /build/amd64-usr/usr/share/aclocal ...             
                                            [ ok ]
 * Running automake --add-missing --copy --foreign --force-missing ...
...

make: Entering directory 
'/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/work/slurm-16.05.2/contribs/pam'
make[1]: Entering directory 
'/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/work/slurm-16.05.2/contribs/pam'
make[1]: Nothing to be done for 'install-data-am'.
make[1]: Leaving directory 
'/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/work/slurm-16.05.2/contribs/pam'
make: Leaving directory 
'/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/work/slurm-16.05.2/contribs/pam'
/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/temp/environment: 
line 4497: prefixify_ro: command not found
 * ERROR: sys-cluster/slurm-16.05.2::portage-stable failed (install phase):
 *   !!! newinitd:  does not exist
 *
 * If you need support, post the output of `emerge --info 
'=sys-cluster/slurm-16.05.2::portage-stable'`,
 * the complete build log and the output of `emerge -pqv 
'=sys-cluster/slurm-16.05.2::portage-stable'`.
 * The complete build log is located at 
'/build/amd64-usr/var/log/portage/sys-cluster:slurm-16.05.2:20161202-214111.log'.
 * For convenience, a symlink to the build log is located at 
'/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/temp/build.log'.
 * The ebuild environment file is located at 
'/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/temp/environment'.
 * Working directory: 
'/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/work/slurm-16.05.2'
 * S: 
'/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/work/slurm-16.05.2'
 * QA Notice: command not found:
 *
 *     
 /build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/temp/environment: 
line 4505: hprefixify: command not found
 *     
 /build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/temp/environment: 
line 4497: prefixify_ro: command not found

>>> Failed to emerge sys-cluster/slurm-16.05.2 for /build/amd64-usr/, Log 
file:

>>> 
 
'/build/amd64-usr/var/log/portage/sys-cluster:slurm-16.05.2:20161202-214111.log'

 * Messages for package sys-cluster/slurm-16.05.2 merged to 
/build/amd64-usr/:

 * ERROR: sys-cluster/slurm-16.05.2::portage-stable failed (install phase):
 *   !!! newinitd:  does not exist
 *
 * If you need support, post the output of `emerge --info 
'=sys-cluster/slurm-16.05.2::portage-stable'`,
 * the complete build log and the output of `emerge -pqv 
'=sys-cluster/slurm-16.05.2::portage-stable'`.
 * The complete build log is located at 
'/build/amd64-usr/var/log/portage/sys-cluster:slurm-16.05.2:20161202-214111.log'.
 * For convenience, a symlink to the build log is located at 
'/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/temp/build.log'.
 * The ebuild environment file is located at 
'/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/temp/environment'.
 * Working directory: 
'/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/work/slurm-16.05.2'
 * S: 
'/build/amd64-usr/var/tmp/portage/sys-cluster/slurm-16.05.2/work/slurm-16.05.2'



суббота, 3 декабря 2016 г., 0:10:47 UTC+3 пользователь Michael Marineau 
написал:
>
> Typically with autoconf test results are stored in ac_cv_.... 
> variables and the test is only run if the value is unset. So you can 
> export the test result in advance. It isn't always obvious what the 
> right variable name is but the full list of them can be found in 
> config.log after a successful run of the configure script. In this 
> case it appears to be x_ac_cv_check_fifo_recvfd=no so as a quick test 
> simply do: 
>
> export x_ac_cv_check_fifo_recvfd=no 
> emerge-amd64-usr sys-auth/munge 
>
> If that works you can that export to the ebuild itself (we place 
> modified ebuilds in coreos-overlay instead of portage-stable) or to 
> leave the ebuild as-is you can drop in an appropriately named file 
> into the overlay similar to this one: 
>
> https://github.com/coreos/coreos-overlay/blob/master/coreos/config/env/app-admin/setools
>  
>
> Hope that helps. 
>
>
>

Reply via email to