From:

Cannot run program "cmake" (in directory
"/home/gtapper/incubator-trafodion/core/sqf/sql/libhdfs_files/hadoop-2.6.0-src/hadoop-common-project/hadoop-common/target/native"):
error=2, No such file or directory

it looks like you are missing cmake.  It should have been downloaded as part
of the yum installs as described on the build page.  Where are you running?

List of Yum installs is described:
https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=61316378
which is pointed to by the directory you specified.

   Roberta

-----Original Message-----
From: Gunnar Tapper [mailto:[email protected]]
Sent: Friday, November 6, 2015 12:28 PM
To: [email protected]
Subject: Build error

Hi,

I just downloaded the code per these instructions:
https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=60623593

Running a make all ends with the following errors:

[INFO]
------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO]
------------------------------------------------------------------------
[INFO] Total time: 9:13.585s
[INFO] Finished at: Fri Nov 06 12:23:00 PST 2015 [INFO] Final Memory:
60M/368M [INFO]
------------------------------------------------------------------------
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project
hadoop-common: An Ant BuildException has occured: Execute failed:
java.io.IOException: Cannot run program "cmake" (in directory
"/home/gtapper/incubator-trafodion/core/sqf/sql/libhdfs_files/hadoop-2.6.0-src/hadoop-common-project/hadoop-common/target/native"):
error=2, No such file or directory
[ERROR] around Ant part ...<exec
dir="/home/gtapper/incubator-trafodion/core/sqf/sql/libhdfs_files/hadoop-2.6.0-src/hadoop-common-project/hadoop-common/target/native"
executable="cmake" failonerror="true">... @ 4:181 in
/home/gtapper/incubator-trafodion/core/sqf/sql/libhdfs_files/hadoop-2.6.0-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the
command
[ERROR]   mvn <goals> -rf :hadoop-common
Copying include file and built libraries to Trafodion export dir...
+ cp -f
/home/gtapper/incubator-trafodion/core/sqf/sql/libhdfs_files/hadoop-2.6.0-src/hadoop-dist/target/hadoop-2.6.0/include/hdfs.h
/home/gtapper/incubator-trafodion/core/sqf/export/include
cp: cannot stat
`/home/gtapper/incubator-trafodion/core/sqf/sql/libhdfs_files/hadoop-2.6.0-src/hadoop-dist/target/hadoop-2.6.0/include/hdfs.h':
No such file or directory
+ cp -Pf
'/home/gtapper/incubator-trafodion/core/sqf/sql/libhdfs_files/hadoop-2.6.0-src/hadoop-dist/target/hadoop-2.6.0/lib/native/libhdfs*.so*'
/home/gtapper/incubator-trafodion/core/sqf/export/lib64d
cp: cannot stat
`/home/gtapper/incubator-trafodion/core/sqf/sql/libhdfs_files/hadoop-2.6.0-src/hadoop-dist/target/hadoop-2.6.0/lib/native/libhdfs*.so*':
No such file or directory
+ cp -Pf
'/home/gtapper/incubator-trafodion/core/sqf/sql/libhdfs_files/hadoop-2.6.0-src/hadoop-dist/target/hadoop-2.6.0/lib/native/libhadoop*.so*'
/home/gtapper/incubator-trafodion/core/sqf/export/lib64d
cp: cannot stat
`/home/gtapper/incubator-trafodion/core/sqf/sql/libhdfs_files/hadoop-2.6.0-src/hadoop-dist/target/hadoop-2.6.0/lib/native/libhadoop*.so*':
No such file or directory
+ ls -l /home/gtapper/incubator-trafodion/core/sqf/export/include/hdfs.h
ls: cannot access
/home/gtapper/incubator-trafodion/core/sqf/export/include/hdfs.h: No such
file or directory
+ ls -l
+ /home/gtapper/incubator-trafodion/core/sqf/export/lib64d/libhdfs.so
ls: cannot access
/home/gtapper/incubator-trafodion/core/sqf/export/lib64d/libhdfs.so: No such
file or directory
+ ls -l
/home/gtapper/incubator-trafodion/core/sqf/export/lib64d/libhadoop.so
ls: cannot access
/home/gtapper/incubator-trafodion/core/sqf/export/lib64d/libhadoop.so: No
such file or directory
+ [[ ! -r
+ /home/gtapper/incubator-trafodion/core/sqf/export/include/hdfs.h
]]
+ echo 'Error, not all files were created'
+ tee -a
/home/gtapper/incubator-trafodion/core/sqf/sql/libhdfs_files/build.log
Error, not all files were created
+ ls -l /home/gtapper/incubator-trafodion/core/sqf/export/include/hdfs.h
ls: cannot access
/home/gtapper/incubator-trafodion/core/sqf/export/include/hdfs.h: No such
file or directory
+ ls -l
+ /home/gtapper/incubator-trafodion/core/sqf/export/lib64d/libhdfs.so
ls: cannot access
/home/gtapper/incubator-trafodion/core/sqf/export/lib64d/libhdfs.so: No such
file or directory
+ exit 1
make[3]: *** [copytoolslibs] Error 1
make[3]: Leaving directory
`/home/gtapper/incubator-trafodion/core/sql/nskgmake'
make[2]: *** [setup] Error 2
make[2]: Leaving directory `/home/gtapper/incubator-trafodion/core/sqf/sql'
make[1]: *** [setupdir] Error 2
make[1]: Leaving directory `/home/gtapper/incubator-trafodion/core/sqf'
make: *** [sqroot] Error 2

Is this a know issue or should I open a Jira?

--
Thanks,

Gunnar
*If you think you can you can, if you think you can't you're right.*

Reply via email to