Hello Rohini,

Thank you for your reply,

Please find following log,

2018-08-13 09:19:18,644 -- DEBUG -- address: Acamar/192.168.102.189 isLoopbackAddress: false, with host 192.168.102.189 Acamar 2018-08-13 09:19:19,509 -- WARN -- Your hostname, Acamar resolves to a loopback/non-reachable address: fe80:0:0:0:d53f:fff5:c5cd:d591%eth10, but we couldn't find any external IP address! 2018-08-13 09:19:19,633 -- DEBUG -- Using SLF4J as the default logging framework
2018-08-13 09:19:20,137 -- DEBUG -- Create a new graph.
2018-08-13 09:19:20,436 -- DEBUG -- Original macro AST:
(QUERY (STATEMENT A (LOAD '/apps/hive/warehouse/cnx.db/employee/atul.csv' (FUNC PigStorage))))

2018-08-13 09:19:20,436 -- DEBUG -- macro AST after import:
(QUERY (STATEMENT A (LOAD '/apps/hive/warehouse/cnx.db/employee/atul.csv' (FUNC PigStorage))))

2018-08-13 09:19:20,436 -- DEBUG -- Resulting macro AST:
(QUERY (STATEMENT A (LOAD '/apps/hive/warehouse/cnx.db/employee/atul.csv' (FUNC PigStorage))))

2018-08-13 09:19:21,379 -- DEBUG -- Could not find schema file for /apps/hive/warehouse/cnx.db/employee/atul.csv
2018-08-13 09:19:21,384 -- DEBUG -- Original macro AST:
(QUERY (STATEMENT A (LOAD '/apps/hive/warehouse/cnx.db/employee/atul.csv' (FUNC PigStorage))) (STATEMENT (STORE A '/user/Admin/pig_Output1/' (FUNC PigStorage))))

2018-08-13 09:19:21,385 -- DEBUG -- macro AST after import:
(QUERY (STATEMENT A (LOAD '/apps/hive/warehouse/cnx.db/employee/atul.csv' (FUNC PigStorage))) (STATEMENT (STORE A '/user/Admin/pig_Output1/' (FUNC PigStorage))))

2018-08-13 09:19:21,385 -- DEBUG -- Resulting macro AST:
(QUERY (STATEMENT A (LOAD '/apps/hive/warehouse/cnx.db/employee/atul.csv' (FUNC PigStorage))) (STATEMENT (STORE A '/user/Admin/pig_Output1/' (FUNC PigStorage))))

2018-08-13 09:19:21,400 -- DEBUG -- Could not find schema file for /apps/hive/warehouse/cnx.db/employee/atul.csv
2018-08-13 09:19:21,622 -- DEBUG -- Original macro AST:
(QUERY (STATEMENT A (LOAD '/apps/hive/warehouse/cnx.db/employee/atul.csv' (FUNC PigStorage))) (STATEMENT (STORE A '/user/Admin/pig_Output1/' (FUNC PigStorage))))

2018-08-13 09:19:21,622 -- DEBUG -- macro AST after import:
(QUERY (STATEMENT A (LOAD '/apps/hive/warehouse/cnx.db/employee/atul.csv' (FUNC PigStorage))) (STATEMENT (STORE A '/user/Admin/pig_Output1/' (FUNC PigStorage))))

2018-08-13 09:19:21,622 -- DEBUG -- Resulting macro AST:
(QUERY (STATEMENT A (LOAD '/apps/hive/warehouse/cnx.db/employee/atul.csv' (FUNC PigStorage))) (STATEMENT (STORE A '/user/Admin/pig_Output1/' (FUNC PigStorage))))

2018-08-13 09:19:21,637 -- DEBUG -- Could not find schema file for /apps/hive/warehouse/cnx.db/employee/atul.csv
2018-08-13 09:19:21,642 -- DEBUG -- Pig Internal storage in use
2018-08-13 09:19:21,731 -- INFO -- Pig features used in the script: UNKNOWN
2018-08-13 09:19:21,769 -- INFO -- Key [pig.schematuple] was not set... will not generate code. 2018-08-13 09:19:21,807 -- INFO -- {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, ConstantCalculator, GroupByConstParallelSetter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, NestedLimitOptimizer, PartitionFilterOptimizer, PredicatePushdownOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter]} 2018-08-13 09:19:21,960 -- DEBUG -- Found heap (Code Cache) of type Non-heap memory 2018-08-13 09:19:21,960 -- DEBUG -- Found heap (Metaspace) of type Non-heap memory 2018-08-13 09:19:21,960 -- DEBUG -- Found heap (Compressed Class Space) of type Non-heap memory 2018-08-13 09:19:21,960 -- DEBUG -- Found heap (PS Eden Space) of type Heap memory 2018-08-13 09:19:21,961 -- DEBUG -- Found heap (PS Survivor Space) of type Heap memory 2018-08-13 09:19:21,961 -- DEBUG -- Found heap (PS Old Gen) of type Heap memory 2018-08-13 09:19:21,961 -- INFO -- Selected heap (PS Old Gen) of size 1404043264 to monitor. collectionUsageThreshold = 1037041664, usageThreshold = 1037041664 2018-08-13 09:19:22,046 -- INFO -- File concatenation threshold: 100 optimistic? false
2018-08-13 09:19:22,070 -- DEBUG -- Not a sampling job.
2018-08-13 09:19:22,086 -- DEBUG -- Cannot find POLocalRearrange or POUnion in map leaf, skip secondary key optimizing
2018-08-13 09:19:22,093 -- INFO -- MR plan size before optimization: 1
2018-08-13 09:19:22,093 -- INFO -- MR plan size after optimization: 1
2018-08-13 09:19:22,189 -- INFO -- Pig script settings are added to the job
2018-08-13 09:19:22,197 -- INFO -- mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3 2018-08-13 09:19:22,260 -- INFO -- This job cannot be converted run in-process Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(II[BI[BIILjava/lang/String;JZ)V at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(Native Method) at org.apache.hadoop.util.NativeCrc32.calculateChunkedSumsByteArray(NativeCrc32.java:86) at org.apache.hadoop.util.DataChecksum.calculateChunkedSums(DataChecksum.java:430) at org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSummer.java:202) at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:163) at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:144) at org.apache.hadoop.hdfs.DFSOutputStream.closeImpl(DFSOutputStream.java:2250) at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:2232) at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72) at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)
    at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:244)
    at org.apache.hadoop.io.IOUtils.closeStream(IOUtils.java:261)
    at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:68)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.shipToHDFS(JobControlCompiler.java:1809) at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.putJarOnClassPathThroughDistributedCache(JobControlCompiler.java:1689) at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:659) at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:331) at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:221) at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.launchPig(HExecutionEngine.java:290)
    at org.apache.pig.PigServer.launchPlan(PigServer.java:1475)
at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1460)
    at org.apache.pig.PigServer.storeEx(PigServer.java:1119)
    at org.apache.pig.PigServer.store(PigServer.java:1082)
    at org.apache.pig.PigServer.openIterator(PigServer.java:995)
    at com.contineonx.ext.pig.Test_Atul.main(Test_Atul.java:29)


Regards,
Atul Raut

On 8/11/2018 2:11, Rohini Palaniswamy wrote:
What you are missing in your source code is call to openIterator() or store() to retrieve the results.

http://pig.apache.org/docs/r0.17.0/api/org/apache/pig/PigServer.html <http://pig.apache.org/docs/r0.17.0/api/org/apache/pig/PigServer.html> The programmer then registers queries using registerQuery() and retrieves results using openIterator() or store()



On Fri, Aug 10, 2018 at 1:36 PM, Rohini Palaniswamy <roh...@apache.org <mailto:roh...@apache.org>> wrote:

    Pig does not have any server. The client directly launches jobs on
    the YARN cluster. You can just use the APIs in
    http://pig.apache.org/docs/r0.17.0/api/org/apache/pig/PigServer.html
    <http://pig.apache.org/docs/r0.17.0/api/org/apache/pig/PigServer.html>
    to execute scripts from your java program.

    On Sun, Jul 29, 2018 at 8:24 PM, Atul Raut <rautatu...@gmail.com
    <mailto:rautatu...@gmail.com>> wrote:

        How to execute pig script from java class.

        I need to submit Pig script from my java application to the
        Pig server
        (This may be on any remote location) and that Pig server will
        execute that
        script and return the result to my java application.



        Following is my source,

        public static void main(String[] args) throws Exception
        { System.setProperty("hadoop.home.dir",
        "/Pig/hadoop-common-2.2.0-bin-master/"); Properties props = new
        Properties(); props.setProperty("fs.default.name
        <http://fs.default.name>", "hdfs://
        192.168.102.179:8020 <http://192.168.102.179:8020>");
        props.setProperty("mapred.job.tracker", "
        192.168.102.179:8021 <http://192.168.102.179:8021>");
        props.setProperty("pig.use.overriden.hadoop.configs", "true");
        PigServer
        pig = new PigServer(ExecType.MAPREDUCE, props); pig.debugOn();
        pig.registerScript("A = LOAD '/apps/employee/sample.txt' USING
        PigStorage();"); }



        Regards,
        Atul Raut




Reply via email to