Thanks Venkatesh. That helped. I am getting below exception when lauching same. I have attached my job details.
2016-08-28 11:40:22,766 INFO com.datatorrent.stram.StramClient: Set the
environment for the application master
2016-08-28 11:40:22,766 INFO com.datatorrent.stram.StramClient: Setting up app
master command
2016-08-28 11:40:22,788 INFO com.datatorrent.stram.StramClient: Completed
setting up app master command ${JAVA_HOME}/bin/java -Djava.io.tmpdir=$PWD/tmp
-Xmx768m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/dt-heap-7.bin
-Dhadoop.root.logger=INFO,RFA -Dhadoop.log.dir=<LOG_DIR>
-Ddt.attr.APPLICATION_PATH=hdfs://<server>:8020/user/bj9306/datatorrent/apps/application_1472055988122_0007
com.datatorrent.stram.StreamingAppMaster 1><LOG_DIR>/AppMaster.stdout
2><LOG_DIR>/AppMaster.stderr
2016-08-28 11:40:22,793 INFO com.datatorrent.stram.StramClient: Submitting
application: {name=file2file, queue=default, user=bj9306 (auth:SIMPLE),
resource=<memory:1024, vCores:0>}
2016-08-28 11:41:09,645 WARN com.datatorrent.stram.client.EventsAgent: Got
exception when reading events
java.io.FileNotFoundException: File does not exist:
/user/bj9306/datatorrent/apps/application_1472055988122_0007/events/index.txt
at
org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71)
at
org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1828)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1799)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1712)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:652)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2151)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2147)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2145)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at
org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at
org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
at
org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1242)
at
org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1227)
at
org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:1215)
at
org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:303)
at
org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:269)
at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:261)
at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1540)
at
org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:303)
at
org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:299)
at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at
org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:299)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:767)
at
com.datatorrent.stram.client.EventsAgent.getLatestEvents(EventsAgent.java:118)
at
com.datatorrent.gateway.resources.ws.v2.EventsResource.getEvents(kc:59)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at
com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at
com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
at
com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
at
com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:288)
at
com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:134)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at
com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:134)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at
com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:134)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at
com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at
com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1469)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1400)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339)
at
com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:537)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:699)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
at
org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:669)
at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:457)
at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:229)
at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
at
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.eclipse.jetty.server.Server.handle(Server.java:368)
at
org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:489)
at
org.eclipse.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:942)
at
org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1004)
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:640)
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
at
org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
at
org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:628)
at
org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:745)
From: Venkatesh Kottapalli [mailto:[email protected]]
Sent: Friday, August 26, 2016 12:11 PM
To: [email protected]
Subject: Re: First application using DT
Looks like you introduced < pluginManagement > tag which has changed the build
behavior. Please remove this tag to generate the apa file.
-Venkatesh.
On Aug 26, 2016, at 11:39 AM, JOHN, BIBIN
<[email protected]<mailto:[email protected]>> wrote:
PFA
Thanks and Regards,
Bibin John| Data Movement Technology Development
20205 North Creek Pkwy , Bothell, WA 98011 USA
• Office: (770) 235 5614 | Cell: (469) 648-9858
Email: [email protected]<mailto:[email protected]>
OOO Alert : 09/19/2016 – 10/14/2016
From: Ankit Sarraf [mailto:[email protected]]
Sent: Friday, August 26, 2016 11:39 AM
To: [email protected]<mailto:[email protected]>
Subject: RE: First application using DT
Please provide the pom.xml that you are using.
Ankit
On Aug 26, 2016 1:03 PM, "JOHN, BIBIN" <[email protected]<mailto:[email protected]>>
wrote:
Bhupesh,
Thanks for your reply. I dont see .apa file are creating when I run maven
build.
<image001.png>
$ mvn clean package -DskipTests
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building My Apex Application 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for com.datatorrent:dt-contrib:jar:3.4.0 is invalid,
transitive dependencies (if any) will not be available, enable debug logging
for more details
[WARNING] The POM for com.datatorrent:dt-library:jar:3.4.0 is invalid,
transitive dependencies (if any) will not be available, enable debug logging
for more details
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ salesApp ---
[INFO] Deleting C:\BIBIN\PROJECT\JAVA\ECLIPSE-KEPLER\salesapp\target
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ salesApp
---
[WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources,
i.e. build is platform dependent!
[INFO] Copying 4 resources
[INFO]
[INFO] --- maven-resources-plugin:2.6:copy-resources (copy-resources) @
salesApp ---
[WARNING] File encoding has not been set, using platform encoding Cp1252, i.e.
build is platform dependent!
[WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources,
i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory
C:\BIBIN\PROJECT\JAVA\ECLIPSE-KEPLER\salesapp\target\generated-resources\xml-javadoc
[INFO]
[INFO] --- maven-compiler-plugin:3.3:compile (default-compile) @ salesApp ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 7 source files to
C:\BIBIN\PROJECT\JAVA\ECLIPSE-KEPLER\salesapp\target\classes
[ERROR] error reading
C:\Users\bj9306\.m2\repository\it\unimi\dsi\fastutil\7.0.6\fastutil-7.0.6.jar;
error in opening zip file
[ERROR] error reading
C:\Users\bj9306\.m2\repository\org\apache\avro\avro\1.7.4\avro-1.7.4.jar; error
in opening zip file
[WARNING] bootstrap class path not set in conjunction with -source 1.7
[ERROR] error reading
C:\Users\bj9306\.m2\repository\it\unimi\dsi\fastutil\7.0.6\fastutil-7.0.6.jar;
error in opening zip file
[ERROR] error reading
C:\Users\bj9306\.m2\repository\org\apache\avro\avro\1.7.4\avro-1.7.4.jar; error
in opening zip file
[WARNING]
/C:/BIBIN/PROJECT/JAVA/ECLIPSE-KEPLER/salesapp/src/main/java/com/example/salesapp/SalesDemo.java:[22,42]
com.datatorrent.contrib.hdht.tfile.TFileImpl in
com.datatorrent.contrib.hdht.tfile has been deprecated
[WARNING]
/C:/BIBIN/PROJECT/JAVA/ECLIPSE-KEPLER/salesapp/src/main/java/com/example/salesapp/SalesDemo.java:[71,5]
com.datatorrent.contrib.hdht.tfile.TFileImpl in
com.datatorrent.contrib.hdht.tfile has been deprecated
[WARNING]
/C:/BIBIN/PROJECT/JAVA/ECLIPSE-KEPLER/salesapp/src/main/java/com/example/salesapp/SalesDemo.java:[71,29]
com.datatorrent.contrib.hdht.tfile.TFileImpl in
com.datatorrent.contrib.hdht.tfile has been deprecated
[WARNING]
/C:/BIBIN/PROJECT/JAVA/ECLIPSE-KEPLER/salesapp/src/main/java/com/example/salesapp/SalesDemo.java:[76,52]
COUNTERS_AGGREGATOR in com.datatorrent.api.Context.OperatorContext has been
deprecated
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @
salesApp ---
[WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources,
i.e. build is platform dependent!
[INFO] Copying 1 resource
[INFO]
[INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @
salesApp ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ salesApp ---
[INFO] Tests are skipped.
[INFO]
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ salesApp ---
[INFO] Building jar:
C:\BIBIN\PROJECT\JAVA\ECLIPSE-KEPLER\salesapp\target\salesApp-0.0.1-SNAPSHOT.jar
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 5.320 s
[INFO] Finished at: 2016-08-26T10:00:10-07:00
[INFO] Final Memory: 29M/285M
[INFO] ------------------------------------------------------------------------
Thanks and Regards,
Bibin John| Data Movement Technology Development
20205 North Creek Pkwy , Bothell, WA 98011 USA
• Office: (770) 235 5614<tel:%28770%29%20235%205614> | Cell: (469)
648-9858<tel:%28469%29%20648-9858>
Email: [email protected]<mailto:[email protected]>
OOO Alert : 09/19/2016 – 10/14/2016
From: Bhupesh Chawda
[mailto:[email protected]<mailto:[email protected]>]
Sent: Friday, August 26, 2016 8:56 AM
To: [email protected]<mailto:[email protected]>
Subject: Re: First application using DT
Hi John,
In addition to the jar file, you would have a .apa file also generated with the
same name.
You can try uploading the apa file instead of the jar file.
~ Bhupesh
On Fri, Aug 26, 2016 at 8:54 PM, JOHN, BIBIN
<[email protected]<mailto:[email protected]>> wrote:
All,
I am doing a POC in DT and trying to set up my first project. When I try to
upload my package, I am getting below exceptions. Could you please help?
I have attached pom.xml
Instructions used :
http://docs.datatorrent.com/tutorials/salesdimensions/#building-the-sales-dimension-application-in-java
×Close
File "salesApp-0.0.1-SNAPSHOT.jar" upload failed. java.io.IOException: Not a
valid app package. App Package Name or Version or Class-Path is missing from
MANIFEST.MF at
com.datatorrent.stram.client.AppPackage.<init>(AppPackage.java:157) at
com.datatorrent.stram.client.AppPackage.<init>(AppPackage.java:203) at
com.datatorrent.stram.cli.ApexCli.newAppPackageInstance(ApexCli.java:452) at
com.datatorrent.stram.cli.ApexCli$GetAppPackageInfoCommand.execute(ApexCli.java:3400)
at com.datatorrent.stram.cli.ApexCli$3.run(ApexCli.java:1462)
<pom.xml>
Application.java
Description: Application.java
LineOutputOperator.java
Description: LineOutputOperator.java
<?xml version="1.0"?>
<configuration>
<!--
<property>
<name>dt.application.{appName}.operator.{opName}.prop.{propName}</name>
<value>some-default-value (if value is not specified, it is required from the user or custom config when launching)</value>
</property>
-->
<!-- memory assigned to app master
<property>
<name>dt.attr.MASTER_MEMORY_MB</name>
<value>1024</value>
</property>
-->
<!-- file output operator -->
<property>
<name>dt.application.file2file.operator.fileIn.prop.filePath</name>
<value>/tmp/dtfiles</value>
</property>
<property>
<name>dt.application.file2file.operator.fileIn.prop.directory</name>
<value>/tmp/dtfiles</value>
</property>
<property>
<name>dt.application.file2file.operator.fileIn.prop.baseName</name>
<value>file2filein</value>
</property>
<property>
<name>dt.application.file2file.operator.fileIn.prop.maxLength</name>
<value>1024</value>
</property>
<property>
<name>dt.application.file2file.operator.fileIn.prop.rotationWindows</name>
<value>4</value>
</property>
<!-- file output operator -->
<property>
<name>dt.application.file2file.operator.fileOut.prop.filePath</name>
<value>/tmp/dtfiles</value>
</property>
<property>
<name>dt.application.file2file.operator.fileOut.prop.baseName</name>
<value>file2fileout</value>
</property>
</configuration>
