Hi there,


I am unable to get started with Apache Apex.



I used the below mvn command to create the project.



#!/bin/bash

# script to create a new project



# change project name and archetype version as needed

name=myapexapp

version=3.4.0



mvn -B archetype:generate \

  -DarchetypeGroupId=org.apache.apex \

  -DarchetypeArtifactId=apex-app-archetype \

  -DarchetypeVersion=$version  \

  -DgroupId=com.example \

  -Dpackage=com.example.$name \

  -DartifactId=$name \

  -Dversion=1.0-SNAPSHOT







I am trying to execute the sample SortedWordCount application example.



I see that the status of yarn application is FAILED.



Below error in the log file:

2016-09-12 00:47:12,284 INFO com.datatorrent.stram.StreamingAppMaster:
Initializing Application Master.
2016-09-12 00:47:12,385 INFO
com.datatorrent.stram.StreamingAppMasterService: Application master,
appId=3, clustertimestamp=1473662193222, attemptId=1
2016-09-12 00:47:13,432 INFO
com.datatorrent.common.util.AsyncFSStorageAgent: using
/scratch/sanav/view_storage/projects/downloads/hadoop/tmp/hadoop-sanav/nm-local-dir/usercache/sanav/appcache/application_1473662193222_0003/container_1473662193222_0003_01_000001/tmp/chkp6399702117805507910
as the basepath for checkpointing.
2016-09-12 00:47:15,223 INFO com.datatorrent.stram.FSRecoveryHandler:
Creating 
hdfs://den00spj:9000/user/sanav/datatorrent/apps/application_1473662193222_0003/recovery/log
2016-09-12 00:47:15,272 INFO
com.datatorrent.stram.StreamingAppMasterService: Starting application
with 5 operators in 5 containers
2016-09-12 00:47:15,286 INFO
org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Upper
bound of the thread pool size is 500
2016-09-12 00:47:15,288 INFO
org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
yarn.client.max-cached-nodemanagers-proxies : 0
2016-09-12 00:47:15,316 INFO org.apache.hadoop.yarn.client.RMProxy:
Connecting to ResourceManager at /0.0.0.0:8030
2016-09-12 00:47:15,345 INFO
com.datatorrent.stram.StreamingContainerParent: Config: Configuration:
core-default.xml, core-site.xml, yarn-default.xml, yarn-site.xml,
hdfs-default.xml, hdfs-site.xml
2016-09-12 00:47:15,345 INFO
com.datatorrent.stram.StreamingContainerParent: Listener thread count
30
2016-09-12 00:47:15,351 INFO org.apache.hadoop.ipc.CallQueueManager:
Using callQueue class java.util.concurrent.LinkedBlockingQueue
2016-09-12 00:47:15,356 INFO org.apache.hadoop.ipc.Server: Starting
Socket Reader #1 for port 55906
2016-09-12 00:47:15,369 INFO org.apache.hadoop.ipc.Server: IPC Server
listener on 55906: starting
2016-09-12 00:47:15,370 INFO org.apache.hadoop.ipc.Server: IPC Server
Responder: starting
2016-09-12 00:47:15,385 INFO
com.datatorrent.stram.StreamingContainerParent: Container callback
server listening at den00spj/10.196.38.196:55906
2016-09-12 00:47:15,440 INFO org.mortbay.log: Logging to
org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via
org.mortbay.log.Slf4jLog
2016-09-12 00:47:15,507 INFO
org.apache.hadoop.security.authentication.server.AuthenticationFilter:
Unable to initialize FileSignerSecretProvider, falling back to use
random secrets.
2016-09-12 00:47:15,512 INFO org.apache.hadoop.http.HttpRequestLog:
Http request log for http.requests.stram is not defined
2016-09-12 00:47:15,518 INFO org.apache.hadoop.http.HttpServer2: Added
global filter 'safety'
(class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
2016-09-12 00:47:15,520 INFO org.apache.hadoop.http.HttpServer2: Added
filter static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter)
to context stram
2016-09-12 00:47:15,520 INFO org.apache.hadoop.http.HttpServer2: Added
filter static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter)
to context static
2016-09-12 00:47:15,520 INFO org.apache.hadoop.http.HttpServer2: Added
filter static_user_filter
(class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter)
to context logs
2016-09-12 00:47:15,523 INFO org.apache.hadoop.http.HttpServer2:
adding path spec: /stram/*
2016-09-12 00:47:15,523 INFO org.apache.hadoop.http.HttpServer2:
adding path spec: /ws/*
2016-09-12 00:47:15,875 INFO org.apache.hadoop.yarn.webapp.WebApps:
Registered webapp guice modules
2016-09-12 00:47:15,876 INFO org.apache.hadoop.http.HttpServer2: Jetty
bound to port 21097

2016-09-12 00:47:16,147 ERROR
com.datatorrent.stram.StreamingAppMasterService: Webapps failed to start.
Ignoring for now:

org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server

        at
org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:310)

        at
com.datatorrent.stram.StreamingAppMasterService.serviceStart(StreamingAppMasterService.java:616)

        at
org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)

        at
com.datatorrent.stram.StreamingAppMaster.main(StreamingAppMaster.java:103)

Caused by: java.io.IOException: Unable to initialize WebAppContext

        at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:879)

        at
org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:306)

        ... 3 more

Caused by: com.sun.jersey.api.container.ContainerException: No
WebApplication provider is present

        at
com.sun.jersey.spi.container.WebApplicationFactory.createWebApplication(WebApplicationFactory.java:69)

        at
com.sun.jersey.spi.container.servlet.ServletContainer.create(ServletContainer.java:391)

        at
com.sun.jersey.spi.container.servlet.ServletContainer$InternalWebComponent.create(ServletContainer.java:306)

        at
com.sun.jersey.spi.container.servlet.WebComponent.load(WebComponent.java:607)

        at
com.sun.jersey.spi.container.servlet.WebComponent.init(WebComponent.java:210)

        at
com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:373)

        at
com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:710)

        at
com.google.inject.servlet.FilterDefinition.init(FilterDefinition.java:114)

        at
com.google.inject.servlet.ManagedFilterPipeline.initPipeline(ManagedFilterPipeline.java:98)

        at com.google.inject.servlet.GuiceFilter.init(GuiceFilter.java:172)

        at
org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:97)

        at
org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)

        at
org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:713)

        at org.mortbay.jetty.servlet.Context.startContext(Context.java:140)

        at
org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1282)

        at
org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:518)

        at
org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:499)

        at
org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)

        at
org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152)

        at
org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156)

        at
org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)

        at
org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)

        at org.mortbay.jetty.Server.doStart(Server.java:224)

        at
org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)

        at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:857)





Apart from this error I don’t see any other errors.



Please advise.


-- 
Thanks
Sanal

Reply via email to