*           <source>1.8</source>*
*           <target>1.8</target>*
Could you change the source and target java versions to 1.7 in the above
lines of your
pom.xml and try again ?

With this change I was able to run the application built with your pom.xml.

Ram

On Mon, Sep 12, 2016 at 8:48 PM, Sanal Vasudevan <get2sa...@gmail.com>
wrote:

> Hi,
>
> Please find attached the pom.xml file.
> I added the following to resolve build issue.
>
> <dependency>
>   <groupId>org.codehaus.jettison</groupId>
>   <artifactId>jettison</artifactId>
>   <version>1.3.8</version>
> </dependency>
>
> It is a single-node setup.
> Haddop Version:2.7.2, rb165c4fe8a74265c792ce23f546c64604acf0e41
> Yes I have used the apex command to lauch the apa file.
>
>
> Thanks
> Sanal
>
> On Tue, Sep 13, 2016 at 1:29 AM, Munagala Ramanath <r...@datatorrent.com>
> wrote:
>
>> Looks like this happens if jersey-server is not present
>> (e.g. http://stackoverflow.com/questions/8662919/jersey-no-
>> webapplication-provider-is-present-when-jersey-json-dependency-added)
>>
>> Have you made any changes to the pom.xml ? If so, can you post it here ?
>>
>> Also, can you tell us a bit more about your deployment environment ? Is
>> it a single or multi-node cluster ?
>> What version of Hadoop ? Are you using the "apex" command line tool to
>> deploy ?
>>
>> Ram
>>
>> On Mon, Sep 12, 2016 at 1:29 AM, Sanal Vasudevan <get2sa...@gmail.com>
>> wrote:
>>
>>> Hi there,
>>>
>>>
>>>
>>> I am unable to get started with Apache Apex.
>>>
>>>
>>>
>>> I used the below mvn command to create the project.
>>>
>>>
>>>
>>> #!/bin/bash
>>>
>>> # script to create a new project
>>>
>>>
>>>
>>> # change project name and archetype version as needed
>>>
>>> name=myapexapp
>>>
>>> version=3.4.0
>>>
>>>
>>>
>>> mvn -B archetype:generate \
>>>
>>>   -DarchetypeGroupId=org.apache.apex \
>>>
>>>   -DarchetypeArtifactId=apex-app-archetype \
>>>
>>>   -DarchetypeVersion=$version  \
>>>
>>>   -DgroupId=com.example \
>>>
>>>   -Dpackage=com.example.$name \
>>>
>>>   -DartifactId=$name \
>>>
>>>   -Dversion=1.0-SNAPSHOT
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> I am trying to execute the sample SortedWordCount application example.
>>>
>>>
>>>
>>> I see that the status of yarn application is FAILED.
>>>
>>>
>>>
>>> Below error in the log file:
>>>
>>> 2016-09-12 00:47:12,284 INFO com.datatorrent.stram.StreamingAppMaster: 
>>> Initializing Application Master.
>>> 2016-09-12 00:47:12,385 INFO 
>>> com.datatorrent.stram.StreamingAppMasterService: Application master, 
>>> appId=3, clustertimestamp=1473662193222, attemptId=1
>>> 2016-09-12 00:47:13,432 INFO 
>>> com.datatorrent.common.util.AsyncFSStorageAgent: using 
>>> /scratch/sanav/view_storage/projects/downloads/hadoop/tmp/hadoop-sanav/nm-local-dir/usercache/sanav/appcache/application_1473662193222_0003/container_1473662193222_0003_01_000001/tmp/chkp6399702117805507910
>>>  as the basepath for checkpointing.
>>> 2016-09-12 00:47:15,223 INFO com.datatorrent.stram.FSRecoveryHandler: 
>>> Creating 
>>> hdfs://den00spj:9000/user/sanav/datatorrent/apps/application_1473662193222_0003/recovery/log
>>> 2016-09-12 00:47:15,272 INFO 
>>> com.datatorrent.stram.StreamingAppMasterService: Starting application with 
>>> 5 operators in 5 containers
>>> 2016-09-12 00:47:15,286 INFO 
>>> org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Upper bound 
>>> of the thread pool size is 500
>>> 2016-09-12 00:47:15,288 INFO 
>>> org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy: 
>>> yarn.client.max-cached-nodemanagers-proxies : 0
>>> 2016-09-12 00:47:15,316 INFO org.apache.hadoop.yarn.client.RMProxy: 
>>> Connecting to ResourceManager at /0.0.0.0:8030
>>> 2016-09-12 00:47:15,345 INFO 
>>> com.datatorrent.stram.StreamingContainerParent: Config: Configuration: 
>>> core-default.xml, core-site.xml, yarn-default.xml, yarn-site.xml, 
>>> hdfs-default.xml, hdfs-site.xml
>>> 2016-09-12 00:47:15,345 INFO 
>>> com.datatorrent.stram.StreamingContainerParent: Listener thread count 30
>>> 2016-09-12 00:47:15,351 INFO org.apache.hadoop.ipc.CallQueueManager: Using 
>>> callQueue class java.util.concurrent.LinkedBlockingQueue
>>> 2016-09-12 00:47:15,356 INFO org.apache.hadoop.ipc.Server: Starting Socket 
>>> Reader #1 for port 55906
>>> 2016-09-12 00:47:15,369 INFO org.apache.hadoop.ipc.Server: IPC Server 
>>> listener on 55906: starting
>>> 2016-09-12 00:47:15,370 INFO org.apache.hadoop.ipc.Server: IPC Server 
>>> Responder: starting
>>> 2016-09-12 00:47:15,385 INFO 
>>> com.datatorrent.stram.StreamingContainerParent: Container callback server 
>>> listening at den00spj/10.196.38.196:55906
>>> 2016-09-12 00:47:15,440 INFO org.mortbay.log: Logging to 
>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via 
>>> org.mortbay.log.Slf4jLog
>>> 2016-09-12 00:47:15,507 INFO 
>>> org.apache.hadoop.security.authentication.server.AuthenticationFilter: 
>>> Unable to initialize FileSignerSecretProvider, falling back to use random 
>>> secrets.
>>> 2016-09-12 00:47:15,512 INFO org.apache.hadoop.http.HttpRequestLog: Http 
>>> request log for http.requests.stram is not defined
>>> 2016-09-12 00:47:15,518 INFO org.apache.hadoop.http.HttpServer2: Added 
>>> global filter 'safety' 
>>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>> 2016-09-12 00:47:15,520 INFO org.apache.hadoop.http.HttpServer2: Added 
>>> filter static_user_filter 
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to 
>>> context stram
>>> 2016-09-12 00:47:15,520 INFO org.apache.hadoop.http.HttpServer2: Added 
>>> filter static_user_filter 
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to 
>>> context static
>>> 2016-09-12 00:47:15,520 INFO org.apache.hadoop.http.HttpServer2: Added 
>>> filter static_user_filter 
>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to 
>>> context logs
>>> 2016-09-12 00:47:15,523 INFO org.apache.hadoop.http.HttpServer2: adding 
>>> path spec: /stram/*
>>> 2016-09-12 00:47:15,523 INFO org.apache.hadoop.http.HttpServer2: adding 
>>> path spec: /ws/*
>>> 2016-09-12 00:47:15,875 INFO org.apache.hadoop.yarn.webapp.WebApps: 
>>> Registered webapp guice modules
>>> 2016-09-12 00:47:15,876 INFO org.apache.hadoop.http.HttpServer2: Jetty 
>>> bound to port 21097
>>>
>>> 2016-09-12 00:47:16,147 ERROR 
>>> com.datatorrent.stram.StreamingAppMasterService:
>>> Webapps failed to start. Ignoring for now:
>>>
>>> org.apache.hadoop.yarn.webapp.WebAppException: Error starting http
>>> server
>>>
>>>         at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.
>>> java:310)
>>>
>>>         at com.datatorrent.stram.StreamingAppMasterService.serviceStart
>>> (StreamingAppMasterService.java:616)
>>>
>>>         at org.apache.hadoop.service.AbstractService.start(AbstractServ
>>> ice.java:193)
>>>
>>>         at com.datatorrent.stram.StreamingAppMaster.main(StreamingAppMa
>>> ster.java:103)
>>>
>>> Caused by: java.io.IOException: Unable to initialize WebAppContext
>>>
>>>         at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:87
>>> 9)
>>>
>>>         at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.
>>> java:306)
>>>
>>>         ... 3 more
>>>
>>> Caused by: com.sun.jersey.api.container.ContainerException: No
>>> WebApplication provider is present
>>>
>>>         at com.sun.jersey.spi.container.WebApplicationFactory.createWeb
>>> Application(WebApplicationFactory.java:69)
>>>
>>>         at com.sun.jersey.spi.container.servlet.ServletContainer.create
>>> (ServletContainer.java:391)
>>>
>>>         at com.sun.jersey.spi.container.servlet.ServletContainer$Intern
>>> alWebComponent.create(ServletContainer.java:306)
>>>
>>>         at com.sun.jersey.spi.container.servlet.WebComponent.load(WebCo
>>> mponent.java:607)
>>>
>>>         at com.sun.jersey.spi.container.servlet.WebComponent.init(WebCo
>>> mponent.java:210)
>>>
>>>         at com.sun.jersey.spi.container.servlet.ServletContainer.init(S
>>> ervletContainer.java:373)
>>>
>>>         at com.sun.jersey.spi.container.servlet.ServletContainer.init(S
>>> ervletContainer.java:710)
>>>
>>>         at com.google.inject.servlet.FilterDefinition.init(FilterDefini
>>> tion.java:114)
>>>
>>>         at com.google.inject.servlet.ManagedFilterPipeline.initPipeline
>>> (ManagedFilterPipeline.java:98)
>>>
>>>         at com.google.inject.servlet.GuiceFilter.init(GuiceFilter.java:
>>> 172)
>>>
>>>         at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.
>>> java:97)
>>>
>>>         at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCy
>>> cle.java:50)
>>>
>>>         at org.mortbay.jetty.servlet.ServletHandler.initialize(ServletH
>>> andler.java:713)
>>>
>>>         at org.mortbay.jetty.servlet.Context.startContext(Context.java:
>>> 140)
>>>
>>>         at org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppCo
>>> ntext.java:1282)
>>>
>>>         at org.mortbay.jetty.handler.ContextHandler.doStart(ContextHand
>>> ler.java:518)
>>>
>>>         at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext
>>> .java:499)
>>>
>>>         at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCy
>>> cle.java:50)
>>>
>>>         at org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerC
>>> ollection.java:152)
>>>
>>>         at org.mortbay.jetty.handler.ContextHandlerCollection.doStart(C
>>> ontextHandlerCollection.java:156)
>>>
>>>         at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCy
>>> cle.java:50)
>>>
>>>         at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrap
>>> per.java:130)
>>>
>>>         at org.mortbay.jetty.Server.doStart(Server.java:224)
>>>
>>>         at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCy
>>> cle.java:50)
>>>
>>>         at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:85
>>> 7)
>>>
>>>
>>>
>>>
>>>
>>> Apart from this error I don’t see any other errors.
>>>
>>>
>>>
>>> Please advise.
>>>
>>>
>>> --
>>> Thanks
>>> Sanal
>>>
>>
>>
>
>
> --
> Sanal Vasudevan Nair
>

Reply via email to