I tried to do a fresh Hadoop setup using CDH 5.5 quickstart VM.
I launched the SortedWordConnt apex application. The application no longer
fails.
I was finally able to get the application into RUNNING state.

Apex CLI 3.5.0-SNAPSHOT 20.09.2016 @ 05:49:21 UTC rev: fdfba96 branch:
master
apex> list-apps
{"apps": [{
  "startTime": "2016-09-20 08:43:13 +0000",
  "id": 1,
  "name": "SortedWordCount",
  "state": "RUNNING",
  "trackingUrl":
"http:\/\/quickstart.cloudera:8088\/proxy\/application_1474359035680_0001\/",
  "finalStatus": "UNDEFINED"
}]}
1 active, total 1 applications.
apex>


I copied an input file.
[root@quickstart myapexapp]#  hdfs dfs -ls /tmp/test/input-dir
Found 1 items
-rw-r--r--   1 root supergroup        208 2016-09-20 09:02
/tmp/test/input-dir/somefile.txt

I don't see any files written to output directory: /tmp/test/output-dir

..
I dont see any errors.
The dt.log log file contains the following at the end.
2016-09-20 09:07:40,505 INFO
com.datatorrent.stram.StreamingAppMasterService:  Removing container
request: [Capability[<memory:128, vCores:1>]Priority[4],
Capability[<memory:256, vCores:1>]Priority[3], Capability[<memory:384,
vCores:1>]Priority[2]]
2016-09-20 09:07:40,505 INFO
com.datatorrent.stram.StreamingAppMasterService: Removed container:
Capability[<memory:128, vCores:1>]Priority[4]
2016-09-20 09:07:40,505 INFO
com.datatorrent.stram.StreamingAppMasterService: Removed container:
Capability[<memory:256, vCores:1>]Priority[3]
2016-09-20 09:07:40,505 INFO
com.datatorrent.stram.StreamingAppMasterService: Removed container:
Capability[<memory:384, vCores:1>]Priority[2]
2016-09-20 09:07:41,512 INFO com.datatorrent.stram.ResourceRequestHandler:
Strict anti-affinity = [] for container with operators
PTOperator[id=5,name=wcWriter]
2016-09-20 09:07:41,512 INFO com.datatorrent.stram.ResourceRequestHandler:
Found host null
2016-09-20 09:07:41,512 INFO com.datatorrent.stram.ResourceRequestHandler:
Strict anti-affinity = [] for container with operators
PTOperator[id=2,name=wordReader]
2016-09-20 09:07:41,512 INFO com.datatorrent.stram.ResourceRequestHandler:
Found host null
2016-09-20 09:07:41,512 INFO com.datatorrent.stram.ResourceRequestHandler:
Strict anti-affinity = [] for container with operators
PTOperator[id=1,name=lineReader]
2016-09-20 09:07:41,512 INFO com.datatorrent.stram.ResourceRequestHandler:
Found host null
2016-09-20 09:07:41,512 INFO
com.datatorrent.stram.StreamingAppMasterService:  Removing container
request: [Capability[<memory:128, vCores:1>]Priority[4],
Capability[<memory:256, vCores:1>]Priority[3], Capability[<memory:384,
vCores:1>]Priority[2]]
2016-09-20 09:07:41,512 INFO
com.datatorrent.stram.StreamingAppMasterService: Removed container:
Capability[<memory:128, vCores:1>]Priority[4]
2016-09-20 09:07:41,512 INFO
com.datatorrent.stram.StreamingAppMasterService: Removed container:
Capability[<memory:256, vCores:1>]Priority[3]
2016-09-20 09:07:41,512 INFO
com.datatorrent.stram.StreamingAppMasterService: Removed container:
Capability[<memory:384, vCores:1>]Priority[2]


Please advise.


Thanks,
Sanal


On Mon, Sep 19, 2016 at 10:37 PM, Munagala Ramanath <r...@datatorrent.com>
wrote:

> Wondering if there is something odd about your Hadoop setup since I'm able
> to run it OK
> with your pom file.
>
> Can you try generating a new project from the maven archetype and running
> the default
> application (random number generation) ?
>
> Ram
>
> On Mon, Sep 19, 2016 at 1:24 AM, Sanal Vasudevan <get2sa...@gmail.com>
> wrote:
>
>> No luck yet.
>> I get the same exception after changing the java version to 1.7.
>>
>>
>> 2016-09-19 01:21:26,621 INFO org.apache.hadoop.http.HttpServer2: adding path 
>> spec: /stram/*
>> 2016-09-19 01:21:26,621 INFO org.apache.hadoop.http.HttpServer2: adding path 
>> spec: /ws/*
>> 2016-09-19 01:21:26,920 INFO org.apache.hadoop.yarn.webapp.WebApps: 
>> Registered webapp guice modules
>> 2016-09-19 01:21:26,921 INFO org.apache.hadoop.http.HttpServer2: Jetty bound 
>> to port 20861
>> 2016-09-19 01:21:27,255 ERROR 
>> com.datatorrent.stram.StreamingAppMasterService: Webapps failed to start. 
>> Ignoring for now:
>> org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server
>>      at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:310)
>>      at 
>> com.datatorrent.stram.StreamingAppMasterService.serviceStart(StreamingAppMasterService.java:616)
>>      at 
>> org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
>>      at 
>> com.datatorrent.stram.StreamingAppMaster.main(StreamingAppMaster.java:103)
>> Caused by: java.io.IOException: Unable to initialize WebAppContext
>>      at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:879)
>>      at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:306)
>>      ... 3 more
>> Caused by: com.sun.jersey.api.container.ContainerException: No 
>> WebApplication provider is present
>>      at 
>> com.sun.jersey.spi.container.WebApplicationFactory.createWebApplication(WebApplicationFactory.java:69)
>>      at 
>> com.sun.jersey.spi.container.servlet.ServletContainer.create(ServletContainer.java:392)
>>      at 
>> com.sun.jersey.spi.container.servlet.ServletContainer$InternalWebComponent.create(ServletContainer.java:307)
>>      at 
>> com.sun.jersey.spi.container.servlet.WebComponent.load(WebComponent.java:607)
>>      at 
>> com.sun.jersey.spi.container.servlet.WebComponent.init(WebComponent.java:210)
>>      at 
>> com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:374)
>>      at 
>> com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:727)
>>      at 
>> com.google.inject.servlet.FilterDefinition.init(FilterDefinition.java:114)
>>      at 
>> com.google.inject.servlet.ManagedFilterPipeline.initPipeline(ManagedFilterPipeline.java:98)
>>      at com.google.inject.servlet.GuiceFilter.init(GuiceFilter.java:172)
>>      at org.mortbay.jetty.servlet.FilterHolder.doStart(FilterHolder.java:97)
>>      at 
>> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
>>      at 
>> org.mortbay.jetty.servlet.ServletHandler.initialize(ServletHandler.java:713)
>>      at org.mortbay.jetty.servlet.Context.startContext(Context.java:140)
>>      at 
>> org.mortbay.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1282)
>>      at 
>> org.mortbay.jetty.handler.ContextHandler.doStart(ContextHandler.java:518)
>>      at 
>> org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:499)
>>      at 
>> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
>>      at 
>> org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152)
>>      at 
>> org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156)
>>      at 
>> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
>>      at 
>> org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
>>      at org.mortbay.jetty.Server.doStart(Server.java:224)
>>      at 
>> org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
>>      at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:857)
>>      ... 4 more
>> 2016-09-19 01:21:27,281 INFO 
>> com.datatorrent.stram.StreamingAppMasterService: Starting ApplicationMaster
>> 2016-09-19 01:21:27,281 INFO 
>> com.datatorrent.stram.StreamingAppMasterService: number of tokens: 1
>> 2016-09-19 01:21:27,342 INFO 
>> com.datatorrent.stram.StreamingAppMasterService: Max mem 8192m, Min mem 
>> 1024m, Max vcores 32 and Min vcores 1 capabililty of resources in this 
>> cluster
>> 2016-09-19 01:21:27,343 INFO 
>> com.datatorrent.stram.StreamingAppMasterService: Blacklist removal time in 
>> millis = 3600000, max consecutive node failure count = 2147483647
>> 2016-09-19 01:21:27,348 INFO org.apache.hadoop.yarn.client.RMProxy: 
>> Connecting to ResourceManager at /0.0.0.0:8032
>>
>> Thanks for any help.
>>
>> Best regards
>>
>> Sanal
>>
>>
>>
>> On Mon, Sep 19, 2016 at 6:03 PM, Sanal Vasudevan <get2sa...@gmail.com>
>> wrote:
>>
>>> Thank you Ram. I will try it out.
>>>
>>> Best regards,
>>> Sanal
>>>
>>> On Mon, Sep 19, 2016 at 9:50 AM, Munagala Ramanath <r...@datatorrent.com>
>>> wrote:
>>>
>>>> *           <source>1.8</source>*
>>>> *           <target>1.8</target>*
>>>> Could you change the source and target java versions to 1.7 in the
>>>> above lines of your
>>>> pom.xml and try again ?
>>>>
>>>> With this change I was able to run the application built with your
>>>> pom.xml.
>>>>
>>>> Ram
>>>>
>>>> On Mon, Sep 12, 2016 at 8:48 PM, Sanal Vasudevan <get2sa...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> Please find attached the pom.xml file.
>>>>> I added the following to resolve build issue.
>>>>>
>>>>> <dependency>
>>>>>   <groupId>org.codehaus.jettison</groupId>
>>>>>   <artifactId>jettison</artifactId>
>>>>>   <version>1.3.8</version>
>>>>> </dependency>
>>>>>
>>>>> It is a single-node setup.
>>>>> Haddop Version:2.7.2, rb165c4fe8a74265c792ce23f546c64604acf0e41
>>>>> Yes I have used the apex command to lauch the apa file.
>>>>>
>>>>>
>>>>> Thanks
>>>>> Sanal
>>>>>
>>>>> On Tue, Sep 13, 2016 at 1:29 AM, Munagala Ramanath <
>>>>> r...@datatorrent.com> wrote:
>>>>>
>>>>>> Looks like this happens if jersey-server is not present
>>>>>> (e.g. http://stackoverflow.com/questions/8662919/jersey-no-w
>>>>>> ebapplication-provider-is-present-when-jersey-json-dependency-added)
>>>>>>
>>>>>> Have you made any changes to the pom.xml ? If so, can you post it
>>>>>> here ?
>>>>>>
>>>>>> Also, can you tell us a bit more about your deployment environment ?
>>>>>> Is it a single or multi-node cluster ?
>>>>>> What version of Hadoop ? Are you using the "apex" command line tool
>>>>>> to deploy ?
>>>>>>
>>>>>> Ram
>>>>>>
>>>>>> On Mon, Sep 12, 2016 at 1:29 AM, Sanal Vasudevan <get2sa...@gmail.com
>>>>>> > wrote:
>>>>>>
>>>>>>> Hi there,
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> I am unable to get started with Apache Apex.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> I used the below mvn command to create the project.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> #!/bin/bash
>>>>>>>
>>>>>>> # script to create a new project
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> # change project name and archetype version as needed
>>>>>>>
>>>>>>> name=myapexapp
>>>>>>>
>>>>>>> version=3.4.0
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> mvn -B archetype:generate \
>>>>>>>
>>>>>>>   -DarchetypeGroupId=org.apache.apex \
>>>>>>>
>>>>>>>   -DarchetypeArtifactId=apex-app-archetype \
>>>>>>>
>>>>>>>   -DarchetypeVersion=$version  \
>>>>>>>
>>>>>>>   -DgroupId=com.example \
>>>>>>>
>>>>>>>   -Dpackage=com.example.$name \
>>>>>>>
>>>>>>>   -DartifactId=$name \
>>>>>>>
>>>>>>>   -Dversion=1.0-SNAPSHOT
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> I am trying to execute the sample SortedWordCount application
>>>>>>> example.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> I see that the status of yarn application is FAILED.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Below error in the log file:
>>>>>>>
>>>>>>> 2016-09-12 00:47:12,284 INFO com.datatorrent.stram.StreamingAppMaster: 
>>>>>>> Initializing Application Master.
>>>>>>> 2016-09-12 00:47:12,385 INFO 
>>>>>>> com.datatorrent.stram.StreamingAppMasterService: Application master, 
>>>>>>> appId=3, clustertimestamp=1473662193222, attemptId=1
>>>>>>> 2016-09-12 00:47:13,432 INFO 
>>>>>>> com.datatorrent.common.util.AsyncFSStorageAgent: using 
>>>>>>> /scratch/sanav/view_storage/projects/downloads/hadoop/tmp/hadoop-sanav/nm-local-dir/usercache/sanav/appcache/application_1473662193222_0003/container_1473662193222_0003_01_000001/tmp/chkp6399702117805507910
>>>>>>>  as the basepath for checkpointing.
>>>>>>> 2016-09-12 00:47:15,223 INFO com.datatorrent.stram.FSRecoveryHandler: 
>>>>>>> Creating 
>>>>>>> hdfs://den00spj:9000/user/sanav/datatorrent/apps/application_1473662193222_0003/recovery/log
>>>>>>> 2016-09-12 00:47:15,272 INFO 
>>>>>>> com.datatorrent.stram.StreamingAppMasterService: Starting application 
>>>>>>> with 5 operators in 5 containers
>>>>>>> 2016-09-12 00:47:15,286 INFO 
>>>>>>> org.apache.hadoop.yarn.client.api.async.impl.NMClientAsyncImpl: Upper 
>>>>>>> bound of the thread pool size is 500
>>>>>>> 2016-09-12 00:47:15,288 INFO 
>>>>>>> org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
>>>>>>>  yarn.client.max-cached-nodemanagers-proxies : 0
>>>>>>> 2016-09-12 00:47:15,316 INFO org.apache.hadoop.yarn.client.RMProxy: 
>>>>>>> Connecting to ResourceManager at /0.0.0.0:8030
>>>>>>> 2016-09-12 00:47:15,345 INFO 
>>>>>>> com.datatorrent.stram.StreamingContainerParent: Config: Configuration: 
>>>>>>> core-default.xml, core-site.xml, yarn-default.xml, yarn-site.xml, 
>>>>>>> hdfs-default.xml, hdfs-site.xml
>>>>>>> 2016-09-12 00:47:15,345 INFO 
>>>>>>> com.datatorrent.stram.StreamingContainerParent: Listener thread count 30
>>>>>>> 2016-09-12 00:47:15,351 INFO org.apache.hadoop.ipc.CallQueueManager: 
>>>>>>> Using callQueue class java.util.concurrent.LinkedBlockingQueue
>>>>>>> 2016-09-12 00:47:15,356 INFO org.apache.hadoop.ipc.Server: Starting 
>>>>>>> Socket Reader #1 for port 55906
>>>>>>> 2016-09-12 00:47:15,369 INFO org.apache.hadoop.ipc.Server: IPC Server 
>>>>>>> listener on 55906: starting
>>>>>>> 2016-09-12 00:47:15,370 INFO org.apache.hadoop.ipc.Server: IPC Server 
>>>>>>> Responder: starting
>>>>>>> 2016-09-12 00:47:15,385 INFO 
>>>>>>> com.datatorrent.stram.StreamingContainerParent: Container callback 
>>>>>>> server listening at den00spj/10.196.38.196:55906
>>>>>>> 2016-09-12 00:47:15,440 INFO org.mortbay.log: Logging to 
>>>>>>> org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via 
>>>>>>> org.mortbay.log.Slf4jLog
>>>>>>> 2016-09-12 00:47:15,507 INFO 
>>>>>>> org.apache.hadoop.security.authentication.server.AuthenticationFilter: 
>>>>>>> Unable to initialize FileSignerSecretProvider, falling back to use 
>>>>>>> random secrets.
>>>>>>> 2016-09-12 00:47:15,512 INFO org.apache.hadoop.http.HttpRequestLog: 
>>>>>>> Http request log for http.requests.stram is not defined
>>>>>>> 2016-09-12 00:47:15,518 INFO org.apache.hadoop.http.HttpServer2: Added 
>>>>>>> global filter 'safety' 
>>>>>>> (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter)
>>>>>>> 2016-09-12 00:47:15,520 INFO org.apache.hadoop.http.HttpServer2: Added 
>>>>>>> filter static_user_filter 
>>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) 
>>>>>>> to context stram
>>>>>>> 2016-09-12 00:47:15,520 INFO org.apache.hadoop.http.HttpServer2: Added 
>>>>>>> filter static_user_filter 
>>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) 
>>>>>>> to context static
>>>>>>> 2016-09-12 00:47:15,520 INFO org.apache.hadoop.http.HttpServer2: Added 
>>>>>>> filter static_user_filter 
>>>>>>> (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) 
>>>>>>> to context logs
>>>>>>> 2016-09-12 00:47:15,523 INFO org.apache.hadoop.http.HttpServer2: adding 
>>>>>>> path spec: /stram/*
>>>>>>> 2016-09-12 00:47:15,523 INFO org.apache.hadoop.http.HttpServer2: adding 
>>>>>>> path spec: /ws/*
>>>>>>> 2016-09-12 00:47:15,875 INFO org.apache.hadoop.yarn.webapp.WebApps: 
>>>>>>> Registered webapp guice modules
>>>>>>> 2016-09-12 00:47:15,876 INFO org.apache.hadoop.http.HttpServer2: Jetty 
>>>>>>> bound to port 21097
>>>>>>>
>>>>>>> 2016-09-12 00:47:16,147 ERROR 
>>>>>>> com.datatorrent.stram.StreamingAppMasterService:
>>>>>>> Webapps failed to start. Ignoring for now:
>>>>>>>
>>>>>>> org.apache.hadoop.yarn.webapp.WebAppException: Error starting http
>>>>>>> server
>>>>>>>
>>>>>>>         at org.apache.hadoop.yarn.webapp.
>>>>>>> WebApps$Builder.start(WebApps.java:310)
>>>>>>>
>>>>>>>         at com.datatorrent.stram.Streamin
>>>>>>> gAppMasterService.serviceStart(StreamingAppMasterService.java:616)
>>>>>>>
>>>>>>>         at org.apache.hadoop.service.Abst
>>>>>>> ractService.start(AbstractService.java:193)
>>>>>>>
>>>>>>>         at com.datatorrent.stram.Streamin
>>>>>>> gAppMaster.main(StreamingAppMaster.java:103)
>>>>>>>
>>>>>>> Caused by: java.io.IOException: Unable to initialize WebAppContext
>>>>>>>
>>>>>>>         at org.apache.hadoop.http.HttpSer
>>>>>>> ver2.start(HttpServer2.java:879)
>>>>>>>
>>>>>>>         at org.apache.hadoop.yarn.webapp.
>>>>>>> WebApps$Builder.start(WebApps.java:306)
>>>>>>>
>>>>>>>         ... 3 more
>>>>>>>
>>>>>>> Caused by: com.sun.jersey.api.container.ContainerException: No
>>>>>>> WebApplication provider is present
>>>>>>>
>>>>>>>         at com.sun.jersey.spi.container.W
>>>>>>> ebApplicationFactory.createWebApplication(WebApplicationFact
>>>>>>> ory.java:69)
>>>>>>>
>>>>>>>         at com.sun.jersey.spi.container.s
>>>>>>> ervlet.ServletContainer.create(ServletContainer.java:391)
>>>>>>>
>>>>>>>         at com.sun.jersey.spi.container.s
>>>>>>> ervlet.ServletContainer$InternalWebComponent.create(ServletC
>>>>>>> ontainer.java:306)
>>>>>>>
>>>>>>>         at com.sun.jersey.spi.container.s
>>>>>>> ervlet.WebComponent.load(WebComponent.java:607)
>>>>>>>
>>>>>>>         at com.sun.jersey.spi.container.s
>>>>>>> ervlet.WebComponent.init(WebComponent.java:210)
>>>>>>>
>>>>>>>         at com.sun.jersey.spi.container.s
>>>>>>> ervlet.ServletContainer.init(ServletContainer.java:373)
>>>>>>>
>>>>>>>         at com.sun.jersey.spi.container.s
>>>>>>> ervlet.ServletContainer.init(ServletContainer.java:710)
>>>>>>>
>>>>>>>         at com.google.inject.servlet.Filt
>>>>>>> erDefinition.init(FilterDefinition.java:114)
>>>>>>>
>>>>>>>         at com.google.inject.servlet.Mana
>>>>>>> gedFilterPipeline.initPipeline(ManagedFilterPipeline.java:98)
>>>>>>>
>>>>>>>         at com.google.inject.servlet.Guic
>>>>>>> eFilter.init(GuiceFilter.java:172)
>>>>>>>
>>>>>>>         at org.mortbay.jetty.servlet.Filt
>>>>>>> erHolder.doStart(FilterHolder.java:97)
>>>>>>>
>>>>>>>         at org.mortbay.component.Abstract
>>>>>>> LifeCycle.start(AbstractLifeCycle.java:50)
>>>>>>>
>>>>>>>         at org.mortbay.jetty.servlet.Serv
>>>>>>> letHandler.initialize(ServletHandler.java:713)
>>>>>>>
>>>>>>>         at org.mortbay.jetty.servlet.Cont
>>>>>>> ext.startContext(Context.java:140)
>>>>>>>
>>>>>>>         at org.mortbay.jetty.webapp.WebAp
>>>>>>> pContext.startContext(WebAppContext.java:1282)
>>>>>>>
>>>>>>>         at org.mortbay.jetty.handler.Cont
>>>>>>> extHandler.doStart(ContextHandler.java:518)
>>>>>>>
>>>>>>>         at org.mortbay.jetty.webapp.WebAp
>>>>>>> pContext.doStart(WebAppContext.java:499)
>>>>>>>
>>>>>>>         at org.mortbay.component.Abstract
>>>>>>> LifeCycle.start(AbstractLifeCycle.java:50)
>>>>>>>
>>>>>>>         at org.mortbay.jetty.handler.Hand
>>>>>>> lerCollection.doStart(HandlerCollection.java:152)
>>>>>>>
>>>>>>>         at org.mortbay.jetty.handler.Cont
>>>>>>> extHandlerCollection.doStart(ContextHandlerCollection.java:156)
>>>>>>>
>>>>>>>         at org.mortbay.component.Abstract
>>>>>>> LifeCycle.start(AbstractLifeCycle.java:50)
>>>>>>>
>>>>>>>         at org.mortbay.jetty.handler.Hand
>>>>>>> lerWrapper.doStart(HandlerWrapper.java:130)
>>>>>>>
>>>>>>>         at org.mortbay.jetty.Server.doStart(Server.java:224)
>>>>>>>
>>>>>>>         at org.mortbay.component.Abstract
>>>>>>> LifeCycle.start(AbstractLifeCycle.java:50)
>>>>>>>
>>>>>>>         at org.apache.hadoop.http.HttpSer
>>>>>>> ver2.start(HttpServer2.java:857)
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Apart from this error I don’t see any other errors.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Please advise.
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Thanks
>>>>>>> Sanal
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Sanal Vasudevan Nair
>>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>> Sanal Vasudevan Nair
>>>
>>
>>
>>
>> --
>> Sanal Vasudevan Nair
>>
>
>


-- 
Sanal Vasudevan Nair

Reply via email to