Doesn't work for me so far ,
   using command but got such output. What should I check to fix the issue?
Any configuration parameters  ...


[root@sdo-hdp-bd-master1 ~]# yarn logs -applicationId
application_1426424283508_0048
15/05/21 13:25:09 INFO impl.TimelineClientImpl: Timeline service address:
http://hdp-bd-node1.development.c4i:8188/ws/v1/timeline/
15/05/21 13:25:09 INFO client.RMProxy: Connecting to ResourceManager at
hdp-bd-node1.development.c4i/12.23.45.253:8050
/app-logs/root/logs/application_1426424283508_0048does not exist.
*Log aggregation has not completed or is not enabled.*

Thanks
Oleg.

On Wed, May 20, 2015 at 11:33 PM, Ruslan Dautkhanov <dautkha...@gmail.com>
wrote:

> Oleg,
>
> You can see applicationId in your Spark History Server.
> Go to http://historyserver:18088/
>
> Also check
> https://spark.apache.org/docs/1.1.0/running-on-yarn.html#debugging-your-application
>
> It should be no different with PySpark.
>
>
> --
> Ruslan Dautkhanov
>
> On Wed, May 20, 2015 at 2:12 PM, Oleg Ruchovets <oruchov...@gmail.com>
> wrote:
>
>> Hi Ruslan.
>>   Could you add more details please.
>> Where do I get applicationId? In case I have a lot of log files would it
>> make sense to view it from single point.
>> How actually I can configure / manage log location of PySpark?
>>
>> Thanks
>> Oleg.
>>
>> On Wed, May 20, 2015 at 10:24 PM, Ruslan Dautkhanov <dautkha...@gmail.com
>> > wrote:
>>
>>> You could use
>>>
>>> yarn logs -applicationId application_1383601692319_0008
>>>
>>>
>>>
>>> --
>>> Ruslan Dautkhanov
>>>
>>> On Wed, May 20, 2015 at 5:37 AM, Oleg Ruchovets <oruchov...@gmail.com>
>>> wrote:
>>>
>>>> Hi ,
>>>>
>>>>   I am executing PySpark job on yarn ( hortonworks distribution).
>>>>
>>>> Could someone pointing me where is the log locations?
>>>>
>>>> Thanks
>>>> Oleg.
>>>>
>>>
>>>
>>
>

Reply via email to