Hey Karthick,
The best way to deepen your understanding is by using the Spark Web UI as
much as possible while learning the fundamentals of Spark.
To help ease the learning curve, I recommend trying an open-source project
called *Dataflint*. It adds an extra tab to the Spark Web UI and presents
pen my understanding of the Spark Web UI. Could anyone
> recommend some useful materials, online courses, or share how you learned
> about it? I've already reviewed the official Spark Web UI documentation,
> but it only covers the basics.
>
> Note: I am using azure databricks for
Hi All,
I am looking to deepen my understanding of the Spark Web UI. Could anyone
recommend some useful materials, online courses, or share how you learned
about it? I've already reviewed the official Spark Web UI documentation,
but it only covers the basics.
Note: I am using azure databrick
The feature was added in Spark 3.0. Btw, you may want to check out the EOL
date for Apache Spark releases - https://endoflife.date/apache-spark 2.x is
already EOLed.
On Fri, Nov 24, 2023 at 11:13 PM mallesh j
wrote:
> Hi Team,
>
> I am trying to test the performance of a spark streaming applica
As outlined at https://issues.apache.org/jira/browse/SPARK-38693 and
https://stackoverflow.com/q/71667296/7954504, we are attempting to integrate
Keycloak<https://www.keycloak.org/docs/latest/securing_apps/#_servlet_filter_adapter>
Single Sign On with the Spark Web UI.
However, Spark
When I kill an application on the Web UI (which I submit with
standalone-client mode), it seems to be killed already; But when I use 'jps'
command I can still see the application running background. This is my demo
code to reappear this problem.
And,
If I kill the application on w
Thanks Manu for your response.
I already checked the logs and didn't see anything that can help me
understanding the issue.
The more weird thing, i have a small CI cluster which run on single
NameNode and i see the Spark2 job in the UI, i'm still not sure if it may
related to the NameNode HA, i t
Hi Fawze,
Sorry but I'm not familiar with CM. Maybe you can look into the logs (or
turn on DEBUG log).
On Thu, Aug 16, 2018 at 3:05 PM Fawze Abujaber wrote:
> Hi Manu,
>
> I'm using cloudera manager with single user mode and every process is
> running with cloudera-scm user, the cloudera-scm is
Hi Manu,
I'm using cloudera manager with single user mode and every process is
running with cloudera-scm user, the cloudera-scm is a super user and this
is why i was confused how it worked in spark 1.6 and not in spark 2.3
On Thu, Aug 16, 2018 at 5:34 AM Manu Zhang wrote:
> If you are able to
If you are able to log onto the node where UI has been launched, then try
`ps -aux | grep HistoryServer` and the first column of output should be the
user.
On Wed, Aug 15, 2018 at 10:26 PM Fawze Abujaber wrote:
> Thanks Manu, Do you know how i can see which user the UI is running,
> because i'm
Thanks Manu, Do you know how i can see which user the UI is running,
because i'm using cloudera manager and i created a user for cloudera
manager and called it spark but this didn't solve me issue and here i'm
trying to find out the user for the spark hisotry UI.
On Wed, Aug 15, 2018 at 5:11 PM Ma
Hi Fawze,
A) The file permission is currently hard coded to 770 (
https://github.com/apache/spark/blob/branch-2.3/core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala#L287
).
B) I think add all users (including UI) to the group like Spark will do.
On Wed, Aug 15, 2018 at 6:3
Hi Manu,
Thanks for your response.
Yes, i see but still interesting to know how i can see these applications
from the spark history UI.
How i can know with which user i'm logged in when i'm navigating the spark
history UI.
The Spark process is running with cloudera-scm and the events written i
Hi Fawze,
In Spark 2.3, HistoryServer will check for file permissions when reading
event logs written by your applications. (Please check
https://issues.apache.org/jira/browse/SPARK-20172). With file permissions
of 770, HistoryServer is not permitted to read the event log. That's why
you were able
Hi Guys,
Any help here?
On Wed, Aug 8, 2018 at 7:56 AM Fawze Abujaber wrote:
> Hello Community,
>
> I'm using Spark 2.3 and Spark 1.6.0 in my cluster with Cloudera
> distribution 5.13.0.
>
> Both are configured to run on Yarn, but i'm unable to see completed
> application in Spark2 history serv
Hello Community,
I'm using Spark 2.3 and Spark 1.6.0 in my cluster with Cloudera
distribution 5.13.0.
Both are configured to run on Yarn, but i'm unable to see completed
application in Spark2 history server, while in Spark 1.6.0 i did.
1) I checked the HDFS permissions for both folders and both
ype JKS
spark.ui.https.enabled true
Hopefully, I didn’t miss anything
Thanks,
Assaf
From: Saisai Shao [mailto:sai.sai.s...@gmail.com]
Sent: Monday, August 21, 2017 5:28 PM
To: Anshuman Kumar
Cc: spark users
Subject: Re: Spark Web UI SSL Encryption
Can you please post the specific problem
e
> computer. I need to setup SSL encryption for the Spark web UI, but
> following some threads online I’m still not able to set it up.
>
> Can someone help me with the SSL encryption.
>
> Warm Regards.
> -
>
Hello,
I have recently installed Sparks 2.2.0, and trying to use it for some big data
processing. Spark is installed on a server that I access from a remote
computer. I need to setup SSL encryption for the Spark web UI, but following
some threads online I’m still not able to set it up.
Can
Severity: Low
Vendor: The Apache Software Foundation
Versions Affected:
Versions of Apache Spark before 2.2.0
Description:
It is possible for an attacker to take advantage of a user's trust in the
server to trick them into visiting a link that points to a shared Spark
cluster and submits data in
6.1]
at java.lang.Thread.run(Thread.java:744) [na:1.7.0_51]
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Exception-when-accessing-Spark-Web-UI-in-yarn-client-mode-tp28762.html
Sent from the Apache Spark User List mailing list archive at Nabble.
ote:
>
>> Hi Jacek,
>>
>> I tried accessing Spark web UI on both Firefox and Google Chrome browsers
>> with ad blocker enabled. I do see other options like* User, Total
>> Uptime, Scheduling Mode, **Active Jobs, Completed Jobs and* Event
>> Timeline. However, I don&
Try selecting a particular Job instead of looking at the summary page for
all Jobs.
On Sat, Jan 28, 2017 at 4:25 PM, Md. Rezaul Karim <
rezaul.ka...@insight-centre.org> wrote:
> Hi Jacek,
>
> I tried accessing Spark web UI on both Firefox and Google Chrome browsers
> with ad
Hi Jacek,
I tried accessing Spark web UI on both Firefox and Google Chrome browsers
with ad blocker enabled. I do see other options like* User, Total Uptime,
Scheduling Mode, **Active Jobs, Completed Jobs and* Event Timeline.
However, I don't see an option for DAG visualization.
Please note
Hi,
Wonder if you have any adblocker enabled in your browser? Is this the only
version giving you this behavior? All Spark jobs have no visualization?
Jacek
On 28 Jan 2017 7:03 p.m., "Md. Rezaul Karim" <
rezaul.ka...@insight-centre.org> wrote:
Hi All,
I am running a Spark job on my local machi
Hi All,
I am running a Spark job on my local machine written in Scala with Spark
2.1.0. However, I am not seeing any option of "*DAG Visualization*" at
http://localhost:4040/jobs/
Suggestion, please.
Regards,
_
*Md. Rezaul Karim*, BSc, MSc
PhD Researcher, INSI
Config your spark master web ui you can set env SPARK_MASTER_WEBUI_PORT=
You can running cmd netstat –nao|grep 4040 to check 4040 is in using
———
I am not sure why Spark web UI keeps changing its port every time I restart a
cluster? how can I make it run always on one port? I did make
gt; On Mon, Jan 23, 2017 at 12:07 PM, Marcelo Vanzin
>> > wrote:
>> >>
>> >> No. Each app has its own UI which runs (starting on) port 4040.
>> >>
>> >> On Mon, Jan 23, 2017 at 12:05 PM, kant kodali
>> >> wrote:
>> >>
>>
> >> No. Each app has its own UI which runs (starting on) port 4040.
> >>
> >> On Mon, Jan 23, 2017 at 12:05 PM, kant kodali
> wrote:
> >> > I am using standalone mode so wouldn't be 8080 for my app web ui as
> >> > well?
>
n UI which runs (starting on) port 4040.
>>
>> On Mon, Jan 23, 2017 at 12:05 PM, kant kodali wrote:
>> > I am using standalone mode so wouldn't be 8080 for my app web ui as
>> > well?
>> > There is nothing running on 4040 in my cluster.
>> &
ting on) port 4040.
>
> On Mon, Jan 23, 2017 at 12:05 PM, kant kodali wrote:
> > I am using standalone mode so wouldn't be 8080 for my app web ui as well?
> > There is nothing running on 4040 in my cluster.
> >
> > http://spark.apache.org/docs/latest/security.html#stand
No. Each app has its own UI which runs (starting on) port 4040.
On Mon, Jan 23, 2017 at 12:05 PM, kant kodali wrote:
> I am using standalone mode so wouldn't be 8080 for my app web ui as well?
> There is nothing running on 4040 in my cluster.
>
> http://spark.apache.org/docs/lat
I am using standalone mode so wouldn't be 8080 for my app web ui as well?
There is nothing running on 4040 in my cluster.
http://spark.apache.org/docs/latest/security.html#standalone-mode-only
On Mon, Jan 23, 2017 at 11:51 AM, Marcelo Vanzin
wrote:
> That's the Master, whose de
That's the Master, whose default port is 8080 (not 4040). The default
port for the app's UI is 4040.
On Mon, Jan 23, 2017 at 11:47 AM, kant kodali wrote:
> I am not sure why Spark web UI keeps changing its port every time I restart
> a cluster? how can I make it run always on
I am not sure why Spark web UI keeps changing its port every time I restart
a cluster? how can I make it run always on one port? I did make sure there
is no process running on 4040(spark default web ui port) however it still
starts at 8080. any ideas?
MasterWebUI: Bound MasterWebUI to 0.0.0.0
12:07:34 AM, Jacek Laskowski (ja...@japila.pl) wrote:
Hi,
A possible workaround...Use SparkListener and save the results to a custom
sink.
After all web UI is a mere bag of SparkListeners + excellent
visualizations.
Jacek
On 3 Jan 2017 4:14 p.m., "Joseph Naegele"
wrote:
Hi all,
Is
Hi,
A possible workaround...Use SparkListener and save the results to a custom
sink.
After all web UI is a mere bag of SparkListeners + excellent
visualizations.
Jacek
On 3 Jan 2017 4:14 p.m., "Joseph Naegele"
wrote:
Hi all,
Is there any way to observe Storage history in Spark,
Hi all,
Is there any way to observe Storage history in Spark, i.e. which RDDs were
cached and where, etc. after an application completes? It appears the Storage
tab in the History Server UI is useless.
Thanks
---
Joe Naegele
Grier Forensics
---
>
>
> Regards,
> Natu
>
> On Tue, Sep 13, 2016 at 9:37 AM, Divya Gehlot
> wrote:
>
>> Hi ,
>> Thank you all..
>> Hurray ...I am able to view the hadoop web UI now @ 8088 . even Spark
>> Hisroty server Web UI @ 18080
>> But unable to figure out
Hi,
I think the spark UI will be accessible whenever you launch a spark app in
the cluster it should be the Application Tracker link.
Regards,
Natu
On Tue, Sep 13, 2016 at 9:37 AM, Divya Gehlot
wrote:
> Hi ,
> Thank you all..
> Hurray ...I am able to view the hadoop web UI now @ 80
Hi ,
Thank you all..
Hurray ...I am able to view the hadoop web UI now @ 8088 . even Spark
Hisroty server Web UI @ 18080
But unable to figure out the Spark UI web port ...
Tried with 4044,4040.. ..
getting below error
This site can’t be reached
How can I find out the Spark port ?
Would really
>
>
>
> [image: http://] <http://about.me/mti>
> Tariq, Mohammad
> about.me/mti
> [image: http://]
> <http://about.me/mti>
>
> On Tue, Sep 13, 2016 at 9:28 AM, Divya Gehlot
> wrote:
>
>> Hi,
>> I am on EMR 4.7 with Spark 1.6.1 and
ce=email_sig&utm_medium=external_link&utm_campaign=chrome_ext>
[image: http://]
Tariq, Mohammad
about.me/mti
[image: http://]
<http://about.me/mti>
On Tue, Sep 13, 2016 at 9:28 AM, Divya Gehlot
wrote:
> Hi,
> I am on EMR 4.7 with Spark 1.6.1 and Hadoop 2.7.2
> When I
Hi,
I am on EMR 4.7 with Spark 1.6.1 and Hadoop 2.7.2
When I am trying to view Any of the web UI of the cluster either hadoop or
Spark ,I am getting below error
"
This site can’t be reached
"
Has anybody using EMR and able to view WebUI .
Could you please share the steps.
Wo
ctive task it will show the address of current job.
> >> or you can check in master node by using netstat -apn | grep 4040
> >>
> >>
> >>
> >> > On Jul 26, 2016, at 8:21 AM, Jestin Ma >> > <mailto:jestinwith.a...@gmail.com>>
> >
public DNS name
* the private ip in our example is 172.31.23.201
From: Jacek Laskowski
Date: Tuesday, July 26, 2016 at 6:38 AM
To: Jestin Ma
Cc: Chanh Le , "user @spark"
Subject: Re: Spark Web UI port 4040 not working
> Hi,
>
> Do you perhaps deploy using cluster mode?
>>> > On Tue, Jul 26, 2016 at 12:39 AM, Chanh Le
>>> wrote:
>>> >>
>>> >> You’re running in StandAlone Mode?
>>> >> Usually inside active task it will show the address of current job.
>>> >> or you can check in master no
Chanh Le wrote:
>> >>
>> >> You’re running in StandAlone Mode?
>> >> Usually inside active task it will show the address of current job.
>> >> or you can check in master node by using netstat -apn | grep 4040
>> >>
>> >>
>> >&
n StandAlone Mode?
> >> Usually inside active task it will show the address of current job.
> >> or you can check in master node by using netstat -apn | grep 4040
> >>
> >>
> >>
> >> > On Jul 26, 2016, at 8:21 AM, Jestin Ma
> >> &g
heck in master node by using netstat -apn | grep 4040
>
>
>
> > On Jul 26, 2016, at 8:21 AM, Jestin Ma
> wrote:
> >
> > Hello, when running spark jobs, I can access the master UI (port 8080
> one) no problem. However, I'm confused as to how to access the
gt; Hello, when running spark jobs, I can access the master UI (port 8080
>> > one) no problem. However, I'm confused as to how to access the web UI to
>> > see
>> > jobs/tasks/stages/etc.
>> >
>> > I can access the master UI at http://:8080. But
Hi,
Go to 8080 and under Running Applications click the Application ID.
You're on the page with Application Detail UI just before Executor
Summary table. Use it to access the web UI.
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://b
port 8080 one) no
> problem. However, I'm confused as to how to access the web UI to see
> jobs/tasks/stages/etc.
>
> I can access the master UI at http://:8080. But port 4040 gives
> me a -connection cannot be reached-.
>
> Is the web UI http:// with a port of 4040?
Hello, when running spark jobs, I can access the master UI (port 8080 one)
no problem. However, I'm confused as to how to access the web UI to see
jobs/tasks/stages/etc.
I can access the master UI at http://:8080. But port 4040
gives me a -connection cannot be reached-.
Is the web UI
I created a cluster using spark-1.6.1-bin-hadoop2.6/ec2/spark-ec2 script.
The shows ganglia started how ever I am not able to access
http://ec2-54-215-230-73.us-west-1.compute.amazonaws.com:5080/ganglia. I
have tried using the private ip from with in my data center.
I d not see anything listing
explain it better. Usually
>> > executors
>> > are allocated when the job is started, if you have a multi-node cluster
>> > then
>> > you'll see executors launched on different nodes.
>> >
>> > On Sat, Jun 18, 2016 at 9:04 PM, Jacek Laskowski
ve a multi-node cluster
> then
> > you'll see executors launched on different nodes.
> >
> > On Sat, Jun 18, 2016 at 9:04 PM, Jacek Laskowski
> wrote:
> >>
> >> Hi,
> >>
> >> This is for Spark on YARN - a 1-node cluster with Spark 2.0.0-S
YARN - a 1-node cluster with Spark 2.0.0-SNAPSHOT
>> (today build)
>>
>> I can understand that when a stage fails a new executor entry shows up
>> in web UI under Executors tab (that corresponds to a stage attempt). I
>> understand that this is to keep the stdout and stderr lo
Spark on YARN - a 1-node cluster with Spark 2.0.0-SNAPSHOT
> (today build)
>
> I can understand that when a stage fails a new executor entry shows up
> in web UI under Executors tab (that corresponds to a stage attempt). I
> understand that this is to keep the stdout and stderr logs fo
APSHOT
> (today build)
>
> I can understand that when a stage fails a new executor entry shows up
> in web UI under Executors tab (that corresponds to a stage attempt). I
> understand that this is to keep the stdout and stderr logs for future
> reference.
>
> Why are there m
Hi,
This is for Spark on YARN - a 1-node cluster with Spark 2.0.0-SNAPSHOT
(today build)
I can understand that when a stage fails a new executor entry shows up
in web UI under Executors tab (that corresponds to a stage attempt). I
understand that this is to keep the stdout and stderr logs for
Hi,
I'd like to have the other optional columns in Aggregated Metrics by
Executor table per stage in web UI. I can easily have Shuffle Read
Size / Records and Shuffle Write Size / Records columns.
scala> sc.parallelize(0 to 9).map((_,1)).groupBy(_._1).count
I can't seem to figure o
Hi all,
I have a spark application running to which I submit jobs continuosly.
These job use different instances of sqlContext. So the web ui of
application starts to fill up more and more with this instance.
Is there any way to prevent this? I don't want to see created sql context
in the w
t a quick question,
>
> When using textFileStream, I did not see any events via web UI.
> Actually, I am uploading files to s3 every 5 seconds,
> And the mini-batch duration is 30 seconds.
> On web ui,:
>
> *Input Rate*
> Avg: 0.00 events/sec
>
> But the schedule time a
Just a quick question,
When using textFileStream, I did not see any events via web UI.
Actually, I am uploading files to s3 every 5 seconds,
And the mini-batch duration is 30 seconds.
On web ui,:
*Input Rate*
Avg: 0.00 events/sec
But the schedule time and processing time are correct, and the
You may want to check out https://github.com/hammerlab/spree
On Tue, 15 Mar 2016 at 10:43 charles li wrote:
> every time I can only get the latest info by refreshing the page, that's a
> little boring.
>
> so is there any way to make the WEB UI auto-refreshing ?
&
every time I can only get the latest info by refreshing the page, that's a
little boring.
so is there any way to make the WEB UI auto-refreshing ?
great thanks
--
*--*
a spark lover, a quant, a developer and a good man.
http://github.com/litaotao
orm at
ErrorStreaming2.scala:396, took 8.218500 s
(org.apache.spark.scheduler.DAGScheduler)
Stages in job 9816 are completed too according to the log
But job 9816 is still in active job of web ui, why?
How can I clear these remaining jobs?
--
View this message in context:
http://apache-spark-user-list.1001
orm at
ErrorStreaming2.scala:396, took 8.218500 s
(org.apache.spark.scheduler.DAGScheduler)
Stages in job 9816 are completed too according to the log
But job 9816 is still in active job of web ui, why?
How can I clear these remaining jobs?
--
View this message in context:
http://apache-spark-user-list.1
rk in yarn-client mode, but every time I access the web
> ui, the browser redirect me to one of the worker nodes and shows nothing.
> The url looks like
> http://hadoop-node31.company.com:8088/proxy/application_1453797301246_120264
> .
>
>
>
> that should be acting as a proxy
On 3 Mar 2016, at 09:17, Shady Xu mailto:shad...@gmail.com>>
wrote:
Hi all,
I am running Spark in yarn-client mode, but every time I access the web ui, the
browser redirect me to one of the worker nodes and shows nothing. The url looks
like
http://hadoop-node31.company.com:8088
Hi all,
I am running Spark in yarn-client mode, but every time I access the web ui,
the browser redirect me to one of the worker nodes and shows nothing. The
url looks like
http://hadoop-node31.company.com:8088/proxy/application_1453797301246_120264
.
I googled a lot and found some possible
02-26 13:40
kMeans/part-00013
-rw-r--r-- 3 abrandon supergroup 2.9 G 2016-02-26 13:39
kMeans/part-00014
-rw-r--r-- 3 abrandon supergroup 2.9 G 2016-02-26 13:40
kMeans/part-00015
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Task-Output-size-i
e:
>
>> Hi,
>>
>>Is there a way to provide minThreads and maxThreds for
>> Threadpool through jetty.xml for the jetty that is used by spark Web
>> UI?
>>
>> I am hitting an issue very similar to the issue described in
>> http://lif
:
> Hi,
>
>Is there a way to provide minThreads and maxThreds for
> Threadpool through jetty.xml for the jetty that is used by spark Web
> UI?
>
> I am hitting an issue very similar to the issue described in
> http://lifelongprogrammer.blogspot.com/20
Hi,
Is there a way to provide minThreads and maxThreds for
Threadpool through jetty.xml for the jetty that is used by spark Web
UI?
I am hitting an issue very similar to the issue described in
http://lifelongprogrammer.blogspot.com/2014/10/jetty-insufficient-threads
gt;>>> for SelectChannelConnector@0.0.0.0:8080
>>>>> 16/02/19 03:07:32 INFO Utils: Successfully started service 'MasterUI'
>>>>> on
>>>>> port 8080.
>>>>> 16/02/19 03:07:32 INFO MasterWebUI: Started MasterWebUI at
>>>&
bUI at
>>>> http://127.0.0.1:8080
>>>> 16/02/19 03:07:32 WARN AbstractConnector: insufficient threads
>>>> configured
>>>> for SelectChannelConnector@OAhtvJ5MCA:6066
>>>> 16/02/19 03:07:32 INFO Utils: Successfully started service on port 6066.
leader! New state:
>>> ALIVE
>>>
>>> --
>>> Through netstat I can see that port 8080 is Listening
>>> Now when I start firefox and access http://127.0.0.1:8080 , firefox
>>> just
>>> hangs with the mes
that port 8080 is Listening
>> Now when I start firefox and access http://127.0.0.1:8080 , firefox
>> just
>> hangs with the message
>>
>> Waiting for "127.0.0.1" and does not connect to UI.
>>
>> How do I enable debug for the spark mast
Hi,
try http://OAhtvJ5MCA:8080
BR
On 2/19/16, 07:18, "vasbhat" wrote:
>OAhtvJ5MCA
--
Informativa sulla Privacy: http://www.unibs.it/node/8155
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional co
//127.0.0.1:8080 , firefox
> just
> hangs with the message
>
> Waiting for "127.0.0.1" and does not connect to UI.
>
> How do I enable debug for the spark master daemon, to understand what's
> happening.
>
> Thanks
> Vasanth
>
>
st
hangs with the message
Waiting for "127.0.0.1" and does not connect to UI.
How do I enable debug for the spark master daemon, to understand what's
happening.
Thanks
Vasanth
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Re-Accessing
05 February 2016 17:09
> To: 'Ted Yu'
> Cc: user@spark.apache.org
> Subject: RE: Can't view executor logs in web UI on Windows
>
> We have created JIRA ticket
> https://issues.apache.org/jira/browse/SPARK-13142 and will submit a pull
> request next week.
>
>
I have submitted a pull request: https://github.com/apache/spark/pull/11135.
Mark
-Original Message-
From: Mark Pavey [mailto:mark.pa...@thefilter.com]
Sent: 05 February 2016 17:09
To: 'Ted Yu'
Cc: user@spark.apache.org
Subject: RE: Can't view executor logs in web UI
view executor logs in web UI on Windows
I did a brief search but didn't find relevant JIRA either.
You can create a JIRA and submit pull request for the fix.
Cheers
> On Feb 1, 2016, at 5:13 AM, Mark Pavey wrote:
>
> I am running Spark on Windows. When I try to view the Executor logs
There have been changes to visibility of info in ui between 1.4 and 1.5, I
can't say off the top of my head at which point versions they took place.
On Thu, Feb 4, 2016 at 12:07 AM, vimal dinakaran
wrote:
> No I am using DSE 4.8 which has spark 1.4. Is this a known issue ?
>
> On Wed, Jan 27, 20
No I am using DSE 4.8 which has spark 1.4. Is this a known issue ?
On Wed, Jan 27, 2016 at 11:52 PM, Cody Koeninger wrote:
> Have you tried spark 1.5?
>
> On Wed, Jan 27, 2016 at 11:14 AM, vimal dinakaran
> wrote:
>
>> Hi ,
>> I am using spark 1.4 with direct kafka api . In my streaming ui , I
backslash is created at line 71
> by calling the getPath method on a java.io.File object. Because it is
> running on Windows it uses the default Windows file separator, which is a
> backslash.
>
> I am using Spark 1.5.1 but the source code appears unchanged in 1.6.0.
>
> I have
ew this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Can-t-view-executor-logs-in-web-UI-on-Windows-tp26122.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscrib
Have you tried spark 1.5?
On Wed, Jan 27, 2016 at 11:14 AM, vimal dinakaran
wrote:
> Hi ,
> I am using spark 1.4 with direct kafka api . In my streaming ui , I am
> able to see the events listed in UI only if add stream.print() statements
> or else event rate and input events remains in 0 event
Hi ,
I am using spark 1.4 with direct kafka api . In my streaming ui , I am
able to see the events listed in UI only if add stream.print() statements
or else event rate and input events remains in 0 eventhough the events gets
processed.
Without print statements , I have the action saveToCassandra
If the application history is turned on, it should work, even through ssh
tunnel. Can you elaborate on what you mean by “it does not work?”
Also, are you able to see the application web UI while an application is
executing a job?
Mohammed
Author: Big Data Analytics with
Spark<h
Yes, I tried it, but it simply does not work.
so, my concern is *to use "ssh tunnel" to forward a port of cluster to
localhost port. *
But in Spark UI, there are two ports which I should forward using "*ssh
tunnel*".
Considering a default port, 8080 is web-ui port to come i
I am not sure whether you can copy the log files from Spark workers to your
local machine and view it from the Web UI. In fact, if you are able to copy the
log files locally, you can just view them directly in any text editor.
I suspect what you really want to see is the application history
As I mentioned before, I am tryint to see the spark log on a cluster via
ssh-tunnel
1) The error on application details UI is probably from monitoring porting
4044. Web UI port is 8088, right? so how could I see job web ui view and
application details UI view in the web ui on my local machine
Hello, a questino about web UI log.
I could see web interface log after forwarding the port on my cluster to
my local and click completed application, but when I clicked "application
detail UI"
[image: Inline image 1]
It happened to me. I do not know why. I also checked the sp
won't have a scheduling
> delay until it starts to run. In your example, a lot of batches are waiting
> so that they don't have the scheduling delay.
>
> On Sun, Jan 17, 2016 at 4:49 AM, Jacek Laskowski wrote:
>>
>> Hi,
>>
>> I'm trying to unders
n 17, 2016 at 4:49 AM, Jacek Laskowski wrote:
> Hi,
>
> I'm trying to understand how Scheduling Delays are displayed in
> Streaming page in web UI and think the values are displayed
> incorrectly in the Timelines column. I'm only concerned with the
> scheduling dela
Hi,
I'm trying to understand how Scheduling Delays are displayed in
Streaming page in web UI and think the values are displayed
incorrectly in the Timelines column. I'm only concerned with the
scheduling delays (on y axis) per batch times (x axis). It appears
that the values (on y
1 - 100 of 214 matches
Mail list logo