Re: Is it possible to change the default port number 7077 for spark?
Hello Arun, Thank you for the descriptive response. And thank you for providing the sample file too. It certainly is a great help. Sincerely, Ashish On Mon, Jul 13, 2015 at 10:30 PM, Arun Verma wrote: > > PFA sample file > > On Mon, Jul 13, 2015 at 7:37 PM, Arun Verma > wrote: > >> Hi, >> >> Yes it is. To do it follow these steps; >> 1. cd spark/intallation/path/.../conf >> 2. cp spark-env.sh.template spark-env.sh >> 3. vi spark-env.sh >> 4. SPARK_MASTER_PORT=9000(or any other available port) >> >> PFA sample file. I hope this will help. >> >> On Mon, Jul 13, 2015 at 7:24 PM, ashishdutt >> wrote: >> >>> Many thanks for your response. >>> Regards, >>> Ashish >>> >>> >>> >>> -- >>> View this message in context: >>> http://apache-spark-user-list.1001560.n3.nabble.com/Is-it-possible-to-change-the-default-port-number-7077-for-spark-tp23774p23797.html >>> Sent from the Apache Spark User List mailing list archive at Nabble.com. >>> >>> - >>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >>> For additional commands, e-mail: user-h...@spark.apache.org >>> >>> >> >> >> -- >> Thanks and Regards, >> Arun Verma >> > > > > -- > Thanks and Regards, > Arun >
Re: Is it possible to change the default port number 7077 for spark?
PFA sample file On Mon, Jul 13, 2015 at 7:37 PM, Arun Verma wrote: > Hi, > > Yes it is. To do it follow these steps; > 1. cd spark/intallation/path/.../conf > 2. cp spark-env.sh.template spark-env.sh > 3. vi spark-env.sh > 4. SPARK_MASTER_PORT=9000(or any other available port) > > PFA sample file. I hope this will help. > > On Mon, Jul 13, 2015 at 7:24 PM, ashishdutt > wrote: > >> Many thanks for your response. >> Regards, >> Ashish >> >> >> >> -- >> View this message in context: >> http://apache-spark-user-list.1001560.n3.nabble.com/Is-it-possible-to-change-the-default-port-number-7077-for-spark-tp23774p23797.html >> Sent from the Apache Spark User List mailing list archive at Nabble.com. >> >> - >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >> For additional commands, e-mail: user-h...@spark.apache.org >> >> > > > -- > Thanks and Regards, > Arun Verma > -- Thanks and Regards, Arun #!/usr/bin/env bash # This file is sourced when running various Spark programs. # Copy it as spark-env.sh and edit that to configure Spark for your site. # Options read when launching programs locally with # ./bin/run-example or ./bin/spark-submit # - HADOOP_CONF_DIR, to point Spark towards Hadoop configuration files # - SPARK_LOCAL_IP, to set the IP address Spark binds to on this node # - SPARK_PUBLIC_DNS, to set the public dns name of the driver program # - SPARK_CLASSPATH, default classpath entries to append # Options read by executors and drivers running inside the cluster # - SPARK_LOCAL_IP, to set the IP address Spark binds to on this node # - SPARK_PUBLIC_DNS, to set the public DNS name of the driver program # - SPARK_CLASSPATH, default classpath entries to append # - SPARK_LOCAL_DIRS, storage directories to use on this node for shuffle and RDD data # - MESOS_NATIVE_JAVA_LIBRARY, to point to your libmesos.so if you use Mesos # Options read in YARN client mode # - HADOOP_CONF_DIR, to point Spark towards Hadoop configuration files # - SPARK_EXECUTOR_INSTANCES, Number of workers to start (Default: 2) # - SPARK_EXECUTOR_CORES, Number of cores for the workers (Default: 1). # - SPARK_EXECUTOR_MEMORY, Memory per Worker (e.g. 1000M, 2G) (Default: 1G) # - SPARK_DRIVER_MEMORY, Memory for Master (e.g. 1000M, 2G) (Default: 512 Mb) # - SPARK_YARN_APP_NAME, The name of your application (Default: Spark) # - SPARK_YARN_QUEUE, The hadoop queue to use for allocation requests (Default: ‘default’) # - SPARK_YARN_DIST_FILES, Comma separated list of files to be distributed with the job. # - SPARK_YARN_DIST_ARCHIVES, Comma separated list of archives to be distributed with the job. # Options for the daemons used in the standalone deploy mode # - SPARK_MASTER_IP, to bind the master to a different IP address or hostname SPARK_MASTER_PORT=9000 SPARK_MASTER_WEBUI_PORT=8000 # to use non-default ports for the master # - SPARK_MASTER_OPTS, to set config properties only for the master (e.g. "-Dx=y") # - SPARK_WORKER_CORES, to set the number of cores to use on this machine # - SPARK_WORKER_MEMORY, to set how much total memory workers have to give executors (e.g. 1000m, 2g) # - SPARK_WORKER_PORT / SPARK_WORKER_WEBUI_PORT, to use non-default ports for the worker # - SPARK_WORKER_INSTANCES, to set the number of worker processes per node # - SPARK_WORKER_DIR, to set the working directory of worker processes # - SPARK_WORKER_OPTS, to set config properties only for the worker (e.g. "-Dx=y") # - SPARK_HISTORY_OPTS, to set config properties only for the history server (e.g. "-Dx=y") # - SPARK_SHUFFLE_OPTS, to set config properties only for the external shuffle service (e.g. "-Dx=y") # - SPARK_DAEMON_JAVA_OPTS, to set config properties for all daemons (e.g. "-Dx=y") # - SPARK_PUBLIC_DNS, to set the public dns name of the master or workers # Generic options for the daemons used in the standalone deploy mode # - SPARK_CONF_DIR Alternate conf dir. (Default: ${SPARK_HOME}/conf) # - SPARK_LOG_DIR Where log files are stored. (Default: ${SPARK_HOME}/logs) # - SPARK_PID_DIR Where the pid file is stored. (Default: /tmp) # - SPARK_IDENT_STRING A string representing this instance of spark. (Default: $USER) # - SPARK_NICENESS The scheduling priority for daemons. (Default: 0) - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Is it possible to change the default port number 7077 for spark?
Hi, Yes it is. To do it follow these steps; 1. cd spark/intallation/path/.../conf 2. cp spark-env.sh.template spark-env.sh 3. vi spark-env.sh 4. SPARK_MASTER_PORT=9000(or any other available port) PFA sample file. I hope this will help. On Mon, Jul 13, 2015 at 7:24 PM, ashishdutt wrote: > Many thanks for your response. > Regards, > Ashish > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Is-it-possible-to-change-the-default-port-number-7077-for-spark-tp23774p23797.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > > -- Thanks and Regards, Arun Verma spark-env.sh Description: Bourne shell script - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Is it possible to change the default port number 7077 for spark?
Many thanks for your response. Regards, Ashish -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Is-it-possible-to-change-the-default-port-number-7077-for-spark-tp23774p23797.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Is it possible to change the default port number 7077 for spark?
Q1: You can change the port number on the master in the file conf/spark-defaults.conf. I don't know what will be the impact on a cloudera distro thought. Q2: Yes: a Spark worker needs to be present on each node which you want to make available to the driver. Q3: You can submit an application from your laptop to the master with the spark-submit script. You don't need to contact the workers directly. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Is-it-possible-to-change-the-default-port-number-7077-for-spark-tp23774p23781.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Is it possible to change the default port number 7077 for spark?
SSH by default should be on port 22. 7456 is the port is where master is listening. So any spark app should be able to connect to master using that port. On 11 Jul 2015 13:50, "ashishdutt" wrote: > Hello all, > In my lab a colleague installed and configured spark 1.3.0 on a 4 noded > cluster on CDH5.4 environment. The default port number for our spark > configuration is 7456. I have been trying to SSH to spark-master from using > this port number but it fails every time giving error JVM is timed out. > After reading the documentation , given by Cloudera > < > http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/cdh_ig_ports_cdh5.html > > > it says that the default port number for spark configuration should be 7077 > and that is what i see in all the posts here and elsewhere on search > results > in Google. So now I have three question please if anyone can help me answer > all or any of them Q1) Will the spark configuration work only with port > number 7077? If yes, then how can I change the port number? Q2) Do i need > to > install spark on all the machines in the cluster? Q3) Do run any spark job > do I always have to SSH into the spark-master machine ? or is it possible > to > connect my laptop to the spark-master and invoke commands from my laptop to > spark-master and worker machines? > Thank you for your time. > Ashish > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Is-it-possible-to-change-the-default-port-number-7077-for-spark-tp23774.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >
Is it possible to change the default port number 7077 for spark?
Hello all, In my lab a colleague installed and configured spark 1.3.0 on a 4 noded cluster on CDH5.4 environment. The default port number for our spark configuration is 7456. I have been trying to SSH to spark-master from using this port number but it fails every time giving error JVM is timed out. After reading the documentation , given by Cloudera <http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/cdh_ig_ports_cdh5.html> it says that the default port number for spark configuration should be 7077 and that is what i see in all the posts here and elsewhere on search results in Google. So now I have three question please if anyone can help me answer all or any of them Q1) Will the spark configuration work only with port number 7077? If yes, then how can I change the port number? Q2) Do i need to install spark on all the machines in the cluster? Q3) Do run any spark job do I always have to SSH into the spark-master machine ? or is it possible to connect my laptop to the spark-master and invoke commands from my laptop to spark-master and worker machines? Thank you for your time. Ashish -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Is-it-possible-to-change-the-default-port-number-7077-for-spark-tp23774.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org