unsubscribe

2021-08-09 Thread Vijay Gharge
unsubscribe Regards, Vijay Gharge

Unsubscribe

2021-07-05 Thread Vijay Gharge
Unsubscribe

Re: How to accelerate reading json file?

2016-01-06 Thread Vijay Gharge
onfile"val people = > sqlContext.read.json(path) > > I have 1 Tb size files in the path. It took 1.2 hours to finish the reading > to infer the schema. > > But I already know the schema. Could I make this process short? > > Thanks a lot. > > > > -- Regards, Vijay Gharge

Re: Unable to read JSON input in Spark (YARN Cluster)

2016-01-02 Thread Vijay Gharge
java.lang.Thread.run(Thread.java:745) > 16/01/01 18:36:56 ERROR util.Utils: Uncaught exception in thread Executor > task launch worker-21 > > *Questions* > > 1. How do i fix each of these errors ? > 2. https://yarn-jt:50030/cluster/apps/RUNNING does not show spark shell > job. > 3. Where do i see the submitted spark job ? > > Regards, > Deepak > > > -- > Deepak > > -- Regards, Vijay Gharge

java.io.FileNotFoundException(Too many open files) in Spark streaming

2015-12-23 Thread Vijay Gharge
ses it is actually a legitimate need. >>>> >>>> If that doesn't help, make sure you close any unused files and streams >>>> in your code. It will also be easier to help diagnose the issue if you send >>>> an error-reproducing snippet. >>>> >>> >>> > -- Regards, Vijay Gharge

Re: Problem with Spark Standalone

2015-12-23 Thread Vijay Gharge
k.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > > -- Regards, Vijay Gharge

Re: PySpark Connection reset by peer: socket write error

2015-12-16 Thread Vijay Gharge
t; > at > org.apache.spark.api.python.PythonRDD$WriterThread$$anonfun$run$3.apply(PythonRDD.scala:248) > > at > org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772) > > at > org.apache.spark.api.python.PythonRDD$WriterThread.run(PythonRDD.scala:208) > > > > I am getting this error only when I use MLlib with pyspark. how to resolve > this issue? > > > > Regards, > > Surendran > > > -- Regards, Vijay Gharge

Re: how to access local file from Spark sc.textFile("file:///path to/myfile")

2015-12-11 Thread Vijay Gharge
Please ignore typo. I meant root "permissions" Regards, Vijay Gharge On Fri, Dec 11, 2015 at 11:30 PM, Vijay Gharge wrote: > This issue is due to file permission issue. You need to execute spark > operations using root command only. > > > > Regards, > Vija

Re: how to access local file from Spark sc.textFile("file:///path to/myfile")

2015-12-11 Thread Vijay Gharge
This issue is due to file permission issue. You need to execute spark operations using root command only. Regards, Vijay Gharge On Fri, Dec 11, 2015 at 11:20 PM, Vijay Gharge wrote: > One more question. Are you also running spark commands using root user ? > Meanwhile am trying to si

Re: how to access local file from Spark sc.textFile("file:///path to/myfile")

2015-12-11 Thread Vijay Gharge
One more question. Are you also running spark commands using root user ? Meanwhile am trying to simulate this locally. On Friday 11 December 2015, Lin, Hao wrote: > Here you go, thanks. > > > > -rw-r--r-- 1 root root 658M Dec 9 2014 /root/2008.csv > > >

Re: how to access local file from Spark sc.textFile("file:///path to/myfile")

2015-12-11 Thread Vijay Gharge
at org.apache.spark.rdd.RDD.iterator(RDD.scala:264) > > at > org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66) > > at org.apache.spark.scheduler.Task.run(Task.scala:88) > > at > org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) > > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > > at java.lang.Thread.run(Thread.java:745) > > > Confidentiality Notice:: This email, including attachments, may include > non-public, proprietary, confidential or legally privileged information. If > you are not an intended recipient or an authorized agent of an intended > recipient, you are hereby notified that any dissemination, distribution or > copying of the information contained in or transmitted with this e-mail is > unauthorized and strictly prohibited. If you have received this email in > error, please notify the sender by replying to this message and permanently > delete this e-mail, its attachments, and any copies of it immediately. You > should not retain, copy or use this e-mail or any attachment for any > purpose, nor disclose all or any part of the contents to any other person. > Thank you. > -- Regards, Vijay Gharge

Re: HTTP Source for Spark Streaming

2015-12-09 Thread Vijay Gharge
s. There can be > multiple such Recievers each polling different http uris. An example can be > accessing a http uri for inventory update and sales update in parallel. > > Regards, > Sourav > > On Wed, Dec 9, 2015 at 7:53 PM, Vijay Gharge > wrote: > >> Not very clear. Can

Re: HTTP Source for Spark Streaming

2015-12-09 Thread Vijay Gharge
ility for the same ? > > Regards, > Sourav > -- Regards, Vijay Gharge

Re: newbie how to upgrade a spark-ec2 cluster?

2015-12-02 Thread Vijay Gharge
Thanks Gourav ! I will refer google for this. Regards, Vijay Gharge On Thu, Dec 3, 2015 at 1:26 PM, Gourav Sengupta wrote: > Vijay, > > please Google for AWS lambda + S3 there are several used cases available. > Lambda are event based triggers and are executed when an eve

Re: newbie how to upgrade a spark-ec2 cluster?

2015-12-02 Thread Vijay Gharge
ea how I >>> can upgrade to 1.5.2 prebuilt binary? >>> >>> Also if I choose to build the binary, how would I upgrade my cluster? >>> >>> Kind regards >>> >>> Andy >>> >>> >>> >> > -- Regards, Vijay Gharge