Hey,
Disk space not enough is also a reliability concern, but might need a diff
strategy to handle it.
As suggested by Mridul, I am working on making things more configurable in
another(new) module… with that, we can plug in new rules for each type of
error.
Regards
Kalyan.
On Mon, 5 Feb 2024
Hi all,
Sorry for the delay in getting the first draft of (my first) SPIP out.
https://docs.google.com/document/d/1hxEPUirf3eYwNfMOmUHpuI5dIt_HJErCdo7_yr9htQc/edit?pli=1
Let me know what you think.
Regards
kalyan.
On Sat, Jan 20, 2024 at 8:19 AM Ashish Singh wrote:
> Hey all,
>
&g
excessive spilling to disk, etc.
While we had developed this on Spark 2.4.3 in-house, we would like to
collaborate and contribute this work to the latest versions of Spark.
What is the best way forward here? Will an SPIP proposal to detail the
changes help?
Regards,
Kalyan.
Uber India.
in this area by my friends in
this domain. One lesson we had was, it is hard to have a generic algorithm
that worked for all cases.
Regards
kalyan.
On Tue, Aug 8, 2023 at 6:12 PM Mich Talebzadeh
wrote:
> Thanks for pointing out this feature to me. I will have a look when I get
> there.
&g
+1
On Fri, Nov 6, 2020, 5:58 AM Matei Zaharia wrote:
> +1
>
> Matei
>
> > On Nov 5, 2020, at 10:25 AM, EveLiao wrote:
> >
> > +1
> > Thanks!
> >
> >
> >
> > --
> > Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
> >
> >
+1
Will positively improve the performance and reliability of spark...
Looking fwd to this..
Regards
Kalyan.
On Tue, Sep 15, 2020, 9:26 AM Joseph Torres
wrote:
> +1
>
> On Mon, Sep 14, 2020 at 6:39 PM angers.zhu wrote:
>
>> +1
>>
>> angers.zhu
>> angers...
Hi Cheng,
Is there some place where I can get more details on this, or if you could
give a couple of lines explaining about it.
> But given memory usage from writers is non-visible to spark now, it seems
> to me that there’s no other good way to model the memory usage for write.
regards
This looks interesting.. anyways, it will be good if you can elaborate more
on the expectations and the various other ways you had tried before
deciding to do it this way...
Regards,
Kalyan.
On Fri, Aug 7, 2020, 11:24 PM Edward Mitchell wrote:
> I will agree that the side effects of us
Hello,
I am Krishna, currently a 2nd year Masters student in (MSc. in Data Mining)
currently in Barcelona studying at Université Polytechnique de Catalogne.
I know its a little early for GSoC, however I wanted to get a head start
working with the spark community.
Is there anyone who would be
I could resolve this by passing the argument below
./python/run-tests --python-executables=python2.7
Thanks,
Krishna
On Thu, Nov 3, 2016 at 4:16 PM, Krishna Kalyan <krishnakaly...@gmail.com>
wrote:
> Hello,
> I am trying to run unit tests on pyspark.
>
> When I try to run un
Hello,
I am trying to run unit tests on pyspark.
When I try to run unit test I am faced with errors.
krishna@Krishna:~/Experiment/spark$ ./python/run-tests
Running PySpark tests. Output is in /Users/krishna/Experiment/
spark/python/unit-tests.log
Will test against the following Python
Hello,
I am a masters student. Could someone please let me know how set up my dev
working environment to contribute to pyspark.
Questions I had were:
a) Should I use Intellij Idea or PyCharm?.
b) How do I test my changes?.
Regards,
Krishna
12 matches
Mail list logo