DB2 connectivity issue with SSL

2020-02-18 Thread SNEHASISH DUTTA
Hi, I am trying to connect to DB2 using spark with following code spark.sparkContext.addFile("xyz.jks") spark.sparkContext.addFile("xyz.pfx") val url = s"""jdbc:db2://host:port/schema:securityMechanism=18;sslConnection=true;user=user;sslTrustStoreLocation=${SparkFiles.get("xyz.jks")};sslKeyStore

There is no space for new record

2018-02-08 Thread SNEHASISH DUTTA
Hi , I am facing the following when running on EMR Caused by: java.lang.IllegalStateException: There is no space for new record at org.apache.spark.util.collection.unsafe.sort.UnsafeInMemorySorter.insertRecord(UnsafeInMemorySorter.java:226) at org.apache.spark.sql.execution.Unsafe

Re: There is no space for new record

2018-02-09 Thread SNEHASISH DUTTA
Regards, Snehasish Regards, Snehasish On Fri, Feb 9, 2018 at 1:26 PM, SNEHASISH DUTTA wrote: > Hi , > > I am facing the following when running on EMR > > Caused by: java.lang.IllegalStateException: There is no space for new > record > at org.apache.spark.util.c

Re: There is no space for new record

2018-02-13 Thread SNEHASISH DUTTA
rovide one? thanks! >> >> On Fri, Feb 9, 2018 at 5:59 PM, SNEHASISH DUTTA > > wrote: >> >>> Hi , >>> >>> I am facing the following when running on EMR >>> >>> Caused by: java.lang.IllegalStateException: There is no space

Re: There is no space for new record

2018-02-13 Thread SNEHASISH DUTTA
-23376. Anyway it will be available in the upcoming 2.3.0 release. > > Thanks. > > On 13 Feb 2018 9:09 a.m., "SNEHASISH DUTTA" > wrote: > >> Hi, >> >> In which version of Spark will this fix be available ? >> The deployment is on EMR >> >&g

csv dataframe reader issue 2.2.0

2018-02-22 Thread SNEHASISH DUTTA
Hi, I am using spark 2.2 csv reader I have data in following format 123|123|"abc"||""|"xyz" Where || is null And "" is one blank character as per the requirement I was using option sep as pipe And option quote as "" Parsed the data and using regex I was able to fulfill all the mentioned condi

CSV reader 2.2.0 issue

2018-03-05 Thread SNEHASISH DUTTA
Hi, I am using spark 2.2 csv reader I have data in following format 123|123|"abc"||""|"xyz" the requirement is || has to be treated as null and "" has to be treated as blank character of length 0 I was using option sep as pipe And option quote as "" Parsed the data and using regex I was able

Benchmark Java/Scala/Python for Apache spark

2019-03-11 Thread SNEHASISH DUTTA
Hi Is there a way to get performance benchmarks for development of application using either Java/Scala/Python Use case mostly involve SQL pipeline/data ingested from various sources including Kafka What should be the most preferred language and it would be great if the preference for language ca