Re: JDBC sessionInitStatement for writes?

2021-11-25 Thread trsell
Sorry I somehow missed the "Scope" column in the docs, which explicitly states its for reads only. I don't suppose anyone knows of some other method I can submit SET statements for write sessions? On Fri, Nov 26, 2021 at 12:51 PM wrote: > Hello, > > Regarding JDBC sinks, the docs state: >

JDBC sessionInitStatement for writes?

2021-11-25 Thread trsell
Hello, Regarding JDBC sinks, the docs state: https://spark.apache.org/docs/latest/sql-data-sources-jdbc.html sessionInitStatement: After each database session is opened to the remote DB and before starting to read data, this option executes a custom SQL statement (or a PL/SQL block). Use this to

Re: java.lang.UnsupportedOperationException: Cannot evaluate expression: fun_nm(input[0, string, true])

2016-08-18 Thread trsell
Hi, The stack trace suggests you're doing a join as well? and it's python.. I wonder if you're seeing this? https://issues.apache.org/jira/browse/SPARK-17100 Are you using spark 2.0.0? Tim On Tue, 16 Aug 2016 at 16:58 Sumit Khanna wrote: > This is just the

PySpark: cannot convert float infinity to integer, when setting batch in add_shuffle_key

2015-11-08 Thread trsell
Hello, I am running spark 1.5.1 on EMR using Python 3. I have a pyspark job which is doing some simple joins and reduceByKey operations. It works fine most of the time, but sometimes I get the following error: 15/11/09 03:00:53 WARN TaskSetManager: Lost task 2.0 in stage 4.0 (TID 69,