Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Carlo . Allocca
Solved!! The solution is using date_format with the “u” option. Thank you very much. Best, Carlo On 28 Jul 2016, at 18:59, carlo allocca > wrote: Hi Mark, Thanks for the suggestion. I changed the maven entries as follows spark-core_2.10

Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Carlo . Allocca
Hi Mark, Thanks for the suggestion. I changed the maven entries as follows spark-core_2.10 2.0.0 and spark-sql_2.10 2.0.0 As result, it worked when I removed the following line of code to compute DAYOFWEEK (Monday—>1 etc.): Dataset

Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Mark Hamstra
Don't use Spark 2.0.0-preview. That was a preview release with known issues, and was intended to be used only for early, pre-release testing purpose. Spark 2.0.0 is now released, and you should be using that. On Thu, Jul 28, 2016 at 3:48 AM, Carlo.Allocca wrote: >

Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Carlo . Allocca
and, of course I am using org.apache.spark spark-core_2.11 2.0.0-preview org.apache.spark spark-sql_2.11 2.0.0-preview jar Is the below problem/issue related to the

Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Carlo . Allocca
I have also found the following two related links: 1) https://github.com/apache/spark/commit/947b9020b0d621bc97661a0a056297e6889936d3 2) https://github.com/apache/spark/pull/12433 which both explain why it happens but nothing about what to do to solve it. Do you have any

Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Carlo . Allocca
Hi Rui, Thanks for the promptly reply. No, I am not using Mesos. Ok. I am writing a code to build a suitable dataset for my needs as in the following: == Session configuration: SparkSession spark = SparkSession .builder() .master("local[6]") //

Re: SPARK Exception thrown in awaitResult

2016-07-28 Thread Sun Rui
Are you using Mesos? if not , https://issues.apache.org/jira/browse/SPARK-16522 is not relevant You may describe more information about your Spark environment, and the full stack trace. > On Jul 28, 2016, at 17:44, Carlo.Allocca