unsubscribe

2019-01-29 Thread Charles Nnamdi Akalugwu
unsubscribe


Re: How Spark HA works

2016-08-19 Thread Charles Nnamdi Akalugwu
I am experiencing this exact issue. Does anyone know what's going on with
the zookeeper setup?

On Jul 5, 2016 10:34 AM, "Akmal Abbasov"  wrote:
>
> Hi,
> I'm trying to understand how Spark HA works. I'm using Spark 1.6.1 and
Zookeeper 3.4.6.
> I've add the following line to $SPARK_HOME/conf/spark-env.sh
> export SPARK_DAEMON_JAVA_OPTS="-Dspark.deploy.recoveryMode=ZOOKEEPER
-Dspark.deploy.zookeeper.url=zk1:2181,zk2:2181,zk3:2181
-Dspark.deploy.zookeeper.dir=/spark
> It's working so far.
> I'd like to setup a link which will always go to active master UI(I'm
using Spark in Standalone).
> I've checked the znode /spark, and it contains
> [leader_election, master_status]
> I'm assuming that master_status znode will contain ip address of the
current active master, is it true? Because in my case this znode isn't
updated after failover.
> And how /spark/leader_election works, because it doesn't contain any data.
> Thank you.
>
> Regards,
> Akmal
>
>


Re: StructField Translation Error with Spark SQL

2016-04-20 Thread Charles Nnamdi Akalugwu
I get the same error for fields which are not null unfortunately.

Can't translate null value for field
StructField(density,DecimalType(4,2),true)
On Apr 21, 2016 1:37 AM, "Ted Yu"  wrote:

> The weight field is not nullable.
>
> Looks like your source table had null value for this field.
>
> On Wed, Apr 20, 2016 at 4:11 PM, Charles Nnamdi Akalugwu <
> cprenzb...@gmail.com> wrote:
>
>> Hi,
>>
>> I am using spark 1.4.1 and trying to copy all rows from a table in one
>> MySQL Database to a Amazon RDS table using spark SQL.
>>
>> Some columns in the source table are defined as DECIMAL type and are
>> nullable. Others are not.  When I run my spark job,
>>
>> val writeData = sqlContext.read.format("jdbc").option("url",
>>>> sourceUrl).option("driver", "com.mysql.jdbc.Driver").option("dbtable",
>>>> table).option("user", sourceUsername).option("password",
>>>> sourcePassword).load()
>>>
>>>
>>>
>>>
>>>
>>> writeData.write.format("com.databricks.spark.redshift").option("url",
>>>> String.format(targetUrl, targetUsername, targetPassword)).option("dbtable",
>>>> table).option("tempdir", redshiftTempDir+table).mode("append").save()
>>>
>>>
>> it fails with the following exception
>>
>> Can't translate null value for field
>>> StructField(weight,DecimalType(5,2),false)
>>
>>
>> Any insights about this exception would be very helpful.
>>
>
>


StructField Translation Error with Spark SQL

2016-04-20 Thread Charles Nnamdi Akalugwu
Hi,

I am using spark 1.4.1 and trying to copy all rows from a table in one
MySQL Database to a Amazon RDS table using spark SQL.

Some columns in the source table are defined as DECIMAL type and are
nullable. Others are not.  When I run my spark job,

val writeData = sqlContext.read.format("jdbc").option("url",
>> sourceUrl).option("driver", "com.mysql.jdbc.Driver").option("dbtable",
>> table).option("user", sourceUsername).option("password",
>> sourcePassword).load()
>
>
>
>
>
> writeData.write.format("com.databricks.spark.redshift").option("url",
>> String.format(targetUrl, targetUsername, targetPassword)).option("dbtable",
>> table).option("tempdir", redshiftTempDir+table).mode("append").save()
>
>
it fails with the following exception

Can't translate null value for field
> StructField(weight,DecimalType(5,2),false)


Any insights about this exception would be very helpful.