Loading postgresql table to spark SyntaxError

2017-05-01 Thread Saulo Ricci
Hi, the following code is reading a table from my postgresql database, and I'm following the directives I've read on the internet: val txs = spark.read.format("jdbc").options(Map( ("driver" -> "org.postgresql.Driver"), ("url" -> "jdbc:postgresql://host/dbname"), ("dbtable" ->

Reading table from sql database to apache spark dataframe/RDD

2017-05-01 Thread Saulo Ricci
Hi, I have the following code that is reading a table to a apache spark DataFrame: val df = spark.read.format("jdbc") .option("url","jdbc:postgresql:host/database") .option("dbtable","tablename").option("user","username") .option("password", "password") .load() When I

RDD unpersisted still showing in my Storage tab UI

2017-01-30 Thread Saulo Ricci
Hi, I have a spark streaming application and basically in the end of each batch processing I call the method unpersist for the batch's RDD. But I've noticed the RDD's for all past batches are still showing on my Spark's UI Storage table. Shouldn't I expect to never see those RDD's again in my

Streaming jobs getting longer

2017-01-29 Thread Saulo Ricci
Hi, I have 2 spark pipeline applications almost identical, but I found out a significant difference between their performance. Basically the 1st application consumes the streaming from Kafka, slice this streaming in batches of 1 minute and for each record calculates a score given the already

[ML - Beginner - How To] - GaussianMixtureModel and GaussianMixtureModel$

2017-01-25 Thread Saulo Ricci
Hi, I'm studying the Java implementation code of the ml library, and I'd like to know why there is 2 implementations of GaussianMixtureModel - #1 GaussianMixtureModel and #2 GaussianMixtureModel$. I appreciate the answers. Thank you, Saulo

[ML - Intermediate - Debug] - Loading Customized Transformers in Apache Spark raised a NullPointerException

2017-01-24 Thread Saulo Ricci
Hi, sorry if I'm being short here. I'm facing the issue related in this link , I would really appreciate any help from the team and happy to talk and discuss more about this