Great! It works. Thanks.
Best,Yi



----- 原始邮件 -----
发件人:Augustin Borsu <augus...@sagacify.com>
收件人:doovs...@sina.com
抄送人:dev <dev@spark.apache.org>
主题:Re: How to connect JDBC DB based on Spark Sql
日期:2015年04月14日 14点14分

Hello Yi,
You can actually pass the username and password in the url. E.g.
val url = "
jdbc:postgresql://ip.ip.ip.ip/ow-feeder?user=MY_LOGIN&password=MY_PASSWORD"
val query = "(SELECT * FROM \"YadaYada\" WHERE type='item' LIMIT 100) as
MY_DB"
val jdbcDF = sqlContext.load("jdbc", Map( "url" -> url, dbtable" -> query))
On Tue, Apr 14, 2015 at 7:48 AM, <doovs...@sina.com> wrote:
> Hi all,
> According to the official document, SparkContext can load datatable to
> dataframe using the DataSources API. However, it just supports the
> following properties:Property NameMeaningurlThe JDBC URL to connect
> to.dbtableThe JDBC table that should be read. Note that anything that is
> valid in a `FROM` clause of a SQL query can be used. For example, instead
> of a full table you could also use a subquery in parentheses.driverThe
> class name of the JDBC driver needed to connect to this URL. This class
> with be loaded on the master and workers before running an JDBC commands to
> allow the driver to register itself with the JDBC
> subsystem.partitionColumn, lowerBound, upperBound, numPartitionsThese
> options must all be specified if any of them is specified. They describe
> how to partition the table when reading in parallel from multiple workers.
> partitionColumn must be a numeric column from the table in question.It lets
> me confused how to pass the username, password or other info? BTW, I am
> connecting to Postgresql like this:    val dataFrame =
> sqlContext.load("jdbc", Map(      "url" -> "jdbc:postgresql://
> 192.168.1.110:5432/demo",  //how to pass username and password?
> "driver" -> "org.postgresql.Driver",      "dbtable" -> "schema.tab_users"
>   ))
> Thanks.
> RegardsYi
>
>
>
>

Reply via email to