Thanks for following up.  I'll fix the docs.

On Wed, Mar 25, 2015 at 4:04 PM, elliott cordo <elliottco...@gmail.com>
wrote:

> Thanks!.. the below worked:
>
> db = sqlCtx.load(source="jdbc",
> url="jdbc:postgresql://localhost/x?user=x&password=x",dbtable="mstr.d_customer")
>
> Note that
> https://spark.apache.org/docs/latest/sql-programming-guide.html#dataframe-operations
> needs to be updated:
>
> [image: Inline image 1]
>
> On Wed, Mar 25, 2015 at 6:12 PM, Michael Armbrust <mich...@databricks.com>
> wrote:
>
>> Try:
>>
>> db = sqlContext.load(source="jdbc", url="jdbc:postgresql://localhost/xx",
>> dbtables="mstr.d_customer")
>>
>>
>> On Wed, Mar 25, 2015 at 2:19 PM, elliott cordo <elliottco...@gmail.com>
>> wrote:
>>
>>> if i run the following:
>>>
>>> db = sqlContext.load("jdbc", url="jdbc:postgresql://localhost/xx",
>>> dbtables="mstr.d_customer")
>>>
>>> i get the error:
>>>
>>> py4j.protocol.Py4JJavaError: An error occurred while calling o28.load.
>>>
>>> : java.io.FileNotFoundException: File file:/Users/elliottcordo/jdbc does
>>> not exist
>>>
>>> Seems to think i'm trying to load a file called jdbc?  i'm setting the
>>> postgres driver in the SPARK_CLASSPATH as well (that doesn't seem to be the
>>> problem)..
>>>
>>
>>
>

Reply via email to