Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/15231
  
    @felixcheung , I usually don't like to answer by quote but let me do this 
just to clarify.
    
    > Hmm, should we hold till 12601 is merged then? Seems like we shouldn't 
allow this unless internal datasources are supporting this more broadly.
    
    As omitting `path` is what the datasource interface allows, maybe, it'd be 
just okay to test if it goes through JVM fine. Also, I worry if I can easily 
add a test for JDBC datasource within SparkR. If it can be easily done, I am 
also happy to hold this.
    
    > Also, before the path parameter type is in the signature, ie.
    > 
    > ```
    > write.df(df, c(1, 2))
    > ```
    > 
    > Would error with some descriptive error, with this change it would get 
some JVM exception which seems to degrade the experience a bit.
    
    
    
    Yeap, I could add some type checks
    
    > Similarly for the path not specified case 
java.lang.IllegalArgumentException - we generally try to avoid JVM exception 
showing up if possible.
    
    Also, yes. Maybe, we could avoid the direct JVM message after catching this 
and make it pretty within R just like PySpark does[1]. (although I am not sure 
if it sounds good in R).
    
    > Could you add checks to path for these cases and give more descriptive 
messages?
    
    Sure, I will try to address the points.
    
    
[1]https://github.com/apache/spark/blob/9a5071996b968148f6b9aba12e0d3fe888d9acd8/python/pyspark/sql/utils.py#L64-L80



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to