[GitHub] spark issue #15274: [SPARK-17699] Support for parsing JSON string columns

2016-10-18 Thread DanielMe
Github user DanielMe commented on the issue:

https://github.com/apache/spark/pull/15274
  
@yhuai thanks! My impression was that `get_json_object` does not convert 
json arrays to `ArrayType`s, maybe I misunderstood the way it's supposed to be 
used though.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #15274: [SPARK-17699] Support for parsing JSON string columns

2016-10-17 Thread DanielMe
Github user DanielMe commented on the issue:

https://github.com/apache/spark/pull/15274
  
Is there any workaround I can use to achieve a similar effect in 1.6?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark issue #12855: [SPARK-10216][SQL] Avoid creating empty files during ove...

2016-06-21 Thread DanielMe
Github user DanielMe commented on the issue:

https://github.com/apache/spark/pull/12855
  
I can reproduce the issue that @jurriaan reports on 1.6.0 and on 1.5.2. The 
issue does not occur on 1.3.1.

I have added a comment to the JIRA issue with more detailed instructions 
how to reproduce: https://issues.apache.org/jira/browse/SPARK-15393

Note that this might mean that this PR did not cause the issue.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-2097][SQL] UDF Support

2015-04-14 Thread DanielMe
Github user DanielMe commented on the pull request:

https://github.com/apache/spark/pull/1063#issuecomment-92678825
  
Okay, thanks for the clarification. Initially, I had naively assumed that 
the functionality you added was just a layer above the Hive API hence it was a 
bit confusing that `SHOW FUNCTIONS` did not list the UDFs. For my usecase I can 
easily work around that limitation so it's not that big of a deal.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-2097][SQL] UDF Support

2015-04-13 Thread DanielMe
Github user DanielMe commented on the pull request:

https://github.com/apache/spark/pull/1063#issuecomment-92301932
  
Excuse my naive question, however, it seems that this does not use the 
regular Hive UDF API, right? (Like when I would run `hiveContext.sql(CREATE 
TEMPORARY FUNCTION [...])` ). Is there any particular reason for that? I 
noticed, that UDFs created using this mechanism won't show up in the `SHOW 
FUNCTIONS` list. Would it be difficult to achieve that?

Also, the Hive API allows to add description strings to a UDF (which 
obviously only makes sense if you can use `DESCRIBE FUNCTION`). It would be 
nice if something similar would exists for UDFs defined over the spark 
interface.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org