[ 
https://issues.apache.org/jira/browse/AIRFLOW-7025?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17055555#comment-17055555
 ] 

ASF GitHub Bot commented on AIRFLOW-7025:
-----------------------------------------

sekikn commented on pull request #7677: [AIRFLOW-7025] Fix 
SparkSqlHook.run_query to handle its parameter properly
URL: https://github.com/apache/airflow/pull/7677
 
 
   ---
   Issue link: WILL BE INSERTED BY 
[boring-cyborg](https://github.com/kaxil/boring-cyborg)
   
   Make sure to mark the boxes below before creating PR: [x]
   
   - [x] Description above provides context of the change
   - [x] Commit message/PR title starts with `[AIRFLOW-NNNN]`. AIRFLOW-NNNN = 
JIRA ID<sup>*</sup>
   - [x] Unit tests coverage for changes (not needed for documentation changes)
   - [x] Commits follow "[How to write a good git commit 
message](http://chris.beams.io/posts/git-commit/)"
   - [x] Relevant documentation is updated including usage instructions.
   - [x] I will engage committers as explained in [Contribution Workflow 
Example](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#contribution-workflow-example).
   
   <sup>*</sup> For document-only changes commit message can start with 
`[AIRFLOW-XXXX]`.
   
   ---
   In case of fundamental code change, Airflow Improvement Proposal 
([AIP](https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Improvements+Proposals))
 is needed.
   In case of a new dependency, check compliance with the [ASF 3rd Party 
License Policy](https://www.apache.org/legal/resolved.html#category-x).
   In case of backwards incompatible changes please leave a note in 
[UPDATING.md](https://github.com/apache/airflow/blob/master/UPDATING.md).
   Read the [Pull Request 
Guidelines](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst#pull-request-guidelines)
 for more information.
   
 
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Fix SparkSqlHook.run_query to handle its parameter properly
> -----------------------------------------------------------
>
>                 Key: AIRFLOW-7025
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-7025
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: hooks
>    Affects Versions: 1.10.9
>            Reporter: Kengo Seki
>            Assignee: Kengo Seki
>            Priority: Major
>
> {{SparkSqlHook.run_query()}} has a parameter called {{cmd}}, which type is 
> documented as string, but it doesn't work as expected.
> {code}
> In [1]: from airflow.providers.apache.spark.hooks.spark_sql import 
> SparkSqlHook                                                                  
>                             
> In [2]: SparkSqlHook(sql="SELECT 1", master="local[*]", 
> conn_id="spark_default").run_query(cmd="--help")                              
>                                        
> (snip)
> [2020-03-09 23:37:35,088] {spark_sql.py:149} INFO - b'  childArgs             
>   [-e SELECT 1 - - h e l p]\n'
> {code}
> The passed argument "--help" is splitted into single characters. This is 
> because the {{cmd}} parameter is concatenated to {{connection_cmd}} as a list 
> of characters, as follows:
> {code:title=airflow/providers/apache/spark/hooks/spark_sql.py}
>  90     def _prepare_command(self, cmd):
>  91         """
>  92         Construct the spark-sql command to execute. Verbose output is 
> enabled
>  93         as default.
>  94 
>  95         :param cmd: command to append to the spark-sql command
>  96         :type cmd: str
>  97         :return: full command to be executed
>  98         """
>  99         connection_cmd = ["spark-sql"]
> (snip)
> 130         connection_cmd += cmd
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to