[jira] [Created] (SPARK-36287) create TimestampNTZType in pyspark

2021-07-26 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36287:


 Summary: create TimestampNTZType in pyspark
 Key: SPARK-36287
 URL: https://issues.apache.org/jira/browse/SPARK-36287
 Project: Spark
  Issue Type: New Feature
  Components: PySpark
Affects Versions: 3.1.2
Reporter: Dominik Gehl


create TimestampNTZType in pyspark



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36259) Expose localtimestamp in pyspark.sql.functions

2021-07-22 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36259:


 Summary: Expose localtimestamp in pyspark.sql.functions
 Key: SPARK-36259
 URL: https://issues.apache.org/jira/browse/SPARK-36259
 Project: Spark
  Issue Type: Improvement
  Components: PySpark
Affects Versions: 3.1.2
Reporter: Dominik Gehl


localtimestamp is available in the scala sql functions, but currently not in 
pyspark



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36258) Export functionExists in pyspark catalog

2021-07-22 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36258:


 Summary: Export functionExists in pyspark catalog
 Key: SPARK-36258
 URL: https://issues.apache.org/jira/browse/SPARK-36258
 Project: Spark
  Issue Type: Improvement
  Components: PySpark
Affects Versions: 3.1.2
Reporter: Dominik Gehl


functionExists is available in the scala catalog, but current isn't exposed in 
pyspark



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-36243) pyspark catalog.tableExists doesn't work for temporary views

2021-07-21 Thread Dominik Gehl (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-36243?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dominik Gehl updated SPARK-36243:
-
Component/s: (was: Java API)
 PySpark
Description: 
Documentation in Catalog.scala for tableExists specifies

   * Check if the table or view with the specified name exists. This can either 
be a temporary
   * view or a table/view.

The pyspark version doesn't work correctly for temporary views

  was:
Documentation in Catalog.scala for tableExists specifies

   * Check if the table or view with the specified name exists. This can either 
be a temporary
   * view or a table/view.

temporary views don't seem to work

Summary: pyspark catalog.tableExists doesn't work for temporary views  
(was: scala catalog.tableExists doesn't work for temporary views)

> pyspark catalog.tableExists doesn't work for temporary views
> 
>
> Key: SPARK-36243
> URL: https://issues.apache.org/jira/browse/SPARK-36243
> Project: Spark
>  Issue Type: Improvement
>  Components: PySpark
>Affects Versions: 3.1.2
>Reporter: Dominik Gehl
>Priority: Major
>
> Documentation in Catalog.scala for tableExists specifies
>* Check if the table or view with the specified name exists. This can 
> either be a temporary
>* view or a table/view.
> The pyspark version doesn't work correctly for temporary views



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36243) scala catalog.tableExists doesn't work for temporary views

2021-07-21 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36243:


 Summary: scala catalog.tableExists doesn't work for temporary views
 Key: SPARK-36243
 URL: https://issues.apache.org/jira/browse/SPARK-36243
 Project: Spark
  Issue Type: Improvement
  Components: Java API
Affects Versions: 3.1.2
Reporter: Dominik Gehl


Documentation in Catalog.scala for tableExists specifies

   * Check if the table or view with the specified name exists. This can either 
be a temporary
   * view or a table/view.

temporary views don't seem to work



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36226) improve python docstring links to other pyspark classes

2021-07-20 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36226:


 Summary: improve python docstring links to other pyspark classes
 Key: SPARK-36226
 URL: https://issues.apache.org/jira/browse/SPARK-36226
 Project: Spark
  Issue Type: Improvement
  Components: PySpark
Affects Versions: 3.1.2
Reporter: Dominik Gehl


improve python docstring links to other pyspark classes



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36225) python docstring referencing non existing Dataset class

2021-07-20 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36225:


 Summary: python docstring referencing non existing Dataset class
 Key: SPARK-36225
 URL: https://issues.apache.org/jira/browse/SPARK-36225
 Project: Spark
  Issue Type: Bug
  Components: PySpark
Affects Versions: 3.1.2
Reporter: Dominik Gehl


Some python docstrings contain ```:class:`Dataset although there is no 
pyspark Dataset class



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-36209) https://spark.apache.org/docs/latest/sql-programming-guide.html contains invalid link to Python doc

2021-07-19 Thread Dominik Gehl (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-36209?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dominik Gehl updated SPARK-36209:
-
Description: 
On https://spark.apache.org/docs/latest/sql-programming-guide.html , the link 
to the python doc points to 
https://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame
 which returns a "Not found"


> https://spark.apache.org/docs/latest/sql-programming-guide.html contains 
> invalid link to Python doc
> ---
>
> Key: SPARK-36209
> URL: https://issues.apache.org/jira/browse/SPARK-36209
> Project: Spark
>  Issue Type: Bug
>  Components: Documentation
>Affects Versions: 3.1.2
> Environment: On 
> https://spark.apache.org/docs/latest/sql-programming-guide.html, the link to 
> the python doc points to 
> https://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame
>  which returns a "Not found"
>Reporter: Dominik Gehl
>Priority: Major
>
> On https://spark.apache.org/docs/latest/sql-programming-guide.html , the link 
> to the python doc points to 
> https://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame
>  which returns a "Not found"



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36209) https://spark.apache.org/docs/latest/sql-programming-guide.html contains invalid link to Python doc

2021-07-19 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36209:


 Summary: 
https://spark.apache.org/docs/latest/sql-programming-guide.html contains 
invalid link to Python doc
 Key: SPARK-36209
 URL: https://issues.apache.org/jira/browse/SPARK-36209
 Project: Spark
  Issue Type: Bug
  Components: Documentation
Affects Versions: 3.1.2
 Environment: On 
https://spark.apache.org/docs/latest/sql-programming-guide.html, the link to 
the python doc points to 
https://spark.apache.org/docs/latest/api/python/pyspark.sql.html#pyspark.sql.DataFrame
 which returns a "Not found"
Reporter: Dominik Gehl






--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36207) Export databaseExists in pyspark.sql.catalog

2021-07-19 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36207:


 Summary: Export databaseExists in pyspark.sql.catalog
 Key: SPARK-36207
 URL: https://issues.apache.org/jira/browse/SPARK-36207
 Project: Spark
  Issue Type: Improvement
  Components: PySpark
Affects Versions: 3.1.2
Reporter: Dominik Gehl


expose in pyspark databaseExists which is part of the scala implementation



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36181) Update pyspark sql readwriter documentation to Scala level

2021-07-16 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36181:


 Summary: Update pyspark sql readwriter documentation to Scala level
 Key: SPARK-36181
 URL: https://issues.apache.org/jira/browse/SPARK-36181
 Project: Spark
  Issue Type: Improvement
  Components: Documentation, PySpark
Affects Versions: 3.1.2
Reporter: Dominik Gehl


Update pyspark sql readwriter documentation to the level of detail the Scala 
documentation provides



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-36178) Document PySpark Catalog APIs in docs/source/reference/pyspark.sql.rst

2021-07-16 Thread Dominik Gehl (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-36178?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dominik Gehl updated SPARK-36178:
-
Summary: Document PySpark Catalog APIs in 
docs/source/reference/pyspark.sql.rst  (was: document PySpark Catalog APIs in 
docs/source/reference/pyspark.sql.rst)

> Document PySpark Catalog APIs in docs/source/reference/pyspark.sql.rst
> --
>
> Key: SPARK-36178
> URL: https://issues.apache.org/jira/browse/SPARK-36178
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, PySpark
>Affects Versions: 3.1.2
>Reporter: Dominik Gehl
>Priority: Minor
>
> PySpark Catalog API currently isn't documented in 
> docs/source/reference/pyspark.sql.rst



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36178) document PySpark Catalog APIs in docs/source/reference/pyspark.sql.rst

2021-07-16 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36178:


 Summary: document PySpark Catalog APIs in 
docs/source/reference/pyspark.sql.rst
 Key: SPARK-36178
 URL: https://issues.apache.org/jira/browse/SPARK-36178
 Project: Spark
  Issue Type: Improvement
  Components: Documentation, PySpark
Affects Versions: 3.1.2
Reporter: Dominik Gehl


PySpark Catalog API currently isn't documented in 
docs/source/reference/pyspark.sql.rst



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-36176) expose tableExists in pyspark.sql.catalog

2021-07-16 Thread Dominik Gehl (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-36176?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dominik Gehl updated SPARK-36176:
-
Summary: expose tableExists in pyspark.sql.catalog  (was: expost 
tableExists in pyspark.sql.catalog)

> expose tableExists in pyspark.sql.catalog
> -
>
> Key: SPARK-36176
> URL: https://issues.apache.org/jira/browse/SPARK-36176
> Project: Spark
>  Issue Type: Improvement
>  Components: PySpark
>Affects Versions: 3.1.2
>Reporter: Dominik Gehl
>Priority: Minor
>
> expose in pyspark tableExists which is part of the scala implementation 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36176) expost tableExists in pyspark.sql.catalog

2021-07-16 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36176:


 Summary: expost tableExists in pyspark.sql.catalog
 Key: SPARK-36176
 URL: https://issues.apache.org/jira/browse/SPARK-36176
 Project: Spark
  Issue Type: Improvement
  Components: PySpark
Affects Versions: 3.1.2
Reporter: Dominik Gehl


expose in pyspark tableExists which is part of the scala implementation 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-36160) pyspark sql/column documentation doesn't always match scala documentation

2021-07-16 Thread Dominik Gehl (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-36160?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dominik Gehl updated SPARK-36160:
-
Description: The pyspark sql/column documentation for methods between, 
getField, dropFields and cast could be adapted to follow more closely the 
corresponding Scala one.  (was: The pyspark sql/column documentation for 
between could be simplified)
Summary: pyspark sql/column documentation doesn't always match scala 
documentation  (was: pyspark sql/column documentation for between is 
complicated)

> pyspark sql/column documentation doesn't always match scala documentation
> -
>
> Key: SPARK-36160
> URL: https://issues.apache.org/jira/browse/SPARK-36160
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, PySpark
>Affects Versions: 3.1.2
>Reporter: Dominik Gehl
>Priority: Trivial
>
> The pyspark sql/column documentation for methods between, getField, 
> dropFields and cast could be adapted to follow more closely the corresponding 
> Scala one.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36160) pyspark sql/column documentation for between is complicated

2021-07-15 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36160:


 Summary: pyspark sql/column documentation for between is 
complicated
 Key: SPARK-36160
 URL: https://issues.apache.org/jira/browse/SPARK-36160
 Project: Spark
  Issue Type: Improvement
  Components: Documentation, PySpark
Affects Versions: 3.1.2
Reporter: Dominik Gehl


The pyspark sql/column documentation for between could be simplified



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36158) pyspark sql/functions documentation for months_between isn't as precise as scala version

2021-07-15 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36158:


 Summary: pyspark sql/functions documentation for months_between 
isn't as precise as scala version
 Key: SPARK-36158
 URL: https://issues.apache.org/jira/browse/SPARK-36158
 Project: Spark
  Issue Type: Improvement
  Components: Documentation, PySpark
Affects Versions: 3.1.2
Reporter: Dominik Gehl


pyspark months_between documentation doesn't mention that months are assumed 
with 31 days in the calculation.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36154) pyspark documentation doesn't mention week and quarter as valid format arguments to trunc

2021-07-15 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36154:


 Summary: pyspark documentation doesn't mention week and quarter as 
valid format arguments to trunc
 Key: SPARK-36154
 URL: https://issues.apache.org/jira/browse/SPARK-36154
 Project: Spark
  Issue Type: Improvement
  Components: Documentation, PySpark
Affects Versions: 3.1.2
Reporter: Dominik Gehl


pyspark documention for {{trunc}} in sql/functions doesn't mention that 
{{week}} and {{quarter}} are valid format specifiers



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-36149) dayofweek documentation for python and R

2021-07-15 Thread Dominik Gehl (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-36149?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17381101#comment-17381101
 ] 

Dominik Gehl commented on SPARK-36149:
--

Same is true for weekofyear

> dayofweek documentation for python and R
> 
>
> Key: SPARK-36149
> URL: https://issues.apache.org/jira/browse/SPARK-36149
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, PySpark, R
>Affects Versions: 2.3.0, 2.3.1, 2.3.2, 2.3.3, 2.3.4, 2.4.0, 2.4.1, 2.4.2, 
> 2.4.3, 2.4.4, 2.4.5, 2.4.6, 2.4.7, 2.4.8, 3.0.0, 3.0.1, 3.0.2, 3.0.3, 3.1.0, 
> 3.1.1, 3.1.2
>Reporter: Dominik Gehl
>Priority: Major
>
> Python and R documentation for {{dayofweek}} doesn't mention which integer 
> corresponds to which day. The information is only available in 
> {{sql/core/src/main/scala/org/apache/spark/sql/functions.scala}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-36149) dayofweek documentation for python and R

2021-07-15 Thread Dominik Gehl (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-36149?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dominik Gehl updated SPARK-36149:
-
Description: Python and R documentation for {{dayofweek}} doesn't mention 
which integer corresponds to which day. The information is only available in 
{{sql/core/src/main/scala/org/apache/spark/sql/functions.scala}}  (was: Python 
and R documentation for `dayofweek` doesn't mention which integer corresponds 
to which day. The information is only available in 
sql/core/src/main/scala/org/apache/spark/sql/functions.scala)

> dayofweek documentation for python and R
> 
>
> Key: SPARK-36149
> URL: https://issues.apache.org/jira/browse/SPARK-36149
> Project: Spark
>  Issue Type: Improvement
>  Components: Documentation, PySpark, R
>Affects Versions: 2.3.0, 2.3.1, 2.3.2, 2.3.3, 2.3.4, 2.4.0, 2.4.1, 2.4.2, 
> 2.4.3, 2.4.4, 2.4.5, 2.4.6, 2.4.7, 2.4.8, 3.0.0, 3.0.1, 3.0.2, 3.0.3, 3.1.0, 
> 3.1.1, 3.1.2
>Reporter: Dominik Gehl
>Priority: Major
>
> Python and R documentation for {{dayofweek}} doesn't mention which integer 
> corresponds to which day. The information is only available in 
> {{sql/core/src/main/scala/org/apache/spark/sql/functions.scala}}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-36149) dayofweek documentation for python and R

2021-07-15 Thread Dominik Gehl (Jira)
Dominik Gehl created SPARK-36149:


 Summary: dayofweek documentation for python and R
 Key: SPARK-36149
 URL: https://issues.apache.org/jira/browse/SPARK-36149
 Project: Spark
  Issue Type: Improvement
  Components: Documentation, PySpark, R
Affects Versions: 3.1.2, 3.1.1, 3.1.0, 3.0.3, 3.0.2, 3.0.1, 3.0.0, 2.4.8, 
2.4.7, 2.4.6, 2.4.5, 2.4.4, 2.4.3, 2.4.2, 2.4.1, 2.4.0, 2.3.4, 2.3.3, 2.3.2, 
2.3.1, 2.3.0
Reporter: Dominik Gehl


Python and R documentation for `dayofweek` doesn't mention which integer 
corresponds to which day. The information is only available in 
sql/core/src/main/scala/org/apache/spark/sql/functions.scala



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org