[
https://issues.apache.org/jira/browse/SPARK-32306?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17199839#comment-17199839
]
Maxim Gekk commented on SPARK-32306:
The function returns an element of the input se
[
https://issues.apache.org/jira/browse/SPARK-27345?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17199618#comment-17199618
]
Maxim Gekk commented on SPARK-27345:
The function was exposed recently by https://gi
[
https://issues.apache.org/jira/browse/SPARK-32306?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17199615#comment-17199615
]
Maxim Gekk commented on SPARK-32306:
[~seanmalory] Which result do you expect? 8?
>
[
https://issues.apache.org/jira/browse/SPARK-30882?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17199602#comment-17199602
]
Maxim Gekk commented on SPARK-30882:
This PR [https://github.com/apache/spark/pull/2
[
https://issues.apache.org/jira/browse/SPARK-32908?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-32908:
---
Description:
Read input data from the attached CSV file:
{code:scala}
val df = spark.read.opti
[
https://issues.apache.org/jira/browse/SPARK-32908?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-32908:
---
Attachment: percentile_approx-input.csv
> percentile_approx() returns incorrect results
> --
[
https://issues.apache.org/jira/browse/SPARK-32908?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17197460#comment-17197460
]
Maxim Gekk commented on SPARK-32908:
I am preparing a fix for the issue.
> percenti
Maxim Gekk created SPARK-32908:
--
Summary: percentile_approx() returns incorrect results
Key: SPARK-32908
URL: https://issues.apache.org/jira/browse/SPARK-32908
Project: Spark
Issue Type: Bug
[
https://issues.apache.org/jira/browse/SPARK-32815?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17191976#comment-17191976
]
Maxim Gekk commented on SPARK-32815:
I am working on this.
> Fix LibSVM data source
Maxim Gekk created SPARK-32815:
--
Summary: Fix LibSVM data source loading error on file paths with
glob metacharacters
Key: SPARK-32815
URL: https://issues.apache.org/jira/browse/SPARK-32815
Project: Spar
[
https://issues.apache.org/jira/browse/SPARK-32810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17191510#comment-17191510
]
Maxim Gekk commented on SPARK-32810:
I am working on this.
> CSV/JSON data sources
Maxim Gekk created SPARK-32810:
--
Summary: CSV/JSON data sources should avoid globbing paths when
inferring schema
Key: SPARK-32810
URL: https://issues.apache.org/jira/browse/SPARK-32810
Project: Spark
[
https://issues.apache.org/jira/browse/SPARK-32108?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk resolved SPARK-32108.
Resolution: Not A Problem
> Silent mode of spark-sql is broken
> -
[
https://issues.apache.org/jira/browse/SPARK-32637?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17188384#comment-17188384
]
Maxim Gekk commented on SPARK-32637:
Spark's TIMESTAMP type has microsecond precisio
Maxim Gekk created SPARK-32599:
--
Summary: Check the TEXTFILE file format in HiveSerDeReadWriteSuite
Key: SPARK-32599
URL: https://issues.apache.org/jira/browse/SPARK-32599
Project: Spark
Issue T
[
https://issues.apache.org/jira/browse/SPARK-32594?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17175820#comment-17175820
]
Maxim Gekk commented on SPARK-32594:
I am working on a fix
> Insert wrong dates to
Maxim Gekk created SPARK-32594:
--
Summary: Insert wrong dates to Hive tables
Key: SPARK-32594
URL: https://issues.apache.org/jira/browse/SPARK-32594
Project: Spark
Issue Type: Bug
Compo
Maxim Gekk created SPARK-32546:
--
Summary: SHOW VIEWS fails with MetaException ...
ClassNotFoundException
Key: SPARK-32546
URL: https://issues.apache.org/jira/browse/SPARK-32546
Project: Spark
I
Maxim Gekk created SPARK-32513:
--
Summary: Rename classes/files with the Jdbc prefix to JDBC
Key: SPARK-32513
URL: https://issues.apache.org/jira/browse/SPARK-32513
Project: Spark
Issue Type: Imp
[
https://issues.apache.org/jira/browse/SPARK-32405?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17169050#comment-17169050
]
Maxim Gekk commented on SPARK-32405:
properties passed to createTable are ignored, s
Maxim Gekk created SPARK-32510:
--
Summary: JDBC doesn't check duplicate column names in nested
structures
Key: SPARK-32510
URL: https://issues.apache.org/jira/browse/SPARK-32510
Project: Spark
Maxim Gekk created SPARK-32501:
--
Summary: Inconsistent NULL conversions to strings
Key: SPARK-32501
URL: https://issues.apache.org/jira/browse/SPARK-32501
Project: Spark
Issue Type: Improvement
Maxim Gekk created SPARK-32499:
--
Summary: Use {} for structs and maps in show()
Key: SPARK-32499
URL: https://issues.apache.org/jira/browse/SPARK-32499
Project: Spark
Issue Type: Improvement
[
https://issues.apache.org/jira/browse/SPARK-32396?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17167669#comment-17167669
]
Maxim Gekk commented on SPARK-32396:
I am working on it.
> Support the BATCH_READ c
Maxim Gekk created SPARK-32471:
--
Summary: Describe JSON option `allowNonNumericNumbers`
Key: SPARK-32471
URL: https://issues.apache.org/jira/browse/SPARK-32471
Project: Spark
Issue Type: Documen
[
https://issues.apache.org/jira/browse/SPARK-32431?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-32431:
---
Comment: was deleted
(was: I cannot reproduce the issue on master, branch-3.0 and branch-2.4. I
ope
[
https://issues.apache.org/jira/browse/SPARK-32431?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-32431:
---
Description:
The code below throws org.apache.spark.sql.AnalysisException: Found duplicate
column(s
[
https://issues.apache.org/jira/browse/SPARK-32431?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17164853#comment-17164853
]
Maxim Gekk commented on SPARK-32431:
I cannot reproduce the issue on master, branch-
[
https://issues.apache.org/jira/browse/SPARK-32431?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17164499#comment-17164499
]
Maxim Gekk commented on SPARK-32431:
I am working on this.
> The .schema() API beha
Maxim Gekk created SPARK-32427:
--
Summary: Omit USING in CREATE TABLE via JDBC Table Catalog
Key: SPARK-32427
URL: https://issues.apache.org/jira/browse/SPARK-32427
Project: Spark
Issue Type: Sub
[
https://issues.apache.org/jira/browse/SPARK-32415?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-32415:
---
Summary: Enable JSON tests for the allowNonNumericNumbers option (was:
Enable JSON tests from the a
Maxim Gekk created SPARK-32415:
--
Summary: Enable JSON tests from the allowNonNumericNumbers option
Key: SPARK-32415
URL: https://issues.apache.org/jira/browse/SPARK-32415
Project: Spark
Issue Ty
Maxim Gekk created SPARK-32410:
--
Summary: Support the BATCH_WRITE capability by JDBCTable
Key: SPARK-32410
URL: https://issues.apache.org/jira/browse/SPARK-32410
Project: Spark
Issue Type: Sub-t
[
https://issues.apache.org/jira/browse/SPARK-32396?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-32396:
---
Summary: Support the BATCH_READ capability by JDBCTable (was: Support
BATCH_READ by JDBCTable)
> S
[
https://issues.apache.org/jira/browse/SPARK-32410?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-32410:
---
Description: Extend JDBCTable introduced in
https://github.com/apache/spark/pull/29168 by SupportsWr
Maxim Gekk created SPARK-32405:
--
Summary: Apply table options while creating tables in JDBC Table
Catalog
Key: SPARK-32405
URL: https://issues.apache.org/jira/browse/SPARK-32405
Project: Spark
Maxim Gekk created SPARK-32402:
--
Summary: Implement ALTER TABLE in JDBC Table Catalog
Key: SPARK-32402
URL: https://issues.apache.org/jira/browse/SPARK-32402
Project: Spark
Issue Type: Sub-task
Maxim Gekk created SPARK-32396:
--
Summary: Support BATCH_READ by JDBCTable
Key: SPARK-32396
URL: https://issues.apache.org/jira/browse/SPARK-32396
Project: Spark
Issue Type: Sub-task
Co
Maxim Gekk created SPARK-32382:
--
Summary: Override table renaming in JDBC dialects
Key: SPARK-32382
URL: https://issues.apache.org/jira/browse/SPARK-32382
Project: Spark
Issue Type: Sub-task
Maxim Gekk created SPARK-32375:
--
Summary: Implement TableCatalog for JDBC
Key: SPARK-32375
URL: https://issues.apache.org/jira/browse/SPARK-32375
Project: Spark
Issue Type: Sub-task
Co
[
https://issues.apache.org/jira/browse/SPARK-32328?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17160521#comment-17160521
]
Maxim Gekk commented on SPARK-32328:
[~pavithraramachandran] Please, wait for the PR
[
https://issues.apache.org/jira/browse/SPARK-32346?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17159911#comment-17159911
]
Maxim Gekk commented on SPARK-32346:
I am working on this
> Support filters pushdow
Maxim Gekk created SPARK-32346:
--
Summary: Support filters pushdown in Avro datasource
Key: SPARK-32346
URL: https://issues.apache.org/jira/browse/SPARK-32346
Project: Spark
Issue Type: Improveme
[
https://issues.apache.org/jira/browse/SPARK-32346?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-32346:
---
Description: (was: * Implement the `SupportsPushDownFilters` interface
in `JsonScanBuilder`
* A
[
https://issues.apache.org/jira/browse/SPARK-32325?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17158902#comment-17158902
]
Maxim Gekk commented on SPARK-32325:
The JIRA ticket was opened while addressing [~d
Maxim Gekk created SPARK-32325:
--
Summary: JSON predicate pushdown for nested fields
Key: SPARK-32325
URL: https://issues.apache.org/jira/browse/SPARK-32325
Project: Spark
Issue Type: Sub-task
[
https://issues.apache.org/jira/browse/SPARK-32273?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17155337#comment-17155337
]
Maxim Gekk commented on SPARK-32273:
[~Samwel] [~cloud_fan] I haven't found an umbre
Maxim Gekk created SPARK-32273:
--
Summary: Support ANSI dialect in MAKE_DATE and MAKE_TIMESTAMP
Key: SPARK-32273
URL: https://issues.apache.org/jira/browse/SPARK-32273
Project: Spark
Issue Type:
Maxim Gekk created SPARK-32209:
--
Summary: Re-use GetTimestamp in ParseToDate
Key: SPARK-32209
URL: https://issues.apache.org/jira/browse/SPARK-32209
Project: Spark
Issue Type: Improvement
Maxim Gekk created SPARK-32173:
--
Summary: Deduplicate code in FromUTCTimestamp and ToUTCTimestamp
Key: SPARK-32173
URL: https://issues.apache.org/jira/browse/SPARK-32173
Project: Spark
Issue Typ
[
https://issues.apache.org/jira/browse/SPARK-31579?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17150535#comment-17150535
]
Maxim Gekk commented on SPARK-31579:
[~suddhuASF] Please, open a PR for master.
> R
[
https://issues.apache.org/jira/browse/SPARK-32130?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17149235#comment-17149235
]
Maxim Gekk commented on SPARK-32130:
I would like to propose:
# Add the SQL configĀ
[
https://issues.apache.org/jira/browse/SPARK-32108?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-32108:
---
Description:
1. I download the recent release Spark 3.0 from
http://spark.apache.org/downloads.html
Maxim Gekk created SPARK-32108:
--
Summary: Silent mode of spark-sql is broken
Key: SPARK-32108
URL: https://issues.apache.org/jira/browse/SPARK-32108
Project: Spark
Issue Type: Bug
Comp
Maxim Gekk created SPARK-32072:
--
Summary: Unaligned benchmark results
Key: SPARK-32072
URL: https://issues.apache.org/jira/browse/SPARK-32072
Project: Spark
Issue Type: Test
Component
Maxim Gekk created SPARK-32071:
--
Summary: Benchmark make_interval
Key: SPARK-32071
URL: https://issues.apache.org/jira/browse/SPARK-32071
Project: Spark
Issue Type: Test
Components: SQ
Maxim Gekk created SPARK-32052:
--
Summary: Extract common code from date-time field expressions
Key: SPARK-32052
URL: https://issues.apache.org/jira/browse/SPARK-32052
Project: Spark
Issue Type:
Maxim Gekk created SPARK-32043:
--
Summary: Replace decimal by int op in `make_interval` and
`make_timestamp`
Key: SPARK-32043
URL: https://issues.apache.org/jira/browse/SPARK-32043
Project: Spark
Maxim Gekk created SPARK-32006:
--
Summary: Create date/timestamp formatters once before collect in
`hiveResultString()`
Key: SPARK-32006
URL: https://issues.apache.org/jira/browse/SPARK-32006
Project: Spa
Maxim Gekk created SPARK-31992:
--
Summary: Benchmark the EXCEPTION rebase mode
Key: SPARK-31992
URL: https://issues.apache.org/jira/browse/SPARK-31992
Project: Spark
Issue Type: Test
Co
Maxim Gekk created SPARK-31989:
--
Summary: Generate JSON files with rebasing switch points using
smaller steps
Key: SPARK-31989
URL: https://issues.apache.org/jira/browse/SPARK-31989
Project: Spark
Maxim Gekk created SPARK-31986:
--
Summary: Test failure RebaseDateTimeSuite."optimization of micros
rebasing - Julian to Gregorian"
Key: SPARK-31986
URL: https://issues.apache.org/jira/browse/SPARK-31986
Maxim Gekk created SPARK-31984:
--
Summary: Make micros rebasing functions via local timestamps pure
Key: SPARK-31984
URL: https://issues.apache.org/jira/browse/SPARK-31984
Project: Spark
Issue Ty
Maxim Gekk created SPARK-31959:
--
Summary: Test failure "RebaseDateTimeSuite.optimization of micros
rebasing - Gregorian to Julian"
Key: SPARK-31959
URL: https://issues.apache.org/jira/browse/SPARK-31959
[
https://issues.apache.org/jira/browse/SPARK-26905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17129737#comment-17129737
]
Maxim Gekk commented on SPARK-26905:
Spark's reserved keywords that are not reserved
[
https://issues.apache.org/jira/browse/SPARK-26905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17129730#comment-17129730
]
Maxim Gekk commented on SPARK-26905:
Spark's ANSI Non-Reserved keywords are actually
[
https://issues.apache.org/jira/browse/SPARK-26905?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17129729#comment-17129729
]
Maxim Gekk commented on SPARK-26905:
I downloaded SQL2016 reserved and non-reserved
[
https://issues.apache.org/jira/browse/SPARK-26905?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-26905:
---
Attachment: sql2016-02-nonreserved.txt
sql2016-02-reserved.txt
sql201
Maxim Gekk created SPARK-31940:
--
Summary: Document the default JVM time zone in to/fromJavaDate and
legacy date formatters
Key: SPARK-31940
URL: https://issues.apache.org/jira/browse/SPARK-31940
Project:
Maxim Gekk created SPARK-31932:
--
Summary: Add date/timestamp benchmarks for toHiveString
Key: SPARK-31932
URL: https://issues.apache.org/jira/browse/SPARK-31932
Project: Spark
Issue Type: Test
[
https://issues.apache.org/jira/browse/SPARK-30808?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk resolved SPARK-30808.
Resolution: Cannot Reproduce
The example is not valid anymore because 'MILLENNIUM' is not supporte
Maxim Gekk created SPARK-31910:
--
Summary: Enable Java 8 time API in Thrift server
Key: SPARK-31910
URL: https://issues.apache.org/jira/browse/SPARK-31910
Project: Spark
Issue Type: Improvement
[
https://issues.apache.org/jira/browse/SPARK-30808?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-30808:
---
Affects Version/s: (was: 3.1.0)
3.0.0
> Thrift server returns wrong times
Maxim Gekk created SPARK-31901:
--
Summary: Legacy date formatters don't use specified time zone
Key: SPARK-31901
URL: https://issues.apache.org/jira/browse/SPARK-31901
Project: Spark
Issue Type:
[
https://issues.apache.org/jira/browse/SPARK-31888?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-31888:
---
Description:
Currently, ParquetFilters supports only java.sql.Timestamp values of
TimestampType, an
Maxim Gekk created SPARK-31888:
--
Summary: Support `java.time.Instant` in Parquet filter pushdown
Key: SPARK-31888
URL: https://issues.apache.org/jira/browse/SPARK-31888
Project: Spark
Issue Type
Maxim Gekk created SPARK-31885:
--
Summary: Incorrect filtering of old millis timestamp in parquet
Key: SPARK-31885
URL: https://issues.apache.org/jira/browse/SPARK-31885
Project: Spark
Issue Type
Maxim Gekk created SPARK-31878:
--
Summary: Create date formatter only once in HiveResult
Key: SPARK-31878
URL: https://issues.apache.org/jira/browse/SPARK-31878
Project: Spark
Issue Type: Improve
Maxim Gekk created SPARK-31874:
--
Summary: Use `FastDateFormat` as the legacy fractional formatter
Key: SPARK-31874
URL: https://issues.apache.org/jira/browse/SPARK-31874
Project: Spark
Issue Typ
[
https://issues.apache.org/jira/browse/SPARK-31859?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17119029#comment-17119029
]
Maxim Gekk edited comment on SPARK-31859 at 5/28/20, 7:41 PM:
[
https://issues.apache.org/jira/browse/SPARK-31859?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17119029#comment-17119029
]
Maxim Gekk commented on SPARK-31859:
[~juliuszsompolski] FYI https://github.com/apac
[
https://issues.apache.org/jira/browse/SPARK-31855?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-31855:
---
Description:
Extend the test "SPARK-31183: compatibility with Spark 2.4 in reading
dates/timestamps
[
https://issues.apache.org/jira/browse/SPARK-31855?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17118468#comment-17118468
]
Maxim Gekk commented on SPARK-31855:
I am working on it
> Check reading date/timest
Maxim Gekk created SPARK-31855:
--
Summary: Check reading date/timestamp from Avro files w/ and w/o
Spark version
Key: SPARK-31855
URL: https://issues.apache.org/jira/browse/SPARK-31855
Project: Spark
Maxim Gekk created SPARK-31820:
--
Summary: Flaky JavaBeanDeserializationSuite
Key: SPARK-31820
URL: https://issues.apache.org/jira/browse/SPARK-31820
Project: Spark
Issue Type: Bug
Comp
[
https://issues.apache.org/jira/browse/SPARK-31818?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-31818:
---
Description:
When spark.sql.datetime.java8API.enabled is set to true, filters pushed down
with java
Maxim Gekk created SPARK-31818:
--
Summary: Failure on pushing down filters with java.time.Instant
values in ORC
Key: SPARK-31818
URL: https://issues.apache.org/jira/browse/SPARK-31818
Project: Spark
Maxim Gekk created SPARK-31806:
--
Summary: Check reading date/timestamp from Parquet: plain,
dictionary enc, Spark version
Key: SPARK-31806
URL: https://issues.apache.org/jira/browse/SPARK-31806
Project:
[
https://issues.apache.org/jira/browse/SPARK-31802?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-31802:
---
Description: Row.jsonValue has to convert incoming Java date/timestamps
types to days/microseconds b
[
https://issues.apache.org/jira/browse/SPARK-31802?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-31802:
---
Description: Row.jsonValue has to convert incoming Java date/timestamps
types to days/microseconds b
Maxim Gekk created SPARK-31802:
--
Summary: Format Java date-time types in Row.jsonValue directly
Key: SPARK-31802
URL: https://issues.apache.org/jira/browse/SPARK-31802
Project: Spark
Issue Type:
Maxim Gekk created SPARK-31785:
--
Summary: Add a helper function to test all parquet readers
Key: SPARK-31785
URL: https://issues.apache.org/jira/browse/SPARK-31785
Project: Spark
Issue Type: Imp
Maxim Gekk created SPARK-31762:
--
Summary: Fix perf regression of date/timestamp formatting in
toHiveString
Key: SPARK-31762
URL: https://issues.apache.org/jira/browse/SPARK-31762
Project: Spark
Maxim Gekk created SPARK-31738:
--
Summary: Describe 'L' and 'M' month pattern letters
Key: SPARK-31738
URL: https://issues.apache.org/jira/browse/SPARK-31738
Project: Spark
Issue Type: Documentat
Maxim Gekk created SPARK-31727:
--
Summary: Inconsistent error messages of casting timestamp to int
Key: SPARK-31727
URL: https://issues.apache.org/jira/browse/SPARK-31727
Project: Spark
Issue Typ
Maxim Gekk created SPARK-31725:
--
Summary: Set America/Los_Angeles time zone and Locale.US by default
Key: SPARK-31725
URL: https://issues.apache.org/jira/browse/SPARK-31725
Project: Spark
Issue
Maxim Gekk created SPARK-31712:
--
Summary: Check casting timestamps to byte/short/int/long before
1970-01-01
Key: SPARK-31712
URL: https://issues.apache.org/jira/browse/SPARK-31712
Project: Spark
Maxim Gekk created SPARK-31680:
--
Summary: Support Java 8 datetime types by Random data generator
Key: SPARK-31680
URL: https://issues.apache.org/jira/browse/SPARK-31680
Project: Spark
Issue Type
[
https://issues.apache.org/jira/browse/SPARK-31672?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Maxim Gekk updated SPARK-31672:
---
Description:
Write dates with dictionary encoding enabled to parquet files:
{code:scala}
Welcome to
Maxim Gekk created SPARK-31672:
--
Summary: Reading wrong timestamps from dictionary encoded columns
in Parquet files
Key: SPARK-31672
URL: https://issues.apache.org/jira/browse/SPARK-31672
Project: Spark
301 - 400 of 1107 matches
Mail list logo