[jira] [Comment Edited] (SPARK-23178) Kryo Unsafe problems with count distinct from cache

2018-01-22 Thread KIryl Sultanau (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23178?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16334206#comment-16334206
 ] 

KIryl Sultanau edited comment on SPARK-23178 at 1/22/18 12:48 PM:
--

With unsafe switched off this example works fine:  
{quote}.config("spark.kryo.unsafe", "false")
{quote}
No null or incorrect IDs in both data sets.


was (Author: kirills2006):
With unsafe switch off this example works fine:  
{quote}.config("spark.kryo.unsafe", "false")
{quote}
No null or incorrect IDs in both data sets.

> Kryo Unsafe problems with count distinct from cache
> ---
>
> Key: SPARK-23178
> URL: https://issues.apache.org/jira/browse/SPARK-23178
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.2.0, 2.2.1
>Reporter: KIryl Sultanau
>Priority: Minor
> Attachments: Unsafe-issue.png, Unsafe-off.png
>
>
> Spark incorrectly process cached data with Kryo & Unsafe options.
> Distinct count from cache doesn't work correctly. Example available below:
> {quote}val spark = SparkSession
>      .builder
>      .appName("unsafe-issue")
>      .master("local[*]")
>      .config("spark.serializer", 
> "org.apache.spark.serializer.KryoSerializer")
>      .config("spark.kryo.unsafe", "true")
>      .config("spark.kryo.registrationRequired", "false")
>      .getOrCreate()
>     val devicesDF = spark.read.format("csv")
>      .option("header", "true")
>      .option("delimiter", "\t")
>      .load("/data/Devices.tsv").cache()
>     val gatewaysDF = spark.read.format("csv")
>      .option("header", "true")
>      .option("delimiter", "\t")
>      .load("/data/Gateways.tsv").cache()
>     val devJoinedDF = devicesDF.join(gatewaysDF, Seq("GatewayId"), 
> "inner").cache()
>      devJoinedDF.printSchema()
>     println(devJoinedDF.count())
>     println(devJoinedDF.select("DeviceId").distinct().count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count>1").count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count=1").count())
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-23178) Kryo Unsafe problems with count distinct from cache

2018-01-22 Thread KIryl Sultanau (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-23178?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

KIryl Sultanau updated SPARK-23178:
---
Description: 
Spark incorrectly process cached data with Kryo & Unsafe options.

Distinct count from cache doesn't work correctly. Example available below:
{quote}val spark = SparkSession
     .builder
     .appName("unsafe-issue")
     .master("local[*]")
     .config("spark.serializer", 
"org.apache.spark.serializer.KryoSerializer")
     .config("spark.kryo.unsafe", "true")
     .config("spark.kryo.registrationRequired", "false")
     .getOrCreate()

    val devicesDF = spark.read.format("csv")
     .option("header", "true")
     .option("delimiter", "\t")
     .load("/data/Devices.tsv").cache()

    val gatewaysDF = spark.read.format("csv")
     .option("header", "true")
     .option("delimiter", "\t")
     .load("/data/Gateways.tsv").cache()

    val devJoinedDF = devicesDF.join(gatewaysDF, Seq("GatewayId"), 
"inner").cache()
     devJoinedDF.printSchema()

    println(devJoinedDF.count())

    println(devJoinedDF.select("DeviceId").distinct().count())

    println(devJoinedDF.groupBy("DeviceId").count().filter("count>1").count())

    println(devJoinedDF.groupBy("DeviceId").count().filter("count=1").count())
{quote}

  was:
val spark = SparkSession
    .builder
    .appName("unsafe-issue")
    .master("local[*]")
    .config("spark.serializer", 
"org.apache.spark.serializer.KryoSerializer")
    .config("spark.kryo.unsafe", "true")
    .config("spark.kryo.registrationRequired", "false")
    .getOrCreate()

    val devicesDF = spark.read.format("csv")
    .option("header", "true")
    .option("delimiter", "\t")
    .load("/data/Devices.tsv").cache()

    val gatewaysDF = spark.read.format("csv")
    .option("header", "true")
    .option("delimiter", "\t")
    .load("/data/Gateways.tsv").cache()

    val devJoinedDF = devicesDF.join(gatewaysDF, Seq("GatewayId"), 
"inner").cache()
    devJoinedDF.printSchema()

    println(devJoinedDF.count())

    println(devJoinedDF.select("DeviceId").distinct().count())

    println(devJoinedDF.groupBy("DeviceId").count().filter("count>1").count())

    println(devJoinedDF.groupBy("DeviceId").count().filter("count=1").count())


> Kryo Unsafe problems with count distinct from cache
> ---
>
> Key: SPARK-23178
> URL: https://issues.apache.org/jira/browse/SPARK-23178
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.2.0, 2.2.1
>Reporter: KIryl Sultanau
>Priority: Minor
> Attachments: Unsafe-issue.png, Unsafe-off.png
>
>
> Spark incorrectly process cached data with Kryo & Unsafe options.
> Distinct count from cache doesn't work correctly. Example available below:
> {quote}val spark = SparkSession
>      .builder
>      .appName("unsafe-issue")
>      .master("local[*]")
>      .config("spark.serializer", 
> "org.apache.spark.serializer.KryoSerializer")
>      .config("spark.kryo.unsafe", "true")
>      .config("spark.kryo.registrationRequired", "false")
>      .getOrCreate()
>     val devicesDF = spark.read.format("csv")
>      .option("header", "true")
>      .option("delimiter", "\t")
>      .load("/data/Devices.tsv").cache()
>     val gatewaysDF = spark.read.format("csv")
>      .option("header", "true")
>      .option("delimiter", "\t")
>      .load("/data/Gateways.tsv").cache()
>     val devJoinedDF = devicesDF.join(gatewaysDF, Seq("GatewayId"), 
> "inner").cache()
>      devJoinedDF.printSchema()
>     println(devJoinedDF.count())
>     println(devJoinedDF.select("DeviceId").distinct().count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count>1").count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count=1").count())
> {quote}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-23178) Kryo Unsafe problems with count distinct from cache

2018-01-22 Thread KIryl Sultanau (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-23178?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

KIryl Sultanau updated SPARK-23178:
---
Attachment: Unsafe-off.png

> Kryo Unsafe problems with count distinct from cache
> ---
>
> Key: SPARK-23178
> URL: https://issues.apache.org/jira/browse/SPARK-23178
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.2.0, 2.2.1
>Reporter: KIryl Sultanau
>Priority: Minor
> Attachments: Unsafe-issue.png, Unsafe-off.png
>
>
> val spark = SparkSession
>     .builder
>     .appName("unsafe-issue")
>     .master("local[*]")
>     .config("spark.serializer", 
> "org.apache.spark.serializer.KryoSerializer")
>     .config("spark.kryo.unsafe", "true")
>     .config("spark.kryo.registrationRequired", "false")
>     .getOrCreate()
>     val devicesDF = spark.read.format("csv")
>     .option("header", "true")
>     .option("delimiter", "\t")
>     .load("/data/Devices.tsv").cache()
>     val gatewaysDF = spark.read.format("csv")
>     .option("header", "true")
>     .option("delimiter", "\t")
>     .load("/data/Gateways.tsv").cache()
>     val devJoinedDF = devicesDF.join(gatewaysDF, Seq("GatewayId"), 
> "inner").cache()
>     devJoinedDF.printSchema()
>     println(devJoinedDF.count())
>     println(devJoinedDF.select("DeviceId").distinct().count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count>1").count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count=1").count())



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-23178) Kryo Unsafe problems with count distinct from cache

2018-01-22 Thread KIryl Sultanau (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-23178?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

KIryl Sultanau updated SPARK-23178:
---
Priority: Minor  (was: Major)

> Kryo Unsafe problems with count distinct from cache
> ---
>
> Key: SPARK-23178
> URL: https://issues.apache.org/jira/browse/SPARK-23178
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.2.0, 2.2.1
>Reporter: KIryl Sultanau
>Priority: Minor
> Attachments: Unsafe-issue.png
>
>
> val spark = SparkSession
>     .builder
>     .appName("unsafe-issue")
>     .master("local[*]")
>     .config("spark.serializer", 
> "org.apache.spark.serializer.KryoSerializer")
>     .config("spark.kryo.unsafe", "true")
>     .config("spark.kryo.registrationRequired", "false")
>     .getOrCreate()
>     val devicesDF = spark.read.format("csv")
>     .option("header", "true")
>     .option("delimiter", "\t")
>     .load("/data/Devices.tsv").cache()
>     val gatewaysDF = spark.read.format("csv")
>     .option("header", "true")
>     .option("delimiter", "\t")
>     .load("/data/Gateways.tsv").cache()
>     val devJoinedDF = devicesDF.join(gatewaysDF, Seq("GatewayId"), 
> "inner").cache()
>     devJoinedDF.printSchema()
>     println(devJoinedDF.count())
>     println(devJoinedDF.select("DeviceId").distinct().count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count>1").count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count=1").count())



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-23178) Kryo Unsafe problems with count distinct from cache

2018-01-22 Thread KIryl Sultanau (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23178?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16334206#comment-16334206
 ] 

KIryl Sultanau edited comment on SPARK-23178 at 1/22/18 12:38 PM:
--

With unsafe switch off this example works fine:  
{quote}.config("spark.kryo.unsafe", "false")
{quote}
No null or incorrect IDs in both data sets.


was (Author: kirills2006):
With unsafe switch off this example works fine:  
{quote}.config("spark.kryo.unsafe", "false")
{quote}

> Kryo Unsafe problems with count distinct from cache
> ---
>
> Key: SPARK-23178
> URL: https://issues.apache.org/jira/browse/SPARK-23178
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.2.0, 2.2.1
>Reporter: KIryl Sultanau
>Priority: Major
> Attachments: Unsafe-issue.png
>
>
> val spark = SparkSession
>     .builder
>     .appName("unsafe-issue")
>     .master("local[*]")
>     .config("spark.serializer", 
> "org.apache.spark.serializer.KryoSerializer")
>     .config("spark.kryo.unsafe", "true")
>     .config("spark.kryo.registrationRequired", "false")
>     .getOrCreate()
>     val devicesDF = spark.read.format("csv")
>     .option("header", "true")
>     .option("delimiter", "\t")
>     .load("/data/Devices.tsv").cache()
>     val gatewaysDF = spark.read.format("csv")
>     .option("header", "true")
>     .option("delimiter", "\t")
>     .load("/data/Gateways.tsv").cache()
>     val devJoinedDF = devicesDF.join(gatewaysDF, Seq("GatewayId"), 
> "inner").cache()
>     devJoinedDF.printSchema()
>     println(devJoinedDF.count())
>     println(devJoinedDF.select("DeviceId").distinct().count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count>1").count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count=1").count())



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-23178) Kryo Unsafe problems with count distinct from cache

2018-01-22 Thread KIryl Sultanau (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23178?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16334206#comment-16334206
 ] 

KIryl Sultanau edited comment on SPARK-23178 at 1/22/18 12:35 PM:
--

With unsafe switch off this example works fine:  
{quote}.config("spark.kryo.unsafe", "false")
{quote}


was (Author: kirills2006):
With unsafe switch off this works fine:  
{quote}.config("spark.kryo.unsafe", "false")
{quote}

> Kryo Unsafe problems with count distinct from cache
> ---
>
> Key: SPARK-23178
> URL: https://issues.apache.org/jira/browse/SPARK-23178
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.2.0, 2.2.1
>Reporter: KIryl Sultanau
>Priority: Major
> Attachments: Unsafe-issue.png
>
>
> val spark = SparkSession
>     .builder
>     .appName("unsafe-issue")
>     .master("local[*]")
>     .config("spark.serializer", 
> "org.apache.spark.serializer.KryoSerializer")
>     .config("spark.kryo.unsafe", "true")
>     .config("spark.kryo.registrationRequired", "false")
>     .getOrCreate()
>     val devicesDF = spark.read.format("csv")
>     .option("header", "true")
>     .option("delimiter", "\t")
>     .load("/data/Devices.tsv").cache()
>     val gatewaysDF = spark.read.format("csv")
>     .option("header", "true")
>     .option("delimiter", "\t")
>     .load("/data/Gateways.tsv").cache()
>     val devJoinedDF = devicesDF.join(gatewaysDF, Seq("GatewayId"), 
> "inner").cache()
>     devJoinedDF.printSchema()
>     println(devJoinedDF.count())
>     println(devJoinedDF.select("DeviceId").distinct().count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count>1").count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count=1").count())



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-23178) Kryo Unsafe problems with count distinct from cache

2018-01-22 Thread KIryl Sultanau (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-23178?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16334206#comment-16334206
 ] 

KIryl Sultanau commented on SPARK-23178:


With unsafe switch off this works fine:  
{quote}.config("spark.kryo.unsafe", "false")
{quote}

> Kryo Unsafe problems with count distinct from cache
> ---
>
> Key: SPARK-23178
> URL: https://issues.apache.org/jira/browse/SPARK-23178
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.2.0, 2.2.1
>Reporter: KIryl Sultanau
>Priority: Major
> Attachments: Unsafe-issue.png
>
>
> val spark = SparkSession
>     .builder
>     .appName("unsafe-issue")
>     .master("local[*]")
>     .config("spark.serializer", 
> "org.apache.spark.serializer.KryoSerializer")
>     .config("spark.kryo.unsafe", "true")
>     .config("spark.kryo.registrationRequired", "false")
>     .getOrCreate()
>     val devicesDF = spark.read.format("csv")
>     .option("header", "true")
>     .option("delimiter", "\t")
>     .load("/data/Devices.tsv").cache()
>     val gatewaysDF = spark.read.format("csv")
>     .option("header", "true")
>     .option("delimiter", "\t")
>     .load("/data/Gateways.tsv").cache()
>     val devJoinedDF = devicesDF.join(gatewaysDF, Seq("GatewayId"), 
> "inner").cache()
>     devJoinedDF.printSchema()
>     println(devJoinedDF.count())
>     println(devJoinedDF.select("DeviceId").distinct().count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count>1").count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count=1").count())



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-23178) Kryo Unsafe problems with count distinct from cache

2018-01-22 Thread KIryl Sultanau (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-23178?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

KIryl Sultanau updated SPARK-23178:
---
Attachment: Unsafe-issue.png

> Kryo Unsafe problems with count distinct from cache
> ---
>
> Key: SPARK-23178
> URL: https://issues.apache.org/jira/browse/SPARK-23178
> Project: Spark
>  Issue Type: Bug
>  Components: Spark Core
>Affects Versions: 2.2.0, 2.2.1
>Reporter: KIryl Sultanau
>Priority: Major
> Attachments: Unsafe-issue.png
>
>
> val spark = SparkSession
>     .builder
>     .appName("unsafe-issue")
>     .master("local[*]")
>     .config("spark.serializer", 
> "org.apache.spark.serializer.KryoSerializer")
>     .config("spark.kryo.unsafe", "true")
>     .config("spark.kryo.registrationRequired", "false")
>     .getOrCreate()
>     val devicesDF = spark.read.format("csv")
>     .option("header", "true")
>     .option("delimiter", "\t")
>     .load("/data/Devices.tsv").cache()
>     val gatewaysDF = spark.read.format("csv")
>     .option("header", "true")
>     .option("delimiter", "\t")
>     .load("/data/Gateways.tsv").cache()
>     val devJoinedDF = devicesDF.join(gatewaysDF, Seq("GatewayId"), 
> "inner").cache()
>     devJoinedDF.printSchema()
>     println(devJoinedDF.count())
>     println(devJoinedDF.select("DeviceId").distinct().count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count>1").count())
>     println(devJoinedDF.groupBy("DeviceId").count().filter("count=1").count())



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-23178) Kryo Unsafe problems with count distinct from cache

2018-01-22 Thread KIryl Sultanau (JIRA)
KIryl Sultanau created SPARK-23178:
--

 Summary: Kryo Unsafe problems with count distinct from cache
 Key: SPARK-23178
 URL: https://issues.apache.org/jira/browse/SPARK-23178
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Affects Versions: 2.2.1, 2.2.0
Reporter: KIryl Sultanau


val spark = SparkSession
    .builder
    .appName("unsafe-issue")
    .master("local[*]")
    .config("spark.serializer", 
"org.apache.spark.serializer.KryoSerializer")
    .config("spark.kryo.unsafe", "true")
    .config("spark.kryo.registrationRequired", "false")
    .getOrCreate()

    val devicesDF = spark.read.format("csv")
    .option("header", "true")
    .option("delimiter", "\t")
    .load("/data/Devices.tsv").cache()

    val gatewaysDF = spark.read.format("csv")
    .option("header", "true")
    .option("delimiter", "\t")
    .load("/data/Gateways.tsv").cache()

    val devJoinedDF = devicesDF.join(gatewaysDF, Seq("GatewayId"), 
"inner").cache()
    devJoinedDF.printSchema()

    println(devJoinedDF.count())

    println(devJoinedDF.select("DeviceId").distinct().count())

    println(devJoinedDF.groupBy("DeviceId").count().filter("count>1").count())

    println(devJoinedDF.groupBy("DeviceId").count().filter("count=1").count())



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org