[ 
https://issues.apache.org/jira/browse/SPARK-45311?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marc Le Bihan updated SPARK-45311:
----------------------------------
    Description: 
(I've extracted from my work a set of 35 tests showing the troubles encountered.

I suggest clone the 
[https://gitlab.com/territoirevif/minimal-tests-spark-issue] project to reach 
quicker the parts where the problems happens).

 

This project, that does many operations around cities, local authorities, 
accounting with open data is working well with Spark 3.2.x, 3.3.x.

 

But as soon as I select 3.4.x Spark version, where the encoder seems to have 
deeply changed,

the encoder fails with two series of problems:

 

*1)* It throws *java.util.NoSuchElementException: None.get* messages everywhere.

Asking over the Internet, I wasn't alone facing this problem. Reading it, 
you'll see that I've attempted a debug but my Scala skills are low.

[https://stackoverflow.com/questions/76036349/encoders-bean-doesnt-work-anymore-on-a-java-pojo-with-spark-3-4-0]

by the way, the encoder and decoder functions should all forward a parameter as 
soon as the name of the field being handled is known, so that when the encoder 
is a the point it has to throw an exception, it knows the field it is handling 
in its specific call and can send a message like:
*java.util.NoSuchElementException: None.get when [encoding|decoding] field 
siret* (or method getSiret, or what it was +exactly+ doing...), that gives the 
developer a hint about where to search... Message of the kind "{_}I've 
failed{_}" with no clue, no context make developers loose days.

 

*2)* *Not found an encoder of the type RS to Spark SQL internal 
representation.* Consider to change the input type to one of supported at ...
Not found an encoder of the type *OMI_ID* to Spark SQL internal 
representation...

 
where *RS* and *OMI_ID* are ... generic types.
This is strange and alarming.
[https://stackoverflow.com/questions/76045255/encoders-bean-attempts-to-check-the-validity-of-a-return-type-considering-its-ge]

 

When I switch to the Spark 3.5.0 version, the same problem remain, but another 
add itself to the list
"{*}Only expression encoders are supported for now{*}"
 
to what was perfectly working before. I've no clue about it, but it's clearly a 
regression.
 

  was:
(I've extracted from my work a set of 35 tests showing the troubles encountered.

I suggest clone the 
[https://gitlab.com/territoirevif/minimal-tests-spark-issue] project to reach 
quicker the parts where the problems happens).

 

This project, that does many operations around cities, local authorities, 
accounting with open data is working well with Spark 3.2.x, 3.3.x.

 

But as soon as I select 3.4.x Spark version, where the encoder seems to have 
deeply changed,

the encoder fails with two series of problems:

 

*1)* It throws *java.util.NoSuchElementException: None.get* messages everywhere.

Asking over the Internet, I wasn't alone facing this problem. Reading it, 
you'll see that I've attempted a debug but my Scala skills are low.

[https://stackoverflow.com/questions/76036349/encoders-bean-doesnt-work-anymore-on-a-java-pojo-with-spark-3-4-0]

by the way, the encoder and decoder function should all forward a parameter as 
soon as the name of the field being handled is known, so that when the encoder 
is a the point it has to throw an exception, it knows the field it is handling 
in its specific call and can send a message like:
*java.util.NoSuchElementException: None.get when [encoding|decoding] field 
siret* (or method getSiret, or what it was +exactly+ doing...), that gives the 
developer a hint about where to search... Message of the kind "{_}I've 
failed{_}" with no clue, no context make developers loose days.

 

*2)* *Not found an encoder of the type RS to Spark SQL internal 
representation.* Consider to change the input type to one of supported at ...
Not found an encoder of the type *OMI_ID* to Spark SQL internal 
representation...

 
where *RS* and *OMI_ID* are ... generic types.
This is strange and alarming.
[https://stackoverflow.com/questions/76045255/encoders-bean-attempts-to-check-the-validity-of-a-return-type-considering-its-ge]

 

When I switch to the Spark 3.5.0 version, the same problem remain, but another 
add itself to the list
"{*}Only expression encoders are supported for now{*}"
 
to what was perfectly working before. I've no clue about it, but it's clearly a 
regression.
 


> Encoder fails on many "NoSuchElementException: None.get" since 3.4.x, search 
> for an encoder for a generic type, and since 3.5.x isn't "an expression 
> encoder"
> -------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-45311
>                 URL: https://issues.apache.org/jira/browse/SPARK-45311
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.4.0, 3.4.1, 3.5.0
>         Environment: Debian 12
> Java 17
> Underlying Spring-Boot 2.7.14
>            Reporter: Marc Le Bihan
>            Priority: Major
>
> (I've extracted from my work a set of 35 tests showing the troubles 
> encountered.
> I suggest clone the 
> [https://gitlab.com/territoirevif/minimal-tests-spark-issue] project to reach 
> quicker the parts where the problems happens).
>  
> This project, that does many operations around cities, local authorities, 
> accounting with open data is working well with Spark 3.2.x, 3.3.x.
>  
> But as soon as I select 3.4.x Spark version, where the encoder seems to have 
> deeply changed,
> the encoder fails with two series of problems:
>  
> *1)* It throws *java.util.NoSuchElementException: None.get* messages 
> everywhere.
> Asking over the Internet, I wasn't alone facing this problem. Reading it, 
> you'll see that I've attempted a debug but my Scala skills are low.
> [https://stackoverflow.com/questions/76036349/encoders-bean-doesnt-work-anymore-on-a-java-pojo-with-spark-3-4-0]
> by the way, the encoder and decoder functions should all forward a parameter 
> as soon as the name of the field being handled is known, so that when the 
> encoder is a the point it has to throw an exception, it knows the field it is 
> handling in its specific call and can send a message like:
> *java.util.NoSuchElementException: None.get when [encoding|decoding] field 
> siret* (or method getSiret, or what it was +exactly+ doing...), that gives 
> the developer a hint about where to search... Message of the kind "{_}I've 
> failed{_}" with no clue, no context make developers loose days.
>  
> *2)* *Not found an encoder of the type RS to Spark SQL internal 
> representation.* Consider to change the input type to one of supported at ...
> Not found an encoder of the type *OMI_ID* to Spark SQL internal 
> representation...
>  
> where *RS* and *OMI_ID* are ... generic types.
> This is strange and alarming.
> [https://stackoverflow.com/questions/76045255/encoders-bean-attempts-to-check-the-validity-of-a-return-type-considering-its-ge]
>  
> When I switch to the Spark 3.5.0 version, the same problem remain, but 
> another add itself to the list
> "{*}Only expression encoders are supported for now{*}"
>  
> to what was perfectly working before. I've no clue about it, but it's clearly 
> a regression.
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to