Amraneze opened a new pull request, #38154:
URL: https://github.com/apache/spark/pull/38154

   ### What changes were proposed in this pull request?
   I encountered an issue using Spark while reading JSON files based on a 
schema it throws every time an exception related to conversion of types.
   
   >Note: This issue can be reproduced only with Scala `2.13`, I'm not having 
this issue with `2.12`
   
   ````
   Failed to convert value ArraySeq(1, 2, 3) (class of class 
scala.collection.mutable.ArraySeq$ofRef}) with the type of 
ArrayType(StringType,true) to JSON.
   java.lang.IllegalArgumentException: Failed to convert value ArraySeq(1, 2, 
3) (class of class scala.collection.mutable.ArraySeq$ofRef}) with the type of 
ArrayType(StringType,true) to JSON.
   ````
   
   If I add ArraySeq to the matching cases, the test that I added passed 
successfully 
   
![image](https://user-images.githubusercontent.com/28459763/194669557-2f13032f-126f-4c2e-bc6d-1a4cfd0a009d.png)
   
   With the current code source, the test fails and we have this following error
   
![image](https://user-images.githubusercontent.com/28459763/194669654-19cefb13-180c-48ac-9206-69d8f672f64c.png)
   
   
   ### Why are the changes needed?
   If the person is using Scala 2.13, they can't parse an array. Which means 
they need to fallback to 2.12 to keep the project functioning 
   
   ### How was this patch tested?
   I added a sample unit test for the case, but I can add more if you want to.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to