NJCrazyRen commented on issue #16379: URL: https://github.com/apache/shardingsphere/issues/16379#issuecomment-1079590699
> @NJCrazyRen Hello, Can your translate english for your question? OK, thanks for your reply. Forgive my poor English. The question is that I have a MySQL database and there are some columns storing encrypted data, but I need plain text data in my spark application. So I tried to use ShardingSphere-proxy with corrent encrypt config as middleware between my apache spark application and MySQL. It works well when I use DBeaver to execute query, I got the plain data. But the exception below occured in my spark application: `requirement failed: Decimal precision 5 exceeds max precision 3` The real column data type in MySQL is decimal(5, 2), but spark got decimal(3, 2) from ShardingSphere-proxy. In fact, not only one column data type is incorrect, the data types of other decimal columns are also incorrect. I found the similarity is the precision got from ShardingSphere-proxy be 2 less than the correct precision in database. I changed my spark application code, connecting to database directly for getting right schema, passing it to spark DataFrameReader with JDBC format for workaround. Although it can avoid the exception, but it's too much trouble. The software version in my case: MySQL 5.7.33, ShardingSphere-proxy 5.1.0, mysql-connector-java-8.0.26 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
