Victor Zhang created SPARK-29969:
------------------------------------

             Summary: parse_url function result in incorrect result
                 Key: SPARK-29969
                 URL: https://issues.apache.org/jira/browse/SPARK-29969
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.4.4, 2.3.1
            Reporter: Victor Zhang


In this Jira using java.net.URI instead of java.net.URL for performance reason.

https://issues.apache.org/jira/browse/SPARK-16826

However, in the case of some unconventional parameters, it can lead to 
incorrect results.

For example, when the URL is encoded, the function cannot resolve the correct 
result.

 

0: jdbc:hive2://localhost:10000> SELECT 
parse_url('http://uzzf.down.gsxzq.com/download/%E5%B8%B8%E7%94%A8%E9%98%80%E9%97%A8CAD%E5%9B%BE%E7%BA%B8%E5%A4%',
 'HOST');
+------------------------------------------------------------------------------------------------------------------------+--+
| 
parse_url(http://uzzf.down.gsxzq.com/download/%E5%B8%B8%E7%94%A8%E9%98%80%E9%97%A8CAD%E5%9B%BE%E7%BA%B8%E5%A4%,
 HOST) |
+------------------------------------------------------------------------------------------------------------------------+--+
| NULL |
+------------------------------------------------------------------------------------------------------------------------+--+
1 row selected (0.094 seconds)

 

hive> SELECT 
parse_url('http://uzzf.down.gsxzq.com/download/%E5%B8%B8%E7%94%A8%E9%98%80%E9%97%A8CAD%E5%9B%BE%E7%BA%B8%E5%A4%',
 'HOST');

OK

HEADER: _c0

uzzf.down.gsxzq.com

Time taken: 4.423 seconds, Fetched: 1 row(s)

 

Here's a similar problem.

https://issues.apache.org/jira/browse/SPARK-23056

Our team used the spark function to run data for months, but now we have to run 
it again.

It's just too painful.:(:(:(

 

 

 

 

 

 

 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to