[GitHub] spark issue #7379: [SPARK-8682][SQL][WIP] Range Join

2017-06-19 Thread zzeekk
Github user zzeekk commented on the issue: https://github.com/apache/spark/pull/7379 Hello @IceMan81, you need to truncate your timestamps to days, hours or mins depending on your use case, and use that for the additional equi-join condition. --- If your project is set up for it

[GitHub] spark issue #7379: [SPARK-8682][SQL][WIP] Range Join

2017-06-09 Thread zzeekk
Github user zzeekk commented on the issue: https://github.com/apache/spark/pull/7379 @IceMan81 Here is an abstract example of our workaround, building blocks as additional equi-join conditions. The task is to join Points (df1) on a line back to Segments of the same line (df2

[GitHub] spark issue #7379: [SPARK-8682][SQL][WIP] Range Join

2017-02-16 Thread zzeekk
Github user zzeekk commented on the issue: https://github.com/apache/spark/pull/7379 Same here. A Workaround is to build blocks and add them as equi-join condition. But then you need to make an additional join on the following block und coalesce the results. --- If your project is