Github user zzeekk commented on the issue:
https://github.com/apache/spark/pull/7379
Hello @IceMan81, you need to truncate your timestamps to days, hours or
mins depending on your use case, and use that for the additional equi-join
condition.
---
If your project is set up for it
Github user zzeekk commented on the issue:
https://github.com/apache/spark/pull/7379
@IceMan81 Here is an abstract example of our workaround, building blocks as
additional equi-join conditions.
The task is to join Points (df1) on a line back to Segments of the same
line (df2
Github user zzeekk commented on the issue:
https://github.com/apache/spark/pull/7379
Same here. A Workaround is to build blocks and add them as equi-join
condition. But then you need to make an additional join on the following block
und coalesce the results.
---
If your project is