paul-rogers commented on PR #13201:
URL: https://github.com/apache/druid/pull/13201#issuecomment-1276857438

   Generic question, when we say "Try converting all inner joins to filters", 
does this include the case where the join would return a billion (or 100 
billion) rows? That is, are we considering cardinality?
   
   In a traditional RDBMS, one looks at the (estimated) cardinality of the 
tables to determine if this kind of conversion is safe. Set some threshold: 100 
items? 1000? 10K? If the estimated cardinality is above that, then the 
memory-for-time tradeoff doesn't work and we're better off retaining the join.
   
   If the code can't figure this out (Druid doesn't really do cost estimation), 
how can a user force the join choice? New keyword? Query hint in SQL? Query 
context entry?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to