Re: How to do map join in Spark SQL

2015-12-20 Thread Alexander Pivovarov
;>> 3,12,26, etc) >>> Also I have small json file (just 8 rows) with ranges definition (min, >>> max , name) >>> 0, 10, A >>> 10, 20, B >>> 20, 30, C >>> etc >>> >>> Because I can not do equi-join btw duration and range min/max I need to >>> do cross join and apply WHERE condition to take records which belong to the >>> range >>> Cross join is an expensive operation I think that it's better if this >>> particular join done using Map Join >>> >>> How to do Map join in Spark Sql? >>> >> >>

Re: How to do map join in Spark SQL

2015-12-20 Thread Chris Fregly
/max I need to do >>> cross join and apply WHERE condition to take records which belong to the >>> range >>> Cross join is an expensive operation I think that it's better if this >>> particular join done using Map Join >>> >>> How to do Map join in Spark Sql?

Re: How to do map join in Spark SQL

2015-12-19 Thread Alexander Pivovarov
to take records which belong to the >> range >> Cross join is an expensive operation I think that it's better if this >> particular join done using Map Join >> >> How to do Map join in Spark Sql? >> > >

Re: How to do map join in Spark SQL

2015-12-18 Thread Akhil Das
n to take records which belong to the > range > Cross join is an expensive operation I think that it's better if this > particular join done using Map Join > > How to do Map join in Spark Sql? >

How to do map join in Spark SQL

2015-12-15 Thread Alexander Pivovarov
WHERE condition to take records which belong to the range Cross join is an expensive operation I think that it's better if this particular join done using Map Join How to do Map join in Spark Sql?