+1
Tested 1.5.1 SQL blockers.
On Sat, Sep 26, 2015 at 1:36 PM, robineast wrote:
> +1
>
>
> build/mvn clean package -DskipTests -Pyarn -Phadoop-2.6
> OK
> Basic graph tests
> Load graph using edgeListFile...SUCCESS
> Run PageRank...SUCCESS
> Minimum Spanning Tree
+1 (non-binding.)
Tested jdbc data source, and some of the tpc-ds queries.
Thanks everybody for voting. I'm going to close the vote now. The vote
passes with 17 +1 votes and 1 -1 vote. I will work on packaging this asap.
+1:
Reynold Xin*
Sean Owen
Hossein Falaki
Xiangrui Meng*
Krishna Sankar
Joseph Bradley
Sean McNamara*
Luciano Resende
Doug Balog
Eugene Zhulenev
We have to try and maintain binary compatibility here, so probably the
easiest thing to do here would be to add a method to the class. Perhaps
something like:
def unhandledFilters(filters: Array[Filter]): Array[Filter] = filters
By default, this could return all filters so behavior would remain
Hi Anchit,
cat you create more than one data in each dataset to test again?
> On Sep 26, 2015, at 18:00, Fengdong Yu wrote:
>
> Anchit,
>
> please ignore my inputs. you are right. Thanks.
>
>
>
>> On Sep 26, 2015, at 17:27, Fengdong Yu
Shouldn't this discussion be held on the user list and not the dev list?
The dev list (this list) is for discussing development on Spark itself.
Please move the discussion accordingly.
Nick
2015년 9월 27일 (일) 오후 10:57, Fengdong Yu 님이 작성:
> Hi Anchit,
> cat you create