Patrice Freydiere created FLINK-3911:
----------------------------------------

             Summary: Sort operation before a group reduce doesn't seem to be 
implemented on 1.0.2
                 Key: FLINK-3911
                 URL: https://issues.apache.org/jira/browse/FLINK-3911
             Project: Flink
          Issue Type: Bug
    Affects Versions: 1.0.2
         Environment: Linux Ubuntu, standalone cluster
            Reporter: Patrice Freydiere


i have this piece of code: 

 // group by id and sort on field order
                DataSet<Tuple2<Long, byte[]>> waysGeometry = 
joinedWaysWithPoints.groupBy(0).sortGroup(1, Order.ASCENDING)
                                .reduceGroup(new 
GroupReduceFunction<Tuple4<Long, Integer, Double, Double>, Tuple2<Long, 
byte[]>>() {
                                        @Override
                                        public void 
reduce(Iterable<Tuple4<Long, Integer, Double, Double>> values,
                                                        Collector<Tuple2<Long, 
byte[]>> out) throws Exception {
                                                long id = -1;


and this exception when executing ;

ava.lang.Exception: The data preparation for task 'GroupReduce (GroupReduce at 
constructOSMStreams(ProcessOSM.java:112))' , caused an error: Unrecognized 
driver strategy for GroupReduce driver: SORTED_GROUP_COMBINE
        at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:456)
        at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:345)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.Exception: Unrecognized driver strategy for GroupReduce 
driver: SORTED_GROUP_COMBINE
        at 
org.apache.flink.runtime.operators.GroupReduceDriver.prepare(GroupReduceDriver.java:90)
        at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:450)
        ... 3 more






--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to