Re: how can handles Any , All query on flink

2015-07-11 Thread Chiwan Park
Because there is no default implementations like forany in scala, I use forall method. Note that ANY (condition) is equivalent as NOT ALL (NOT condition). Regards, Chiwan Park > On Jul 12, 2015, at 5:39 AM, hagersaleh wrote: > > why in this use ! and <= in handle Any >override def filter(va

Re: error when use Broadcast Variables cannot find symbol getRuntimeContext()

2015-07-11 Thread Chiwan Park
Hi, you should use RichMapFunction not MapFunction. The difference between RichMapFunction and MapFunction is described in Flink documentation [1]. Regards, Chiwan Park [1] https://ci.apache.org/projects/flink/flink-docs-master/apis/programming_guide.html#rich-functions > On Jul 12, 2015, at 7:

error when use Broadcast Variables cannot find symbol getRuntimeContext()

2015-07-11 Thread hagersaleh
import java.util.Collection; import org.apache.flink.api.common.functions.MapFunction; import org.apache.flink.configuration.Configuration; import org.apache.flink.api.java.DataSet; import org.apache.flink.api.java.ExecutionEnvironment; ExecutionEnvironment env = ExecutionEnvironment.getExecut

Re: how can handles Any , All query on flink

2015-07-11 Thread hagersaleh
why in this use ! and <= in handle Any override def filter(value: Product): Boolean = !bcSet.forall(value.model <= _) }).withBroadcastSet(pcModels, "pcModels").distinct("maker").map(_.maker) -- View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble

HBase & Machine Learning

2015-07-11 Thread Lydia Ickler
Dear Sir or Madame, I would like to use the Flink-HBase addon to read out data that then serves as an input for the machine learning algorithms, respectively the SVM and MLR. Right now I first write the extracted data to a temporary file and then read it in via the libSVM method...but i guess t

Re: how can handles Any , All query on flink

2015-07-11 Thread Chiwan Park
Hi, I wrote a example including queries which you want [1]. The example uses only Flink Scala API, but I think It would be better to use Table API. I used broadcast set [2] to perform subqueries in your given query. Flink has many functions to handle data and the great documentation to explain t