[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-16 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r175205403
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/MXNetHandler.scala
 ##
 @@ -0,0 +1,95 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import java.util.concurrent._
+
+import org.slf4j.LoggerFactory
+
+private[infer] trait MXNetHandler {
+
+  def execute[T](f: => T): T
+
+  val executor: ExecutorService
+
+}
+
+private[infer] object MXNetHandlerType extends Enumeration {
+
+  type MXNetHandlerType = Value
+  val SingleThreadHandler = Value("MXNetSingleThreadHandler")
+  val OneThreadPerModelHandler = Value("MXNetOneThreadPerModelHandler")
+}
+
+private[infer] class MXNetThreadPoolHandler(numThreads: Option[Int] = Some(1))
+  extends MXNetHandler {
+  private val logger = LoggerFactory.getLogger(classOf[MXNetThreadPoolHandler])
+
+  private val threadFactory = new ThreadFactory {
+
+override def newThread(r: Runnable): Thread = new Thread(r) {
+  setName(classOf[MXNetThreadPoolHandler].getCanonicalName
++ "-numThreads: %d".format(numThreads.get))
+}
+  }
+
+  override val executor: ExecutorService =
+Executors.newFixedThreadPool(numThreads.get, threadFactory)
+
+  private val creatorThread = executor.submit(new Callable[Thread] {
+override def call(): Thread = Thread.currentThread()
+  }).get()
+
+  override def execute[T](f: => T): T = {
+
+if (Thread.currentThread() eq creatorThread) {
+  f
+} else {
+
+  val task = new Callable[T] {
+override def call(): T = {
+  logger.info("threadId: %s".format(Thread.currentThread().getId()))
+  f
+}
+  }
+
+  val result = executor.submit(task)
+  try {
+result.get()
+  } catch {
+case e: Exception => throw e.getCause()
 
 Review comment:
   you might also get other exceptions, I think it makes sense to catch all 
Exceptions. I will throw InterruptedException as is separately.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-16 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r175195701
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/MXNetHandler.scala
 ##
 @@ -0,0 +1,95 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import java.util.concurrent._
+
+import org.slf4j.LoggerFactory
+
+private[infer] trait MXNetHandler {
+
+  def execute[T](f: => T): T
+
+  val executor: ExecutorService
+
+}
+
+private[infer] object MXNetHandlerType extends Enumeration {
+
+  type MXNetHandlerType = Value
+  val SingleThreadHandler = Value("MXNetSingleThreadHandler")
+  val OneThreadPerModelHandler = Value("MXNetOneThreadPerModelHandler")
+}
+
+private[infer] class MXNetThreadPoolHandler(numThreads: Option[Int] = Some(1))
 
 Review comment:
   good point, regardless of whether an Option or just a Int, I needed a 
validation of int > 0. I changed it to a Int and added validation. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-16 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r175096557
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/MXNetHandler.scala
 ##
 @@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import java.util.concurrent._
+
+import org.slf4j.LoggerFactory
+
+private[infer] trait MXNetHandler {
+
+  def execute[T](f: => T): T
+
+  val executor: ExecutorService
+
+}
+
+private[infer] object MXNetHandlerType extends Enumeration {
+
+  type MXNetHandlerType = Value
+  val SingleThreadHandler = Value("MXNetSingleThreadHandler")
+  val OneThreadPerModelHandler = Value("MXNetOneThreadPerModelHandler")
+}
+
+private[infer] class MXNetThreadPoolHandler(numThreads: Option[Int] = Some(1))
+  extends MXNetHandler {
+  private val logger = LoggerFactory.getLogger(classOf[MXNetThreadPoolHandler])
+
+  private val threadFactory = new ThreadFactory {
+
+override def newThread(r: Runnable): Thread = new Thread(r) {
+  setName(classOf[MXNetThreadPoolHandler].getCanonicalName
++ "-numThreads: %d".format(numThreads.get))
+}
+  }
+
+  override val executor: ExecutorService =
+Executors.newFixedThreadPool(numThreads.get, threadFactory)
+
+  private val creatorThread = executor.submit(new Callable[Thread] {
+override def call(): Thread = Thread.currentThread()
+  }).get()
+
+  override def execute[T](f: => T): T = {
+
+if (Thread.currentThread() eq creatorThread) {
+  f
 
 Review comment:
   this doesn't work since it does not take any parameters, so I added a return 
to be more explicit. return statement fails style-check, so i have to keep it 
the way it is.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-16 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r175096557
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/MXNetHandler.scala
 ##
 @@ -0,0 +1,94 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import java.util.concurrent._
+
+import org.slf4j.LoggerFactory
+
+private[infer] trait MXNetHandler {
+
+  def execute[T](f: => T): T
+
+  val executor: ExecutorService
+
+}
+
+private[infer] object MXNetHandlerType extends Enumeration {
+
+  type MXNetHandlerType = Value
+  val SingleThreadHandler = Value("MXNetSingleThreadHandler")
+  val OneThreadPerModelHandler = Value("MXNetOneThreadPerModelHandler")
+}
+
+private[infer] class MXNetThreadPoolHandler(numThreads: Option[Int] = Some(1))
+  extends MXNetHandler {
+  private val logger = LoggerFactory.getLogger(classOf[MXNetThreadPoolHandler])
+
+  private val threadFactory = new ThreadFactory {
+
+override def newThread(r: Runnable): Thread = new Thread(r) {
+  setName(classOf[MXNetThreadPoolHandler].getCanonicalName
++ "-numThreads: %d".format(numThreads.get))
+}
+  }
+
+  override val executor: ExecutorService =
+Executors.newFixedThreadPool(numThreads.get, threadFactory)
+
+  private val creatorThread = executor.submit(new Callable[Thread] {
+override def call(): Thread = Thread.currentThread()
+  }).get()
+
+  override def execute[T](f: => T): T = {
+
+if (Thread.currentThread() eq creatorThread) {
+  f
 
 Review comment:
   this doesn't work since it does not take any parameters, so I added a return 
to be more explicit


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-16 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r175091492
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/Predictor.scala
 ##
 @@ -0,0 +1,197 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import ml.dmlc.mxnet.io.NDArrayIter
+import ml.dmlc.mxnet.{Context, DataDesc, NDArray, Shape}
+import ml.dmlc.mxnet.module.Module
+
+import scala.collection.mutable.ListBuffer
+import org.slf4j.LoggerFactory
+
+/**
+ * Base Trait for MXNet Predictor classes.
+ */
+private[mxnet] trait PredictBase {
+
+  /**
+   * This method will take input as IndexedSeq one dimensional arrays and 
creates
+   * NDArray needed for inference. The array will be reshaped based on the 
input descriptors.
+   * @param input: A IndexedSequence of Scala one-dimensional array, An 
IndexedSequence is
+   * is needed when the model has more than one input/output
 
 Review comment:
   this for models that need more than 1 input, a crude example would be a 
model that takes 2 different images. Are you implying that it can just be 
another dimension to the same input?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-16 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r175091982
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/Predictor.scala
 ##
 @@ -0,0 +1,197 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import ml.dmlc.mxnet.io.NDArrayIter
+import ml.dmlc.mxnet.{Context, DataDesc, NDArray, Shape}
+import ml.dmlc.mxnet.module.Module
+
+import scala.collection.mutable.ListBuffer
+import org.slf4j.LoggerFactory
+
+/**
+ * Base Trait for MXNet Predictor classes.
+ */
+private[mxnet] trait PredictBase {
+
+  /**
+   * This method will take input as IndexedSeq one dimensional arrays and 
creates
+   * NDArray needed for inference. The array will be reshaped based on the 
input descriptors.
+   * @param input: A IndexedSequence of Scala one-dimensional array, An 
IndexedSequence is
+   * is needed when the model has more than one input/output
+   * @return IndexedSequence array of outputs.
+   */
+  def predict(input: IndexedSeq[Array[Float]]): IndexedSeq[Array[Float]]
+
+  /**
+   * Predict using NDArray as input. This method is useful when the input is a 
batch of data
+   * or when multiple operations on the input/output have to performed.
+   * Note: User is responsible for managing allocation/deallocation of 
NDArrays.
+   * @param input: IndexedSequence NDArrays.
+   * @return output of Predictions as NDArrays.
+   */
+  def predictWithNDArray(input: IndexedSeq[NDArray]): IndexedSeq[NDArray]
+
+}
+
+/**
+ * Implementation of predict routines.
+ *
+ * @param modelPathPrefix PathPrefix from where to load the model.
+ *Example: file://model-dir/resnet-152(containing 
resnet-152-symbol.json,
+ * @param inputDescriptors Descriptors defining the input node names, shape,
+ * layout and Type parameters.
+ * Note: If the input Descriptors is missing batchSize('N' in layout),
+ * a batchSize of 1 is assumed for the model.
+ * 
+ * @param contexts Device Contexts on which you want to run Inference, 
defaults to CPU.
+ * @param epoch Model epoch to load, defaults to 0.
+ */
+class Predictor(modelPathPrefix: String,
+protected val inputDescriptors: IndexedSeq[DataDesc],
+protected val contexts: Array[Context] = Context.cpu(),
+protected val epoch: Option[Int] = Some(0))
+extends PredictBase {
+
+  private val logger = LoggerFactory.getLogger(classOf[Predictor])
+
+  require(inputDescriptors.head.layout.size != 0, "layout size should not be 
zero")
+
+  protected[mxnet] var batchIndex = inputDescriptors(0).layout.indexOf('N')
+  protected[mxnet] var batchSize = if (batchIndex != -1) 
inputDescriptors(0).shape(batchIndex)
+else 1
+
+  protected[mxnet] var iDescriptors = inputDescriptors
+
+  inputDescriptors.foreach((f: DataDesc) => require(f.layout.indexOf('N') == 
batchIndex,
+"batch size should be in the same index for all inputs"))
+
+  if (batchIndex != -1) {
+inputDescriptors.foreach((f: DataDesc) => require(f.shape(batchIndex) == 
batchSize,
+  "batch size should be same for all inputs"))
+  } else {
+// Note: this is assuming that the input needs a batch
+logger.warn("InputDescriptor does not have batchSize, using 1 as the 
default batchSize")
+iDescriptors = inputDescriptors.map((f: DataDesc) => new DataDesc(f.name,
+  Shape(1 +: f.shape.toVector), f.dtype, 'N' +: f.layout))
+batchIndex = 1
+  }
+
+  protected[mxnet] val mxNetHandler = MXNetHandler()
+
+  protected[mxnet] val mod = loadModule()
 
 Review comment:
   if i don't make it package[infer] private. I won't be able to mock and unit 
test easily.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-16 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r175091492
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/Predictor.scala
 ##
 @@ -0,0 +1,197 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import ml.dmlc.mxnet.io.NDArrayIter
+import ml.dmlc.mxnet.{Context, DataDesc, NDArray, Shape}
+import ml.dmlc.mxnet.module.Module
+
+import scala.collection.mutable.ListBuffer
+import org.slf4j.LoggerFactory
+
+/**
+ * Base Trait for MXNet Predictor classes.
+ */
+private[mxnet] trait PredictBase {
+
+  /**
+   * This method will take input as IndexedSeq one dimensional arrays and 
creates
+   * NDArray needed for inference. The array will be reshaped based on the 
input descriptors.
+   * @param input: A IndexedSequence of Scala one-dimensional array, An 
IndexedSequence is
+   * is needed when the model has more than one input/output
 
 Review comment:
   this for models that need more than 1 input, a crude example would be a 
model that takes 2 different images


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-16 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r175090171
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/Classifier.scala
 ##
 @@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import ml.dmlc.mxnet.{Context, DataDesc, NDArray}
+import java.io.File
+
+import org.slf4j.LoggerFactory
+
+import scala.io
+import scala.collection.mutable.ListBuffer
+
+trait ClassifierBase {
+
+  /**
+* Takes an Array of Floats and returns corresponding labels, score tuples.
+* @param input: IndexedSequence one-dimensional array of Floats.
+* @param topK: (Optional) How many top_k(sorting will be based on the last 
axis)
+* elements to return, if not passed returns unsorted output.
+* @return IndexedSequence of (Label, Score) tuples.
+*/
+  def classify(input: IndexedSeq[Array[Float]],
+   topK: Option[Int] = None): IndexedSeq[(String, Float)]
+
+  /**
+* Takes a Sequence of NDArrays and returns Label, Score tuples.
+* @param input: Indexed Sequence of NDArrays
+* @param topK: (Optional) How many top_k(sorting will be based on the last 
axis)
+* elements to return, if not passed returns unsorted output.
+* @return Traversable Sequence of (Label, Score) tuple, Score will be in 
the form of NDArray
+*/
+  def classifyWithNDArray(input: IndexedSeq[NDArray],
+  topK: Option[Int] = None): 
IndexedSeq[IndexedSeq[(String, Float)]]
+}
+
+/**
+  * A class for classifier tasks
+  * @param modelPathPrefix PathPrefix from where to load the symbol, 
parameters and synset.txt
+  *Example: file://model-dir/resnet-152(containing 
resnet-152-symbol.json
+  *file://model-dir/synset.txt
+  * @param inputDescriptors Descriptors defining the input node names, shape,
+  * layout and Type parameters
+  * @param contexts Device Contexts on which you want to run Inference, 
defaults to CPU.
+  * @param epoch Model epoch to load, defaults to 0.
+  */
+class Classifier(modelPathPrefix: String,
+ protected val inputDescriptors: IndexedSeq[DataDesc],
+ protected val contexts: Array[Context] = Context.cpu(),
+ protected val epoch: Option[Int] = Some(0))
+  extends ClassifierBase {
+
+  private val logger = LoggerFactory.getLogger(classOf[Classifier])
+
+  protected[mxnet] val predictor: PredictBase = getPredictor()
+
+  protected[mxnet] val synsetFilePath = getSynsetFilePath(modelPathPrefix)
+
+  protected[mxnet] val synset = readSynsetFile(synsetFilePath)
+
+  protected[mxnet] val handler = MXNetHandler()
+
+  /**
+* Takes a flat arrays as input and returns a List of (Label, tuple)
+* @param input: IndexedSequence one-dimensional array of Floats.
+* @param topK: (Optional) How many top_k(sorting will be based on the last 
axis)
+* elements to return, if not passed returns unsorted output.
+* @return IndexedSequence of (Label, Score) tuples.
+*/
+  override def classify(input: IndexedSeq[Array[Float]],
+topK: Option[Int] = None): IndexedSeq[(String, Float)] 
= {
+
+// considering only the first output
+val predictResult = predictor.predict(input)(0)
+var result: IndexedSeq[(String, Float)] = IndexedSeq.empty
+
+if (topK.isDefined) {
+  val sortedIndex = 
predictResult.zipWithIndex.sortBy(-_._1).map(_._2).take(topK.get)
+  result = sortedIndex.map(i => (synset(i), predictResult(i))).toIndexedSeq
+} else {
+  result = synset.zip(predictResult).toIndexedSeq
+}
+result
+  }
+
+  /**
+* Takes input as NDArrays, useful when you want to perform multiple 
operations on
+* the input Array or when you want to pass a batch of input.
+* @param input: Indexed Sequence of NDArrays
+* @param topK: (Optional) How many top_k(sorting will be based on the last 
axis)
+* elements to return, if not passed returns unsorted output.
+* @return Traversable Sequence of (Label

[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-14 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174657442
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/Classifier.scala
 ##
 @@ -0,0 +1,167 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import ml.dmlc.mxnet.{DataDesc, NDArray}
+import java.io.File
+
+import org.slf4j.LoggerFactory
+
+import scala.io
+import scala.collection.mutable.ListBuffer
+
+trait ClassifierBase {
+
+  /**
+* Takes an Array of Floats and returns corresponding labels, score tuples.
+* @param input: IndexedSequence one-dimensional array of Floats.
+* @param topK: (Optional) How many top_k(sorting will be based on the last 
axis)
+* elements to return, if not passed returns unsorted output.
+* @return IndexedSequence of (Label, Score) tuples.
+*/
+  def classify(input: IndexedSeq[Array[Float]],
+   topK: Option[Int] = None): List[(String, Float)]
 
 Review comment:
   Ok, changed the return type to IndexedSeq instead.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-14 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174657396
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/MXNetHandler.scala
 ##
 @@ -0,0 +1,82 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import java.util.concurrent._
+
+trait MXNetHandler {
+
+  def execute[T](f: => T): T
+
+  val executor: ExecutorService
+
+}
+
+object MXNetHandlerType extends Enumeration {
+
+  type MXNetHandlerType = Value
+  val SingleThreadHandler = Value("MXNetSingleThreadHandler")
+  val OneThreadPerModelHandler = Value("MXNetOneThreadPerModelHandler")
+}
+
+class MXNetOneThreadPerModelHandler extends MXNetHandler {
+
+  private val threadFactory = new ThreadFactory {
+
+override def newThread(r: Runnable): Thread = new Thread(r) {
+  setName(classOf[MXNetOneThreadPerModelHandler].getCanonicalName)
+}
+  }
+
+  override val executor: ExecutorService = Executors.newFixedThreadPool(10, 
threadFactory)
+
+  override def execute[T](f: => T): T = {
 
 Review comment:
   I did not encounter this when i unit tested by running on the same thread. 
However I fixed this based on your implementation. 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-14 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174607772
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/Predictor.scala
 ##
 @@ -0,0 +1,188 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import ml.dmlc.mxnet.io.NDArrayIter
+import ml.dmlc.mxnet.{DataDesc, NDArray, Shape}
+import ml.dmlc.mxnet.module.Module
+
+import scala.collection.mutable.ListBuffer
+import org.slf4j.LoggerFactory
+
+/**
+ * Base Trait for MXNet Predictor classes.
+ */
+private[mxnet] trait PredictBase {
+
+  /**
+   * This method will take input as IndexedSeq one dimensional arrays and 
creates
+   * NDArray needed for inference. The array will be reshaped based on the 
input descriptors.
+   * @param input: A IndexedSequence of Scala one-dimensional array, An 
IndexedSequence is
+   * is needed when the model has more than one input/output
+   * @return IndexedSequence array of outputs.
+   */
+  def predict(input: IndexedSeq[Array[Float]]): IndexedSeq[Array[Float]]
+
+  /**
+   * Predict using NDArray as input. This method is useful when the input is a 
batch of data
+   * or when multiple operations on the input/output have to performed.
+   * Note: User is responsible for managing allocation/deallocation of 
NDArrays.
+   * @param input: IndexedSequence NDArrays.
+   * @return output of Predictions as NDArrays.
+   */
+  def predictWithNDArray(input: IndexedSeq[NDArray]): IndexedSeq[NDArray]
+
+}
+
+/**
+ * Implementation of predict routines.
+ *
+ * @param modelPathPrefix PathPrefix from where to load the model.
+ *Example: file://model-dir/resnet-152(containing 
resnet-152-symbol.json,
+ * @param inputDescriptors Descriptors defining the input node names, shape,
+ * layout and Type parameters.
+ * Note: If the input Descriptors is missing batchSize('N' in layout),
+ * a batchSize of 1 is assumed for the model.
+ * 
+ */
+class Predictor(modelPathPrefix: String, protected val inputDescriptors: 
IndexedSeq[DataDesc])
+  extends PredictBase {
+
+  private val logger = LoggerFactory.getLogger(classOf[Predictor])
+
+  protected var batchIndex = inputDescriptors(0).layout.indexOf('N')
+  protected var batchSize = if (batchIndex != -1) 
inputDescriptors(0).shape(batchIndex) else 1
+
+  protected var iDescriptors = inputDescriptors
+
+  inputDescriptors.foreach((f: DataDesc) => require(f.layout.indexOf('N') == 
batchIndex,
+"batch size should be in the same index for all inputs"))
+
+  if (batchIndex != -1) {
+inputDescriptors.foreach((f: DataDesc) => require(f.shape(batchIndex) == 
batchSize,
+  "batch size should be same for all inputs"))
+  } else {
+// Note: this is assuming that the input needs a batch
+logger.warn("InputDescriptor does not have batchSize, using 1 as the 
default batchSize")
+iDescriptors = inputDescriptors.map((f: DataDesc) => new DataDesc(f.name,
+  Shape(1 +: f.shape.toVector), f.dtype, 'N' +: f.layout))
+batchIndex = 1
+  }
+
+  protected val mxNetHandler = MXNetHandler()
+
+  protected val mod = loadModule()
+
+  /**
+   * This method will take input as IndexedSeq one dimensional arrays and 
creates
+   * NDArray needed for inference. The array will be reshaped based on the 
input descriptors.
+   *
+   * @param input : A IndexedSequence of Scala one-dimensional array, An 
IndexedSequence is
+   *  is needed when the model has more than one input/output
+   * @return IndexedSequence array of outputs.
+   */
+  override def predict(input: IndexedSeq[Array[Float]])
+  : IndexedSeq[Array[Float]] = {
+
+require(input.length == inputDescriptors.length, "number of inputs 
provided: %d" +
+  " does not match number of inputs in inputDescriptors: 
%d".format(input.length,
+inputDescriptors.length))
+
+for((i, d) <- input.zip(inputDescriptors)) {
+  require (i.length == d.shape.product/batchSize, "number of elements:" +
+" %d in the input does not match the shape:%s".format( i.length, 
d.shape.toString()))
+}
+var inputND: ListBuffer[NDAr

[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-14 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174607014
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/MXNetHandler.scala
 ##
 @@ -0,0 +1,82 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import java.util.concurrent._
+
+trait MXNetHandler {
+
+  def execute[T](f: => T): T
+
+  val executor: ExecutorService
+
+}
+
+object MXNetHandlerType extends Enumeration {
+
+  type MXNetHandlerType = Value
+  val SingleThreadHandler = Value("MXNetSingleThreadHandler")
+  val OneThreadPerModelHandler = Value("MXNetOneThreadPerModelHandler")
 
 Review comment:
   To be able to support in the future or if I find that its OK to run 
one-thread-per-model


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-14 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174606153
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/PredictBase.scala
 ##
 @@ -0,0 +1,200 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import ml.dmlc.mxnet.io.NDArrayIter
+import ml.dmlc.mxnet.{DataDesc, NDArray, Shape}
+import ml.dmlc.mxnet.module.Module
+
+import scala.collection.mutable.ListBuffer
+import org.slf4j.LoggerFactory
+
+/**
+  * Base Trait for MXNNet Predictor classes.
+  */
+trait PredictBase {
+
+  /**
+* This method will take input as IndexedSeq one dimensional arrays and 
creates
+* NDArray needed for inference. The array will be reshaped based on the 
input descriptors.
+* @param input: A IndexedSequence of Java one-dimensional array, An 
IndexedSequence is
+* is needed when the model has more than one input/output
+* @return IndexedSequence array of outputs.
+*/
+  def predict(input: IndexedSeq[Array[Float]]): IndexedSeq[Array[Float]]
+
+  /**
+* Predict using NDArray as input. This method is useful when the input is 
a batch of data
+* or when multiple operations on the input/output have to performed.
+* Note: User is responsible for managing allocation/deallocation of 
NDArrays.
+* @param input: IndexedSequence NDArrays.
+* @return output of Predictions as NDArrays.
+*/
+  def predictWithNDArray(input: IndexedSeq[NDArray]): IndexedSeq[NDArray]
+
+}
+
+/**
+  * Implementation of predict routines.
+  *
+  * @param modelPathPrefix PathPrefix from where to load the model.
+  *Example: file://model-dir/resnet-152(containing 
resnet-152-symbol.json,
+  *resnet-152-.params and optionally synset.txt).
+  *Supports model loading from various sources like 
local disk,
+  *hdfs, https and s3. file://, hdfs://, https://, 
s3://
+  * @param inputDescriptors Descriptors defining the input node names, shape,
+  * layout and Type parameters
+  * @param outputDescriptors Descriptors defining the output node names, shape,
+  *  layout and Type parameters
+  */
+class Predictor(modelPathPrefix: String,
+ protected val inputDescriptors: IndexedSeq[DataDesc],
+ protected var outputDescriptors:
+ Option[IndexedSeq[DataDesc]] = None) extends PredictBase {
+
+  private val logger = LoggerFactory.getLogger(classOf[Predictor])
+
+  protected var batchIndex = inputDescriptors(0).layout.indexOf('N')
+  protected var batchSize = if (batchIndex != -1 ) 
inputDescriptors(0).shape(batchIndex) else 1
+
+  protected var iDescriptors = inputDescriptors
+
+  inputDescriptors.foreach((f: DataDesc) => require(f.layout.indexOf('N') == 
batchIndex,
+"batch size should be in the same index for all inputs"))
+
+
+  if (batchIndex != -1) {
+inputDescriptors.foreach((f: DataDesc) => require(f.shape(batchIndex) == 
batchSize,
+  "batch size should be same for all inputs"))
+  } else {
+// TODO: this is assuming that the input needs a batch
+iDescriptors = inputDescriptors.map((f : DataDesc) => new DataDesc(f.name,
+Shape(1 +: f.shape.toVector), f.dtype, 'N' +: f.layout) )
+batchIndex = 1
+  }
+
+  protected val mxNetHandler = MXNetHandler()
+
+  protected val mod = loadModule()
+
+  /**
+* This method will take input as IndexedSeq one dimensional arrays and 
creates
+* NDArray needed for inference. The array will be reshaped based on the 
input descriptors.
+*
+* @param input : A IndexedSequence of Java one-dimensional array, An 
IndexedSequence is
+*  is needed when the model has more than one input/output
+* @return IndexedSequence array of outputs.
+*/
+  override def predict(input: IndexedSeq[Array[Float]]): 
IndexedSeq[Array[Float]] = {
+
+require(input.length == inputDescriptors.length, "number of inputs 
provided: %d" +
+  " do not match number of inputs in inputDescriptors: 
%d".format(input.length,
+inp

[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-14 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174583657
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/PredictBase.scala
 ##
 @@ -0,0 +1,200 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import ml.dmlc.mxnet.io.NDArrayIter
+import ml.dmlc.mxnet.{DataDesc, NDArray, Shape}
+import ml.dmlc.mxnet.module.Module
+
+import scala.collection.mutable.ListBuffer
+import org.slf4j.LoggerFactory
+
+/**
+  * Base Trait for MXNNet Predictor classes.
+  */
 
 Review comment:
   I wonder why scala-style is not already doing that.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-14 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174583365
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/MXNetHandler.scala
 ##
 @@ -0,0 +1,82 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import java.util.concurrent._
+
+trait MXNetHandler {
+
+  def execute[T](f: => T): T
+
+  val executor: ExecutorService
+
+}
+
+object MXNetHandlerType extends Enumeration {
+
+  type MXNetHandlerType = Value
+  val SingleThreadHandler = Value("MXNetSingleThreadHandler")
+  val OneThreadPerModelHandler = Value("MXNetOneThreadPerModelHandler")
+}
+
+class MXNetOneThreadPerModelHandler extends MXNetHandler {
+
+  private val threadFactory = new ThreadFactory {
 
 Review comment:
   I'll learn about it and let me do that in the next iteration


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-14 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174583067
 
 

 ##
 File path: scala-package/infer/pom.xml
 ##
 @@ -0,0 +1,124 @@
+
+http://maven.apache.org/POM/4.0.0";
+ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
+ xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+
+mxnet-parent_2.11
+ml.dmlc.mxnet
+1.0.1-SNAPSHOT
+
+4.0.0
+
+mxnet-infer
+MXNet Scala Package - Inference
+
+
+
+release
+
+
+
+org.apache.maven.plugins
+maven-source-plugin
+
+true
+
+
+
+org.apache.maven.plugins
+maven-javadoc-plugin
+
+true
+
+
+
+org.apache.maven.plugins
+maven-gpg-plugin
+
+true
+
+
+
+org.sonatype.plugins
+nexus-staging-maven-plugin
+
+
true
+
+
+
+
+
+
 
 Review comment:
   I wasn't able inherit and make the tests run. the tests needs the jar to be 
built 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-14 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174582843
 
 

 ##
 File path: scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/package.scala
 ##
 @@ -0,0 +1,24 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet
+
+import ml.dmlc.mxnet.infer.MXNetHandlerType.MXNetHandlerType
+
+package object infer {
+  private[mxnet] val handlerType: MXNetHandlerType = 
MXNetHandlerType.SingleThreadHandler
 
 Review comment:
   I will expose this as a property if I find that MXNet can be called from 
multiple threads.
   , like said in other places I am going with the assumption that MXNet needs 
to be called from the same thread through out the lifetime of the process 
otherwise it seg-faults.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-14 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174582317
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/MXNetHandler.scala
 ##
 @@ -0,0 +1,82 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import java.util.concurrent._
+
+trait MXNetHandler {
+
+  def execute[T](f: => T): T
+
+  val executor: ExecutorService
 
 Review comment:
   I want to be able to expose the executor if at all I find that all NDArray 
creations have to go throw the same thread.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-14 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174581815
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/MXNetHandler.scala
 ##
 @@ -0,0 +1,82 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import java.util.concurrent._
+
+trait MXNetHandler {
+
+  def execute[T](f: => T): T
+
+  val executor: ExecutorService
+
+}
+
+object MXNetHandlerType extends Enumeration {
+
+  type MXNetHandlerType = Value
+  val SingleThreadHandler = Value("MXNetSingleThreadHandler")
+  val OneThreadPerModelHandler = Value("MXNetOneThreadPerModelHandler")
+}
+
+class MXNetOneThreadPerModelHandler extends MXNetHandler {
+
+  private val threadFactory = new ThreadFactory {
+
+override def newThread(r: Runnable): Thread = new Thread(r) {
+  setName(classOf[MXNetOneThreadPerModelHandler].getCanonicalName)
+}
+  }
+
+  override val executor: ExecutorService = Executors.newFixedThreadPool(10, 
threadFactory)
 
 Review comment:
   I want this to be configurable, the value 10 was for testing, made this 
configurable now.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-14 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174581398
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/MXNetHandler.scala
 ##
 @@ -0,0 +1,82 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import java.util.concurrent._
+
+trait MXNetHandler {
+
+  def execute[T](f: => T): T
+
+  val executor: ExecutorService
+
+}
+
+object MXNetHandlerType extends Enumeration {
+
+  type MXNetHandlerType = Value
+  val SingleThreadHandler = Value("MXNetSingleThreadHandler")
+  val OneThreadPerModelHandler = Value("MXNetOneThreadPerModelHandler")
+}
+
+class MXNetOneThreadPerModelHandler extends MXNetHandler {
+
+  private val threadFactory = new ThreadFactory {
+
+override def newThread(r: Runnable): Thread = new Thread(r) {
+  setName(classOf[MXNetOneThreadPerModelHandler].getCanonicalName)
+}
+  }
+
+  override val executor: ExecutorService = Executors.newFixedThreadPool(10, 
threadFactory)
+
+  override def execute[T](f: => T): T = {
+val task = new Callable[T] {
+  override def call(): T = {
+// scalastyle:off println
+println("threadId: %s".format(Thread.currentThread().getId()))
+// scalastyle:on println
+f
+  }
+}
+val result = executor.submit(task)
+try {
+  result.get()
+}
+catch {
+  case e: ExecutionException => throw e.getCause()
+}
+  }
+
+}
+
+object MXNetSingleThreadHandler extends MXNetOneThreadPerModelHandler {
 
 Review comment:
   I want to create a singleton object with only thread which will be used by 
default.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-14 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174580232
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/MXNetHandler.scala
 ##
 @@ -0,0 +1,82 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import java.util.concurrent._
+
+trait MXNetHandler {
+
+  def execute[T](f: => T): T
+
+  val executor: ExecutorService
+
+}
+
+object MXNetHandlerType extends Enumeration {
+
+  type MXNetHandlerType = Value
+  val SingleThreadHandler = Value("MXNetSingleThreadHandler")
+  val OneThreadPerModelHandler = Value("MXNetOneThreadPerModelHandler")
+}
+
+class MXNetOneThreadPerModelHandler extends MXNetHandler {
+
+  private val threadFactory = new ThreadFactory {
+
+override def newThread(r: Runnable): Thread = new Thread(r) {
+  setName(classOf[MXNetOneThreadPerModelHandler].getCanonicalName)
+}
+  }
+
+  override val executor: ExecutorService = Executors.newFixedThreadPool(10, 
threadFactory)
+
+  override def execute[T](f: => T): T = {
+val task = new Callable[T] {
+  override def call(): T = {
+// scalastyle:off println
+println("threadId: %s".format(Thread.currentThread().getId()))
 
 Review comment:
   this for testing, removed the print


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services


[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-13 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174285487
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/Predictor.scala
 ##
 @@ -0,0 +1,188 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import ml.dmlc.mxnet.io.NDArrayIter
+import ml.dmlc.mxnet.{DataDesc, NDArray, Shape}
+import ml.dmlc.mxnet.module.Module
+
+import scala.collection.mutable.ListBuffer
+import org.slf4j.LoggerFactory
+
+/**
+ * Base Trait for MXNet Predictor classes.
+ */
+private[mxnet] trait PredictBase {
+
+  /**
+   * This method will take input as IndexedSeq one dimensional arrays and 
creates
+   * NDArray needed for inference. The array will be reshaped based on the 
input descriptors.
+   * @param input: A IndexedSequence of Scala one-dimensional array, An 
IndexedSequence is
+   * is needed when the model has more than one input/output
+   * @return IndexedSequence array of outputs.
+   */
+  def predict(input: IndexedSeq[Array[Float]]): IndexedSeq[Array[Float]]
+
+  /**
+   * Predict using NDArray as input. This method is useful when the input is a 
batch of data
+   * or when multiple operations on the input/output have to performed.
+   * Note: User is responsible for managing allocation/deallocation of 
NDArrays.
+   * @param input: IndexedSequence NDArrays.
+   * @return output of Predictions as NDArrays.
+   */
+  def predictWithNDArray(input: IndexedSeq[NDArray]): IndexedSeq[NDArray]
+
+}
+
+/**
+ * Implementation of predict routines.
+ *
+ * @param modelPathPrefix PathPrefix from where to load the model.
+ *Example: file://model-dir/resnet-152(containing 
resnet-152-symbol.json,
+ * @param inputDescriptors Descriptors defining the input node names, shape,
+ * layout and Type parameters.
+ * Note: If the input Descriptors is missing batchSize('N' in layout),
+ * a batchSize of 1 is assumed for the model.
+ * 
+ */
+class Predictor(modelPathPrefix: String, protected val inputDescriptors: 
IndexedSeq[DataDesc])
+  extends PredictBase {
+
+  private val logger = LoggerFactory.getLogger(classOf[Predictor])
+
+  protected var batchIndex = inputDescriptors(0).layout.indexOf('N')
+  protected var batchSize = if (batchIndex != -1) 
inputDescriptors(0).shape(batchIndex) else 1
+
+  protected var iDescriptors = inputDescriptors
+
+  inputDescriptors.foreach((f: DataDesc) => require(f.layout.indexOf('N') == 
batchIndex,
+"batch size should be in the same index for all inputs"))
+
+  if (batchIndex != -1) {
+inputDescriptors.foreach((f: DataDesc) => require(f.shape(batchIndex) == 
batchSize,
+  "batch size should be same for all inputs"))
+  } else {
+// Note: this is assuming that the input needs a batch
+logger.warn("InputDescriptor does not have batchSize, using 1 as the 
default batchSize")
+iDescriptors = inputDescriptors.map((f: DataDesc) => new DataDesc(f.name,
+  Shape(1 +: f.shape.toVector), f.dtype, 'N' +: f.layout))
+batchIndex = 1
+  }
+
+  protected val mxNetHandler = MXNetHandler()
+
+  protected val mod = loadModule()
+
+  /**
+   * This method will take input as IndexedSeq one dimensional arrays and 
creates
+   * NDArray needed for inference. The array will be reshaped based on the 
input descriptors.
+   *
+   * @param input : A IndexedSequence of Scala one-dimensional array, An 
IndexedSequence is
+   *  is needed when the model has more than one input/output
+   * @return IndexedSequence array of outputs.
+   */
+  override def predict(input: IndexedSeq[Array[Float]])
+  : IndexedSeq[Array[Float]] = {
+
+require(input.length == inputDescriptors.length, "number of inputs 
provided: %d" +
+  " does not match number of inputs in inputDescriptors: 
%d".format(input.length,
+inputDescriptors.length))
+
+for((i, d) <- input.zip(inputDescriptors)) {
+  require (i.length == d.shape.product/batchSize, "number of elements:" +
+" %d in the input does not match the shape:%s".format( i.length, 
d.shape.toString()))
+}
+var inputND: ListBuffer[NDAr

[GitHub] nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference APIs

2018-03-13 Thread GitBox
nswamy commented on a change in pull request #9678: [MXNET-50] Scala Inference 
APIs
URL: https://github.com/apache/incubator-mxnet/pull/9678#discussion_r174247041
 
 

 ##
 File path: 
scala-package/infer/src/main/scala/ml/dmlc/mxnet/infer/Classifier.scala
 ##
 @@ -0,0 +1,167 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package ml.dmlc.mxnet.infer
+
+import ml.dmlc.mxnet.{DataDesc, NDArray}
+import java.io.File
+
+import org.slf4j.LoggerFactory
+
+import scala.io
+import scala.collection.mutable.ListBuffer
+
+trait ClassifierBase {
+
+  /**
+* Takes an Array of Floats and returns corresponding labels, score tuples.
+* @param input: IndexedSequence one-dimensional array of Floats.
+* @param topK: (Optional) How many top_k(sorting will be based on the last 
axis)
+* elements to return, if not passed returns unsorted output.
+* @return IndexedSequence of (Label, Score) tuples.
+*/
+  def classify(input: IndexedSeq[Array[Float]],
+   topK: Option[Int] = None): List[(String, Float)]
+
+  /**
+* Takes a Sequence of NDArrays and returns Label, Score tuples.
+* @param input: Indexed Sequence of NDArrays
+* @param topK: (Optional) How many top_k(sorting will be based on the last 
axis)
+* elements to return, if not passed returns unsorted output.
+* @return Traversable Sequence of (Label, Score) tuple, Score will be in 
the form of NDArray
+*/
+  def classifyWithNDArray(input: IndexedSeq[NDArray],
+  topK: Option[Int] = None): IndexedSeq[List[(String, 
Float)]]
+}
+
+/**
+  * A class for classifier tasks
+  * @param modelPathPrefix PathPrefix from where to load the symbol, 
parameters and synset.txt
+  *Example: file://model-dir/resnet-152(containing 
resnet-152-symbol.json
+  *file://model-dir/synset.txt
+  * @param inputDescriptors Descriptors defining the input node names, shape,
+  * layout and Type parameters
+  */
+class Classifier(modelPathPrefix: String, protected val inputDescriptors: 
IndexedSeq[DataDesc])
+  extends ClassifierBase {
+
+  private val logger = LoggerFactory.getLogger(classOf[Classifier])
+
+  val predictor: PredictBase = getPredictor(modelPathPrefix, inputDescriptors)
+
+  val synsetFilePath = getSynsetFilePath(modelPathPrefix)
+
+  val synset = readSynsetFile(synsetFilePath)
 
 Review comment:
   Classifier is just taking the predictor and using Synset to map the labels, 
this is the only difference between the predictor/classifier


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services