[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-25 Thread petermaxlee
Github user petermaxlee closed the pull request at:

https://github.com/apache/spark/pull/13969


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70204050
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,174 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * For now, only types defined in `Reflect.typeMapping` are supported 
(basically primitives
+ * and string) as input types, and the output is turned automatically to a 
string.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 'randomUUID');\n 
c33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
--- End diff --

`CallMethodUsingReflect`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70203510
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,174 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * For now, only types defined in `Reflect.typeMapping` are supported 
(basically primitives
+ * and string) as input types, and the output is turned automatically to a 
string.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 'randomUUID');\n 
c33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion. If we support timestamps, dates, 
decimals, arrays, or maps
+  // in the future, proper conversion needs to happen here too.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval().asInstanceOf[UTF8String].toString
+
+  /** True if the class exists and can be loaded. */
+  @transient private lazy val classExists = Reflect.classExists(className)
+
+  /** The reflection method. */
+  @transient lazy val method: Method = {
+val methodName = 
children(1).eval(null).asInstanceOf[UTF8String].toString
+Reflect.findMethod(className, methodName, 
argExprs.map(_.dataType)).orNull

[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70203430
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/MiscFunctionsSuite.scala ---
@@ -0,0 +1,35 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql
+
+import org.apache.spark.sql.test.SharedSQLContext
+
+class MiscFunctionsSuite extends QueryTest with SharedSQLContext {
+  import testImplicits._
+
+  test("reflect and java_method") {
+val df = Seq((1, "one")).toDF("a", "b")
+checkAnswer(
+  df.selectExpr("reflect('org.apache.spark.sql.ReflectClass', 
'method1', a, b)"),
--- End diff --

oh i see, `reflect('org.apache.spark.sql.ReflectClass', 'method1', a, b)` 
is not equal to `ReflectClass.method1`, but calling the static method defined 
in the `ReflectClass` class.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70202991
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/MiscFunctionsSuite.scala ---
@@ -0,0 +1,35 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql
+
+import org.apache.spark.sql.test.SharedSQLContext
+
+class MiscFunctionsSuite extends QueryTest with SharedSQLContext {
+  import testImplicits._
+
+  test("reflect and java_method") {
+val df = Seq((1, "one")).toDF("a", "b")
+checkAnswer(
+  df.selectExpr("reflect('org.apache.spark.sql.ReflectClass', 
'method1', a, b)"),
--- End diff --

You should decompile the right class (don't decompile the one with a dollar 
sign). Static methods are generated too.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70202906
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/MiscFunctionsSuite.scala ---
@@ -0,0 +1,35 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql
+
+import org.apache.spark.sql.test.SharedSQLContext
+
+class MiscFunctionsSuite extends QueryTest with SharedSQLContext {
+  import testImplicits._
+
+  test("reflect and java_method") {
+val df = Seq((1, "one")).toDF("a", "b")
+checkAnswer(
+  df.selectExpr("reflect('org.apache.spark.sql.ReflectClass', 
'method1', a, b)"),
--- End diff --

no, method defined in companion object is not static method, but a normal 
method defined in a singleton class. You can decompile the class file to check 
it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70202613
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,174 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * For now, only types defined in `Reflect.typeMapping` are supported 
(basically primitives
+ * and string) as input types, and the output is turned automatically to a 
string.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 'randomUUID');\n 
c33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion. If we support timestamps, dates, 
decimals, arrays, or maps
+  // in the future, proper conversion needs to happen here too.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval().asInstanceOf[UTF8String].toString
+
+  /** True if the class exists and can be loaded. */
+  @transient private lazy val classExists = Reflect.classExists(className)
+
+  /** The reflection method. */
+  @transient lazy val method: Method = {
+val methodName = 
children(1).eval(null).asInstanceOf[UTF8String].toString
+Reflect.findMethod(className, methodName, 

[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70201874
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/MiscFunctionsSuite.scala ---
@@ -0,0 +1,35 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql
+
+import org.apache.spark.sql.test.SharedSQLContext
+
+class MiscFunctionsSuite extends QueryTest with SharedSQLContext {
+  import testImplicits._
+
+  test("reflect and java_method") {
+val df = Seq((1, "one")).toDF("a", "b")
+checkAnswer(
+  df.selectExpr("reflect('org.apache.spark.sql.ReflectClass', 
'method1', a, b)"),
--- End diff --

I don't get what you mean. Scala does have static methods -- methods that 
are defined in a companion object is static.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70201875
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,174 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * For now, only types defined in `Reflect.typeMapping` are supported 
(basically primitives
+ * and string) as input types, and the output is turned automatically to a 
string.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 'randomUUID');\n 
c33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion. If we support timestamps, dates, 
decimals, arrays, or maps
+  // in the future, proper conversion needs to happen here too.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval().asInstanceOf[UTF8String].toString
+
+  /** True if the class exists and can be loaded. */
+  @transient private lazy val classExists = Reflect.classExists(className)
+
+  /** The reflection method. */
+  @transient lazy val method: Method = {
+val methodName = 
children(1).eval(null).asInstanceOf[UTF8String].toString
+Reflect.findMethod(className, methodName, 
argExprs.map(_.dataType)).orNull

[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70201841
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,174 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * For now, only types defined in `Reflect.typeMapping` are supported 
(basically primitives
+ * and string) as input types, and the output is turned automatically to a 
string.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 'randomUUID');\n 
c33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
--- End diff --

So what's a good name? I am not attached to Reflect, but I think Reflect 
should be in the name, if the function is called reflect.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70201786
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,174 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * For now, only types defined in `Reflect.typeMapping` are supported 
(basically primitives
+ * and string) as input types, and the output is turned automatically to a 
string.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 'randomUUID');\n 
c33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion. If we support timestamps, dates, 
decimals, arrays, or maps
+  // in the future, proper conversion needs to happen here too.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval().asInstanceOf[UTF8String].toString
+
+  /** True if the class exists and can be loaded. */
+  @transient private lazy val classExists = Reflect.classExists(className)
+
+  /** The reflection method. */
+  @transient lazy val method: Method = {
+val methodName = 
children(1).eval(null).asInstanceOf[UTF8String].toString
+Reflect.findMethod(className, methodName, 

[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70201691
  
--- Diff: 
sql/core/src/test/scala/org/apache/spark/sql/MiscFunctionsSuite.scala ---
@@ -0,0 +1,35 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql
+
+import org.apache.spark.sql.test.SharedSQLContext
+
+class MiscFunctionsSuite extends QueryTest with SharedSQLContext {
+  import testImplicits._
+
+  test("reflect and java_method") {
+val df = Seq((1, "one")).toDF("a", "b")
+checkAnswer(
+  df.selectExpr("reflect('org.apache.spark.sql.ReflectClass', 
'method1', a, b)"),
--- End diff --

We should also test it in `JavaDataFrameSuite`, there is no real static 
method in scala.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70201642
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,174 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * For now, only types defined in `Reflect.typeMapping` are supported 
(basically primitives
+ * and string) as input types, and the output is turned automatically to a 
string.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 'randomUUID');\n 
c33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion. If we support timestamps, dates, 
decimals, arrays, or maps
+  // in the future, proper conversion needs to happen here too.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval().asInstanceOf[UTF8String].toString
+
+  /** True if the class exists and can be loaded. */
+  @transient private lazy val classExists = Reflect.classExists(className)
+
+  /** The reflection method. */
+  @transient lazy val method: Method = {
+val methodName = 
children(1).eval(null).asInstanceOf[UTF8String].toString
+Reflect.findMethod(className, methodName, 
argExprs.map(_.dataType)).orNull

[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70201559
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,174 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * For now, only types defined in `Reflect.typeMapping` are supported 
(basically primitives
+ * and string) as input types, and the output is turned automatically to a 
string.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 'randomUUID');\n 
c33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion. If we support timestamps, dates, 
decimals, arrays, or maps
+  // in the future, proper conversion needs to happen here too.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval().asInstanceOf[UTF8String].toString
+
+  /** True if the class exists and can be loaded. */
+  @transient private lazy val classExists = Reflect.classExists(className)
+
+  /** The reflection method. */
+  @transient lazy val method: Method = {
+val methodName = 
children(1).eval(null).asInstanceOf[UTF8String].toString
+Reflect.findMethod(className, methodName, 
argExprs.map(_.dataType)).orNull

[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70201417
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,174 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * For now, only types defined in `Reflect.typeMapping` are supported 
(basically primitives
+ * and string) as input types, and the output is turned automatically to a 
string.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 'randomUUID');\n 
c33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
--- End diff --

Ya. It's my fault. Sorry for that.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70200185
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,174 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * For now, only types defined in `Reflect.typeMapping` are supported 
(basically primitives
+ * and string) as input types, and the output is turned automatically to a 
string.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 'randomUUID');\n 
c33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
--- End diff --

It is also annoying if we search for reflect (based on the name) and then 
doesn't find an expression with reflect in the name.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70200163
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,174 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * For now, only types defined in `Reflect.typeMapping` are supported 
(basically primitives
+ * and string) as input types, and the output is turned automatically to a 
string.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 'randomUUID');\n 
c33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
--- End diff --

I actually named it JavaMethodReflect before but @dongjoon-hyun asked to 
use Reflect.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70198925
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
--- End diff --

`while` is preferred here. The `eval` method is critical path and `for` 
loop in scala in slow.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-10 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70198848
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,174 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * For now, only types defined in `Reflect.typeMapping` are supported 
(basically primitives
+ * and string) as input types, and the output is turned automatically to a 
string.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 'randomUUID');\n 
c33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
--- End diff --

`Reflect` is really ambiguous, how about `CallMethod`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70120022
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval().asInstanceOf[UTF8String].toString
+
+  /** True if the class exists and can be loaded. */
+  @transient private lazy val classExists = Reflect.classExists(className)
--- End diff --

Let's forget about `Try`. It's not a good style, too.

BTW, do you mean `Utils.classForName` is called once in this PR?
> given it's created only once.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70118085
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
--- End diff --

Yep. `reflect` does. `reflect2` is not related to this PR.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70117803
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
--- End diff --

added


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70116854
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
--- End diff --

Yes Hive only supports string for reflect.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70106490
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval().asInstanceOf[UTF8String].toString
+
+  /** True if the class exists and can be loaded. */
+  @transient private lazy val classExists = Reflect.classExists(className)
--- End diff --

It'd make unit test more annoying to write. I kind of prefer doing it this 
way, since the cost of creating a class 3 times is very small given it's 
created only once.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70106457
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
--- End diff --

Oh, it does. I referenced the wrong part of Hive before; 
(GenericUDFReflect.java).


https://github.com/apache/hive/blob/master/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFReflect.java#L137
```
try {
  return String.valueOf(m.invoke(o, parameterJavaValues));
```

You mean GenericUDFReflect2.java, right?


https://github.com/apache/hive/blob/master/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFReflect2.java


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70106095
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
--- End diff --

Let me add a comment saying only string is supported for now.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70106044
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval().asInstanceOf[UTF8String].toString
--- End diff --

Why? I don't think head is more clear.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70075370
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
--- End diff --

What about timestamp/date/decimals/arrays/maps?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70075273
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
--- End diff --

Hive's reflect allows a method to return anything Hive can serialize. Why 
do we only return a String?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread hvanhovell
Github user hvanhovell commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70074963
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
--- End diff --

Why? Both seem fine to me?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70047707
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval().asInstanceOf[UTF8String].toString
+
+  /** True if the class exists and can be loaded. */
+  @transient private lazy val classExists = Reflect.classExists(className)
--- End diff --

What I mean is the following.
```
- } else if (!classExists) {
+ } else if (clazz.getOrElse(null) == null) {
...
- @transient private lazy val classExists = Reflect.classExists(className)
+ @transient private lazy val clazz = Reflect.findClass(className)
...
- private def classExists(className: String): Boolean = { ... }
+ private def findClass(className: String): Try[Class[_]] = 
Try(Utils.classForName(className))
```


---
If your project is set up for it, you can reply to this email and have your
reply 

[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70045387
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval().asInstanceOf[UTF8String].toString
+
+  /** True if the class exists and can be loaded. */
+  @transient private lazy val classExists = Reflect.classExists(className)
--- End diff --

Currently, this is a boolean. Can we use this for `val clazz: Class[_]` 
instead?
For `false`, it could be null.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: 

[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70042656
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
--- End diff --

Could you replace this?
```scala
var i = 0
while (i < argExprs.length) {
  ...
  i += 1
}
```
with the following?
```scala
for (i <- argExprs.indices) {
   ...
}
```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70042360
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class Reflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (!classExists) {
+  TypeCheckFailure(s"class $className not found")
+} else if (method == null) {
+  TypeCheckFailure(s"cannot find a method that matches the argument 
types in $className")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval().asInstanceOf[UTF8String].toString
--- End diff --

Minor: `children(0)` -> `children.head`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-08 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r70042265
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Reflect.scala
 ---
@@ -0,0 +1,170 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
--- End diff --

Could you add a single space after newline, e.g. '\n` -> `\n `?
In many cases, we do that.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-07 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69857166
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflect.scala
 ---
@@ -0,0 +1,153 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class JavaMethodReflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+  import JavaMethodReflect._
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (method == null) {
+  TypeCheckFailure("cannot find a method that matches the argument 
types")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval(null).asInstanceOf[UTF8String].toString
+
+  /** The reflection method. */
+  @transient lazy val method: Method = {
+val methodName = 
children(1).eval(null).asInstanceOf[UTF8String].toString
+findMethod(className, methodName, argExprs.map(_.dataType)).orNull
+  }
+
+  /** If the class has a no-arg ctor, instantiate the object. Otherwise, 
obj is null. */
+  @transient private lazy val obj: Object = 
instantiate(className).orNull.asInstanceOf[Object]
+
+  /** A temporary buffer used to hold intermediate results returned by 
children. */
+  @transient private lazy val buffer = new Array[Object](argExprs.length)
+}
+
+object JavaMethodReflect {
+  /** Mapping from Spark's type to acceptable JVM types. */
+  val typeMapping = 

[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-07 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69856576
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflectSuite.scala
 ---
@@ -0,0 +1,101 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.types.{IntegerType, StringType}
+
+/**
+ * Test suite for [[JavaMethodReflect]] and its companion object.
+ */
+class JavaMethodReflectSuite extends SparkFunSuite with 
ExpressionEvalHelper {
+
+  import JavaMethodReflect._
+
+  private val staticClassName = ReflectStaticClass.getClass.getName
+  private val dynamicClassName = classOf[ReflectClass].getName
+
+  test("findMethod via reflection for static methods") {
+for (className <- Seq(staticClassName, dynamicClassName)) {
+  assert(findMethod(className, "method1", Seq.empty).exists(_.getName 
== "method1"))
+  assert(findMethod(className, "method2", Seq(IntegerType)).isDefined)
+  assert(findMethod(className, "method3", Seq(IntegerType)).isDefined)
+  assert(findMethod(className, "method4", Seq(IntegerType, 
StringType)).isDefined)
+}
+  }
+
+  test("instantiate class via reflection") {
+// Should succeed since the following two should have no-arg ctor.
+assert(instantiate(dynamicClassName).isDefined)
+assert(instantiate(staticClassName).isDefined)
+
+// Should fail since there is no no-arg ctor.
+assert(instantiate(classOf[ReflectClass1].getName).isEmpty)
+  }
+
+  test("findMethod for a JDK library") {
+assert(findMethod(classOf[java.util.UUID].getName, "randomUUID", 
Seq.empty).isDefined)
+  }
+
+  test("type checking") {
+assert(JavaMethodReflect(Seq.empty).checkInputDataTypes().isFailure)
+
assert(JavaMethodReflect(Seq(Literal(staticClassName))).checkInputDataTypes().isFailure)
+assert(
+  JavaMethodReflect(Seq(Literal(staticClassName), 
Literal(1))).checkInputDataTypes().isFailure)
+
+assert(reflectExpr(staticClassName, 
"method1").checkInputDataTypes().isSuccess)
+  }
+
+  test("invoking methods using acceptable types") {
+for (className <- Seq(staticClassName, dynamicClassName)) {
+  checkEvaluation(reflectExpr(className, "method1"), "m1")
+  checkEvaluation(reflectExpr(className, "method2", 2), "m2")
+  checkEvaluation(reflectExpr(className, "method3", 3), "m3")
+  checkEvaluation(reflectExpr(className, "method4", 4, "four"), 
"m4four")
+}
+  }
+
+  private def reflectExpr(className: String, methodName: String, args: 
Any*): JavaMethodReflect = {
--- End diff --

Why?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-06 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69706643
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflect.scala
 ---
@@ -0,0 +1,153 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class JavaMethodReflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+  import JavaMethodReflect._
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (method == null) {
+  TypeCheckFailure("cannot find a method that matches the argument 
types")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval(null).asInstanceOf[UTF8String].toString
+
+  /** The reflection method. */
+  @transient lazy val method: Method = {
+val methodName = 
children(1).eval(null).asInstanceOf[UTF8String].toString
+findMethod(className, methodName, argExprs.map(_.dataType)).orNull
+  }
+
+  /** If the class has a no-arg ctor, instantiate the object. Otherwise, 
obj is null. */
+  @transient private lazy val obj: Object = 
instantiate(className).orNull.asInstanceOf[Object]
+
+  /** A temporary buffer used to hold intermediate results returned by 
children. */
+  @transient private lazy val buffer = new Array[Object](argExprs.length)
+}
+
+object JavaMethodReflect {
+  /** Mapping from Spark's type to acceptable JVM types. */
+  val typeMapping = 

[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-06 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69706388
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflect.scala
 ---
@@ -0,0 +1,153 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class JavaMethodReflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+  import JavaMethodReflect._
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (method == null) {
+  TypeCheckFailure("cannot find a method that matches the argument 
types")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval(null).asInstanceOf[UTF8String].toString
+
+  /** The reflection method. */
+  @transient lazy val method: Method = {
+val methodName = 
children(1).eval(null).asInstanceOf[UTF8String].toString
+findMethod(className, methodName, argExprs.map(_.dataType)).orNull
+  }
+
+  /** If the class has a no-arg ctor, instantiate the object. Otherwise, 
obj is null. */
+  @transient private lazy val obj: Object = 
instantiate(className).orNull.asInstanceOf[Object]
+
+  /** A temporary buffer used to hold intermediate results returned by 
children. */
+  @transient private lazy val buffer = new Array[Object](argExprs.length)
+}
+
+object JavaMethodReflect {
+  /** Mapping from Spark's type to acceptable JVM types. */
+  val typeMapping = 

[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-06 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69704233
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflectSuite.scala
 ---
@@ -0,0 +1,101 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.types.{IntegerType, StringType}
+
+/**
+ * Test suite for [[JavaMethodReflect]] and its companion object.
+ */
+class JavaMethodReflectSuite extends SparkFunSuite with 
ExpressionEvalHelper {
+
+  import JavaMethodReflect._
+
+  private val staticClassName = ReflectStaticClass.getClass.getName
+  private val dynamicClassName = classOf[ReflectClass].getName
+
+  test("findMethod via reflection for static methods") {
+for (className <- Seq(staticClassName, dynamicClassName)) {
+  assert(findMethod(className, "method1", Seq.empty).exists(_.getName 
== "method1"))
+  assert(findMethod(className, "method2", Seq(IntegerType)).isDefined)
+  assert(findMethod(className, "method3", Seq(IntegerType)).isDefined)
+  assert(findMethod(className, "method4", Seq(IntegerType, 
StringType)).isDefined)
+}
+  }
+
+  test("instantiate class via reflection") {
+// Should succeed since the following two should have no-arg ctor.
+assert(instantiate(dynamicClassName).isDefined)
+assert(instantiate(staticClassName).isDefined)
+
+// Should fail since there is no no-arg ctor.
+assert(instantiate(classOf[ReflectClass1].getName).isEmpty)
+  }
+
+  test("findMethod for a JDK library") {
+assert(findMethod(classOf[java.util.UUID].getName, "randomUUID", 
Seq.empty).isDefined)
+  }
+
+  test("type checking") {
+assert(JavaMethodReflect(Seq.empty).checkInputDataTypes().isFailure)
+
assert(JavaMethodReflect(Seq(Literal(staticClassName))).checkInputDataTypes().isFailure)
+assert(
+  JavaMethodReflect(Seq(Literal(staticClassName), 
Literal(1))).checkInputDataTypes().isFailure)
+
+assert(reflectExpr(staticClassName, 
"method1").checkInputDataTypes().isSuccess)
+  }
+
+  test("invoking methods using acceptable types") {
+for (className <- Seq(staticClassName, dynamicClassName)) {
+  checkEvaluation(reflectExpr(className, "method1"), "m1")
+  checkEvaluation(reflectExpr(className, "method2", 2), "m2")
+  checkEvaluation(reflectExpr(className, "method3", 3), "m3")
+  checkEvaluation(reflectExpr(className, "method4", 4, "four"), 
"m4four")
+}
+  }
+
+  private def reflectExpr(className: String, methodName: String, args: 
Any*): JavaMethodReflect = {
--- End diff --

Minor, but could you remove `: JavaMethodReflect` here?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-06 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69703426
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflectSuite.scala
 ---
@@ -0,0 +1,101 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.types.{IntegerType, StringType}
+
+/**
+ * Test suite for [[JavaMethodReflect]] and its companion object.
+ */
+class JavaMethodReflectSuite extends SparkFunSuite with 
ExpressionEvalHelper {
+
+  import JavaMethodReflect._
+
+  private val staticClassName = ReflectStaticClass.getClass.getName
+  private val dynamicClassName = classOf[ReflectClass].getName
+
+  test("findMethod via reflection for static methods") {
+for (className <- Seq(staticClassName, dynamicClassName)) {
+  assert(findMethod(className, "method1", Seq.empty).exists(_.getName 
== "method1"))
+  assert(findMethod(className, "method2", Seq(IntegerType)).isDefined)
+  assert(findMethod(className, "method3", Seq(IntegerType)).isDefined)
+  assert(findMethod(className, "method4", Seq(IntegerType, 
StringType)).isDefined)
+}
+  }
+
+  test("instantiate class via reflection") {
+// Should succeed since the following two should have no-arg ctor.
+assert(instantiate(dynamicClassName).isDefined)
+assert(instantiate(staticClassName).isDefined)
+
+// Should fail since there is no no-arg ctor.
+assert(instantiate(classOf[ReflectClass1].getName).isEmpty)
+  }
+
+  test("findMethod for a JDK library") {
+assert(findMethod(classOf[java.util.UUID].getName, "randomUUID", 
Seq.empty).isDefined)
+  }
+
+  test("type checking") {
+assert(JavaMethodReflect(Seq.empty).checkInputDataTypes().isFailure)
+
assert(JavaMethodReflect(Seq(Literal(staticClassName))).checkInputDataTypes().isFailure)
+assert(
+  JavaMethodReflect(Seq(Literal(staticClassName), 
Literal(1))).checkInputDataTypes().isFailure)
+
+assert(reflectExpr(staticClassName, 
"method1").checkInputDataTypes().isSuccess)
--- End diff --

Could you add one testcase for `dynamicClassName`, too?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-06 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69702922
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflectSuite.scala
 ---
@@ -0,0 +1,101 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.types.{IntegerType, StringType}
+
+/**
+ * Test suite for [[JavaMethodReflect]] and its companion object.
+ */
+class JavaMethodReflectSuite extends SparkFunSuite with 
ExpressionEvalHelper {
+
+  import JavaMethodReflect._
+
+  private val staticClassName = ReflectStaticClass.getClass.getName
+  private val dynamicClassName = classOf[ReflectClass].getName
+
+  test("findMethod via reflection for static methods") {
+for (className <- Seq(staticClassName, dynamicClassName)) {
+  assert(findMethod(className, "method1", Seq.empty).exists(_.getName 
== "method1"))
+  assert(findMethod(className, "method2", Seq(IntegerType)).isDefined)
+  assert(findMethod(className, "method3", Seq(IntegerType)).isDefined)
+  assert(findMethod(className, "method4", Seq(IntegerType, 
StringType)).isDefined)
+}
+  }
+
+  test("instantiate class via reflection") {
+// Should succeed since the following two should have no-arg ctor.
+assert(instantiate(dynamicClassName).isDefined)
+assert(instantiate(staticClassName).isDefined)
+
+// Should fail since there is no no-arg ctor.
+assert(instantiate(classOf[ReflectClass1].getName).isEmpty)
--- End diff --

Here again. `Invalid class name test`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-06 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69702708
  
--- Diff: 
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflectSuite.scala
 ---
@@ -0,0 +1,101 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import org.apache.spark.SparkFunSuite
+import org.apache.spark.sql.types.{IntegerType, StringType}
+
+/**
+ * Test suite for [[JavaMethodReflect]] and its companion object.
+ */
+class JavaMethodReflectSuite extends SparkFunSuite with 
ExpressionEvalHelper {
+
+  import JavaMethodReflect._
+
+  private val staticClassName = ReflectStaticClass.getClass.getName
+  private val dynamicClassName = classOf[ReflectClass].getName
+
+  test("findMethod via reflection for static methods") {
+for (className <- Seq(staticClassName, dynamicClassName)) {
+  assert(findMethod(className, "method1", Seq.empty).exists(_.getName 
== "method1"))
+  assert(findMethod(className, "method2", Seq(IntegerType)).isDefined)
+  assert(findMethod(className, "method3", Seq(IntegerType)).isDefined)
+  assert(findMethod(className, "method4", Seq(IntegerType, 
StringType)).isDefined)
--- End diff --

Could you add a NotFound case?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-07-06 Thread dongjoon-hyun
Github user dongjoon-hyun commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69701369
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflect.scala
 ---
@@ -0,0 +1,153 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class JavaMethodReflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+  import JavaMethodReflect._
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (method == null) {
+  TypeCheckFailure("cannot find a method that matches the argument 
types")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval(null).asInstanceOf[UTF8String].toString
--- End diff --

You can use `eval()` instead of `eval(null)`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-06-30 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69191570
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflect.scala
 ---
@@ -0,0 +1,153 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class JavaMethodReflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+  import JavaMethodReflect._
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (method == null) {
+  TypeCheckFailure("cannot find a method that matches the argument 
types")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval(null).asInstanceOf[UTF8String].toString
+
+  /** The reflection method. */
+  @transient lazy val method: Method = {
+val methodName = 
children(1).eval(null).asInstanceOf[UTF8String].toString
+findMethod(className, methodName, argExprs.map(_.dataType)).orNull
+  }
+
+  /** If the class has a no-arg ctor, instantiate the object. Otherwise, 
obj is null. */
+  @transient private lazy val obj: Object = 
instantiate(className).orNull.asInstanceOf[Object]
+
+  /** A temporary buffer used to hold intermediate results returned by 
children. */
+  @transient private lazy val buffer = new Array[Object](argExprs.length)
+}
+
+object JavaMethodReflect {
+  /** Mapping from Spark's type to acceptable JVM types. */
+  val typeMapping = 

[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-06-30 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69146875
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflect.scala
 ---
@@ -0,0 +1,153 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class JavaMethodReflect(children: Seq[Expression])
+  extends Expression with CodegenFallback {
+  import JavaMethodReflect._
+
+  override def prettyName: String = "reflect"
+
+  override def checkInputDataTypes(): TypeCheckResult = {
+if (children.size < 2) {
+  TypeCheckFailure("requires at least two arguments")
+} else if (!children.take(2).forall(e => e.dataType == StringType && 
e.foldable)) {
+  // The first two arguments must be string type.
+  TypeCheckFailure("first two arguments should be string literals")
+} else if (method == null) {
+  TypeCheckFailure("cannot find a method that matches the argument 
types")
+} else {
+  TypeCheckSuccess
+}
+  }
+
+  override def deterministic: Boolean = false
+  override def nullable: Boolean = true
+  override val dataType: DataType = StringType
+
+  override def eval(input: InternalRow): Any = {
+var i = 0
+while (i < argExprs.length) {
+  buffer(i) = argExprs(i).eval(input).asInstanceOf[Object]
+  // Convert if necessary. Based on the types defined in typeMapping, 
string is the only
+  // type that needs conversion.
+  if (buffer(i).isInstanceOf[UTF8String]) {
+buffer(i) = buffer(i).toString
+  }
+  i += 1
+}
+UTF8String.fromString(String.valueOf(method.invoke(obj, buffer : _*)))
+  }
+
+  @transient private lazy val argExprs: Array[Expression] = 
children.drop(2).toArray
+
+  /** Name of the class -- this has to be called after we verify children 
has at least two exprs. */
+  @transient private lazy val className = 
children(0).eval(null).asInstanceOf[UTF8String].toString
+
+  /** The reflection method. */
+  @transient lazy val method: Method = {
+val methodName = 
children(1).eval(null).asInstanceOf[UTF8String].toString
+findMethod(className, methodName, argExprs.map(_.dataType)).orNull
+  }
+
+  /** If the class has a no-arg ctor, instantiate the object. Otherwise, 
obj is null. */
+  @transient private lazy val obj: Object = 
instantiate(className).orNull.asInstanceOf[Object]
+
+  /** A temporary buffer used to hold intermediate results returned by 
children. */
+  @transient private lazy val buffer = new Array[Object](argExprs.length)
+}
+
+object JavaMethodReflect {
+  /** Mapping from Spark's type to acceptable JVM types. */
+  val typeMapping = 

[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-06-29 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69058099
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflect.scala
 ---
@@ -0,0 +1,153 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class JavaMethodReflect(children: Seq[Expression])
--- End diff --

It assumes there is a no-arg constructor and creates an instance of the 
class automatically. That's what reflect does in Hive.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-06-29 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69057148
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflect.scala
 ---
@@ -0,0 +1,153 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class JavaMethodReflect(children: Seq[Expression])
--- End diff --

this one can invoke non-static methods? How do we pass in the object 
reference?

I'm ok to leave them separated as this is one is userfacing and 
`StaticInvoke` is used internally.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-06-29 Thread petermaxlee
Github user petermaxlee commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r69007189
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflect.scala
 ---
@@ -0,0 +1,153 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class JavaMethodReflect(children: Seq[Expression])
--- End diff --

Thanks for pointing out. It looks similar, but has some subtle differences:

1. This one can invoke non-static methods.
2. This one does type conversion, and as a result is more user facing. 
StaticInvoke seems to be used in internal implementations?
3. This is a SQL function - why was StaticInvoke a "nonSQL" function?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-06-29 Thread cloud-fan
Github user cloud-fan commented on a diff in the pull request:

https://github.com/apache/spark/pull/13969#discussion_r68913073
  
--- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/JavaMethodReflect.scala
 ---
@@ -0,0 +1,153 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.sql.catalyst.expressions
+
+import java.lang.reflect.Method
+
+import scala.util.Try
+
+import org.apache.spark.sql.catalyst.InternalRow
+import org.apache.spark.sql.catalyst.analysis.TypeCheckResult
+import 
org.apache.spark.sql.catalyst.analysis.TypeCheckResult.{TypeCheckFailure, 
TypeCheckSuccess}
+import org.apache.spark.sql.catalyst.expressions.codegen.CodegenFallback
+import org.apache.spark.sql.types._
+import org.apache.spark.unsafe.types.UTF8String
+import org.apache.spark.util.Utils
+
+/**
+ * An expression that invokes a method on a class via reflection.
+ *
+ * @param children the first element should be a literal string for the 
class name,
+ * and the second element should be a literal string for 
the method name,
+ * and the remaining are input arguments to the Java 
method.
+ */
+// scalastyle:off line.size.limit
+@ExpressionDescription(
+  usage = "_FUNC_(class,method[,arg1[,arg2..]]) calls method with 
reflection",
+  extended = "> SELECT _FUNC_('java.util.UUID', 
'randomUUID');\nc33fb387-8500-4bfa-81d2-6e0e3e930df2")
+// scalastyle:on line.size.limit
+case class JavaMethodReflect(children: Seq[Expression])
--- End diff --

is it similar to `StaticInvoke`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request #13969: [SPARK-16284][SQL] Implement reflect SQL function

2016-06-29 Thread petermaxlee
GitHub user petermaxlee opened a pull request:

https://github.com/apache/spark/pull/13969

[SPARK-16284][SQL] Implement reflect SQL function

## What changes were proposed in this pull request?
This patch implements reflect SQL function, which can be used to invoke a 
Java method in SQL. Slightly different from Hive, this implementation requires 
the class name and the method name to be literals. This implementation also 
supports only a smaller number of data types.

## How was this patch tested?
Added expression unit tests and an end-to-end test.

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/petermaxlee/spark reflect

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/13969.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #13969


commit dfe34804bd7855ac970c6cd2f802ee51434a3d7e
Author: petermaxlee 
Date:   2016-06-29T09:16:23Z

[SPARK-16284][SQL] Implement reflect SQL function




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org