[
https://issues.apache.org/jira/browse/PHOENIX-538?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14512608#comment-14512608
]
ASF GitHub Bot commented on PHOENIX-538:
----------------------------------------
Github user JamesRTaylor commented on a diff in the pull request:
https://github.com/apache/phoenix/pull/77#discussion_r29102300
--- Diff:
phoenix-core/src/main/java/org/apache/phoenix/expression/function/UDFExpression.java
---
@@ -0,0 +1,217 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.phoenix.expression.function;
+
+import static org.apache.phoenix.query.QueryServices.DYNAMIC_JARS_DIR_KEY;
+
+import java.io.DataInput;
+import java.io.DataOutput;
+import java.io.IOException;
+import java.lang.reflect.Constructor;
+import java.lang.reflect.InvocationTargetException;
+import java.util.List;
+import java.util.concurrent.ConcurrentMap;
+import java.util.concurrent.locks.Lock;
+
+import org.apache.commons.logging.Log;
+import org.apache.commons.logging.LogFactory;
+import org.apache.hadoop.conf.Configuration;
+import org.apache.hadoop.fs.Path;
+import org.apache.hadoop.hbase.HBaseConfiguration;
+import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
+import org.apache.hadoop.hbase.util.DynamicClassLoader;
+import org.apache.hadoop.hbase.util.KeyLocker;
+import org.apache.hadoop.io.WritableUtils;
+import org.apache.phoenix.compile.KeyPart;
+import org.apache.phoenix.expression.Expression;
+import org.apache.phoenix.expression.visitor.ExpressionVisitor;
+import org.apache.phoenix.parse.PFunction;
+import org.apache.phoenix.schema.PName;
+import org.apache.phoenix.schema.PNameFactory;
+import org.apache.phoenix.schema.tuple.Tuple;
+import org.apache.phoenix.schema.types.PDataType;
+
+import com.google.common.annotations.VisibleForTesting;
+import com.google.common.collect.MapMaker;
+
+public class UDFExpression extends ScalarFunction {
--- End diff --
I see - forgot about that, and we shouldn't change that. Instead, in the
RowProjector constructor we'll want to set cloneRequired to true if any of the
expressions use a UDF. The easiest way is probably to add a boolean hasUDFs to
the RowProjector, add a ColumnResolver.hasUDFs() that returns true if
!functions.isEmpty(). This will still clone if a UDF is used in the WHERE
clause, but not in the SELECT expressions, but that's ok (it won't harm
anything).
> Support UDFs
> ------------
>
> Key: PHOENIX-538
> URL: https://issues.apache.org/jira/browse/PHOENIX-538
> Project: Phoenix
> Issue Type: Task
> Reporter: James Taylor
> Assignee: Rajeshbabu Chintaguntla
> Fix For: 5.0.0, 4.4.0
>
> Attachments: PHOENIX-538-wip.patch, PHOENIX-538_v1.patch,
> PHOENIX-538_v2.patch, PHOENIX-538_v3.patch, PHOENIX-538_v4.patch,
> PHOENIX-538_v5.patch, PHOENIX-538_v6.patch, PHOENIX-538_v6.patch
>
>
> Phoenix allows built-in functions to be added (as described
> [here](http://phoenix-hbase.blogspot.com/2013/04/how-to-add-your-own-built-in-function.html))
> with the restriction that they must be in the phoenix jar. We should improve
> on this and allow folks to declare new functions through a CREATE FUNCTION
> command like this:
> CREATE FUNCTION mdHash(anytype)
> RETURNS binary(16)
> LOCATION 'hdfs://path-to-my-jar' 'com.me.MDHashFunction'
> Since HBase supports loading jars dynamically, this would not be too
> difficult. The function implementation class would be required to extend our
> ScalarFunction base class. Here's how I could see it being implemented:
> * modify the phoenix grammar to support the new CREATE FUNCTION syntax
> * create a new UTFParseNode class to capture the parse state
> * add a new method to the MetaDataProtocol interface
> * add a new method in ConnectionQueryServices to invoke the MetaDataProtocol
> method
> * add a new method in MetaDataClient to invoke the ConnectionQueryServices
> method
> * persist functions in a new "SYSTEM.FUNCTION" table
> * add a new client-side representation to cache functions called PFunction
> * modify ColumnResolver to dynamically resolve a function in the same way we
> dynamically resolve and load a table
> * create and register a new ExpressionType called UDFExpression
> * at parse time, check for the function name in the built in list first (as
> is currently done), and if not found in the PFunction cache. If not found
> there, then use the new UDFExpression as a placeholder and have the
> ColumnResolver attempt to resolve it at compile time and throw an error if
> unsuccessful.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)