Till Westmann has posted comments on this change.

Change subject: ASTERIXDB-1573 Allow Extension of Rewrite Rules
......................................................................


Patch Set 2:

(7 comments)

Looks good to me, but I've got a few proposals for further improvements :)

https://asterix-gerrit.ics.uci.edu/#/c/1304/2/asterixdb/asterix-algebra/src/main/java/org/apache/asterix/compiler/provider/DefaultRuleSetFactory.java
File 
asterixdb/asterix-algebra/src/main/java/org/apache/asterix/compiler/provider/DefaultRuleSetFactory.java:

Line 43:     public static List<Pair<AbstractRuleController, 
List<IAlgebraicRewriteRule>>> buildLogical() {
Could we inline this method?


Line 72:     public static List<Pair<AbstractRuleController, 
List<IAlgebraicRewriteRule>>> buildPhysical() {
Could we inline this method?


https://asterix-gerrit.ics.uci.edu/#/c/1304/2/asterixdb/asterix-algebra/src/main/java/org/apache/asterix/compiler/provider/IRuleSetFactory.java
File 
asterixdb/asterix-algebra/src/main/java/org/apache/asterix/compiler/provider/IRuleSetFactory.java:

Line 32:      * @throws AlgebricksException
I think that we don't need an exception here.


Line 34:     public List<Pair<AbstractRuleController, 
List<IAlgebraicRewriteRule>>> buildLogicalRewrites()
s/buildLogicalRewrites/getLogicalRewrites/ ?


Line 40:     public List<Pair<AbstractRuleController, 
List<IAlgebraicRewriteRule>>> buildPhysicalRewrites();
s/buildPhysicalRewrites/getPhysicalRewrites/ ?


https://asterix-gerrit.ics.uci.edu/#/c/1304/2/asterixdb/asterix-algebra/src/main/java/org/apache/asterix/optimizer/rules/UnnestToDataScanRule.java
File 
asterixdb/asterix-algebra/src/main/java/org/apache/asterix/optimizer/rules/UnnestToDataScanRule.java:

Line 78:     public boolean rewritePost(Mutable<ILogicalOperator> opRef, 
IOptimizationContext context)
Could we make this easier to extend by refactoring this list this:


    @Override
    public boolean rewritePost(Mutable<ILogicalOperator> opRef, 
IOptimizationContext context)
            throws AlgebricksException {
        AbstractLogicalOperator op = (AbstractLogicalOperator) opRef.getValue();
        if (op.getOperatorTag() != LogicalOperatorTag.UNNEST) {
            return false;
        }
        UnnestOperator unnest = (UnnestOperator) op;
        ILogicalExpression unnestExpr = unnest.getExpressionRef().getValue();
        if (unnestExpr.getExpressionTag() != 
LogicalExpressionTag.FUNCTION_CALL) {
            return false;
        }
        return handleFunction(opRef, context, unnest, 
(AbstractFunctionCallExpression) unnestExpr);
    }

    protected boolean handleFunction(Mutable<ILogicalOperator> opRef, 
IOptimizationContext context,
            UnnestOperator unnest, AbstractFunctionCallExpression f) throws 
AlgebricksException {
        FunctionIdentifier fid = f.getFunctionIdentifier();

        if (fid.equals(AsterixBuiltinFunctions.DATASET)) {
            if (unnest.getPositionalVariable() != null) {
                // TODO remove this after enabling the support of positional 
variables in data scan
                throw new AlgebricksException("No positional variables are 
allowed over datasets.");
            }
            ILogicalExpression expr = f.getArguments().get(0).getValue();
            if (expr.getExpressionTag() != LogicalExpressionTag.CONSTANT) {
                return false;
            }
            ConstantExpression ce = (ConstantExpression) expr;
            IAlgebricksConstantValue acv = ce.getValue();
            if (!(acv instanceof AsterixConstantValue)) {
                return false;
            }
            AsterixConstantValue acv2 = (AsterixConstantValue) acv;
            if (acv2.getObject().getType().getTypeTag() != ATypeTag.STRING) {
                return false;
            }
            String datasetArg = ((AString) acv2.getObject()).getStringValue();

            AqlMetadataProvider metadataProvider = (AqlMetadataProvider) 
context.getMetadataProvider();
            Pair<String, String> datasetReference = 
parseDatasetReference(metadataProvider, datasetArg);
            String dataverseName = datasetReference.first;
            String datasetName = datasetReference.second;
            Dataset dataset = metadataProvider.findDataset(dataverseName, 
datasetName);
            if (dataset == null) {
                throw new AlgebricksException(
                        "Could not find dataset " + datasetName + " in 
dataverse " + dataverseName);
            }

            AqlSourceId asid = new AqlSourceId(dataverseName, datasetName);
            List<LogicalVariable> variables = new ArrayList<>();

            if (dataset.getDatasetType() == DatasetType.INTERNAL) {
                int numPrimaryKeys = 
DatasetUtils.getPartitioningKeys(dataset).size();
                for (int i = 0; i < numPrimaryKeys; i++) {
                    variables.add(context.newVar());
                }
            }
            variables.add(unnest.getVariable());
            AqlDataSource dataSource = metadataProvider.findDataSource(asid);
            boolean hasMeta = dataSource.hasMeta();
            if (hasMeta) {
                variables.add(context.newVar());
            }
            DataSourceScanOperator scan = new DataSourceScanOperator(variables, 
dataSource);
            List<Mutable<ILogicalOperator>> scanInpList = scan.getInputs();
            scanInpList.addAll(unnest.getInputs());
            opRef.setValue(scan);
            addPrimaryKey(variables, dataSource, context);
            context.computeAndSetTypeEnvironmentForOperator(scan);

            // Adds equivalence classes --- one equivalent class between a 
primary key
            // variable and a record field-access expression.
            IAType[] schemaTypes = dataSource.getSchemaTypes();
            ARecordType recordType = (ARecordType) (hasMeta ? 
schemaTypes[schemaTypes.length - 2]
                    : schemaTypes[schemaTypes.length - 1]);
            ARecordType metaRecordType = (ARecordType) (hasMeta ? 
schemaTypes[schemaTypes.length - 1] : null);
            
EquivalenceClassUtils.addEquivalenceClassesForPrimaryIndexAccess(scan, 
variables, recordType,
                    metaRecordType, dataset, context);
            return true;
        } else if (fid.equals(AsterixBuiltinFunctions.FEED_COLLECT)) {
            if (unnest.getPositionalVariable() != null) {
                throw new AlgebricksException("No positional variables are 
allowed over feeds.");
            }

            String dataverse = ConstantExpressionUtil.getStringArgument(f, 0);
            String sourceFeedName = ConstantExpressionUtil.getStringArgument(f, 
1);
            String getTargetFeed = ConstantExpressionUtil.getStringArgument(f, 
2);
            String subscriptionLocation = 
ConstantExpressionUtil.getStringArgument(f, 3);
            String targetDataset = ConstantExpressionUtil.getStringArgument(f, 
4);
            String outputType = ConstantExpressionUtil.getStringArgument(f, 5);

            AqlMetadataProvider metadataProvider = (AqlMetadataProvider) 
context.getMetadataProvider();

            AqlSourceId asid = new AqlSourceId(dataverse, getTargetFeed);
            String policyName = 
metadataProvider.getConfig().get(FeedActivityDetails.FEED_POLICY_NAME);
            FeedPolicyEntity policy = 
metadataProvider.findFeedPolicy(dataverse, policyName);
            if (policy == null) {
                policy = BuiltinFeedPolicies.getFeedPolicy(policyName);
                if (policy == null) {
                    throw new AlgebricksException("Unknown feed policy:" + 
policyName);
                }
            }

            ArrayList<LogicalVariable> feedDataScanOutputVariables = new 
ArrayList<>();

            String csLocations = 
metadataProvider.getConfig().get(FeedActivityDetails.COLLECT_LOCATIONS);
            List<LogicalVariable> pkVars = new ArrayList<>();
            FeedDataSource ds = createFeedDataSource(asid, targetDataset, 
sourceFeedName, subscriptionLocation,
                    metadataProvider, policy, outputType, csLocations, 
unnest.getVariable(), context, pkVars);
            // The order for feeds is <Record-Meta-PK>
            feedDataScanOutputVariables.add(unnest.getVariable());
            // Does it produce meta?
            if (ds.hasMeta()) {
                feedDataScanOutputVariables.add(context.newVar());
            }
            // Does it produce pk?
            if (ds.isChange()) {
                feedDataScanOutputVariables.addAll(pkVars);
            }

            DataSourceScanOperator scan = new 
DataSourceScanOperator(feedDataScanOutputVariables, ds);
            List<Mutable<ILogicalOperator>> scanInpList = scan.getInputs();
            scanInpList.addAll(unnest.getInputs());
            opRef.setValue(scan);
            context.computeAndSetTypeEnvironmentForOperator(scan);
            return true;
        }
        return false;
    }


https://asterix-gerrit.ics.uci.edu/#/c/1304/2/asterixdb/asterix-app/src/main/java/org/apache/asterix/app/translator/DefaultStatementExecutorFactory.java
File 
asterixdb/asterix-app/src/main/java/org/apache/asterix/app/translator/DefaultStatementExecutorFactory.java:

Line 31:     public QueryTranslator create(List<Statement> aqlStatements, 
SessionConfig conf,
s/aqlStatements/statements/ ?


-- 
To view, visit https://asterix-gerrit.ics.uci.edu/1304
To unsubscribe, visit https://asterix-gerrit.ics.uci.edu/settings

Gerrit-MessageType: comment
Gerrit-Change-Id: I7f59dea86b0ef4ee9d31b56766a97bd2b34ec02c
Gerrit-PatchSet: 2
Gerrit-Project: asterixdb
Gerrit-Branch: master
Gerrit-Owner: Steven Jacobs <[email protected]>
Gerrit-Reviewer: Jenkins <[email protected]>
Gerrit-Reviewer: Till Westmann <[email protected]>
Gerrit-Reviewer: Yingyi Bu <[email protected]>
Gerrit-Reviewer: abdullah alamoudi <[email protected]>
Gerrit-HasComments: Yes

Reply via email to