fixed bug with collection with tag, added test for collectio with tag, moved logic for collection with tag function arguments to IntroduceCollectionRule class and removed CollectionWithTagRule
Project: http://git-wip-us.apache.org/repos/asf/vxquery/repo Commit: http://git-wip-us.apache.org/repos/asf/vxquery/commit/94801fb0 Tree: http://git-wip-us.apache.org/repos/asf/vxquery/tree/94801fb0 Diff: http://git-wip-us.apache.org/repos/asf/vxquery/diff/94801fb0 Branch: refs/heads/steven/hdfs Commit: 94801fb08d508f3d72a12a083645a06da4e31ddf Parents: c1f1ec4 Author: efikalti <[email protected]> Authored: Sat Oct 17 13:50:53 2015 +0300 Committer: efikalti <[email protected]> Committed: Sat Oct 17 13:50:53 2015 +0300 ---------------------------------------------------------------------- src/site/apt/user_query_hdfs.apt | 167 +++++++++++++++++++ src/site/site.xml | 3 + .../compiler/rewriter/RewriteRuleset.java | 2 - .../rewriter/rules/AbstractCollectionRule.java | 9 +- .../rewriter/rules/CollectionWithTagRule.java | 65 -------- .../rewriter/rules/IntroduceCollectionRule.java | 6 + .../org/apache/vxquery/hdfs2/HDFSFunctions.java | 5 +- .../VXQueryCollectionOperatorDescriptor.java | 6 +- .../org/apache/vxquery/xmlparser/XMLParser.java | 22 +-- .../HDFS/Aggregate/maxvalueHDFS.txt | 1 + .../XQuery/HDFS/Aggregate/maxvalueHDFS.xq | 23 +++ .../Queries/XQuery/HDFS/Aggregate/sumHDFS.xq | 2 +- .../test/resources/cat/HDFSAggregateQueries.xml | 5 + 13 files changed, 228 insertions(+), 88 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/vxquery/blob/94801fb0/src/site/apt/user_query_hdfs.apt ---------------------------------------------------------------------- diff --git a/src/site/apt/user_query_hdfs.apt b/src/site/apt/user_query_hdfs.apt new file mode 100644 index 0000000..662d5df --- /dev/null +++ b/src/site/apt/user_query_hdfs.apt @@ -0,0 +1,167 @@ +~~ Licensed to the Apache Software Foundation (ASF) under one or more +~~ contributor license agreements. See the NOTICE file distributed with +~~ this work for additional information regarding copyright ownership. +~~ The ASF licenses this file to You under the Apache License, Version 2.0 +~~ (the "License"); you may not use this file except in compliance with +~~ the License. You may obtain a copy of the License at +~~ +~~ http://www.apache.org/licenses/LICENSE-2.0 +~~ +~~ Unless required by applicable law or agreed to in writing, software +~~ distributed under the License is distributed on an "AS IS" BASIS, +~~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +~~ See the License for the specific language governing permissions and +~~ limitations under the License. + +Executing a Query in HDFS + + +* 1. Connecting VXQuery with HDFS + + The only configuration you need to define, is the ip address of the node(s) that + you want to run the queries. + + This information should be defined in the <local.xml> or <cluster.xml> file at + <vxquery-server/src/main/resources/conf/> . + + +* 2. Running the Query + + + For files stored in HDFS there are 2 ways to access them from VXQuery. + + + [[a]] Reading them as whole files. + + + [[b]] Reading them block by block. + + +** a. Reading them as whole files. + + For this option you only need to change the path to files.To define that your + file(s) exist and should be read from HDFS you must add <"hdfs:/"> in front + of the path.VXQuery will read the path of the files you request in your query + and try to locate them. + + + So in order to run a query that will read the input files from HDFS you need + to make sure that + + + a) The configuration path is correctly set to the one of your HDFS system. + + + b) The path defined in your query begins with <hdfs://> and the full path to + the file(s). + + + c) The path exists on HDFS and the user that runs the query has read permission + to these files. + + +*** Example + + I want to find all the <books> that are published after 2004. + + + The file is located in HDFS in this path </user/hduser/store/books.xml> + + + My query will look like this: + + +---------- +for $x in collection("hdfs://user/hduser/store") +where $x/year>2004 +return $x/title +---------- + + + If I want only one file, the <<books.xml>> to be parsed from HDFS, my query will + look like this: + + +---------- +for $x in doc("hdfs://user/hduser/store/books.xml") +where $x/year>2004 +return $x/title +---------- + + +** b. Reading them block by block + + + In order to use that option you need to modify your query.Instead of using the + <collection> or <doc> function to define your input file(s) you need to use + <collection-with-tag>. + + + <collection-with-tag> accepts two arguments, one is the path to the HDFS directory + you have stored your input files, and the second is a specific <<tag>> that exists + in the input file(s).This is the tag of the element that contains the fields that + your query is looking for. + + Other than these arguments, you do not need to change anything else in the query. + + +*** Example + + The same example, using <<collection-with-tag>>. + + My input file <books.xml>: + +----------------------------- +<?xml version="1.0" encoding="UTF-8"?> +<bookstore> + +<book> + <title lang="en">Everyday Italian</title> + <author>Giada De Laurentiis</author> + <year>2005</year> + <price>30.00</price> +</book> + +<book> + <title lang="en">Harry Potter</title> + <author>J K. Rowling</author> + <year>2005</year> + <price>29.99</price> +</book> + +<book> + <title lang="en">XQuery Kick Start</title> + <author>James McGovern</author> + <author>Per Bothner</author> + <author>Kurt Cagle</author> + <author>James Linn</author> + <author>Vaidyanathan Nagarajan</author> + <year>2003</year> + <price>49.99</price> +</book> + +<book> + <title lang="en">Learning XML</title> + <author>Erik T. Ray</author> + <year>2003</year> + <price>39.95</price> +</book> + +</bookstore> +---------------------------- + + + My query will look like this: + + +---------------------------- +for $x in collectionwithtag("hdfs://user/hduser/store","book")/book +where $x/year>2004 +return $x/title +---------------------------- + + + Take notice that I defined the path to the directory containing the file(s) + and not the file, <collection-with-tag> expects path to the directory. I also + added the </book> after the function.This is also needed, like <collection> and + <doc> functions, for the query to be parsed correctly. http://git-wip-us.apache.org/repos/asf/vxquery/blob/94801fb0/src/site/site.xml ---------------------------------------------------------------------- diff --git a/src/site/site.xml b/src/site/site.xml index b9612c1..6cfae24 100644 --- a/src/site/site.xml +++ b/src/site/site.xml @@ -64,6 +64,9 @@ limitations under the License. name="Executing a Query" href="user_query.html" /> <item + name="Using HDFS with VXQuery" + href="user_query_hdfs.html" /> + <item name="Running the Test Suite" href="user_running_tests.html" /> </menu> http://git-wip-us.apache.org/repos/asf/vxquery/blob/94801fb0/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/RewriteRuleset.java ---------------------------------------------------------------------- diff --git a/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/RewriteRuleset.java b/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/RewriteRuleset.java index 1940651..b51dc45 100644 --- a/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/RewriteRuleset.java +++ b/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/RewriteRuleset.java @@ -19,7 +19,6 @@ package org.apache.vxquery.compiler.rewriter; import java.util.LinkedList; import java.util.List; -import org.apache.vxquery.compiler.rewriter.rules.CollectionWithTagRule; import org.apache.vxquery.compiler.rewriter.rules.ConsolidateAssignAggregateRule; import org.apache.vxquery.compiler.rewriter.rules.ConsolidateDescandantChild; import org.apache.vxquery.compiler.rewriter.rules.ConvertAssignToUnnestRule; @@ -129,7 +128,6 @@ public class RewriteRuleset { normalization.add(new SetCollectionDataSourceRule()); normalization.add(new IntroduceCollectionRule()); normalization.add(new RemoveUnusedAssignAndAggregateRule()); - normalization.add(new CollectionWithTagRule()); normalization.add(new ConsolidateDescandantChild()); http://git-wip-us.apache.org/repos/asf/vxquery/blob/94801fb0/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/AbstractCollectionRule.java ---------------------------------------------------------------------- diff --git a/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/AbstractCollectionRule.java b/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/AbstractCollectionRule.java index 72fc678..ed3d5ac 100644 --- a/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/AbstractCollectionRule.java +++ b/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/AbstractCollectionRule.java @@ -100,6 +100,8 @@ public abstract class AbstractCollectionRule implements IAlgebraicRewriteRule { ILogicalExpression logicalExpression2 = (ILogicalExpression) functionCall.getArguments().get(pos).getValue(); if (logicalExpression2.getExpressionTag() != LogicalExpressionTag.VARIABLE) { return null; + } else if (logicalExpression2 == null) { + return null; } VariableReferenceExpression vre = (VariableReferenceExpression) logicalExpression2; Mutable<ILogicalOperator> opRef3 = OperatorToolbox.findProducerOf(opRef, vre.getVariableReference()); @@ -122,18 +124,17 @@ public abstract class AbstractCollectionRule implements IAlgebraicRewriteRule { } else { return null; } - String args[] = new String[2]; // Constant value is now in a TaggedValuePointable. Convert the value into a java String. tvp.set(constantValue.getValue(), 0, constantValue.getValue().length); - String arg = null; + String collectionName = null; if (tvp.getTag() == ValueTag.XS_STRING_TAG) { tvp.getValue(stringp); try { bbis.setByteBuffer( ByteBuffer.wrap(Arrays.copyOfRange(stringp.getByteArray(), stringp.getStartOffset(), stringp.getLength() + stringp.getStartOffset())), 0); - arg = di.readUTF(); - return arg; + collectionName = di.readUTF(); + return collectionName; } catch (IOException e) { e.printStackTrace(); } http://git-wip-us.apache.org/repos/asf/vxquery/blob/94801fb0/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/CollectionWithTagRule.java ---------------------------------------------------------------------- diff --git a/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/CollectionWithTagRule.java b/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/CollectionWithTagRule.java deleted file mode 100644 index c56bed5..0000000 --- a/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/CollectionWithTagRule.java +++ /dev/null @@ -1,65 +0,0 @@ -/* - * Licensed to the Apache Software Foundation (ASF) under one or more - * contributor license agreements. See the NOTICE file distributed with - * this work for additional information regarding copyright ownership. - * The ASF licenses this file to You under the Apache License, Version 2.0 - * (the "License"); you may not use this file except in compliance with - * the License. You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ -package org.apache.vxquery.compiler.rewriter.rules; - -import org.apache.commons.lang3.mutable.Mutable; -import org.apache.vxquery.compiler.rewriter.VXQueryOptimizationContext; -import org.apache.vxquery.metadata.VXQueryCollectionDataSource; -import org.apache.vxquery.types.AnyItemType; -import org.apache.vxquery.types.Quantifier; -import org.apache.vxquery.types.SequenceType; - -import edu.uci.ics.hyracks.algebricks.common.exceptions.AlgebricksException; -import edu.uci.ics.hyracks.algebricks.core.algebra.base.ILogicalOperator; -import edu.uci.ics.hyracks.algebricks.core.algebra.base.IOptimizationContext; -import edu.uci.ics.hyracks.algebricks.core.algebra.base.LogicalVariable; -import edu.uci.ics.hyracks.algebricks.core.algebra.operators.logical.AbstractLogicalOperator; -import edu.uci.ics.hyracks.algebricks.core.algebra.operators.logical.AssignOperator; -import edu.uci.ics.hyracks.algebricks.core.algebra.operators.logical.DataSourceScanOperator; -import edu.uci.ics.hyracks.algebricks.core.algebra.operators.logical.UnnestOperator; - -public class CollectionWithTagRule extends AbstractCollectionRule { - - @Override - public boolean rewritePre(Mutable<ILogicalOperator> opRef, IOptimizationContext context) throws AlgebricksException { - VXQueryOptimizationContext vxqueryContext = (VXQueryOptimizationContext) context; - String args[] = getCollectionName(opRef); - - if (args != null) { - // Build the new operator and update the query plan. - int collectionId = vxqueryContext.newCollectionId(); - VXQueryCollectionDataSource ds = VXQueryCollectionDataSource.create(collectionId, args[0], - SequenceType.create(AnyItemType.INSTANCE, Quantifier.QUANT_STAR)); - if (ds != null) { - ds.setTotalDataSources(vxqueryContext.getTotalDataSources()); - ds.setTag(args[1]); - // Known to be true because of collection name. - AbstractLogicalOperator op = (AbstractLogicalOperator) opRef.getValue(); - UnnestOperator unnest = (UnnestOperator) op; - Mutable<ILogicalOperator> opRef2 = unnest.getInputs().get(0); - AbstractLogicalOperator op2 = (AbstractLogicalOperator) opRef2.getValue(); - AssignOperator assign = (AssignOperator) op2; - - DataSourceScanOperator opNew = new DataSourceScanOperator(assign.getVariables(), ds); - opNew.getInputs().addAll(assign.getInputs()); - opRef2.setValue(opNew); - return true; - } - } - return false; - } -} http://git-wip-us.apache.org/repos/asf/vxquery/blob/94801fb0/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/IntroduceCollectionRule.java ---------------------------------------------------------------------- diff --git a/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/IntroduceCollectionRule.java b/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/IntroduceCollectionRule.java index a8ad94a..e1e0442 100644 --- a/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/IntroduceCollectionRule.java +++ b/vxquery-core/src/main/java/org/apache/vxquery/compiler/rewriter/rules/IntroduceCollectionRule.java @@ -75,6 +75,12 @@ public class IntroduceCollectionRule extends AbstractCollectionRule { if (ds != null) { ds.setTotalDataSources(vxqueryContext.getTotalDataSources()); + // Check if the call is for collection-with-tag + if (args.length == 2) { + ds.setTotalDataSources(vxqueryContext.getTotalDataSources()); + ds.setTag(args[1]); + } + // Known to be true because of collection name. AbstractLogicalOperator op = (AbstractLogicalOperator) opRef.getValue(); UnnestOperator unnest = (UnnestOperator) op; http://git-wip-us.apache.org/repos/asf/vxquery/blob/94801fb0/vxquery-core/src/main/java/org/apache/vxquery/hdfs2/HDFSFunctions.java ---------------------------------------------------------------------- diff --git a/vxquery-core/src/main/java/org/apache/vxquery/hdfs2/HDFSFunctions.java b/vxquery-core/src/main/java/org/apache/vxquery/hdfs2/HDFSFunctions.java index 0680eb3..b6655ed 100644 --- a/vxquery-core/src/main/java/org/apache/vxquery/hdfs2/HDFSFunctions.java +++ b/vxquery-core/src/main/java/org/apache/vxquery/hdfs2/HDFSFunctions.java @@ -175,10 +175,12 @@ public class HDFSFunctions { // load properties file Properties prop = new Properties(); String propFilePath = "../vxquery-server/src/main/resources/conf/cluster.properties"; + nodeXMLfile = new File("../vxquery-server/src/main/resources/conf/local.xml"); try { prop.load(new FileInputStream(propFilePath)); } catch (FileNotFoundException e) { propFilePath = "vxquery-server/src/main/resources/conf/cluster.properties"; + nodeXMLfile = new File("vxquery-server/src/main/resources/conf/local.xml"); try { prop.load(new FileInputStream(propFilePath)); } catch (FileNotFoundException e1) { @@ -293,7 +295,6 @@ public class HDFSFunctions { } public void scheduleSplits() throws IOException, ParserConfigurationException, SAXException { - schedule = new HashMap<Integer, String>(); ArrayList<String> empty = new ArrayList<String>(); HashMap<String, ArrayList<Integer>> splits_map = this.getLocationsOfSplits(); @@ -303,7 +304,7 @@ public class HDFSFunctions { ArrayList<Integer> splits; String node; for (ArrayList<String> info : this.nodes) { - node = info.get(0); + node = info.get(1); if (splits_map.containsKey(node)) { splits = splits_map.get(node); for (Integer split : splits) { http://git-wip-us.apache.org/repos/asf/vxquery/blob/94801fb0/vxquery-core/src/main/java/org/apache/vxquery/metadata/VXQueryCollectionOperatorDescriptor.java ---------------------------------------------------------------------- diff --git a/vxquery-core/src/main/java/org/apache/vxquery/metadata/VXQueryCollectionOperatorDescriptor.java b/vxquery-core/src/main/java/org/apache/vxquery/metadata/VXQueryCollectionOperatorDescriptor.java index 0395788..3262b81 100644 --- a/vxquery-core/src/main/java/org/apache/vxquery/metadata/VXQueryCollectionOperatorDescriptor.java +++ b/vxquery-core/src/main/java/org/apache/vxquery/metadata/VXQueryCollectionOperatorDescriptor.java @@ -75,7 +75,7 @@ public class VXQueryCollectionOperatorDescriptor extends AbstractSingleActivityO protected static final Logger LOGGER = Logger.getLogger(VXQueryCollectionOperatorDescriptor.class.getName()); private HDFSFunctions hdfs; private String tag; - private final String START_TAG = "<?xml version=\"1.0\" encoding=\"utf-8\"?>\n"; + private final String START_TAG = "<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"yes\"?>\n"; public VXQueryCollectionOperatorDescriptor(IOperatorDescriptorRegistry spec, VXQueryCollectionDataSource ds, RecordDescriptor rDesc) { @@ -148,14 +148,14 @@ public class VXQueryCollectionOperatorDescriptor extends AbstractSingleActivityO Path directory = new Path(collectionModifiedName); Path xmlDocument; if (tag != null) { - hdfs.setJob(directory.getName(), tag); + hdfs.setJob(directory.toString(), tag); tag = "<" + tag + ">"; Job job = hdfs.getJob(); InputFormat inputFormat = hdfs.getinputFormat(); try { hdfs.scheduleSplits(); ArrayList<Integer> schedule = hdfs.getScheduleForNode(InetAddress.getLocalHost() - .getHostName()); + .getHostAddress()); List<InputSplit> splits = hdfs.getSplits(); List<FileSplit> fileSplits = new ArrayList<FileSplit>(); for (int i : schedule) { http://git-wip-us.apache.org/repos/asf/vxquery/blob/94801fb0/vxquery-core/src/main/java/org/apache/vxquery/xmlparser/XMLParser.java ---------------------------------------------------------------------- diff --git a/vxquery-core/src/main/java/org/apache/vxquery/xmlparser/XMLParser.java b/vxquery-core/src/main/java/org/apache/vxquery/xmlparser/XMLParser.java index 5d92870..9ad8fec 100644 --- a/vxquery-core/src/main/java/org/apache/vxquery/xmlparser/XMLParser.java +++ b/vxquery-core/src/main/java/org/apache/vxquery/xmlparser/XMLParser.java @@ -130,10 +130,10 @@ public class XMLParser { throw hde; } } - - public void parseHDFSElements(InputStream inputStream, IFrameWriter writer, FrameTupleAccessor fta, int tupleIndex) throws IOException - { - try { + + public void parseHDFSElements(InputStream inputStream, IFrameWriter writer, FrameTupleAccessor fta, int tupleIndex) + throws IOException { + try { Reader input; if (bufferSize > 0) { input = new BufferedReader(new InputStreamReader(inputStream), bufferSize); @@ -149,11 +149,11 @@ public class XMLParser { hde.setNodeId(nodeId); throw hde; } catch (SAXException e) { - // TODO Auto-generated catch block - e.printStackTrace(); - } + // TODO Auto-generated catch block + e.printStackTrace(); + } } - + public void parseHDFSDocument(InputStream inputStream, ArrayBackedValueStorage abvs) throws HyracksDataException { try { Reader input; @@ -171,9 +171,9 @@ public class XMLParser { hde.setNodeId(nodeId); throw hde; } catch (SAXException e) { - // TODO Auto-generated catch block - e.printStackTrace(); - } + // TODO Auto-generated catch block + e.printStackTrace(); + } } } http://git-wip-us.apache.org/repos/asf/vxquery/blob/94801fb0/vxquery-xtest/src/test/resources/ExpectedTestResults/HDFS/Aggregate/maxvalueHDFS.txt ---------------------------------------------------------------------- diff --git a/vxquery-xtest/src/test/resources/ExpectedTestResults/HDFS/Aggregate/maxvalueHDFS.txt b/vxquery-xtest/src/test/resources/ExpectedTestResults/HDFS/Aggregate/maxvalueHDFS.txt new file mode 100644 index 0000000..e37d32a --- /dev/null +++ b/vxquery-xtest/src/test/resources/ExpectedTestResults/HDFS/Aggregate/maxvalueHDFS.txt @@ -0,0 +1 @@ +1000 \ No newline at end of file http://git-wip-us.apache.org/repos/asf/vxquery/blob/94801fb0/vxquery-xtest/src/test/resources/Queries/XQuery/HDFS/Aggregate/maxvalueHDFS.xq ---------------------------------------------------------------------- diff --git a/vxquery-xtest/src/test/resources/Queries/XQuery/HDFS/Aggregate/maxvalueHDFS.xq b/vxquery-xtest/src/test/resources/Queries/XQuery/HDFS/Aggregate/maxvalueHDFS.xq new file mode 100644 index 0000000..30006af --- /dev/null +++ b/vxquery-xtest/src/test/resources/Queries/XQuery/HDFS/Aggregate/maxvalueHDFS.xq @@ -0,0 +1,23 @@ +(: Licensed to the Apache Software Foundation (ASF) under one + or more contributor license agreements. See the NOTICE file + distributed with this work for additional information + regarding copyright ownership. The ASF licenses this file + to you under the Apache License, Version 2.0 (the + "License"); you may not use this file except in compliance + with the License. You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, + software distributed under the License is distributed on an + "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + KIND, either express or implied. See the License for the + specific language governing permissions and limitations + under the License. :) + +(: XQuery Aggregate Query :) +(: Find the max value. :) +fn:max( + for $r in collectionwithtag("hdfs://tmp/vxquery-hdfs-test/half_1/quarter_1/sensors", "data")/data + return $r/value +) http://git-wip-us.apache.org/repos/asf/vxquery/blob/94801fb0/vxquery-xtest/src/test/resources/Queries/XQuery/HDFS/Aggregate/sumHDFS.xq ---------------------------------------------------------------------- diff --git a/vxquery-xtest/src/test/resources/Queries/XQuery/HDFS/Aggregate/sumHDFS.xq b/vxquery-xtest/src/test/resources/Queries/XQuery/HDFS/Aggregate/sumHDFS.xq index fb1e12c..1fdf743 100644 --- a/vxquery-xtest/src/test/resources/Queries/XQuery/HDFS/Aggregate/sumHDFS.xq +++ b/vxquery-xtest/src/test/resources/Queries/XQuery/HDFS/Aggregate/sumHDFS.xq @@ -22,4 +22,4 @@ fn:sum( for $r in collection($collection)/dataCollection/data where $r/dataType eq "PRCP" return $r/value -) +) \ No newline at end of file http://git-wip-us.apache.org/repos/asf/vxquery/blob/94801fb0/vxquery-xtest/src/test/resources/cat/HDFSAggregateQueries.xml ---------------------------------------------------------------------- diff --git a/vxquery-xtest/src/test/resources/cat/HDFSAggregateQueries.xml b/vxquery-xtest/src/test/resources/cat/HDFSAggregateQueries.xml index 2b36f9f..4f925a0 100644 --- a/vxquery-xtest/src/test/resources/cat/HDFSAggregateQueries.xml +++ b/vxquery-xtest/src/test/resources/cat/HDFSAggregateQueries.xml @@ -45,4 +45,9 @@ <query name="sumHDFS" date="2015-06-11"/> <output-file compare="Text">sumHDFS.txt</output-file> </test-case> + <test-case name="hdfs-aggregate-max-value" FilePath="HDFS/Aggregate/" Creator="Efi Kaltirimidou"> + <description>Count records in HDFS returned for q03 from the weather benchmark.</description> + <query name="maxvalueHDFS" date="2015-10-17"/> + <output-file compare="Text">maxvalueHDFS.txt</output-file> + </test-case> </test-group>
