[
https://issues.apache.org/jira/browse/NIFI-3724?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15991079#comment-15991079
]
ASF GitHub Bot commented on NIFI-3724:
--------------------------------------
Github user joewitt commented on a diff in the pull request:
https://github.com/apache/nifi/pull/1712#discussion_r114153914
--- Diff:
nifi-nar-bundles/nifi-extension-utils/nifi-record-utils/nifi-avro-record-utils/src/main/java/org/apache/nifi/schema/access/SchemaAccessUtils.java
---
@@ -0,0 +1,156 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.nifi.schema.access;
+
+import org.apache.nifi.avro.AvroSchemaValidator;
+import org.apache.nifi.components.AllowableValue;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.components.ValidationContext;
+import org.apache.nifi.components.ValidationResult;
+import org.apache.nifi.controller.ConfigurationContext;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schemaregistry.services.SchemaRegistry;
+
+import java.util.Collection;
+import java.util.Collections;
+import java.util.List;
+
+public class SchemaAccessUtils {
+
+ public static final AllowableValue SCHEMA_NAME_PROPERTY = new
AllowableValue("schema-name", "Use 'Schema Name' Property",
+ "The name of the Schema to use is specified by the 'Schema
Name' Property. The value of this property is used to lookup the Schema in the
configured Schema Registry service.");
+ public static final AllowableValue SCHEMA_TEXT_PROPERTY = new
AllowableValue("schema-text-property", "Use 'Schema Text' Property",
+ "The text of the Schema itself is specified by the 'Schema
Text' Property. The value of this property must be a valid Avro Schema. "
+ + "If Expression Language is used, the value of the
'Schema Text' property must be valid after substituting the expressions.");
+ public static final AllowableValue HWX_CONTENT_ENCODED_SCHEMA = new
AllowableValue("hwx-content-encoded-schema", "HWX Content-Encoded Schema
Reference",
+ "The content of the FlowFile contains a reference to a schema
in the Schema Registry service. The reference is encoded as a single byte
indicating the 'protocol version', "
+ + "followed by 8 bytes indicating the schema
identifier, and finally 4 bytes indicating the schema version, as per the
Hortonworks Schema Registry serializers and deserializers, "
+ + "found at https://github.com/hortonworks/registry");
+ public static final AllowableValue HWX_SCHEMA_REF_ATTRIBUTES = new
AllowableValue("hwx-schema-ref-attributes", "HWX Schema Reference Attributes",
+ "The FlowFile contains 3 Attributes that will be used to
lookup a Schema from the configured Schema Registry: 'schema.identifier',
'schema.version', and 'schema.protocol.version'");
+
+ public static final PropertyDescriptor SCHEMA_REGISTRY = new
PropertyDescriptor.Builder()
+ .name("Schema Registry")
+ .description("Specifies the Controller Service to use for the
Schema Registry")
+ .identifiesControllerService(SchemaRegistry.class)
+ .required(false)
+ .build();
+
+ public static final PropertyDescriptor SCHEMA_ACCESS_STRATEGY = new
PropertyDescriptor.Builder()
+ .name("Schema Access Strategy")
+ .description("Specifies how to obtain the schema that is to be
used for interpreting the data.")
+ .allowableValues(SCHEMA_NAME_PROPERTY, SCHEMA_TEXT_PROPERTY,
HWX_SCHEMA_REF_ATTRIBUTES, HWX_CONTENT_ENCODED_SCHEMA)
+ .defaultValue(SCHEMA_NAME_PROPERTY.getValue())
+ .required(true)
+ .build();
+
+ public static final PropertyDescriptor SCHEMA_NAME = new
PropertyDescriptor.Builder()
+ .name("Schema Name")
+ .description("Specifies the name of the schema to lookup in
the Schema Registry property")
--- End diff --
just to be clear the only thing that needs to be truly Human Readable is
displayName. However, the 'name' doesn't have to look like something only a
computer would like :)
> Add Put/Fetch Parquet Processors
> --------------------------------
>
> Key: NIFI-3724
> URL: https://issues.apache.org/jira/browse/NIFI-3724
> Project: Apache NiFi
> Issue Type: Improvement
> Reporter: Bryan Bende
> Assignee: Bryan Bende
> Priority: Minor
> Fix For: 1.2.0
>
>
> Now that we have the record reader/writer services currently in master, it
> would be nice to have reader and writers for Parquet. Since Parquet's API is
> based on the Hadoop Path object, and not InputStreams/OutputStreams, we can't
> really implement direct conversions to and from Parquet in the middle of a
> flow, but we can we can perform the conversion by taking any record format
> and writing to a Path as Parquet, or reading Parquet from a Path and writing
> it out as another record format.
> We should add a PutParquet that uses a record reader and writes records to a
> Path as Parquet, and a FetchParquet that reads Parquet from a path and writes
> out records to a flow file using a record writer.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)