[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-25 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16452054#comment-16452054
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on the issue:

https://github.com/apache/nifi/pull/2561
  
@bbende  @MikeThomsen  Thanks for reviewing the pull request


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
> Fix For: 1.7.0
>
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-24 Thread ASF subversion and git services (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16449998#comment-16449998
 ] 

ASF subversion and git services commented on NIFI-4035:
---

Commit e3f4720797a9777264d1732b2e1475b8f344a8f0 in nifi's branch 
refs/heads/master from abhinavrohatgi30
[ https://git-wip-us.apache.org/repos/asf?p=nifi.git;h=e3f4720 ]

NIFI-4035 - Adding PutSolrRecord Processor that reads NiFi records and indexes 
them into Solr as SolrDocuments

Adding Test Cases for PutSolrRecord Processor

Adding PutSolrRecord Processor in the list of Processors

Resolving checkstyle errors

Resolving checkstyle errors in test classes

Adding License information and additional information about the processor

1. Implementing Batch Indexing 2. Changes for nested records 3. Removing 
MockRecordParser

Fixing bugs with nested records

Updating version of dependencies

Setting Expression Language Scope

This closes #2561.

Signed-off-by: Bryan Bende 


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
> Fix For: 1.7.0
>
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-24 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=1644#comment-1644
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user asfgit closed the pull request at:

https://github.com/apache/nifi/pull/2561


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
> Fix For: 1.7.0
>
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-24 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16449992#comment-16449992
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on the issue:

https://github.com/apache/nifi/pull/2561
  
I was able to resolve the conflicts and everything looks good now, going to 
merge, thanks!


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-24 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16449795#comment-16449795
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2561
  
@abhinavrohatgi30 While you were away, I merged another Solr-related commit 
and that's the reason you now have conflicts.


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16446102#comment-16446102
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on the issue:

https://github.com/apache/nifi/pull/2561
  
I'm really sorry, it might take a while, I'm on a vacation and away from my 
workstation. I'll keep you updated as soon as I am back.



> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16445909#comment-16445909
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2561
  
@abhinavrohatgi30 You have a merge conflict in this branch. If you resolve 
it, I'll help @bbende finish the review.


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-09 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16431421#comment-16431421
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on the issue:

https://github.com/apache/nifi/pull/2561
  
Hi, I've looked at the comments and I've made the following changes as part 
of the latest commit that cover all the comments :

1. Fixed the issue with Nested Records (The issue came up because of the 
change in field names in the previous commit)

2. Fixed the issue with Array of Records (It was generating an Object[] as 
opposed to a Record[] that I was expecting and as a result was storing the 
string representation of a Record)

3. Trimming field names individually

4. Adding Test cases for Nested Record, Array of Record and Record Parser 
failure

5. Using the getLogger() later in the code

6. Wrapping the Jsons in the additionalDetails.html in a  tag
  
I hope the processor now works as expected, let me know if any further 
changes are to be made 

Thanks


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-03 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16424503#comment-16424503
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on the issue:

https://github.com/apache/nifi/pull/2561
  
@bbende I'll have a look at this and write test cases accordingly.


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-02 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16423105#comment-16423105
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r178643673
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,373 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-02 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16423104#comment-16423104
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r178643485
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,373 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-02 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16423103#comment-16423103
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r178644667
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/test/java/org/apache/nifi/processors/solr/TestPutSolrRecord.java
 ---
@@ -0,0 +1,643 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.controller.AbstractControllerService;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.reporting.InitializationException;
+import org.apache.nifi.serialization.record.MockRecordParser;
+import org.apache.nifi.serialization.record.RecordFieldType;
+import org.apache.nifi.ssl.SSLContextService;
+import org.apache.nifi.util.TestRunner;
+import org.apache.nifi.util.TestRunners;
+import org.apache.solr.client.solrj.SolrClient;
+import org.apache.solr.client.solrj.SolrQuery;
+import org.apache.solr.client.solrj.SolrRequest;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.impl.HttpSolrClient;
+import org.apache.solr.client.solrj.impl.Krb5HttpClientConfigurer;
+import org.apache.solr.client.solrj.response.QueryResponse;
+import org.apache.solr.common.SolrDocument;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.StringUtils;
+import org.apache.solr.common.util.NamedList;
+import org.junit.Assert;
+import org.junit.Test;
+import org.mockito.Mockito;
+
+import javax.net.ssl.SSLContext;
+import java.io.File;
+import java.io.IOException;
+import java.util.Collection;
+import java.util.HashMap;
+import java.util.Map;
+
+import static org.mockito.Matchers.any;
+import static org.mockito.Matchers.eq;
+import static org.mockito.Mockito.times;
+import static org.mockito.Mockito.verify;
+import static org.mockito.Mockito.when;
+
+/**
+ * Test for PutSolrRecord Processor
+ */
+public class TestPutSolrRecord {
--- End diff --

I think there should be 3 new tests added...

1) Test what happens when the recordParser throws an exception, just set 
failAfter on the record parser:
`recordParser.failAfter(0);`
And ensure that it routes to failure.

2) Test a nested record

3) Test an array of nested records


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-02 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16423102#comment-16423102
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r178644182
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/resources/docs/org.apache.nifi.processors.solr.PutSolrRecord/additionalDetails.html
 ---
@@ -0,0 +1,108 @@
+
+
+
+
+
+PutSolrRecord
+
+
+
+
+Usage Example
+
+This processor reads the NiFi record and indexes it into Solr as a 
SolrDocument.
+Any properties added to this processor by the user are
+passed to Solr on the update request. It is required that the input 
record reader
+should be specified for this processor. Additionally, if only selected 
fields of a record are to be indexed
+you can specify the field name as a comma-separated list under the 
fields property.
+
+
+Example: To specify specific fields of the record  to be indexed:
+
+
+Fields To Index: field1,field2,field3
+
+
+NOTE: In case of nested the field names should be 
prefixed with the parent field name.
+
+
+Fields To Index: 
parentField1,parentField2,parentField3_childField1,parentField3_childField2
+
+
+   In case of nested records, this processor would flatten all the nested 
records into a single solr document, the field name of the field in a child 
document would follow the format of {Parent Field Name}_{Child Field 
Name}.
+
+
+Example:
+For a record created from the following json:
--- End diff --

Can we wrap all the JSON examples in pre elements so that they display like 
code-blocks when viewing the documentation?


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-02 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16423106#comment-16423106
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r178643917
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,373 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-04-02 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16423094#comment-16423094
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on the issue:

https://github.com/apache/nifi/pull/2561
  
Nested records and arrays of nested records are not working correctly...

**Scenario 1 - Nested Record**

Schema:
```
{
"type": "record",
"name": "exams",
"fields" : [
  { "name": "first", "type": "string" },
  { "name": "last", "type": "string" },
  { "name": "grade", "type": "int" },
  {
"name": "exam",
"type": {
  "name" : "exam",
  "type" : "record",
  "fields" : [
{ "name": "subject", "type": "string" },
{ "name": "test", "type": "string" },
{ "name": "marks", "type": "int" }
  ]
}
  }
]
}
```

Input:
```
{
  "first": "Abhi",
  "last": "R",
  "grade": 8,
  "exam": {
"subject": "Maths",
"test" : "term1",
"marks" : 90
  }
}
```

Result:
```
java.util.NoSuchElementException: No value present
at java.util.Optional.get(Optional.java:135)
at 
org.apache.nifi.processors.solr.SolrUtils.writeRecord(SolrUtils.java:313)
at 
org.apache.nifi.processors.solr.SolrUtils.writeValue(SolrUtils.java:384)
at 
org.apache.nifi.processors.solr.SolrUtils.writeRecord(SolrUtils.java:314)
at 
org.apache.nifi.processors.solr.PutSolrRecord.onTrigger(PutSolrRecord.java:247)
at 
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
```

**Scenario 2 - Array of Records**

Schema:
```
{
"type": "record",
"name": "exams",
"fields" : [
  { "name": "first", "type": "string" },
  { "name": "last", "type": "string" },
  { "name": "grade", "type": "int" },
  {
"name": "exams",
"type": {
  "type" : "array",
  "items" : {
"name" : "exam",
"type" : "record",
"fields" : [
{ "name": "subject", "type": "string" },
{ "name": "test", "type": "string" },
{ "name": "marks", "type": "int" }
]
  }
}
  }
]
}
```

Input:
```
{
"first": "Abhi",
"last": "R",
"grade": 8,
"exams": [
{
"subject": "Maths",
"test" : "term1",
"marks" : 90
},
{
"subject": "Physics",
"test" : "term1",
"marks" : 95
}
]
}
```

Result:

Solr Document with multi-valued field exams where the values are the 
toString of a MapRecord:
```
 
"exams":["org.apache.nifi.serialization.record.MapRecord:MapRecord[{marks=90, 
test=term1, subject=Maths}]",
  
"org.apache.nifi.serialization.record.MapRecord:MapRecord[{marks=95, 
test=term1, subject=Physics}]"],
```
Should have created fields like exams_marks, exams_test, exams_subject.

Here is a full template for the two scenarios:


https://gist.githubusercontent.com/bbende/edc2e7d61db83b29533ac3fc520de30f/raw/8764d50ed5e14d876c53a0b84b3af5741d910b3b/PutSolrRecordTesting.xml

There needs to be unit tests that cover both these cases.


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-28 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16417426#comment-16417426
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on the issue:

https://github.com/apache/nifi/pull/2561
  
Thanks, will try to take a look in a few days, unless someone gets to it 
first.


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-27 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16416197#comment-16416197
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on the issue:

https://github.com/apache/nifi/pull/2561
  
Hi @bbende , I've brought it down to a single commit, can you have a look 
at it now?


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-23 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16411962#comment-16411962
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on the issue:

https://github.com/apache/nifi/pull/2561
  
I tried to re-base this against master so I could squash it down to a 
single commit, but the re-base is encountering a lot of conflicts, which really 
shouldn't be happening because its conflicting with itself. Can you work 
through getting it down to a single commit?

Normally it should just be:
`git rebase -i upstream/master`

Then in the list of commits you choose "s" for all the commits except the 
top-one, which squashes them all into the top one. Then force push.


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-22 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16409625#comment-16409625
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on the issue:

https://github.com/apache/nifi/pull/2561
  
I'm done with the changes that @bbende  and @MikeThomsen have suggested


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-22 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16409488#comment-16409488
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r176411279
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,351 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-21 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16408456#comment-16408456
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r176208789
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,351 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-21 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16408450#comment-16408450
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r176208228
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/SolrUtils.java
 ---
@@ -280,5 +291,115 @@ public SolrInputDocument 
toSolrInputDocument(SolrDocument d) {
 }
 }
 
+/**
+ * Writes each Record as a SolrInputDocument.
+ */
+public static void writeRecord(final Record record, final RecordSchema 
writeSchema, final SolrInputDocument inputDocument,final List 
fieldsToIndex)
+throws IOException {
+RecordSchema schema = record.getSchema();
+
+for (int i = 0; i < schema.getFieldCount(); i++) {
+final RecordField field = schema.getField(i);
+final String fieldName = field.getFieldName();
+final Object value = record.getValue(field);
+if (value == null || (!fieldsToIndex.isEmpty() && 
!fieldsToIndex.contains(fieldName))) {
+continue;
+}else {
+final DataType dataType = 
schema.getDataType(fieldName).get();
+writeValue(inputDocument, value, fieldName, 
dataType,fieldsToIndex);
+}
+}
+}
 
+private static void writeValue(final SolrInputDocument inputDocument, 
final Object value, final String fieldName, final DataType dataType,final 
List fieldsToIndex) throws IOException {
+final DataType chosenDataType = dataType.getFieldType() == 
RecordFieldType.CHOICE ? DataTypeUtils.chooseDataType(value, (ChoiceDataType) 
dataType) : dataType;
+final Object coercedValue = DataTypeUtils.convertType(value, 
chosenDataType, fieldName);
+if (coercedValue == null) {
+return;
+}
+
+switch (chosenDataType.getFieldType()) {
+case DATE: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.DATE.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDate localDate = 
getLocalDateFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDate.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDate.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case TIMESTAMP: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.TIMESTAMP.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDateTime localDateTime = 
getLocalDateTimeFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDateTime.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDateTime.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case DOUBLE:
+
inputDocument.addField(fieldName,DataTypeUtils.toDouble(coercedValue, 
fieldName));
+break;
+case FLOAT:
+
inputDocument.addField(fieldName,DataTypeUtils.toFloat(coercedValue, 
fieldName));
+break;
+case LONG:
+
inputDocument.addField(fieldName,DataTypeUtils.toLong(coercedValue, fieldName));
+break;
+case INT:
+case BYTE:
+case SHORT:
+
inputDocument.addField(fieldName,DataTypeUtils.toInteger(coercedValue, 
fieldName));
+break;
+case CHAR:
+case STRING:
+inputDocument.addField(fieldName,coercedValue.toString());
+break;
+case BIGINT:
+if (coercedValue instanceof Long) {
+inputDocument.addField(fieldName,(Long) coercedValue);
+} else {
+inputDocument.addField(fieldName,(BigInteger) 
coercedValue);
+}
+break;
+case BOOLEAN:
+final String 

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-21 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16408448#comment-16408448
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r176207854
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,351 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-21 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16408446#comment-16408446
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r176206589
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,351 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-21 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16408393#comment-16408393
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on the issue:

https://github.com/apache/nifi/pull/2561
  
I would try doing a rebase against master to see what happens. In the worst 
case situation you would have to create another branch off latest master, and 
then individually cherry-pick your commits from this branch over to the new 
branch, to get rid of those other commits that are in between yours, but only 
do that if you can't get this branch straightened out. 


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-21 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16408090#comment-16408090
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on the issue:

https://github.com/apache/nifi/pull/2561
  
Sorry, instead of doing the force push i resolved conflicts and did a push, 
can i now do the rebase again on the current commit or will i have to add a new 
commit inorder to rebase from the master branch?


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-21 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16407994#comment-16407994
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on the issue:

https://github.com/apache/nifi/pull/2561
  
Yea when you update your branch you should be doing something like the 
following...
```
git fetch upstream
git rebase upstream/master
```
This assumes "upstream" points to either Apache NiFi git repo or Apache 
NiFi Github.

Using rebase will apply all the incoming commits from upstream/master to 
your branch and then put your commits back on top of that so it looks like 
yours are always the latest.

You then need to force push to your remote branch `git push origin 
your-branch --force`


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-21 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16407743#comment-16407743
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user MikeThomsen commented on the issue:

https://github.com/apache/nifi/pull/2561
  
@abhinavrohatgi30 Looks like your latest push grabbed a bunch of other 
folks' commits. Unless @bbende disagrees, I think you're going to need to 
rebase and repush.


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406855#comment-16406855
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175883171
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/SolrUtils.java
 ---
@@ -280,5 +291,115 @@ public SolrInputDocument 
toSolrInputDocument(SolrDocument d) {
 }
 }
 
+/**
+ * Writes each Record as a SolrInputDocument.
+ */
+public static void writeRecord(final Record record, final RecordSchema 
writeSchema, final SolrInputDocument inputDocument,final List 
fieldsToIndex)
+throws IOException {
+RecordSchema schema = record.getSchema();
+
+for (int i = 0; i < schema.getFieldCount(); i++) {
+final RecordField field = schema.getField(i);
+final String fieldName = field.getFieldName();
+final Object value = record.getValue(field);
+if (value == null || (!fieldsToIndex.isEmpty() && 
!fieldsToIndex.contains(fieldName))) {
+continue;
+}else {
+final DataType dataType = 
schema.getDataType(fieldName).get();
+writeValue(inputDocument, value, fieldName, 
dataType,fieldsToIndex);
+}
+}
+}
 
+private static void writeValue(final SolrInputDocument inputDocument, 
final Object value, final String fieldName, final DataType dataType,final 
List fieldsToIndex) throws IOException {
+final DataType chosenDataType = dataType.getFieldType() == 
RecordFieldType.CHOICE ? DataTypeUtils.chooseDataType(value, (ChoiceDataType) 
dataType) : dataType;
+final Object coercedValue = DataTypeUtils.convertType(value, 
chosenDataType, fieldName);
+if (coercedValue == null) {
+return;
+}
+
+switch (chosenDataType.getFieldType()) {
+case DATE: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.DATE.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDate localDate = 
getLocalDateFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDate.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDate.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case TIMESTAMP: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.TIMESTAMP.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDateTime localDateTime = 
getLocalDateTimeFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDateTime.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDateTime.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case DOUBLE:
+
inputDocument.addField(fieldName,DataTypeUtils.toDouble(coercedValue, 
fieldName));
+break;
+case FLOAT:
+
inputDocument.addField(fieldName,DataTypeUtils.toFloat(coercedValue, 
fieldName));
+break;
+case LONG:
+
inputDocument.addField(fieldName,DataTypeUtils.toLong(coercedValue, fieldName));
+break;
+case INT:
+case BYTE:
+case SHORT:
+
inputDocument.addField(fieldName,DataTypeUtils.toInteger(coercedValue, 
fieldName));
+break;
+case CHAR:
+case STRING:
+inputDocument.addField(fieldName,coercedValue.toString());
+break;
+case BIGINT:
+if (coercedValue instanceof Long) {
+inputDocument.addField(fieldName,(Long) coercedValue);
+} else {
+inputDocument.addField(fieldName,(BigInteger) 
coercedValue);
+}
+break;
+case BOOLEAN:
+final String 

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406854#comment-16406854
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175882819
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,351 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406769#comment-16406769
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175863187
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,351 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406768#comment-16406768
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175863036
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,351 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406744#comment-16406744
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175858048
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/SolrUtils.java
 ---
@@ -280,5 +291,115 @@ public SolrInputDocument 
toSolrInputDocument(SolrDocument d) {
 }
 }
 
+/**
+ * Writes each Record as a SolrInputDocument.
+ */
+public static void writeRecord(final Record record, final RecordSchema 
writeSchema, final SolrInputDocument inputDocument,final List 
fieldsToIndex)
+throws IOException {
+RecordSchema schema = record.getSchema();
+
+for (int i = 0; i < schema.getFieldCount(); i++) {
+final RecordField field = schema.getField(i);
+final String fieldName = field.getFieldName();
+final Object value = record.getValue(field);
+if (value == null || (!fieldsToIndex.isEmpty() && 
!fieldsToIndex.contains(fieldName))) {
+continue;
+}else {
+final DataType dataType = 
schema.getDataType(fieldName).get();
+writeValue(inputDocument, value, fieldName, 
dataType,fieldsToIndex);
+}
+}
+}
 
+private static void writeValue(final SolrInputDocument inputDocument, 
final Object value, final String fieldName, final DataType dataType,final 
List fieldsToIndex) throws IOException {
+final DataType chosenDataType = dataType.getFieldType() == 
RecordFieldType.CHOICE ? DataTypeUtils.chooseDataType(value, (ChoiceDataType) 
dataType) : dataType;
+final Object coercedValue = DataTypeUtils.convertType(value, 
chosenDataType, fieldName);
+if (coercedValue == null) {
+return;
+}
+
+switch (chosenDataType.getFieldType()) {
+case DATE: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.DATE.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDate localDate = 
getLocalDateFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDate.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDate.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case TIMESTAMP: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.TIMESTAMP.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDateTime localDateTime = 
getLocalDateTimeFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDateTime.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDateTime.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case DOUBLE:
+
inputDocument.addField(fieldName,DataTypeUtils.toDouble(coercedValue, 
fieldName));
+break;
+case FLOAT:
+
inputDocument.addField(fieldName,DataTypeUtils.toFloat(coercedValue, 
fieldName));
+break;
+case LONG:
+
inputDocument.addField(fieldName,DataTypeUtils.toLong(coercedValue, fieldName));
+break;
+case INT:
+case BYTE:
+case SHORT:
+
inputDocument.addField(fieldName,DataTypeUtils.toInteger(coercedValue, 
fieldName));
+break;
+case CHAR:
+case STRING:
+inputDocument.addField(fieldName,coercedValue.toString());
+break;
+case BIGINT:
+if (coercedValue instanceof Long) {
+inputDocument.addField(fieldName,(Long) coercedValue);
+} else {
+inputDocument.addField(fieldName,(BigInteger) 
coercedValue);
+}
+break;
+case BOOLEAN:
+final String stringValue = 

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406663#comment-16406663
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175840062
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/SolrUtils.java
 ---
@@ -280,5 +291,115 @@ public SolrInputDocument 
toSolrInputDocument(SolrDocument d) {
 }
 }
 
+/**
+ * Writes each Record as a SolrInputDocument.
+ */
+public static void writeRecord(final Record record, final RecordSchema 
writeSchema, final SolrInputDocument inputDocument,final List 
fieldsToIndex)
+throws IOException {
+RecordSchema schema = record.getSchema();
+
+for (int i = 0; i < schema.getFieldCount(); i++) {
+final RecordField field = schema.getField(i);
+final String fieldName = field.getFieldName();
+final Object value = record.getValue(field);
+if (value == null || (!fieldsToIndex.isEmpty() && 
!fieldsToIndex.contains(fieldName))) {
+continue;
+}else {
+final DataType dataType = 
schema.getDataType(fieldName).get();
+writeValue(inputDocument, value, fieldName, 
dataType,fieldsToIndex);
+}
+}
+}
 
+private static void writeValue(final SolrInputDocument inputDocument, 
final Object value, final String fieldName, final DataType dataType,final 
List fieldsToIndex) throws IOException {
+final DataType chosenDataType = dataType.getFieldType() == 
RecordFieldType.CHOICE ? DataTypeUtils.chooseDataType(value, (ChoiceDataType) 
dataType) : dataType;
+final Object coercedValue = DataTypeUtils.convertType(value, 
chosenDataType, fieldName);
+if (coercedValue == null) {
+return;
+}
+
+switch (chosenDataType.getFieldType()) {
+case DATE: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.DATE.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDate localDate = 
getLocalDateFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDate.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDate.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case TIMESTAMP: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.TIMESTAMP.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDateTime localDateTime = 
getLocalDateTimeFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDateTime.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDateTime.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case DOUBLE:
+
inputDocument.addField(fieldName,DataTypeUtils.toDouble(coercedValue, 
fieldName));
+break;
+case FLOAT:
+
inputDocument.addField(fieldName,DataTypeUtils.toFloat(coercedValue, 
fieldName));
+break;
+case LONG:
+
inputDocument.addField(fieldName,DataTypeUtils.toLong(coercedValue, fieldName));
+break;
+case INT:
+case BYTE:
+case SHORT:
+
inputDocument.addField(fieldName,DataTypeUtils.toInteger(coercedValue, 
fieldName));
+break;
+case CHAR:
+case STRING:
+inputDocument.addField(fieldName,coercedValue.toString());
+break;
+case BIGINT:
+if (coercedValue instanceof Long) {
+inputDocument.addField(fieldName,(Long) coercedValue);
+} else {
+inputDocument.addField(fieldName,(BigInteger) 
coercedValue);
+}
+break;
+case BOOLEAN:
+final String 

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406657#comment-16406657
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175839474
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/SolrUtils.java
 ---
@@ -280,5 +291,115 @@ public SolrInputDocument 
toSolrInputDocument(SolrDocument d) {
 }
 }
 
+/**
+ * Writes each Record as a SolrInputDocument.
+ */
+public static void writeRecord(final Record record, final RecordSchema 
writeSchema, final SolrInputDocument inputDocument,final List 
fieldsToIndex)
+throws IOException {
+RecordSchema schema = record.getSchema();
+
+for (int i = 0; i < schema.getFieldCount(); i++) {
+final RecordField field = schema.getField(i);
+final String fieldName = field.getFieldName();
+final Object value = record.getValue(field);
+if (value == null || (!fieldsToIndex.isEmpty() && 
!fieldsToIndex.contains(fieldName))) {
+continue;
+}else {
+final DataType dataType = 
schema.getDataType(fieldName).get();
+writeValue(inputDocument, value, fieldName, 
dataType,fieldsToIndex);
+}
+}
+}
 
+private static void writeValue(final SolrInputDocument inputDocument, 
final Object value, final String fieldName, final DataType dataType,final 
List fieldsToIndex) throws IOException {
+final DataType chosenDataType = dataType.getFieldType() == 
RecordFieldType.CHOICE ? DataTypeUtils.chooseDataType(value, (ChoiceDataType) 
dataType) : dataType;
+final Object coercedValue = DataTypeUtils.convertType(value, 
chosenDataType, fieldName);
+if (coercedValue == null) {
+return;
+}
+
+switch (chosenDataType.getFieldType()) {
+case DATE: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.DATE.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDate localDate = 
getLocalDateFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDate.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDate.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case TIMESTAMP: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.TIMESTAMP.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDateTime localDateTime = 
getLocalDateTimeFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDateTime.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDateTime.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case DOUBLE:
+
inputDocument.addField(fieldName,DataTypeUtils.toDouble(coercedValue, 
fieldName));
+break;
+case FLOAT:
+
inputDocument.addField(fieldName,DataTypeUtils.toFloat(coercedValue, 
fieldName));
+break;
+case LONG:
+
inputDocument.addField(fieldName,DataTypeUtils.toLong(coercedValue, fieldName));
+break;
+case INT:
+case BYTE:
+case SHORT:
+
inputDocument.addField(fieldName,DataTypeUtils.toInteger(coercedValue, 
fieldName));
+break;
+case CHAR:
+case STRING:
+inputDocument.addField(fieldName,coercedValue.toString());
+break;
+case BIGINT:
+if (coercedValue instanceof Long) {
+inputDocument.addField(fieldName,(Long) coercedValue);
+} else {
+inputDocument.addField(fieldName,(BigInteger) 
coercedValue);
+}
+break;
+case BOOLEAN:
+final String 

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406535#comment-16406535
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175818379
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/SolrUtils.java
 ---
@@ -280,5 +291,115 @@ public SolrInputDocument 
toSolrInputDocument(SolrDocument d) {
 }
 }
 
+/**
+ * Writes each Record as a SolrInputDocument.
+ */
+public static void writeRecord(final Record record, final RecordSchema 
writeSchema, final SolrInputDocument inputDocument,final List 
fieldsToIndex)
+throws IOException {
+RecordSchema schema = record.getSchema();
+
+for (int i = 0; i < schema.getFieldCount(); i++) {
+final RecordField field = schema.getField(i);
+final String fieldName = field.getFieldName();
+final Object value = record.getValue(field);
+if (value == null || (!fieldsToIndex.isEmpty() && 
!fieldsToIndex.contains(fieldName))) {
+continue;
+}else {
+final DataType dataType = 
schema.getDataType(fieldName).get();
+writeValue(inputDocument, value, fieldName, 
dataType,fieldsToIndex);
+}
+}
+}
 
+private static void writeValue(final SolrInputDocument inputDocument, 
final Object value, final String fieldName, final DataType dataType,final 
List fieldsToIndex) throws IOException {
+final DataType chosenDataType = dataType.getFieldType() == 
RecordFieldType.CHOICE ? DataTypeUtils.chooseDataType(value, (ChoiceDataType) 
dataType) : dataType;
+final Object coercedValue = DataTypeUtils.convertType(value, 
chosenDataType, fieldName);
+if (coercedValue == null) {
+return;
+}
+
+switch (chosenDataType.getFieldType()) {
+case DATE: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.DATE.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDate localDate = 
getLocalDateFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDate.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDate.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case TIMESTAMP: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.TIMESTAMP.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDateTime localDateTime = 
getLocalDateTimeFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDateTime.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDateTime.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case DOUBLE:
+
inputDocument.addField(fieldName,DataTypeUtils.toDouble(coercedValue, 
fieldName));
+break;
+case FLOAT:
+
inputDocument.addField(fieldName,DataTypeUtils.toFloat(coercedValue, 
fieldName));
+break;
+case LONG:
+
inputDocument.addField(fieldName,DataTypeUtils.toLong(coercedValue, fieldName));
+break;
+case INT:
+case BYTE:
+case SHORT:
+
inputDocument.addField(fieldName,DataTypeUtils.toInteger(coercedValue, 
fieldName));
+break;
+case CHAR:
+case STRING:
+inputDocument.addField(fieldName,coercedValue.toString());
+break;
+case BIGINT:
+if (coercedValue instanceof Long) {
+inputDocument.addField(fieldName,(Long) coercedValue);
+} else {
+inputDocument.addField(fieldName,(BigInteger) 
coercedValue);
+}
+break;
+case BOOLEAN:
+final String stringValue = 

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406527#comment-16406527
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175816508
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/SolrUtils.java
 ---
@@ -280,5 +291,115 @@ public SolrInputDocument 
toSolrInputDocument(SolrDocument d) {
 }
 }
 
+/**
+ * Writes each Record as a SolrInputDocument.
+ */
+public static void writeRecord(final Record record, final RecordSchema 
writeSchema, final SolrInputDocument inputDocument,final List 
fieldsToIndex)
+throws IOException {
+RecordSchema schema = record.getSchema();
+
+for (int i = 0; i < schema.getFieldCount(); i++) {
+final RecordField field = schema.getField(i);
+final String fieldName = field.getFieldName();
+final Object value = record.getValue(field);
+if (value == null || (!fieldsToIndex.isEmpty() && 
!fieldsToIndex.contains(fieldName))) {
+continue;
+}else {
+final DataType dataType = 
schema.getDataType(fieldName).get();
+writeValue(inputDocument, value, fieldName, 
dataType,fieldsToIndex);
+}
+}
+}
 
+private static void writeValue(final SolrInputDocument inputDocument, 
final Object value, final String fieldName, final DataType dataType,final 
List fieldsToIndex) throws IOException {
+final DataType chosenDataType = dataType.getFieldType() == 
RecordFieldType.CHOICE ? DataTypeUtils.chooseDataType(value, (ChoiceDataType) 
dataType) : dataType;
+final Object coercedValue = DataTypeUtils.convertType(value, 
chosenDataType, fieldName);
+if (coercedValue == null) {
+return;
+}
+
+switch (chosenDataType.getFieldType()) {
+case DATE: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.DATE.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDate localDate = 
getLocalDateFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDate.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDate.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case TIMESTAMP: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.TIMESTAMP.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDateTime localDateTime = 
getLocalDateTimeFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDateTime.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDateTime.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case DOUBLE:
+
inputDocument.addField(fieldName,DataTypeUtils.toDouble(coercedValue, 
fieldName));
+break;
+case FLOAT:
+
inputDocument.addField(fieldName,DataTypeUtils.toFloat(coercedValue, 
fieldName));
+break;
+case LONG:
+
inputDocument.addField(fieldName,DataTypeUtils.toLong(coercedValue, fieldName));
+break;
+case INT:
+case BYTE:
+case SHORT:
+
inputDocument.addField(fieldName,DataTypeUtils.toInteger(coercedValue, 
fieldName));
+break;
+case CHAR:
+case STRING:
+inputDocument.addField(fieldName,coercedValue.toString());
+break;
+case BIGINT:
+if (coercedValue instanceof Long) {
+inputDocument.addField(fieldName,(Long) coercedValue);
+} else {
+inputDocument.addField(fieldName,(BigInteger) 
coercedValue);
+}
+break;
+case BOOLEAN:
+final String 

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406503#comment-16406503
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175810746
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/SolrUtils.java
 ---
@@ -280,5 +291,115 @@ public SolrInputDocument 
toSolrInputDocument(SolrDocument d) {
 }
 }
 
+/**
+ * Writes each Record as a SolrInputDocument.
+ */
+public static void writeRecord(final Record record, final RecordSchema 
writeSchema, final SolrInputDocument inputDocument,final List 
fieldsToIndex)
+throws IOException {
+RecordSchema schema = record.getSchema();
+
+for (int i = 0; i < schema.getFieldCount(); i++) {
+final RecordField field = schema.getField(i);
+final String fieldName = field.getFieldName();
+final Object value = record.getValue(field);
+if (value == null || (!fieldsToIndex.isEmpty() && 
!fieldsToIndex.contains(fieldName))) {
+continue;
+}else {
+final DataType dataType = 
schema.getDataType(fieldName).get();
+writeValue(inputDocument, value, fieldName, 
dataType,fieldsToIndex);
+}
+}
+}
 
+private static void writeValue(final SolrInputDocument inputDocument, 
final Object value, final String fieldName, final DataType dataType,final 
List fieldsToIndex) throws IOException {
+final DataType chosenDataType = dataType.getFieldType() == 
RecordFieldType.CHOICE ? DataTypeUtils.chooseDataType(value, (ChoiceDataType) 
dataType) : dataType;
+final Object coercedValue = DataTypeUtils.convertType(value, 
chosenDataType, fieldName);
+if (coercedValue == null) {
+return;
+}
+
+switch (chosenDataType.getFieldType()) {
+case DATE: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.DATE.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDate localDate = 
getLocalDateFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDate.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDate.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case TIMESTAMP: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.TIMESTAMP.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDateTime localDateTime = 
getLocalDateTimeFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDateTime.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDateTime.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case DOUBLE:
+
inputDocument.addField(fieldName,DataTypeUtils.toDouble(coercedValue, 
fieldName));
+break;
+case FLOAT:
+
inputDocument.addField(fieldName,DataTypeUtils.toFloat(coercedValue, 
fieldName));
+break;
+case LONG:
+
inputDocument.addField(fieldName,DataTypeUtils.toLong(coercedValue, fieldName));
+break;
+case INT:
+case BYTE:
+case SHORT:
+
inputDocument.addField(fieldName,DataTypeUtils.toInteger(coercedValue, 
fieldName));
+break;
+case CHAR:
+case STRING:
+inputDocument.addField(fieldName,coercedValue.toString());
+break;
+case BIGINT:
+if (coercedValue instanceof Long) {
+inputDocument.addField(fieldName,(Long) coercedValue);
+} else {
+inputDocument.addField(fieldName,(BigInteger) 
coercedValue);
+}
+break;
+case BOOLEAN:
+final String stringValue = 

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406498#comment-16406498
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175809550
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/SolrUtils.java
 ---
@@ -280,5 +291,115 @@ public SolrInputDocument 
toSolrInputDocument(SolrDocument d) {
 }
 }
 
+/**
+ * Writes each Record as a SolrInputDocument.
+ */
+public static void writeRecord(final Record record, final RecordSchema 
writeSchema, final SolrInputDocument inputDocument,final List 
fieldsToIndex)
+throws IOException {
+RecordSchema schema = record.getSchema();
+
+for (int i = 0; i < schema.getFieldCount(); i++) {
+final RecordField field = schema.getField(i);
+final String fieldName = field.getFieldName();
+final Object value = record.getValue(field);
+if (value == null || (!fieldsToIndex.isEmpty() && 
!fieldsToIndex.contains(fieldName))) {
+continue;
+}else {
+final DataType dataType = 
schema.getDataType(fieldName).get();
+writeValue(inputDocument, value, fieldName, 
dataType,fieldsToIndex);
+}
+}
+}
 
+private static void writeValue(final SolrInputDocument inputDocument, 
final Object value, final String fieldName, final DataType dataType,final 
List fieldsToIndex) throws IOException {
+final DataType chosenDataType = dataType.getFieldType() == 
RecordFieldType.CHOICE ? DataTypeUtils.chooseDataType(value, (ChoiceDataType) 
dataType) : dataType;
+final Object coercedValue = DataTypeUtils.convertType(value, 
chosenDataType, fieldName);
+if (coercedValue == null) {
+return;
+}
+
+switch (chosenDataType.getFieldType()) {
+case DATE: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.DATE.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDate localDate = 
getLocalDateFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDate.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDate.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case TIMESTAMP: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.TIMESTAMP.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDateTime localDateTime = 
getLocalDateTimeFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDateTime.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDateTime.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case DOUBLE:
+
inputDocument.addField(fieldName,DataTypeUtils.toDouble(coercedValue, 
fieldName));
+break;
+case FLOAT:
+
inputDocument.addField(fieldName,DataTypeUtils.toFloat(coercedValue, 
fieldName));
+break;
+case LONG:
+
inputDocument.addField(fieldName,DataTypeUtils.toLong(coercedValue, fieldName));
+break;
+case INT:
+case BYTE:
+case SHORT:
+
inputDocument.addField(fieldName,DataTypeUtils.toInteger(coercedValue, 
fieldName));
+break;
+case CHAR:
+case STRING:
+inputDocument.addField(fieldName,coercedValue.toString());
+break;
+case BIGINT:
+if (coercedValue instanceof Long) {
+inputDocument.addField(fieldName,(Long) coercedValue);
+} else {
+inputDocument.addField(fieldName,(BigInteger) 
coercedValue);
+}
+break;
+case BOOLEAN:
+final String stringValue = 

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406464#comment-16406464
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175801577
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/SolrUtils.java
 ---
@@ -280,5 +291,115 @@ public SolrInputDocument 
toSolrInputDocument(SolrDocument d) {
 }
 }
 
+/**
+ * Writes each Record as a SolrInputDocument.
+ */
+public static void writeRecord(final Record record, final RecordSchema 
writeSchema, final SolrInputDocument inputDocument,final List 
fieldsToIndex)
+throws IOException {
+RecordSchema schema = record.getSchema();
+
+for (int i = 0; i < schema.getFieldCount(); i++) {
+final RecordField field = schema.getField(i);
+final String fieldName = field.getFieldName();
+final Object value = record.getValue(field);
+if (value == null || (!fieldsToIndex.isEmpty() && 
!fieldsToIndex.contains(fieldName))) {
+continue;
+}else {
+final DataType dataType = 
schema.getDataType(fieldName).get();
+writeValue(inputDocument, value, fieldName, 
dataType,fieldsToIndex);
+}
+}
+}
 
+private static void writeValue(final SolrInputDocument inputDocument, 
final Object value, final String fieldName, final DataType dataType,final 
List fieldsToIndex) throws IOException {
+final DataType chosenDataType = dataType.getFieldType() == 
RecordFieldType.CHOICE ? DataTypeUtils.chooseDataType(value, (ChoiceDataType) 
dataType) : dataType;
+final Object coercedValue = DataTypeUtils.convertType(value, 
chosenDataType, fieldName);
+if (coercedValue == null) {
+return;
+}
+
+switch (chosenDataType.getFieldType()) {
+case DATE: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.DATE.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDate localDate = 
getLocalDateFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDate.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDate.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case TIMESTAMP: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.TIMESTAMP.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDateTime localDateTime = 
getLocalDateTimeFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDateTime.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDateTime.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case DOUBLE:
+
inputDocument.addField(fieldName,DataTypeUtils.toDouble(coercedValue, 
fieldName));
+break;
+case FLOAT:
+
inputDocument.addField(fieldName,DataTypeUtils.toFloat(coercedValue, 
fieldName));
+break;
+case LONG:
+
inputDocument.addField(fieldName,DataTypeUtils.toLong(coercedValue, fieldName));
+break;
+case INT:
+case BYTE:
+case SHORT:
+
inputDocument.addField(fieldName,DataTypeUtils.toInteger(coercedValue, 
fieldName));
+break;
+case CHAR:
+case STRING:
+inputDocument.addField(fieldName,coercedValue.toString());
+break;
+case BIGINT:
+if (coercedValue instanceof Long) {
+inputDocument.addField(fieldName,(Long) coercedValue);
+} else {
+inputDocument.addField(fieldName,(BigInteger) 
coercedValue);
+}
+break;
+case BOOLEAN:
+final String 

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406455#comment-16406455
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175799302
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,351 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406451#comment-16406451
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175797163
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/test/java/org/apache/nifi/processors/solr/util/MockRecordParser.java
 ---
@@ -0,0 +1,105 @@
+/*
--- End diff --

Sure, I'll look into these tests.


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406448#comment-16406448
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175796878
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,351 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406332#comment-16406332
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175764796
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/SolrUtils.java
 ---
@@ -280,5 +291,115 @@ public SolrInputDocument 
toSolrInputDocument(SolrDocument d) {
 }
 }
 
+/**
+ * Writes each Record as a SolrInputDocument.
+ */
+public static void writeRecord(final Record record, final RecordSchema 
writeSchema, final SolrInputDocument inputDocument,final List 
fieldsToIndex)
+throws IOException {
+RecordSchema schema = record.getSchema();
+
+for (int i = 0; i < schema.getFieldCount(); i++) {
+final RecordField field = schema.getField(i);
+final String fieldName = field.getFieldName();
+final Object value = record.getValue(field);
+if (value == null || (!fieldsToIndex.isEmpty() && 
!fieldsToIndex.contains(fieldName))) {
+continue;
+}else {
+final DataType dataType = 
schema.getDataType(fieldName).get();
+writeValue(inputDocument, value, fieldName, 
dataType,fieldsToIndex);
+}
+}
+}
 
+private static void writeValue(final SolrInputDocument inputDocument, 
final Object value, final String fieldName, final DataType dataType,final 
List fieldsToIndex) throws IOException {
+final DataType chosenDataType = dataType.getFieldType() == 
RecordFieldType.CHOICE ? DataTypeUtils.chooseDataType(value, (ChoiceDataType) 
dataType) : dataType;
+final Object coercedValue = DataTypeUtils.convertType(value, 
chosenDataType, fieldName);
+if (coercedValue == null) {
+return;
+}
+
+switch (chosenDataType.getFieldType()) {
+case DATE: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.DATE.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDate localDate = 
getLocalDateFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDate.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDate.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case TIMESTAMP: {
+final String stringValue = 
DataTypeUtils.toString(coercedValue, () -> 
DataTypeUtils.getDateFormat(RecordFieldType.TIMESTAMP.getDefaultFormat()));
+if (DataTypeUtils.isLongTypeCompatible(stringValue)) {
+LocalDateTime localDateTime = 
getLocalDateTimeFromEpochTime(fieldName, coercedValue);
+
inputDocument.addField(fieldName,localDateTime.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+} else {
+
inputDocument.addField(fieldName,LocalDateTime.parse(stringValue).format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)+'Z');
+}
+break;
+}
+case DOUBLE:
+
inputDocument.addField(fieldName,DataTypeUtils.toDouble(coercedValue, 
fieldName));
+break;
+case FLOAT:
+
inputDocument.addField(fieldName,DataTypeUtils.toFloat(coercedValue, 
fieldName));
+break;
+case LONG:
+
inputDocument.addField(fieldName,DataTypeUtils.toLong(coercedValue, fieldName));
+break;
+case INT:
+case BYTE:
+case SHORT:
+
inputDocument.addField(fieldName,DataTypeUtils.toInteger(coercedValue, 
fieldName));
+break;
+case CHAR:
+case STRING:
+inputDocument.addField(fieldName,coercedValue.toString());
+break;
+case BIGINT:
+if (coercedValue instanceof Long) {
+inputDocument.addField(fieldName,(Long) coercedValue);
+} else {
+inputDocument.addField(fieldName,(BigInteger) 
coercedValue);
+}
+break;
+case BOOLEAN:
+final String stringValue = 

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406316#comment-16406316
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175761850
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,351 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406307#comment-16406307
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user bbende commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175760215
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,351 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406259#comment-16406259
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175743694
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,351 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406258#comment-16406258
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175746503
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/test/java/org/apache/nifi/processors/solr/util/MockRecordParser.java
 ---
@@ -0,0 +1,105 @@
+/*
--- End diff --

`PutMongoRecordIT.testInsertNestedRecords` is another good example.


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406260#comment-16406260
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175744146
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/main/java/org/apache/nifi/processors/solr/PutSolrRecord.java
 ---
@@ -0,0 +1,351 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *   http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+package org.apache.nifi.processors.solr;
+
+import org.apache.nifi.annotation.behavior.DynamicProperty;
+import org.apache.nifi.annotation.behavior.InputRequirement;
+import org.apache.nifi.annotation.behavior.InputRequirement.Requirement;
+import org.apache.nifi.annotation.documentation.CapabilityDescription;
+import org.apache.nifi.annotation.documentation.Tags;
+import org.apache.nifi.components.PropertyDescriptor;
+import org.apache.nifi.flowfile.FlowFile;
+import org.apache.nifi.logging.ComponentLog;
+import org.apache.nifi.processor.ProcessContext;
+import org.apache.nifi.processor.ProcessSession;
+import org.apache.nifi.processor.ProcessorInitializationContext;
+import org.apache.nifi.processor.Relationship;
+import org.apache.nifi.processor.exception.ProcessException;
+import org.apache.nifi.processor.util.StandardValidators;
+import org.apache.nifi.schema.access.SchemaNotFoundException;
+import org.apache.nifi.serialization.MalformedRecordException;
+import org.apache.nifi.serialization.RecordReader;
+import org.apache.nifi.serialization.RecordReaderFactory;
+import org.apache.nifi.serialization.record.Record;
+import org.apache.nifi.util.StopWatch;
+import org.apache.nifi.util.StringUtils;
+import org.apache.solr.client.solrj.SolrServerException;
+import org.apache.solr.client.solrj.request.UpdateRequest;
+import org.apache.solr.client.solrj.response.UpdateResponse;
+import org.apache.solr.common.SolrException;
+import org.apache.solr.common.SolrInputDocument;
+import org.apache.solr.common.params.ModifiableSolrParams;
+import org.apache.solr.common.params.MultiMapSolrParams;
+
+import java.io.IOException;
+import java.io.InputStream;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.LinkedList;
+import java.util.List;
+import java.util.Map;
+import java.util.Set;
+import java.util.SortedMap;
+import java.util.TreeMap;
+import java.util.concurrent.TimeUnit;
+import java.util.concurrent.atomic.AtomicReference;
+
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_PASSWORD;
+import static org.apache.nifi.processors.solr.SolrUtils.BASIC_USERNAME;
+import static org.apache.nifi.processors.solr.SolrUtils.COLLECTION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.JAAS_CLIENT_APP_NAME;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_LOCATION;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_MAX_CONNECTIONS_PER_HOST;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SOLR_SOCKET_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE;
+import static org.apache.nifi.processors.solr.SolrUtils.SOLR_TYPE_CLOUD;
+import static 
org.apache.nifi.processors.solr.SolrUtils.SSL_CONTEXT_SERVICE;
+import static org.apache.nifi.processors.solr.SolrUtils.ZK_CLIENT_TIMEOUT;
+import static 
org.apache.nifi.processors.solr.SolrUtils.ZK_CONNECTION_TIMEOUT;
+import static org.apache.nifi.processors.solr.SolrUtils.writeRecord;
+
+
+@Tags({"Apache", "Solr", "Put", "Send","Record"})

[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-20 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16406261#comment-16406261
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user MikeThomsen commented on a diff in the pull request:

https://github.com/apache/nifi/pull/2561#discussion_r175746168
  
--- Diff: 
nifi-nar-bundles/nifi-solr-bundle/nifi-solr-processors/src/test/java/org/apache/nifi/processors/solr/util/MockRecordParser.java
 ---
@@ -0,0 +1,105 @@
+/*
--- End diff --

See `TestPutHBaseRecord` for an example of how to use 
`org.apache.nifi.serialization.record.MockRecordParser` which should be able to 
replace this class.


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-19 Thread Abhinav Rohatgi (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16405450#comment-16405450
 ] 

Abhinav Rohatgi commented on NIFI-4035:
---

Hi, can someone please review the pull request.

> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16403510#comment-16403510
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

Github user abhinavrohatgi30 commented on the issue:

https://github.com/apache/nifi/pull/2561
  
NIFI-4035 Adding a PutSolrRecord Processor that reads NiFi Records and 
indexes them into Solr as SolrDocuments.


> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2018-03-17 Thread ASF GitHub Bot (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16403509#comment-16403509
 ] 

ASF GitHub Bot commented on NIFI-4035:
--

GitHub user abhinavrohatgi30 opened a pull request:

https://github.com/apache/nifi/pull/2561

NIFI-4035 Implement record-based Solr processors

Thank you for submitting a contribution to Apache NiFi.

In order to streamline the review of the contribution we ask you
to ensure the following steps have been taken:

### For all changes:
- [ ] Is there a JIRA ticket associated with this PR? Is it referenced 
 in the commit message?

- [ ] Does your PR title start with NIFI- where  is the JIRA number 
you are trying to resolve? Pay particular attention to the hyphen "-" character.

- [ ] Has your PR been rebased against the latest commit within the target 
branch (typically master)?

- [ ] Is your initial contribution a single, squashed commit?

### For code changes:
- [ ] Have you ensured that the full suite of tests is executed via mvn 
-Pcontrib-check clean install at the root nifi folder?
- [ ] Have you written or updated unit tests to verify your changes?
- [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)? 
- [ ] If applicable, have you updated the LICENSE file, including the main 
LICENSE file under nifi-assembly?
- [ ] If applicable, have you updated the NOTICE file, including the main 
NOTICE file found under nifi-assembly?
- [ ] If adding new Properties, have you added .displayName in addition to 
.name (programmatic access) for each of the new properties?

### For documentation related changes:
- [ ] Have you ensured that format looks appropriate for the output in 
which it is rendered?

### Note:
Please ensure that once the PR is submitted, you check travis-ci for build 
issues and submit an update to your PR as soon as possible.


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/abhinavrohatgi30/nifi nifi-4035

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/nifi/pull/2561.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #2561


commit 4532645294a225b5b3cde6cf59254a3a38ca15f6
Author: abhinavrohatgi30 
Date:   2018-03-17T15:35:06Z

Adding PutSolrRecord Processor that reads NiFi records and indexes them 
into Solr as SolrDocuments

commit 313a95ef59f5fff31c6bd9a032bc4d82de7df2f9
Author: abhinavrohatgi30 
Date:   2018-03-17T15:36:04Z

Adding Test Cases for PutSolrRecord Processor

commit 76003a1b1ef5449ee3cbd51b244dc27e946b5ea3
Author: abhinavrohatgi30 
Date:   2018-03-17T15:36:58Z

Adding PutSolrRecord Processor in the list of Processors




> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Commented] (NIFI-4035) Implement record-based Solr processors

2017-10-23 Thread Koji Kawamura (JIRA)

[ 
https://issues.apache.org/jira/browse/NIFI-4035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16214884#comment-16214884
 ] 

Koji Kawamura commented on NIFI-4035:
-

NIFI-3248 added capability to write query result as NiFi record.

> Implement record-based Solr processors
> --
>
> Key: NIFI-4035
> URL: https://issues.apache.org/jira/browse/NIFI-4035
> Project: Apache NiFi
>  Issue Type: Improvement
>Affects Versions: 1.2.0, 1.3.0
>Reporter: Bryan Bende
>Priority: Minor
>
> Now that we have record readers and writers, we should implement variants of 
> the existing Solr processors that record-based...
> Processors to consider:
> * PutSolrRecord - uses a configured record reader to read an incoming flow 
> file and insert records to Solr
> * GetSolrRecord - extracts records from Solr and uses a configured record 
> writer to write them to a flow file



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)