Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Ryan H
We don't.. We've talked about putting it on there. We will in the future. On Fri, Oct 23, 2015 at 3:46 PM, Mark Payne wrote: > So digging in a bit more, the issue that I was concerned about is not > really an issue, as the data is > still cleaned up elsewhere in the code if the FlowFile repo is

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Ryan H
Yea.. they're all on the same partition. (We know.. not good..) I grepped and haven't found any OutOfMem errors... Ryan On Fri, Oct 23, 2015 at 3:15 PM, Mark Payne wrote: > This is indicating that it's unable to write to the FileSystemRepository - > presumably because it is out of disk space.

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Mark Payne
This is indicating that it's unable to write to the FileSystemRepository - presumably because it is out of disk space. Any other error messages that you find? I'm wondering specifically if perhaps you received an OutOfMemoryError or anything of that nature? If it's not cleaning up after itself,

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Ryan H
Random mentions of FileSystemRepo nifi-app_2015-10-23_10.0.log: org.apache.nifi.processor.exception.FlowFileAccessException: Failed to import data from java.io.ByteArrayInputStream@639fe470 for StandardFlowFileRecord[uuid=881506e8-c62f-442f-a564-88675e4f0372,claim=,offset=0,name=914750081936080,si

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Ryan H
Here's our provenance settings: # Persistent Provenance Repository Properties nifi.provenance.repository.directory.default=./provenance_repository nifi.provenance.repository.max.storage.time=24 hours nifi.provenance.repository.max.storage.size=1 GB nifi.provenance.repository.rollover.time=30 secs

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Elli Schwarz
We had max storage size of 1GB, but that's for provenance repo and our problem was with content_repo. Our disk was 60GB, all on one partition, and 55GB were taken up by content_repo. Now, it only contains 233MB. On Friday, October 23, 2015 2:50 PM, Mark Payne wrote: OK, so this

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Ryan H
I've got this one... let me look for that 2015-10-23 09:00:33,625 WARN [Provenance Maintenance Thread-1] o.a.n.p.PersistentProvenanceRepository java.io.IOException: No space left on device at java.io.FileOutputStream.writeBytes(Native Method) ~[na:1.8.0_51] at java.io.FileOutputStr

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Mark Payne
Ryan, Elli, Do you by chance have any error messages in your logs from the FileSystemRepository? I.e., if you perform: grep FileSystemRepository logs/* Do you get anything interesting in there? Thanks -Mark > On Oct 23, 2015, at 2:38 PM, Elli Schwarz > wrote: > > I've been working with R

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Elli Schwarz
I've been working with Ryan. There appear to be a few issues here: - We upgraded from 0.2.0 to 0.3.0 and it appears that content_repository archive is now true by default. In 0.2.0 it was false, and the documentation still states it is false by default. - When we ran out of disk space o

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Ryan H
Agree, they concern the archive... although it sounds like there are 2 archives? Within the content_repository folder, there are subfolders with the name 'archive' and files inside them. Example: ./nfii/content_repository/837/archive/1445611320767-837 Settings: nifi.content.repository.archive.ma

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Aldrin Piri
Ryan, Those items only concern the archive. Did you have data enqueued in connections in your flow? If so, these items are not eligible and could explain why your disk was filled. Otherwise, can you please provide some additional information so we can dig into why this may have arisen. Thanks!

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Ryan H
I've got the following set: nifi.content.repository.archive.max.retention.period=12 hours nifi.content.repository.archive.max.usage.percentage=50% nifi.content.repository.archive.enabled=true Yet, the content repo filled my disk last night... On Fri, Oct 23, 2015 at 1:16 PM, Aldrin Piri wrote:

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Aldrin Piri
Ryan, Those archive folders map to the nifi.content.repository.archive.enabled property. What this property provides is a retention of files no longer in the system for historical context of your flow's processing and the ability for viewing this in conjunction with provenance events as well as a

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Ryan H
Interesting.. So what would ./nfii/content_repository/837/archive/1445611320767-837 typically be? On Fri, Oct 23, 2015 at 12:56 PM, Andrew Grande wrote: > Attachments don't go through, view at imagebin: > http://ibin.co/2K3SwR0z8yWX > > > > > On 10/23/15, 12:52 PM, "Andrew Grande" wrote: > >

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Andrew Grande
Attachments don't go through, view at imagebin: http://ibin.co/2K3SwR0z8yWX On 10/23/15, 12:52 PM, "Andrew Grande" wrote: >Ryan, > >./conf/archive is to create a snapshot of your entire flow, not the content >repository data. See the attached screenshot (Settings menu on the right). > >Andre

Re: Content Repo Large.. Archive in there?

2015-10-23 Thread Andrew Grande
Ryan, ./conf/archive is to create a snapshot of your entire flow, not the content repository data. See the attached screenshot (Settings menu on the right). Andrew On 10/23/15, 12:47 PM, "ryan.andrew.hendrick...@gmail.com on behalf of Ryan H" wrote: >Hi, > I'm noticing my Content Repo

Content Repo Large.. Archive in there?

2015-10-23 Thread Ryan H
Hi, I'm noticing my Content Repo growing large. There's a number of files... content_repo/837/archive/144...-837 Is this new in 3.0? My conf file says any archiving should be going into ./conf/archive, but i don't see anything in there. Thanks, Ryan