Hi helix84,

So, it seems like there's two possible routes to take here:

1. An event consumer writes directly to Solr. The "persistent store" is 
then simply a dump from Solr to CSV.

2. An event consumer writes directly to CSV. Solr then indexes those CSVs.

So, my question is whether #2 is really absolutely necessary for this 
data, or if #1 is good enough? To me, either way you are getting the 
data into CSV where it can be backed up and easily reindexed as needed. 
The only question seems to be whether that CSV is generated by Solr or 
directly by DSpace API itself (in some custom format). Do we really need 
to have our own custom CSV format for this data, or can we use Solr's 
CSV format?

- Tim

On 3/25/2015 9:24 AM, helix84 wrote:
> Hi Tim,
>
> CSV export may be adequate for backup, but one important thing
> suggested here was an event consumer that would write to a
> persistent store (which could be CSV files). We currently don't have
> a persistent store.
>
>
> Regards,
> ~~helix84
>
> Compulsory reading: DSpace Mailing List Etiquette
> https://wiki.duraspace.org/display/DSPACE/Mailing+List+Etiquette
>

------------------------------------------------------------------------------
Dive into the World of Parallel Programming The Go Parallel Website, sponsored
by Intel and developed in partnership with Slashdot Media, is your hub for all
things parallel software development, from weekly thought leadership blogs to
news, videos, case studies, tutorials and more. Take a look and join the 
conversation now. http://goparallel.sourceforge.net/
_______________________________________________
Dspace-devel mailing list
Dspace-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dspace-devel

Reply via email to