[Dspace-tech] Ingesting large data set

2012-08-30 Thread Ingram , William A
I apologize if a similar questuon has been answered in a prior thread. We have a student needing to submit a 150 GB data set into DSpace. Is this even possible? Are there any tips or workarounds I should try? Cheers, Bill

Re: [Dspace-tech] Ingesting large data set

2012-08-30 Thread Pottinger, Hardy J.
Hi, Bill, the theoretical limit for posting data via HTTP is 1.8 GB [1]. Your only recourse for storing this particular data set in DSpace, is to transfer to the server via FTP, SFTP, or SCP, and then either batch load, or run the item update script [2]. However, my main question is: once it is

Re: [Dspace-tech] Ingesting large data set

2012-08-30 Thread George S Kozak
-tech] Ingesting large data set I apologize if a similar questuon has been answered in a prior thread. We have a student needing to submit a 150 GB data set into DSpace. Is this even possible? Are there any tips or workarounds I should try? Cheers, Bill

Re: [Dspace-tech] Ingesting large data set

2012-08-30 Thread Mark H. Wood
We are just setting up a data repository and will probably soon be facing similar challenges. This also has some relationship to longer videos and the like. -- Mark H. Wood, Lead System Programmer mw...@iupui.edu Asking whether markets are efficient is like asking whether people are smart.

Re: [Dspace-tech] Ingesting large data set

2012-08-30 Thread Mark H. Wood
There's also the registration method: put the file into the assetstore space by some other means and then just tell DSpace it's there, and here are the metadata. No further copying required. I suppose you could even carry your 2TB file in on a hot-plug disk drive, push it into an empty slot,

Re: [Dspace-tech] Ingesting large data set

2012-08-30 Thread Richard Rodgers
Yes, as has been remarked, the bigger questions revolve around access and usage, rather than ingest. We recently did a pilot with large video files where we ingested them as preservation masters (via ItemImport), suppressed the download link, but offered in it's place a link to a much smaller

Re: [Dspace-tech] Ingesting large data set

2012-08-30 Thread Ryan Scherle
: Thursday, August 30, 2012 11:54 AM To: dspace-tech@lists.sourceforge.net Subject: [Dspace-tech] Ingesting large data set I apologize if a similar questuon has been answered in a prior thread. We have a student needing to submit a 150 GB data set into DSpace. Is this even possible

Re: [Dspace-tech] Ingesting large data set

2012-08-30 Thread Mark H. Wood
On Thu, Aug 30, 2012 at 05:03:02PM +, Richard Rodgers wrote: Yes, as has been remarked, the bigger questions revolve around access and usage, rather than ingest. We recently did a pilot with large video files where we ingested them as preservation masters (via ItemImport), suppressed

Re: [Dspace-tech] Ingesting large data set

2012-08-30 Thread Pottinger, Hardy J.
This may be just me hijacking the thread, so, apologies up front, but I followed a link [1] on the Code4Lib mail list just now, and came across Miso Dataset [2] Which looks very cool, indeed. [1] http://selection.datavisualization.ch/ [2] http://misoproject.com/dataset/ -- HARDY POTTINGER

Re: [Dspace-tech] Ingesting large data set

2012-08-30 Thread Mark H. Wood
On Thu, Aug 30, 2012 at 01:17:03PM -0400, Ryan Scherle wrote: [snip] Even before you near 2GB, it is likely that something in your system will reject the upload. You must ensure that all of the pieces of your installation are configured correctly. This includes: * Apache -- LimitRequestBody

Re: [Dspace-tech] Ingesting large data set

2012-08-30 Thread Ingram, William A
the heap. Here goes. Cheers, Bill -Original Message- From: Pottinger, Hardy J. [mailto:pottinge...@umsystem.edu] Sent: Thursday, August 30, 2012 12:31 PM To: Pottinger, Hardy J.; Ingram, William A; dspace-tech@lists.sourceforge.net Subject: Re: [Dspace-tech] Ingesting large data

Re: [Dspace-tech] Ingesting large data set

2012-08-30 Thread Benjamin Ryan
Subject: [Dspace-tech] Ingesting large data set I apologize if a similar questuon has been answered in a prior thread. We have a student needing to submit a 150 GB data set into DSpace. Is this even possible? Are there any tips or workarounds I should try? Cheers, Bill

Re: [Dspace-tech] Ingesting large data set

2012-08-30 Thread Ingram, William A
: [Dspace-tech] Ingesting large data set My 2p worth, 1. What does the 150Gb consist of – one data set, multiple data sets (that may be related e.g. time/geographic location) 2. How would someone use this data set – at 150Gb I would assume (hope) offline processing 3. There are ways