: [Dspace-tech] Ingesting large data set
My 2p worth,
1. What does the 150Gb consist of – one data set, multiple data sets
(that may be related e.g. time/geographic location)
2. How would someone use this data set – at 150Gb I would assume (hope)
offline processing
3. There are ways
space-tech@lists.sourceforge.net
Subject: [Dspace-tech] Ingesting large data set
I apologize if a similar questuon has been answered in a prior thread.
We have a student needing to submit a 150 GB data set into DSpace. Is this even
possible? Are there any tips or workarounds I should try?
Ch
some reason and overflows
the heap.
Here goes.
Cheers,
Bill
> -Original Message-
> From: Pottinger, Hardy J. [mailto:pottinge...@umsystem.edu]
> Sent: Thursday, August 30, 2012 12:31 PM
> To: Pottinger, Hardy J.; Ingram, William A; dspace-tech@lists.sourceforge.net
> Su
On Thu, Aug 30, 2012 at 01:17:03PM -0400, Ryan Scherle wrote:
[snip]
> Even before you near 2GB, it is likely that something in your system will
> reject the upload. You must ensure that all of the pieces of your
> installation are configured correctly. This includes:
> * Apache -- LimitRequestBo
This may be just me hijacking the thread, so, apologies up front, but I
followed a link [1] on the Code4Lib mail list just now, and came across
Miso Dataset [2] Which looks very cool, indeed.
[1] http://selection.datavisualization.ch/
[2] http://misoproject.com/dataset/
--
HARDY POTTINGER
Univer
On Thu, Aug 30, 2012 at 05:03:02PM +, Richard Rodgers wrote:
> Yes, as has been remarked, the bigger questions revolve around access and
> usage, rather than ingest.
> We recently did a pilot with large video files where we ingested them as
> preservation masters (via ItemImport), suppressed
14853
> 607-255-8924
>
> From: Ingram, William A [mailto:wingr...@illinois.edu]
> Sent: Thursday, August 30, 2012 11:54 AM
> To: dspace-tech@lists.sourceforge.net
> Subject: [Dspace-tech] Ingesting large data set
>
> I apologize if a similar questuon has been answered in
Yes, as has been remarked, the bigger questions revolve around access and
usage, rather than ingest.
We recently did a pilot with large video files where we ingested them as
preservation masters (via ItemImport), suppressed the
download link, but offered in it's place a link to a much smaller tr
There's also the "registration" method: put the file into the
assetstore space by some other means and then just tell DSpace "it's
there, and here are the metadata". No further copying required.
I suppose you could even carry your 2TB file in on a hot-plug disk
drive, push it into an empty slot,
We are just setting up a data repository and will probably soon be
facing similar challenges. This also has some relationship to longer
videos and the like.
--
Mark H. Wood, Lead System Programmer mw...@iupui.edu
Asking whether markets are efficient is like asking whether people are smart.
p
-tech] Ingesting large data set
I apologize if a similar questuon has been answered in a prior thread.
We have a student needing to submit a 150 GB data set into DSpace. Is this even
possible? Are there any tips or workarounds I should try?
Cheers,
Bill
Hi, Bill, the theoretical limit for posting data via HTTP is 1.8 GB [1].
Your only recourse for storing this particular data set in DSpace, is to
transfer to the server via FTP, SFTP, or SCP, and then either batch load,
or run the item update script [2]. However, my main question is: once it
is *in
I apologize if a similar questuon has been answered in a prior thread.
We have a student needing to submit a 150 GB data set into DSpace. Is this even
possible? Are there any tips or workarounds I should try?
Cheers,
Bill
---
13 matches
Mail list logo