-d will load the whole file into memory and also interpret it as ascii, which might make it invalid.
use -T <filename> instead. B. On 11 Jun 2012, at 12:29, Mohammad Prabowo wrote: > Hi. I need to do bulk-insert of document in my CouchDB database. > I'm trying to follow the manual here: > http://wiki.apache.org/couchdb/HTTP_Bulk_Document_API > > Here is my code: > > ~$ DB="http://localhost:5984/employees" > ~$ curl -H "Content-Type:application/json" -d @employees_selfContained.json > -vX POST $DB/_bulk_docs > > the file employees_selfContained.json is a huge file = 465 MB. I've > validated it using JSONLint and found nothing wrong > Here's the curl's verbose output: > > curl -H "Content-Type:application/json" -d @employees_selfContained.json > -vX POST $DB/_bulk_docs > * About to connect() to 127.0.0.1 port 5984 (#0) > * Trying 127.0.0.1... connected > * Connected to 127.0.0.1 (127.0.0.1) port 5984 (#0) >> POST /employees/_bulk_docs HTTP/1.1 >> User-Agent: curl/7.19.7 (i486-pc-linux-gnu) libcurl/7.19.7 OpenSSL/0.9.8k > zlib/1.2.3.3 libidn/1.15 >> Host: 127.0.0.1:5984 >> Accept: */* >> Content-Type:application/json >> Content-Length: 439203931 >> Expect: 100-continue >> > < HTTP/1.1 100 Continue > * Empty reply from server > * Connection #0 to host 127.0.0.1 left intact > curl: (52) Empty reply from server > * Closing connection #0 > > How can i do bulk-insert from that Huge single file? I prefer not to split > the file into smaller size if possible..
