Hello Jan
Did you see anything?
Can you tell me anything about the error?
Is there anybody knows what these errors mean?
http://7fw.de/tmp/couch.log
I changed the log output in the config from info to debug and this is
then the output:
http://7fw.de/tmp/couch_debug.log
But still I don't
Am 16.03.2015 um 10:26 schrieb Jan Lehnardt:
... Could you share the fist megabyte of the file somewhere?
Did a copy of the first 1 MB with:
dd if=./de-m-wikipedia-org.couch of=./dewiki_1mb.couch bs=1MB count=1
here is the file:
http://7fw.de/tmp/dewiki_1mb.couch
What filesystem is this on?
tl;dir: this doesn’t look good, and shouldn’t happen from what CouchDB is doing
to the file. Maybe something external messed with the DB file.
Could you share the fist megabyte of the file somewhere?
What filesystem is this on?
Did the machine reboot at any time between the creation of the
Hello
On my Ubuntu 12.04 LTS server I have installed couchdb version 1.6.1-0ubuntu1.
$ couchdb -V
Apache CouchDB 1.6.1
I have “scraped” the german wiki into this couchdb and now I have a
uncompressed 44 GB database “de-m-wikipedia-org”, which I try to compact with:
$ curl -H Content-Type:
just a guess but, how much free space do you have on that ubuntu server?
could it be that during compaction you are running out of disk space?
On Sat, Mar 14, 2015 at 3:51 AM, Frank Röhm francwal...@gmx.net wrote:
Hello
On my Ubuntu 12.04 LTS server I have installed couchdb version
That's not good sign, imho. Have you tried to replicate it to another
db to swap them after?
--
,,,^..^,,,
On Sat, Mar 14, 2015 at 2:19 PM, Frank Röhm francwal...@gmx.net wrote:
Am 14.03.2015 um 09:15 schrieb Diego Medina:
just a guess but, how much free space do you have on that ubuntu
Am 14.03.2015 um 12:41 schrieb Alexander Shorin:
That's not good sign, imho. Have you tried to replicate it to another
db to swap them after?
No, will try this now, thank you for this recommodation.
That's a pity. Do you have a way to share it somehow?
--
,,,^..^,,,
On Sat, Mar 14, 2015 at 4:03 PM, Frank Röhm francwal...@gmx.net wrote:
Am 14.03.2015 um 12:41 schrieb Alexander Shorin:
That's not good sign, imho. Have you tried to replicate it to another
db to swap them after?
Neither
Can you read individual documents from this couch database?
If you do, then you don't really need to fetch the data from wikipedia
again, you can just crawl your own database. and maybe run compaction as
you load the new database, instead of waiting until the end.
On Sat, Mar 14, 2015 at
Am 14.03.2015 um 14:18 schrieb Alexander Shorin:
That's a pity. Do you have a way to share it somehow?
No, unfortunately not possible, it is still Admin party and I tunnel
it through SSH, because it is on a productive server (not many customers
though).
Am 14.03.2015 um 09:15 schrieb Diego Medina:
just a guess but, how much free space do you have on that ubuntu server?
could it be that during compaction you are running out of disk space?
I _knew_ I forgot something to mention in my question :)
Nope, there is lot of space free. 411 GB at the
Am 14.03.2015 um 12:41 schrieb Alexander Shorin:
That's not good sign, imho. Have you tried to replicate it to another
db to swap them after?
Neither this works, I get errors as well.
I created a new database de-m-wikipedia-org-new and try to replicate with:
curl -H 'Content-Type:
Am 14.03.2015 um 14:20 schrieb Diego Medina:
Can you read individual documents from this couch database?
I don't see any possibility to do this with the script I have (mwscrape).
But I found a download link of a complete and surely working couch
database for dewiki, on copy.com, so I will try
13 matches
Mail list logo