OK, thanks for this hint! I set it now to:

log > level > debug


I run a single couchdb instance

Yes, this could be a timeout error, when Wikipedia maybe stops queries
from my servers ip, thinking it is too much. But this is only a guess.
When I re-run the scrape, it works immediately again.
The scrape with the 500 error can happen after some hours, or nearly a day.
Anyway I put the timeout of [couch_httpd_auth] now to

timeout = 3600

and will see, if it something changes.

Thanks!

frank


Am 03.05.2020 um 13:04 schrieb Ronny Berndt:
> You should increase your log level to debug to get more infos.
>
> Do you run a cluster or a single couchdb instance?
> You said that you are getting an error „after a while“. Is this time
> in the same time frame - e.g. a timeout error?
>
>
>
>
>> Am 03.05.2020 um 12:53 schrieb Frank Röhm <[email protected]>:
>>
>> Just this:
>>
>> couchdb.http.ServerError: (500, (u'unknown_error', u'undefined'))
>>
>>
>>> Am 03.05.2020 um 12:44 schrieb Ronny Berndt <[email protected]>:
>>>
>>> Are there any error details in the CouchDB log file?
>>>
>>> Ronny
>>>
>>>> Am 03.05.2020 um 11:42 schrieb Frank Röhm <[email protected]>:
>>>>
>>>> Hallo
>>>>
>>>> is it possible to downgrade on a Ubuntu 18.04 server to the old CouchDB 
>>>> 1.6?
>>>>
>>>> I have since CouchDB 2.x problems while filling a db with Wikipedia
>>>> entries. I use therefore mwscrape from:
>>>> https://github.com/itkach/mwscrape
>>>> and after a while I always get a 500 error from CouchDB and my scrape
>>>> stops. This never happened with CouchDB 1.6 under Ubuntu 14.04.
>>>>
>>>> So could it be possible to install CouchDB 1.x maybe by self-compilation
>>>> somehow or are there old and now unsatisfiable dependencies? Someone knows?
>>>>
>>>> Thanks, frank
>>>>
>>
>

Reply via email to