bd808 added a comment.
Likely one of:
the file is not readable in the VM for some reason (NFS/Vbox failure?)
php in the VM does not have bzip support (configuration problem?)
the archive is corrupt
The next workaround I would try is decompressing the dump before importing.TASK DETAILhttps://pha
Hjfocs added a comment.
mwscript now seems to run inside the vagrant box:
mwscript importDump.php --wiki=wikidatawiki --uploads wikidatawiki-20171220-pages-articles19.xml-p19072452p19140743.bz2
but complains:
PHP Warning: fopen(compress.bzip2://wikidatawiki-20171220-pages-articles19.xml-p190724
Hjfocs added a comment.
Yeah, I was wondering the same thing. Will try out your suggestions. Thanks again!TASK DETAILhttps://phabricator.wikimedia.org/T183274EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: HjfocsCc: bd808, Aklapper, Hjfocs, Lahi, Gq86, GoranSM
bd808 added a comment.
On a local test server:
$ cd $MY_MWVAGRANT_CHECKOUT
$ ls -l Wikitech-2018055417.xml
-rw-r--r--@ 1 bd808 staff 6.8K Jan 11 08:54 Wikitech-2018055417.xml
$ vagrant import-dump Wikitech-2018055417.xml
Done!
You might want to run rebuildrecentchanges.php to regene
Hjfocs added a comment.
@bd808 , thanks for the comment. I tried the commands you suggested, now I'm getting a nicer:
bash: import-mediawiki-dump: command not foundTASK DETAILhttps://phabricator.wikimedia.org/T183274EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences
bd808 added a comment.
sudo: /usr/bin/sudo must be owned by uid 0 and have the setuid bit set should never happen. This is pretty obviously a corrupted LXC container. If you run vagrant destroy -f && vagrant up && vagrant import-dump ... can you recreate this failure?TASK DETAILhttps://phabricator.