[Bug 42095] importDump.php crashes with out of memory error

2014-02-20 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=42095

--- Comment #9 from Isarra  ---
Well, there's this: http://dump.zaori.org/20121114_uncy_en_pages_full.xml.gz 

That's the file I was trying to import when I originally filed this bug, I
believe, though it's probably not the best thing to test on due to its being
enormous.

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
___
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l


[Bug 42095] importDump.php crashes with out of memory error

2014-02-20 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=42095

--- Comment #8 from physikerwelt  ---
(In reply to Isarra from comment #7)
> I think I recall it working with the --no-updates flag since then as well.
> So if this is still broken, the bug may just be in how it handles updates.
> 
> If this is the case, maybe just having it always run without updates would
> be in order - then have the option to run the appropriate scripts when it's
> done or something.

Can you give me a pointer to the dataset. I'd like to test, how much memory you
need. Maybe 500M is just not enough for a complex nested structure. I tend to
write a note on that on the manpage rather than changing the code. But it's
just a first guess.

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
___
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l


[Bug 42095] importDump.php crashes with out of memory error

2014-02-20 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=42095

--- Comment #7 from Isarra  ---
I think I recall it working with the --no-updates flag since then as well. So
if this is still broken, the bug may just be in how it handles updates.

If this is the case, maybe just having it always run without updates would be
in order - then have the option to run the appropriate scripts when it's done
or something.

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
___
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l


[Bug 42095] importDump.php crashes with out of memory error

2014-02-20 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=42095

--- Comment #6 from physikerwelt  ---
I used the most recent vagrant  version. I assigned 8G main memory and 8 cores
to the VM. The dataset was 500mb a sample from the most recent version of
enwiki (all pages that contain math). I set the main memory limit to 8G which
would have been basically the same as max. And that migh be important I used
the --no-updates flag. Can you post your dataset?

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
___
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l


[Bug 42095] importDump.php crashes with out of memory error

2014-02-20 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=42095

--- Comment #5 from Adam Wight  ---
physikerwelt: Can you let us know roughly what size your target wiki and output
file were?  Also, your PHP version would be helpful...  And, if you are passing
a new memory_limit, what do you specify?

The bug isn't that it's impossible to run the dump script, it's about a memory
leak which causes rapid memory exhaustion on even small data sets.

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
___
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l


[Bug 42095] importDump.php crashes with out of memory error

2014-02-20 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=42095

physikerwelt  changed:

   What|Removed |Added

 Status|NEW |RESOLVED
 CC||phy...@ckurs.de
 Resolution|--- |WORKSFORME

--- Comment #4 from physikerwelt  ---
I just tried it with the most recent version an it works for me.
The Maintemamce.php scrip just passes whatever you specify as $limit to php via
ini_set( 'memory_limit', $limit );

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
___
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l


[Bug 42095] importDump.php crashes with out of memory error

2014-01-25 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=42095

Adam Wight  changed:

   What|Removed |Added

 CC||s...@ludd.net

--- Comment #3 from Adam Wight  ---
Same bug seen in MediaWiki 1.23-HEAD, importing from a recursive dump of the
mediawiki.org/Template: namespace.  The resulting XML file is only 6.8MB, but
the memory used to import seems to go up superlinearly, at over 90KB/revision. 
There are memory leaks like a floating cardboard box.

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
___
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l


[Bug 42095] importDump.php crashes with out of memory error

2013-05-16 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=42095

--- Comment #2 from Isarra  ---
~6GB compressed.

Also crashed for ?pedia, which is only ~100MB, though.

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
___
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l


[Bug 42095] importDump.php crashes with out of memory error

2013-05-16 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=42095

Andre Klapper  changed:

   What|Removed |Added

  Component|Database|Maintenance scripts

--- Comment #1 from Andre Klapper  ---
> full-revision dump of Uncyclopedia

How big is that?

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
___
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l


[Bug 42095] importDump.php crashes with out of memory error

2012-12-15 Thread bugzilla-daemon
https://bugzilla.wikimedia.org/show_bug.cgi?id=42095

Andre Klapper  changed:

   What|Removed |Added

   Priority|Unprioritized   |High
  Component|Maintenance scripts |Database

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are watching all bug changes.
___
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l