https://bugzilla.wikimedia.org/show_bug.cgi?id=57464

--- Comment #12 from Kunal Mehta (Legoktm) <[email protected]> ---
I'm not able to see where the namespace is being lost, so here are the steps
the data goes through. Maybe someone else will spot it.

1. Picked up by MassMessageHooks::storeDataParserFunction as a string
2. MassMessageHooks::verifyPFData validates that it is a valid title (checks
!is_null( Title::newFromText ) ), but returns the original string
3. MassMessage::getParserFunctionTargets unserializes the string.
4. MassMessage::normalizeTargets simply re-arranges the list of targets, not
modifying the string.
5. Back to SpecialMassMessage::submit, which feeds the result from
normalizeTargets to MassMessageSubmitJob.
6. the SubmitJob creates a Title object, and submits a group of
MassMessageJobs.
7. On meta, in MassMessageJob::__construct, there is "$this->title =
Title::newFromText( $this->title->getPrefixedText() );", which was the original
fix for this bug. It *should* do nothing when being run on meta.
8. When the job is executed on the target wiki, that line is run again.
Title::getPrefixedText should return the entire title (including interwiki)
except for any fragments (url anchors). By re-creating the title, the namespace
should be re-parsed as a real namespace instead of an interwiki prefix.

The issue:
If the namespace is not an interwiki prefix, it is re-parsed correctly as can
be seen by the successful deliveries.

If the namespace is an interwiki prefix, it is still thought to be an interwiki
prefix (I think), and the delivery fails since it is not interpreted as a
namespace.

My hunch is that somewhere step 8 isn't working as I expect it would.

-- 
You are receiving this mail because:
You are on the CC list for the bug.
_______________________________________________
Wikibugs-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to