JanZerebecki added a comment.
But until we're there, if the relevant extension here is Wikimedia-deployed
and needs data-values/javascript at run-time in production, then it not only
can be added but must be added.
Both this extension and its dependencies (data-values/javascript) are only
adrianheine added a comment.
In https://phabricator.wikimedia.org/T88436#1042729, @Krinkle wrote:
Jobs will no longer use mediawiki/vendor in the near future. But until we're
there, if the relevant extension here is Wikimedia-deployed and needs
data-values/javascript at run-time in
Krinkle added a comment.
Jobs will no longer use mediawiki/vendor in the near future. But until we're
there, if the relevant extension here is Wikimedia-deployed and needs
data-values/javascript at run-time in production, then it not only can be added
but must be added.
TASK DETAIL
JanZerebecki added a comment.
In the long run it might be a good idea to make mediawiki-vendor a build result
of the composer run in wmf branches of mediawiki-core and run the tests after
that with the newly built vendor repo.
@hashar What do you think? Should we move forward in the outlined
bd808 added a comment.
In https://phabricator.wikimedia.org/T88436#1025151, @JanZerebecki wrote:
In the long run it might be a good idea to make mediawiki-vendor a build
result of the composer run in wmf branches of mediawiki-core and run the
tests after that with the newly built vendor
JanZerebecki added a comment.
I forgot for a moment that core pulls in externally maintained dependencies via
composer. I distracted from the main question for this task:
Should we run composer for master branches (i.e. not the wmf deployment ones)
during CI instead of relying on
Krinkle added a comment.
In https://phabricator.wikimedia.org/T88436#1025775, @JanZerebecki wrote:
Should we run composer for master branches (i.e. not the wmf deployment ones)
during CI instead of relying on mediawiki-vendor?
Yes. Both to simplify day-to-day maintenance (e.g. atomic
JanZerebecki added a comment.
In https://phabricator.wikimedia.org/T88436#1012609, @Legoktm wrote:
Why would you need to do it recursively? Just include Wikibase's dependencies
in vendor (still duplicated manually :() and let composer automatically
figure out their dependencies.
Yes you
adrianheine added a comment.
We update dependency versions quite often for Wikidata, so we depend on Jenkins
testing against the current dependencies specified in `composer.json`.
TASK DETAIL
https://phabricator.wikimedia.org/T88436
REPLY HANDLER ACTIONS
Reply to comment or attach files,
bd808 added a comment.
In https://phabricator.wikimedia.org/T88436#1011709, @JanZerebecki wrote:
This should probably be answered by @bd808 or @legoktm .
In the long run not running composer in projects that use composer for
dependencies is probably a bad idea. As is duplicate declaration
JanZerebecki added a comment.
But manually copying the dependencies of more than a dozen components (Wikibase
alone has more direct dependencies) and even recursively for their dependencies
does not seem a good way to improve this. The current build process is fully
automated. So in the long
11 matches
Mail list logo