Re: [Development] Proposal: move Qt provisioning scripts and 3rd party components into a dedicated repo
> -Original Message- > From: Development On Behalf Of > Volker Hilsheimer > [...] > I think that’s by far the exception though. Most 3rd party components we > use have well defined, stable APIs. I think it's the other way round, at least if you go by the number of qt_attribution.json entries Checking only Qt Core, we list about 18 third-party code attributions. Only 3 of them can be configured to use a system library instead. Anyhow, the examples with proper upstream project + a build system are arguably the bigger ones, and the ones that are interesting from the security perspective (ZLIB, PCRE2). But don't expect that we get rid of even the majority of third-party attributions by saying 'we use system libs'. Also, let's not forget the elephant in the room: Chromium, and the myriad of further dependencies it ships itself. I wouldn't hold my breath for them to provide a stable API :/ > We can expect that a system that runs Qt has a system libpng.so. We don’t > need to build libpng sources into Qt just because our build environment > doesn’t have libpng-dev provisioned. Though luck on Windows We need to find a solution there. This might be Conan, vcpkg, or our own build system. But we really shouldn't make the steps to get a working copy of Qt on Windows even more elaborate than it is right now. > Yop. Nothing is going to happen immediately. If we can agree that the goal is > worthy the effort it will take, then setting up a new submodule where we > can experiment with some of the obvious candidates (zlib, libpng, freetype) > will probably find the new tangles :) I'm all for experimenting with these. Let's set up some requirements though: - We need a setup on Windows that is practical - We need a solution for _all_ supported platforms + compilers, including cross-compilation - static builds still need to be supported (so that the third-party lib is statically linked into a static Qt) My expectation is that we must still maintain our own forks for almost all 3rd-party libs, if only for supporting exotic platforms + build system issues. But I'd be happy to be shown otherwise. For Windows specifically: My understanding is that we want to use separate third-party .dll's wherever possible, also for the official binaries that we provide. Mind you that this will break a lot of customer deployment scripts, and potentially can cause some dll hell. It also doesn't make Qt appear lightweight when you must deploy 10+ dll's for a "hello world" :/ Regards Kai ___ Development mailing list Development@qt-project.org https://lists.qt-project.org/listinfo/development
Re: [Development] Proposal: move Qt provisioning scripts and 3rd party components into a dedicated repo
Just to add to the discussion, Qt is clearly adopting Conan as a first-class "service" [1]. Conan users may know that the "canonical" repos [2] have a lot of effort put in them to remove exactly this kind of scenarios where libraries ship their vendored dependencies. Qt is an example of this [3]. With sane CMake it's surely possible to get these dependencies out of tree in the Qt repos, and have this actually work in benefit of packagers like Conan (including Qt itself, as they are supporting their own Artifactory), Yocto project, all the Linux distros packagers, etc, by providing an easy place to "cut". [1] https://www.qt.io/blog/tag/conan [2] https://github.com/conan-io/conan-center-index [3] https://github.com/conan-io/conan-center-index/blob/master/recipes/qt/6.x.x/conanfile.py#L331-L401 Às 16:38 de 14/07/2022, Volker Hilsheimer escreveu: On 14 Jul 2022, at 13:51, Edward Welbourne wrote: Volker Hilsheimer (14 July 2022 11:23) wrote: And it makes it in general messy to maintain an overview of those 3rd party components. We have a responsibility to keep track of those, That's supposed to be handled by qt_attribution.json files; do you have any evidence that we're failing at it ? From the perspective of keeping track of things, one qt_attribution.json file per submodule is only marginally better than searching src/3rdparty for each submodule. It does not help people needing a 3rd party component in one submodule see what 3rd party components we use elsewhere already. It also makes it very difficult for auditors at companies using Qt to see which 3rd party components there are buried in the Qt code. and to keep things up-to-date. That, indeed, is a chore; but I don't see how moving them to another module would help. Indeed, when we do updates to a 3rdparty component, sometimes it requires parallel changes to the code that calls it. That is easy to synchronise if they're in the same module. Our dependency tracking system does provide a way to do it with them in separate modules, but it's one more complication to have to struggle with. Agree, if there is very tight coupling between an exact version of a 3rd party module and our code, then for those 3rd party modules it will be difficult. But then it will be next to impossible anyway to use the system library of those modules anyway. I think that’s by far the exception though. Most 3rd party components we use have well defined, stable APIs. I think that would be easier if we move them into a single repo with all 3rd party libraries. Aside from the case of duplicated 3rdparty components, I don't really see how this would make it easier. Those who maintain the code that exercises the 3rdparty component would still need to take care of its updates, along with any needed changes to their code using it, but this work would now be split across two modules, with a dependency gate in between, instead of being localised to one module. The assumption is that for the vast majority of our 3rd party dependencies, there is a stable API. Stuff that’s tightly coupled is the exception, and can be treated as such. Indeed, even for a third-party component that's presently used by two Qt git modules, if one of those is actively maintained and the other is in maintenance mode (particularly if its maintainer has quietly slipped out of touch), an update to the third-party component driven by the former may break the latter (if we switch to making them use the same version). You'd either have to kick the latter out of Qt (in order to be able to take in the update) or do the needed maintenance on it, and we might not have anyone sufficiently familiar with it to do that robustly. That might still be a good thing (forcing us to address our lack of adequate cover for a module we still ship), but it'd be a new source of pain. If we have a submodule in Qt that doesn’t keep up with the 3rd party libraries it depends on, then it needs to go. Otherwise it becomes a security liability. Our provisioning process in the CI system could then use the respective “native" build systems of each 3rd party component in that repo to install them as system libraries. That's great as long as your builds are always in virtual machines, but very very unwelcome for local native builds, unless I'm misunderstanding what you mean by "as system libraries". First, I do not want the software I build to install random crap all over where my operating system keeps libraries; second, I routinely build several versions of Qt in parallel, from different source trees into different build trees. That "might" lead to conflicts - and, if you're using the third-party code's own build systems, you're opting to accept whatever bad policy they have for where to install themselves as system libraries. So I hope you meant something else by "as system libraries." I was specifically talking about the CI system. For local builds, you have the option today already to
Re: [Development] Proposal: move Qt provisioning scripts and 3rd party components into a dedicated repo
> On 14 Jul 2022, at 13:51, Edward Welbourne wrote: > > Volker Hilsheimer (14 July 2022 11:23) wrote: >> And it makes it in general messy to maintain an overview of those 3rd >> party components. We have a responsibility to keep track of those, > > That's supposed to be handled by qt_attribution.json files; do you have > any evidence that we're failing at it ? From the perspective of keeping track of things, one qt_attribution.json file per submodule is only marginally better than searching src/3rdparty for each submodule. It does not help people needing a 3rd party component in one submodule see what 3rd party components we use elsewhere already. It also makes it very difficult for auditors at companies using Qt to see which 3rd party components there are buried in the Qt code. >> and to keep things up-to-date. > > That, indeed, is a chore; but I don't see how moving them to another > module would help. Indeed, when we do updates to a 3rdparty component, > sometimes it requires parallel changes to the code that calls it. That > is easy to synchronise if they're in the same module. Our dependency > tracking system does provide a way to do it with them in separate > modules, but it's one more complication to have to struggle with. Agree, if there is very tight coupling between an exact version of a 3rd party module and our code, then for those 3rd party modules it will be difficult. But then it will be next to impossible anyway to use the system library of those modules anyway. I think that’s by far the exception though. Most 3rd party components we use have well defined, stable APIs. >> I think that would be easier if we move them into a single repo with >> all 3rd party libraries. > > Aside from the case of duplicated 3rdparty components, I don't really > see how this would make it easier. Those who maintain the code that > exercises the 3rdparty component would still need to take care of its > updates, along with any needed changes to their code using it, but this > work would now be split across two modules, with a dependency gate in > between, instead of being localised to one module. The assumption is that for the vast majority of our 3rd party dependencies, there is a stable API. Stuff that’s tightly coupled is the exception, and can be treated as such. > Indeed, even for a third-party component that's presently used by two Qt > git modules, if one of those is actively maintained and the other is in > maintenance mode (particularly if its maintainer has quietly slipped out > of touch), an update to the third-party component driven by the former > may break the latter (if we switch to making them use the same > version). You'd either have to kick the latter out of Qt (in order to be > able to take in the update) or do the needed maintenance on it, and we > might not have anyone sufficiently familiar with it to do that robustly. > > That might still be a good thing (forcing us to address our lack of > adequate cover for a module we still ship), but it'd be a new source of > pain. If we have a submodule in Qt that doesn’t keep up with the 3rd party libraries it depends on, then it needs to go. Otherwise it becomes a security liability. >> Our provisioning process in the CI system could then use the >> respective “native" build systems of each 3rd party component in that >> repo to install them as system libraries. > > That's great as long as your builds are always in virtual machines, but > very very unwelcome for local native builds, unless I'm misunderstanding > what you mean by "as system libraries". First, I do not want the > software I build to install random crap all over where my operating > system keeps libraries; second, I routinely build several versions of Qt > in parallel, from different source trees into different build trees. > That "might" lead to conflicts - and, if you're using the third-party > code's own build systems, you're opting to accept whatever bad policy > they have for where to install themselves as system libraries. > So I hope you meant something else by "as system libraries." I was specifically talking about the CI system. For local builds, you have the option today already to use the system libraries in most cases. Building the 3rd party code from the repo of 3rd party modules would not have to be done differently than how it’s done today, except that the sources are not searched in $QTCLONE/qtbase/src/3rdparty/zlib, but in $QTCLONE/qt3rdpartystuff/zlib (or whatever). If you have neither the system SDK installed, nor the qt3rdpartystuff repo checked out, then you will miss features or fail to configure. Just as today with many other 3rd party libraries. > If we're going to use their native build systems, we should engineer > that they install into some part of Qt's build tree, that Qt's build > system then adds to the "system" library and header paths for builds in > that tree. We need this, in any case, so that we can
Re: [Development] Proposal: move Qt provisioning scripts and 3rd party components into a dedicated repo
On Thursday, 14 July 2022 04:51:22 PDT Edward Welbourne wrote: > Aside from the case of duplicated 3rdparty components, I don't really > see how this would make it easier. It's not for us. This does indeed add some work to us, but it's meant to make it easier for everyone downstream of the sources. That's everyone building from source and everyone using any binary build. > Indeed, even for a third-party component that's presently used by two Qt > git modules, if one of those is actively maintained and the other is in > maintenance mode (particularly if its maintainer has quietly slipped out > of touch), an update to the third-party component driven by the former > may break the latter (if we switch to making them use the same > version). You'd either have to kick the latter out of Qt (in order to be > able to take in the update) or do the needed maintenance on it, and we > might not have anyone sufficiently familiar with it to do that robustly. That's true, but that's not a reason to keep things as-is. That third-party content may need updates for one reason or another, so we had better know of any issues as soon as possible. Hopefully before the feature in Qt is productised in the first place. So we can kick it out as "depends on third party component that is too fragile, so we can't provide long-term support on." > > Our provisioning process in the CI system could then use the > > respective “native" build systems of each 3rd party component in that > > repo to install them as system libraries. > > That's great as long as your builds are always in virtual machines, but > very very unwelcome for local native builds, unless I'm misunderstanding > what you mean by "as system libraries". The use of "system" here is Qt's meaning of it: it's not the bundled copy. They should be installed to a regular prefix of your choice, which could be /usr/local. Installing them to where your Qt build will be installed helps simplify CMake runs, but shouldn't be required. -- Thiago Macieira - thiago.macieira (AT) intel.com Cloud Software Architect - Intel DCAI Cloud Engineering ___ Development mailing list Development@qt-project.org https://lists.qt-project.org/listinfo/development
Re: [Development] Proposal: move Qt provisioning scripts and 3rd party components into a dedicated repo
On Thursday, 14 July 2022 04:22:59 PDT Laszlo Agocs wrote: > 1. Can we assume every single 3rd party library out there has a > sophisticated build system that builds (or at least allows building after > appropriate configuration) exactly the way Qt needs, with first class > cross-compilation and embedded support, including the exotic targets like > INTEGRITY? (unlikely) How about Yocto where you wouldn't need the upstream > build system but rather appropriate recipes for packages? Integrating the > external code into Qt's build system have some benefits that are probably > not obvious at first glance, esp. when it comes to the less straightforward > non-desktopish targets. I don't think we can *assume* that, but we can make it so. Both we and Yocto Project somehow manage to build all of those, so it must be possible somehow. Sometimes it might require patching, or it may require injecting a different build system to make it compile for that target. But there is a solution. Of course, the next issue is what we should do if that need for a solution arises. I'm hoping that most projects will accept patches to fix them to build them properly and cleanly. They may balk at having an INTEGRITY requirement, but the burden falls on the interested party to keep that working. This is particularly true for OSes and SDKs that aren't freely available or even remotely easy to obtain. This may be an area that we can get Blackberry and Green Hills to contribute directly to, actually: get their own people to keep those low-level infrastructure libraries working. > If the end result here is a partial solution for the common desktop > platforms, while most Qt modules will still be stuck with having to support > alternative solutions (e.g. still bundle) for the more exotic (but > essential) targets, then that might not be much of an improvement over all. There's a separate meaning of "bundle" here. Some of those libraries are mandatory or at least highly-recommended requirements of the Qt modules, so they must be present on the user's device, regardless of how they were built. The difference here is how they're build and I think Volker's proposal makes an improvement, if at the cost of having to deal with a foreign build system. See my direct reply for more. > 2. Not every 3rd party library offers Qt-style compatibility guarantees. We > may not even have proper source/behaviorial compatiblity promises in some > cases. This means blindly updating stuff to the latest version (no matter > how important or great that release may seem) or sharing the same 3rd party > content between multiple Qt modules is not always feasible or wise. The libraries where sharing is not feasible (Assimp) need to be fixed, right now. One radical solution is to yank them away as crappy code that they are and never look back. Why are we building our own code over this house of cards? If necessary, we fork it. But then I'll insist on those warnings getting fixed, which means the code will greatly diverge from upstream. > A 3rd party library is often an integral part of a module, with custom > patches applied and *careful testing* with every single update of the 3rd > party code to make sure it plays perfectly together with the particular > module. (may not be the case everywhere in Qt, but certainly true for some > of the graphics/3D dependencies) We must attempt to upstream those patches. If upstream refuses, we can discuss it. Hopefully they'll only apply to "exotic" platforms, so the standard third party will work for almost everyone. Then we simply store the Git patchset to apply on top of third party, like KDE is doing to the Qt LTS branches. There's no such thing as a maximum version of a module: any version of that third party above the minimum MUST WORK at the moment of our release. That means we maintainers of a given code must test Qt against the latest versions of the third party, including making sure we are in the cycle for their releases and correcting anything in our code, or reporting issues in theirs. We can't be blamed for issues that arise with updates after our releases, if we misused the library. Users will know which version was current at the time of the release and they MAY want to stick to the same release series. But that's their choice to make. And if the third party in question is so fragile that every release of theirs breaks Qt? I repeat: Yank. It. Out. Why are we building our code on top of this fragile house of cards? > Dumping the full upstream contents to a faraway external repository and > claiming it will just work as-is (or that any change we apply on top is > fine for all clients of the 3rd party lib within Qt) sounds a bit > optimistic. The 3rd party content that is essential for a Qt module should > rather be "near the eyes of the developers", ideally next to the module > sources. (again, this may not apply uniformly true within Qt, but certainly > true in graphics/3D
Re: [Development] Proposal: move Qt provisioning scripts and 3rd party components into a dedicated repo
On Thursday, 14 July 2022 02:23:47 PDT Volker Hilsheimer wrote: > As mentioned, this would be a gradual process, one 3rd party dependency at a > time and figuring out what it takes to make this work on all platforms. > Some 3rd party code might have to stay where it is today. And I am partly to blame for that (tinycbor). > What do you think? Thank you for sending this, Volker. As Alexandru reminds us, this is not a new discussion. We had planned on doing this for 6.0, but didn't get it all the way to the end. Does anyone have a conclusion of why we didn't do it for 6.0? Was it only lack of time, or did we try and come to difficult problems? If so, can you summarise what they were? Disclaimer, Volker is sending this because I brought it up again, in the context of a security issue. And this is an important reason why we should do this, so let me expand: right now, by bundling the sources of third party libraries into our sources, we make it difficult for Regular Joe users to know just which ones they are using and thus if there are security issues applicable to their applications. All of this is compounded by there being binary packages (which I'm not sure are a Qt Project deliverable or if they are a Qt Company deliverable), but let's focus on source only right now. Many of the third-party content we use are very low-level libraries that are subject to security issues themselves. Often, they don't apply to how Qt uses those libraries, since we only use them in a restricted context, but automated vulnerability-scanning tools won't know that. So if a customer is told "zlib has a security vulnerability with severity High", they may need to patch it. Speaking from an Intel perspective, it is far easier to simply patch than to get an exception / exemption saying "this doesn't apply to me". This can be worse when the vulnerability DOES apply to them but the scanning tool doesn't know that it does because the code in question is buried inside of the Qt sources. The tool may simply treat all of Qt as "Qt", or fail to recognise this content because we've patched it somehow (such as by removing its native build system and using the CMake integration). By unbundling from inside our sources, we make it obvious that such a component is there. We make it easy to replace it too, if we stop patching it. This applies to us, not just customers. When we do get a notice saying that a given third party has a security vulnerability disclosed, we can much more easily update it throughout the CI. It can also easily roll out to binary packages, but I won't make that determination. This also applies to Open Source licence compliance. I handle this for Intel every time Qt comes up, but all other companies that aren't Intel don't employ a maintainer in the Qt Project that knows this inside and out. They need to know what libraries they're using to know that they're all compatible with each other, and to list them in the necessary documentation. We do help them with the "qt_attribution.json" files of course, but unbundling makes it obvious. The other aspect is binaries. We've also just received a request on how to use the bundled zlib in user code. The answer is: Don't. The bundled libraries are for Qt's use only and if you want to use it, you must have the regular library that your content can use. And if you're going to do that anyway, then you will likely want to avoid having two copies. -- Thiago Macieira - thiago.macieira (AT) intel.com Cloud Software Architect - Intel DCAI Cloud Engineering ___ Development mailing list Development@qt-project.org https://lists.qt-project.org/listinfo/development
Re: [Development] Proposal: move Qt provisioning scripts and 3rd party components into a dedicated repo
Volker Hilsheimer (14 July 2022 11:23) wrote: > Our 3rd party dependencies currently live in the submodules where they > are used. For some 3rd party components, that means we have two, > sometimes different copies (e.g. assimp in both Qt Quick 3D and Qt 3D, > only one of them patched). Having two checkouts of the same thing is not nice; having them not even on the same version is positively nasty. > And it makes it in general messy to maintain an overview of those 3rd > party components. We have a responsibility to keep track of those, That's supposed to be handled by qt_attribution.json files; do you have any evidence that we're failing at it ? > and to keep things up-to-date. That, indeed, is a chore; but I don't see how moving them to another module would help. Indeed, when we do updates to a 3rdparty component, sometimes it requires parallel changes to the code that calls it. That is easy to synchronise if they're in the same module. Our dependency tracking system does provide a way to do it with them in separate modules, but it's one more complication to have to struggle with. > I think that would be easier if we move them into a single repo with > all 3rd party libraries. Aside from the case of duplicated 3rdparty components, I don't really see how this would make it easier. Those who maintain the code that exercises the 3rdparty component would still need to take care of its updates, along with any needed changes to their code using it, but this work would now be split across two modules, with a dependency gate in between, instead of being localised to one module. Indeed, even for a third-party component that's presently used by two Qt git modules, if one of those is actively maintained and the other is in maintenance mode (particularly if its maintainer has quietly slipped out of touch), an update to the third-party component driven by the former may break the latter (if we switch to making them use the same version). You'd either have to kick the latter out of Qt (in order to be able to take in the update) or do the needed maintenance on it, and we might not have anyone sufficiently familiar with it to do that robustly. That might still be a good thing (forcing us to address our lack of adequate cover for a module we still ship), but it'd be a new source of pain. > Our provisioning process in the CI system could then use the > respective “native" build systems of each 3rd party component in that > repo to install them as system libraries. That's great as long as your builds are always in virtual machines, but very very unwelcome for local native builds, unless I'm misunderstanding what you mean by "as system libraries". First, I do not want the software I build to install random crap all over where my operating system keeps libraries; second, I routinely build several versions of Qt in parallel, from different source trees into different build trees. That "might" lead to conflicts - and, if you're using the third-party code's own build systems, you're opting to accept whatever bad policy they have for where to install themselves as system libraries. So I hope you meant something else by "as system libraries." If we're going to use their native build systems, we should engineer that they install into some part of Qt's build tree, that Qt's build system then adds to the "system" library and header paths for builds in that tree. We need this, in any case, so that we can bundle those libraries up with the resulting Qt binaries when we want to build packages that can be installed. (One exception to that is the 3rdparty components only used in building, e.g. gradle. The packaging scripts don't want to pick those up.) > This will speed up the build of Qt (and removes noise from those > libraries, some of which are not exactly free of warnings), That, indeed, would be welcome. However, see note below on stray 3rdparty .c files. > and it allows us to gradually stop shipping Qt binaries that use the > bundled 3rd party stuff. I fail to follow - how exactly does this let us do that ? The only case of it can think of is anywhere that we can defer to the system libraries and, surely, all distributions that have those system libraries ship their Qt with a dependency on the relevant system library already, so build Qt without the internal 3rdparty dependency. > That will be a big step towards reducing our security relevant body of > code. It won’t be possible for all 3rd party components Indeed, we have some third-party components that are just a stray .c file without a build system; and more where we've extracted the subset we need from a larger upstream, discarding the rest along with their build system. I think we're stuck with those as part of our source tree (and their warnings as part of our build logs). > and on all platforms, but at least it allows us to move into that > direction. If it lets us merely take the few duplicated third-party components out, so that there's
Re: [Development] Proposal: move Qt provisioning scripts and 3rd party components into a dedicated repo
Hi, > Our 3rd party dependencies currently live in the submodules where they are > used. For some 3rd party components, that means we have two, sometimes > different copies (e.g. assimp in both Qt Quick 3D and Qt 3D, only one of them > patched). And it makes it in general messy to maintain an overview of those > 3rd party components. We have a responsibility to keep track of those, and to > keep things up-to-date. I think that would be easier if we move them into a > single repo with all 3rd party libraries. I can try sharing some concerns, many of which are similar to those voiced in the discussions we had around and pre-6.0, when there was a lot of talk about package management and such for 3rd party dependencies (which then died off eventually). 1. Can we assume every single 3rd party library out there has a sophisticated build system that builds (or at least allows building after appropriate configuration) exactly the way Qt needs, with first class cross-compilation and embedded support, including the exotic targets like INTEGRITY? (unlikely) How about Yocto where you wouldn't need the upstream build system but rather appropriate recipes for packages? Integrating the external code into Qt's build system have some benefits that are probably not obvious at first glance, esp. when it comes to the less straightforward non-desktopish targets. If the end result here is a partial solution for the common desktop platforms, while most Qt modules will still be stuck with having to support alternative solutions (e.g. still bundle) for the more exotic (but essential) targets, then that might not be much of an improvement over all. 2. Not every 3rd party library offers Qt-style compatibility guarantees. We may not even have proper source/behaviorial compatiblity promises in some cases. This means blindly updating stuff to the latest version (no matter how important or great that release may seem) or sharing the same 3rd party content between multiple Qt modules is not always feasible or wise. A 3rd party library is often an integral part of a module, with custom patches applied and *careful testing* with every single update of the 3rd party code to make sure it plays perfectly together with the particular module. (may not be the case everywhere in Qt, but certainly true for some of the graphics/3D dependencies) Dumping the full upstream contents to a faraway external repository and claiming it will just work as-is (or that any change we apply on top is fine for all clients of the 3rd party lib within Qt) sounds a bit optimistic. The 3rd party content that is essential for a Qt module should rather be "near the eyes of the developers", ideally next to the module sources. (again, this may not apply uniformly true within Qt, but certainly true in graphics/3D for certain 3rd party deps that provide core and essential functionality) 3. Many 3rd party components are stripped down, meaning we build only what is needed by Qt. In the form that is best, even if that is something the upstream build system would never do, with settings or changes upstream may not support out of the box. (e.g. sometimes we can just compile in the 3rd party sources directly to the module or maybe use a static lib etc., instead of polluting everything with tens of shared libraries and unnecessary tools and artifacts from the upstream build system) That this makes updating more difficult is not a valid argument, since updating *should* be an involved process - the impact of updating the 3rd party code must be carefully evaluated and tested. (as always, thinking of graphics/3D 3rd party examples here, perhaps the bar is lower elsewhere with other 3rd party deps) 4. Simplicity of customization. Applying custom patches to a stripped down bundled version may be significantly simpler than fighting the entire upstream repo and its build system. (and customization will be needed, given Qt's extensive platform and compiler coverage) 5. Manually building and installing all 3rd party dependencies for every single Qt module we work with (thinking here of devs with no toplevel builds and whatnot, just building individual repos) is not very tempting. Best regards, Laszlo -Original Message- From: Development On Behalf Of Alexandru Croitor Sent: Thursday, July 14, 2022 11:53 AM To: Volker Hilsheimer Cc: development@qt-project.org Subject: Re: [Development] Proposal: move Qt provisioning scripts and 3rd party components into a dedicated repo Hi, Just pointing to some relevant issues and discussions. https://bugreports.qt.io/browse/QTBUG-68816 https://bugreports.qt.io/browse/QTBUG-73760 https://wiki.qt.io/QtCS2018_Third-Party_Sources_Policy_and_Security > For us as Qt developers it might mean that we either have to build that 3rd > party repository ourselves (which could still be done as part of the toplevel > build process) If the libraries will be built using their native build
Re: [Development] Proposal: move Qt provisioning scripts and 3rd party components into a dedicated repo
Hi, Just pointing to some relevant issues and discussions. https://bugreports.qt.io/browse/QTBUG-68816 https://bugreports.qt.io/browse/QTBUG-73760 https://wiki.qt.io/QtCS2018_Third-Party_Sources_Policy_and_Security > For us as Qt developers it might mean that we either have to build that 3rd > party repository ourselves (which could still be done as part of the toplevel > build process) If the libraries will be built using their native build system instead of CMake, that will complicate the implementation of top-level builds, as well considerably slow down configuration time, because the 3rd party libs will need to be built and installed in some staging area before CMake can pick them up for Qt usage at configure time. So qt5/configure would imply first running configure && make && make install for each 3rd party library. ___ Development mailing list Development@qt-project.org https://lists.qt-project.org/listinfo/development
[Development] Proposal: move Qt provisioning scripts and 3rd party components into a dedicated repo
Hi, Our 3rd party dependencies currently live in the submodules where they are used. For some 3rd party components, that means we have two, sometimes different copies (e.g. assimp in both Qt Quick 3D and Qt 3D, only one of them patched). And it makes it in general messy to maintain an overview of those 3rd party components. We have a responsibility to keep track of those, and to keep things up-to-date. I think that would be easier if we move them into a single repo with all 3rd party libraries. Our provisioning process in the CI system could then use the respective “native" build systems of each 3rd party component in that repo to install them as system libraries. This will speed up the build of Qt (and removes noise from those libraries, some of which are not exactly free of warnings), and it allows us to gradually stop shipping Qt binaries that use the bundled 3rd party stuff. That will be a big step towards reducing our security relevant body of code. It won’t be possible for all 3rd party components and on all platforms, but at least it allows us to move into that direction. For us as Qt developers it might mean that we either have to build that 3rd party repository ourselves (which could still be done as part of the toplevel build process), or install the SDKs from package managers etc. This is nothing new, we do that today already for many dependencies. At the same time, and because it’s quite related, I’d also like to see if we can move all the coin provisioning scripts out of qt5.git and into that same repository. That qt5.git is not just an empty meta-repository that keeps track of the consistent set of submodules, but also has code that might need to be changed in order for submodule updates to succeed, has more than once caused problems. Moving the provisioning code and coin rules into a submodule untangles that. We’d use the dependency mechanism that we have, ie. qtbase would declare a dependency to that new repository, and the submodule update process keeps qtbase’s dependencies.yaml updated. Building qtbase on its own would still work as long as the necessary dependencies are provisioned; if they are not, building them from the new repo is one option, using the native system libraries is another. As mentioned, this would be a gradual process, one 3rd party dependency at a time and figuring out what it takes to make this work on all platforms. Some 3rd party code might have to stay where it is today. What do you think? Volker ___ Development mailing list Development@qt-project.org https://lists.qt-project.org/listinfo/development