Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
On 03/13/2017 07:51 PM, Bruce, Henry wrote: I agree that leveraging the likes of 'npm install' will make life simpler but the problem with these operations is that they span a range of bitbake tasks. The reason we wrote an npm fetcher was to limit network access to the fetch task. This works in conjunction with an npm patch that enables the install command to run in the compile task without accessing the registry. Are you are proposing that we allow network access outside of the fetch task? Yes, in a limited way. In addition to fetch, there would be a fetch_deps task, which is also allowed to access the network. Alex -- ___ yocto mailing list yocto@yoctoproject.org https://lists.yoctoproject.org/listinfo/yocto
Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
Alex - thanks for kicking off the discussion here, this is something we definitely need to get a better handle on. My involvement with this is perhaps somewhat accidental - I ended up working on the fetcher that was implemented by someone else because I needed it to work for devtool integration, and got sucked into fixing a number of issues I found in the process. Unfortunately - other than python, naturally - I'm not that familiar with the actual languages or their package managers, I've only come into contact with node.js and npm through the work that I've done with the build system. This is a tough problem to solve. Maybe some of the other language package managers are more cooperative, I'm not sure, but npm *really* doesn't want to be used in the manner we're trying to use it - it insists on being able to go out to the registry and things get a bit ugly if you tell it not to. RSS- related issues aside (since I kind of have a path to fixing those) the latest pain was in 90cb980a1c49de99a0aec00c0cd5fc1e165490a7 when we shifted from a cp -a to a second invocation of npm install in order to get a more accurate install step - the side-effect has been we broke that step for certain modules, as the certain directives in the package.json file trigger querying the repository, but we've told it not to do that so it errors out. Another thing that is still a stumbling block for the "one package represents a tree of modules" that I've yet to properly resolve is how to deal with a "partial" tree, where you have one or more modules that you want to actually work on, i.e. you have another recipe to satisfy the dependency possibly under the control of devtool in your workspace. npm doesn't seem to provide any specific mechanism to help with this either. At the moment I'm not sure there's a solution with npm other than hacking the package.json file. Since it was brought up by Trevor a while back, I still have a todo item to go and look at yarn [1] to see if it really solved some of these issues for node.js in a less nasty way than npm. One thing I will say - I really want to see the fetchers (or at least, the custom bits) for these language package managers implemented in the metadata rather than in bitbake. This stuff moves fast, there are a growing number of these package managers, and it's awkward to have part of the implementation in one place and a significant portion of the rest of it (class and supporting recipes) in another, without which the fetcher is useless. However, at the same time we want to make sure we don't lose the ability to have mirror tarballs which are implemented on the bitbake side. With regard to the recipes generated by devtool, they do end up being a bit ugly because we package each underlying npm package individually. The reason I did it that way is to have each source and therefore each license represented in the image manifests. I am open to having a mode where we have it all in one package, though, but it seems to me that LICENSE must include all the licenses at parse time. I'm open to suggestions. Perhaps we could save the package.json file next to the recipe - is that going to be practical? It might be a bit easier to update at least. I agree that in addition to providing the lockdown we do definitely need to make updating these recipes easier, it's a bit awkward right now. Cheers, Paul [1] https://yarnpkg.com/ On Saturday, 11 March 2017 2:49:01 AM NZDT Alexander Kanavin wrote: > Hello all, > > *Introduction* > > The new generation of programming languages (think node.js, Go, Rust) is > a poor fit for the Yocto build model which follows the traditional Unix > model. In particular, those new development environments have no problem > with 'grabbing random stuff from the Internet' as a part of development > and build process. However, Yocto has very strict rules about the build > steps and what they can and can not do, and also a strict enforcement of > license and version checks for every component that gets built. Those > two models clash, and this is a proposal of how they could be reconciled. > > I'll also send a separate email that talks specifically about MEAN stack > and how it could be supported as Yocto - take it as a specific example > for all of the below. > > *Background* > > The traditional development model on Unix clearly separates installation > of dependencies needed to develop a project from the development process > itself. Typically, when one wants to build some project, first the > project README needs to be inspected, and any required dependencies > installed system-wide using the distribution package management's tool. > When those dependencies change, usually this manifests itself in a > previously unseen build failure which is again manually resolved by > figuring out the missing dependency and installing it. This can be > awkward, but it's how things have been done for decades, and Yocto's > build system (with
Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
On Fri, 2017-03-10 at 17:10 +0200, Alexander Kanavin wrote: Thanks for raising this topic. The problems we hit in adding node.js support clearly shows we would benefit from a common approach to supporting languages with their own and runtime and packaging. > > npm fetcher for instance was a nightmare to write, from what I've > heard: > > http://git.yoctoproject.org/cgit/cgit.cgi/poky/tree/bitbake/lib/bb/fe > tch2/npm.py Although I didn't write the code I worked with Paul Eggleton (who now owns it) on improving it, and yes it was hard to write. And it is becoming increasingly complex as we hit corner cases, indicating that the current architecture is not ideal. > > > I want to use existing tools (like 'npm install') for getting the > > > stuff from the network - we don't really need full recipes, we > > > just want to know the licenses of the dependencies, and, if > > > possible, lock them down to a specific version. I agree that leveraging the likes of 'npm install' will make life simpler but the problem with these operations is that they span a range of bitbake tasks. The reason we wrote an npm fetcher was to limit network access to the fetch task. This works in conjunction with an npm patch that enables the install command to run in the compile task without accessing the registry. Are you are proposing that we allow network access outside of the fetch task? I look forward to Paul's comments. Henry -- ___ yocto mailing list yocto@yoctoproject.org https://lists.yoctoproject.org/listinfo/yocto
Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
On Fri, Mar 10, 2017 at 10:10 AM, Alexander Kanavin < alexander.kana...@linux.intel.com> wrote: > On 03/10/2017 04:58 PM, Otavio Salvador wrote: > >> I'd like to avoid generating entire separate recipes though, because that >>> implies your custom-written tool would be figuring out where the >>> dependency >>> source came from in the first place, and what are its own dependencies, >>> when >>> creating the recipe, which can be tricky, breakage-prone guesswork. >>> >> >> In fact not; as you generate the recipes for the dependencies, it goes >> recursively and is always good. >> > > Would it also be true for npm, Rust, Go, and other languages that will > come along? In your specific case the metadata may be easily available to > parse and convert to recipe form, but this many not hold in other > situations. > > npm fetcher for instance was a nightmare to write, from what I've heard: > > http://git.yoctoproject.org/cgit/cgit.cgi/poky/tree/bitbake/ > lib/bb/fetch2/npm.py > > I want to use existing tools (like 'npm install') for getting the stuff >>> from >>> the network - we don't really need full recipes, we just want to know the >>> licenses of the dependencies, and, if possible, lock them down to a >>> specific >>> version. >>> >> >> Well we initially thought this would suffice but consider a security >> flaw. As many apps may be using different versions of same package it >> becomes a nightmare to figure which ones are affected. If using >> dependencies it is fine, for free. >> > > The lockdown files would list the versions of the dependencies (if it is > possible, which is not always true), so you can inspect those to see if > something is vulnerable. In node.js or Go worlds the libraries are not > reused between apps anyway, so it really doesn't matter if they're packaged > as separate recipes or not (I didn't have time to check Rust, but as it's > also using lockdown files, I believe the libraries are not reused either). > > I can chime in on how we do things in meta-rust. Right now, each application is statically linked against the crate library versions it calls out. At this point, the rust ABI is not stable between versions of the compiler, so we made the conscious decision to avoid dynamic libraries for the time being. We acknowledge this does increase the file system size, but we didn't want to have to deal with users trying to perform package updates on individual shared objects. We have our own cargo module that we maintain that helps users generate their bitbake recipe for a given package [1]. There was a small bit of work done on trying to create so files, but it became an unmanageable version nightmare isn't supported moving forward[2]. We also maintain our own custom fetcher for crates [3] and ran into some issues getting it totally supported without integrating it into the set of bitbake fetchers [4][5]. -Derek [1] - https://github.com/meta-rust/meta-rust [2] - https://github.com/cardoe/cargo-bitbake [3] - https://github.com/meta-rust/meta-rust/tree/master/recipes-core [4] - https://github.com/meta-rust/meta-rust/blob/master/lib/crate.py [5] - https://github.com/meta-rust/meta-rust/issues/136 [6] - https://bugzilla.yoctoproject.org/show_bug.cgi?id=10867 > Alex > > ___ > Openembedded-architecture mailing list > openembedded-architect...@lists.openembedded.org > http://lists.openembedded.org/mailman/listinfo/openembedded-architecture > -- ___ yocto mailing list yocto@yoctoproject.org https://lists.yoctoproject.org/listinfo/yocto
Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
On Fri, Mar 10, 2017 at 10:10 AM, Alexander Kanavin < alexander.kana...@linux.intel.com> wrote: > On 03/10/2017 04:58 PM, Otavio Salvador wrote: > >> I'd like to avoid generating entire separate recipes though, because that >>> implies your custom-written tool would be figuring out where the >>> dependency >>> source came from in the first place, and what are its own dependencies, >>> when >>> creating the recipe, which can be tricky, breakage-prone guesswork. >>> >> >> In fact not; as you generate the recipes for the dependencies, it goes >> recursively and is always good. >> > > Would it also be true for npm, Rust, Go, and other languages that will > come along? In your specific case the metadata may be easily available to > parse and convert to recipe form, but this many not hold in other > situations. > > npm fetcher for instance was a nightmare to write, from what I've heard: > > http://git.yoctoproject.org/cgit/cgit.cgi/poky/tree/bitbake/ > lib/bb/fetch2/npm.py > > I want to use existing tools (like 'npm install') for getting the stuff >>> from >>> the network - we don't really need full recipes, we just want to know the >>> licenses of the dependencies, and, if possible, lock them down to a >>> specific >>> version. >>> >> >> Well we initially thought this would suffice but consider a security >> flaw. As many apps may be using different versions of same package it >> becomes a nightmare to figure which ones are affected. If using >> dependencies it is fine, for free. >> > > The lockdown files would list the versions of the dependencies (if it is > possible, which is not always true), so you can inspect those to see if > something is vulnerable. In node.js or Go worlds the libraries are not > reused between apps anyway, so it really doesn't matter if they're packaged > as separate recipes or not (I didn't have time to check Rust, but as it's > also using lockdown files, I believe the libraries are not reused either). > > I can chime in on how we do things in meta-rust. Right now, each application is statically linked against the crate library versions it calls out. At this point, the rust ABI is not stable between versions of the compiler, so we made the conscious decision to avoid dynamic libraries for the time being. We acknowledge this does increase the file system size, but we didn't want to have to deal with users trying to perform package updates on individual shared objects. We have our own cargo module that we maintain that helps users generate their bitbake recipe for a given package [1]. There was a small bit of work done on trying to create so files, but it became an unmanageable version nightmare isn't supported moving forward[2]. We also maintain our own custom fetcher for crates [3] and ran into some issues getting it totally supported without integrating it into the set of bitbake fetchers [4][5]. -Derek [1] - https://github.com/meta-rust/meta-rust [2] - https://github.com/cardoe/cargo-bitbake [3] - https://github.com/meta-rust/meta-rust/tree/master/recipes-core [4] - https://github.com/meta-rust/meta-rust/blob/master/lib/crate.py [5] - https://github.com/meta-rust/meta-rust/issues/136 [6] - https://bugzilla.yoctoproject.org/show_bug.cgi?id=10867 Alex > > ___ > Openembedded-architecture mailing list > openembedded-architect...@lists.openembedded.org > http://lists.openembedded.org/mailman/listinfo/openembedded-architecture > -- ___ yocto mailing list yocto@yoctoproject.org https://lists.yoctoproject.org/listinfo/yocto
Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
Hi Alexander, first of all thanks for the efforts. Some comments to add to the mix: 2017-03-10 16:10 GMT+01:00 Alexander Kanavin: > The lockdown files would list the versions of the dependencies (if it is > possible, which is not always true), so you can inspect those to see if > something is vulnerable. In node.js or Go worlds the libraries are not > reused between apps anyway, so it really doesn't matter if they're packaged > as separate recipes or not (I didn't have time to check Rust, but as it's > also using lockdown files, I believe the libraries are not reused either). I don't know if you've heard of the lately popular idea of "microservices" - but basically it comes down to having multiple specialized node applications installed. In practice those applications often share the dependencies, so it would totally make sense to use the packages approach so that the dependencies don't need to be installed in multiple copies. Perhaps maybe not with recipe per npm package, but maybe with some more advanced bitbake magic or a post-rootfs hook to run deduplication. What works most of the time is using shrinkwrap to freeze the dependencies to some local npm mirror, so the concept of lockdown would make sense. Just a word of warning that it sometimes doesn't work that well - some of the npm packages (in the dependencies chain) may have hard-coded URI's to e.g. gitlab and shrinkwrap will keep those references instead of npm mirror. Also npm itself doesn't really check for consistency it only checks for versions, what can happen is that the contents may change but the version string may not. In terms of node yarn [1] seems to address some of the npm shortcomings but I'm not aware of any progress in regards of yocto integration. [1] https://yarnpkg.com/ Best regards, Piotr. -- ___ yocto mailing list yocto@yoctoproject.org https://lists.yoctoproject.org/listinfo/yocto
Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
Hi Trevor, On 10.03.2017 21:49, Trevor Woerner wrote: Although the trend is to not care about licensing, I believe it is vitally important that we do our best to keep track of all the licensing from every package that is pulled into an image. If we're pulling in >1000 npm packages just for one node app, then that means we should have >1000 item list of each dependency and their respective licenses. Although it makes a recipe look ugly, I wouldn't want to drop this functionality due to aesthetic concerns. Maybe the license list could be moved to another file that is required by the "main" recipe file? Maybe the list could be moved to the bottom of the file? Boiling that down, it sounds to me like the approach is the following: 1) Let the sub-package manager do its work as its meant to be. 2) If the sub-package manager supports version lockdown/shrinkwrapping. it shall be used. 3) The OE build process is only meant to take care of licensing. (This could basically be seen as an additional Option 0 to the mail from yesterday: License-only recipes. [1]) Sounds like an interesting option indeed. Keeping it in the recipe means, in an abstract manner, that we need support for sub-licensing. Might be a viable route to go . In the case of node specifically, I don't think trying to create and maintain separate recipes for each and every dependency one might find in the npm registry would be a sane approach. Currently we embed the version info into the recipe filename. This will simply not scale to millions of npm packages, each with numerous versions. It will not scale for human inspection. For metadata that is algorithmically generated and used, I personally don't think the sheer number is a killer argument. I've been playing with node a fair amount lately as it relates to OE and I have to say I've been quite impressed! These aren't easy things and I think there's a lot of good work happening. Totally agreed. But implicitly, we tend to see npm as the reference for such sub-package managers. Is this a good way? Alexanders approach was to find a concept that fits all such constructions. Maybe its also worthwhile to think along the opposite lines: Treat each and every of those sub-package managers completely on its own, with all its specialities? (And hope that their number stays low) Other than these (short-term?) issues devtool seems to be on the right track (?) It does, for example, generate a lockdown.json file and an npm-shrinkwrap.json file automatically. All we need is the package.json from the app developer, and that can be auto-generated via npm. I think we have to accept that node developers are going to want to develop on the target device itself, and when they're done they can hand us the package.json file which we can run devtool on which will generate the recipe for us. I'm not too convinced that this is a good way. Especially when it comes to node modules that contain some native code, this becomes very ugly. My experience is that auto-processing that stuff adds megabytes of clutter, while all that matters in the end is a binary that is a couple if kilobytes. So how would one tackle that? Carve that out as a separate recipe again? As a short-term work-around, I've simply been creating an image with node+npm, running it on the device, copying over the package.json file, running npm install against it, then collecting up all the extra stuff that gets added to my image (as a result), and bundling all that into a platform-specific "bin_package" (bbclass). It works, but it's a multi-step process. If I could cut out some of those steps (once things from [1] are fixed), it would be an improvement. Yeah, thats an option. I am rather providing custom compile and install stages, as the applications I'm working on have similar requisites, but I didn't want to go multi-step/binary. Greetz PS: being tired of typing sub-package manager again and again, how shall we call this? SPM? Application Package Managers (APM)? [1] http://lists.openembedded.org/pipermail/openembedded-architecture/2017-March/000489.html -- Josef Holzmayr Software Developer Embedded Systems Tel: +49 8444 9204-48 Fax: +49 8444 9204-50 R-S-I Elektrotechnik GmbH & Co. KG Woelkestrasse 11 D-85301 Schweitenkirchen www.rsi-elektrotechnik.de ——— Amtsgericht Ingolstadt – GmbH: HRB 191328 – KG: HRA 170393 Geschäftsführer: Dr.-Ing. Michael Sorg, Dipl.-Ing. Franz Sorg Ust-IdNr: DE 128592548 _ Amtsgericht Ingolstadt - GmbH: HRB 191328 - KG: HRA 170363 Geschäftsführer: Dr.-Ing. Michael Sorg, Dipl.-Ing. Franz Sorg USt-IdNr.: DE 128592548 -- ___ yocto mailing list yocto@yoctoproject.org https://lists.yoctoproject.org/listinfo/yocto
Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
Hi Alexander, Thanks for bringing up this important topic. There is no doubt we're seeing paradigm shifts in the way applications are written, built, and packaged; as well as a complete lack of interest in licensing. Although the trend is to not care about licensing, I believe it is vitally important that we do our best to keep track of all the licensing from every package that is pulled into an image. If we're pulling in >1000 npm packages just for one node app, then that means we should have >1000 item list of each dependency and their respective licenses. Although it makes a recipe look ugly, I wouldn't want to drop this functionality due to aesthetic concerns. Maybe the license list could be moved to another file that is required by the "main" recipe file? Maybe the list could be moved to the bottom of the file? In the case of node specifically, I don't think trying to create and maintain separate recipes for each and every dependency one might find in the npm registry would be a sane approach. Currently we embed the version info into the recipe filename. This will simply not scale to millions of npm packages, each with numerous versions. I've been playing with node a fair amount lately as it relates to OE and I have to say I've been quite impressed! These aren't easy things and I think there's a lot of good work happening. I've outlined some of my thoughts on my experiences[1]: http://lists.openembedded.org/pipermail/openembedded-core/2017-February/133432.html Other than these (short-term?) issues devtool seems to be on the right track (?) It does, for example, generate a lockdown.json file and an npm-shrinkwrap.json file automatically. All we need is the package.json from the app developer, and that can be auto-generated via npm. I think we have to accept that node developers are going to want to develop on the target device itself, and when they're done they can hand us the package.json file which we can run devtool on which will generate the recipe for us. As a short-term work-around, I've simply been creating an image with node+npm, running it on the device, copying over the package.json file, running npm install against it, then collecting up all the extra stuff that gets added to my image (as a result), and bundling all that into a platform-specific "bin_package" (bbclass). It works, but it's a multi-step process. If I could cut out some of those steps (once things from [1] are fixed), it would be an improvement. Best regards, Trevor [1] A short recap of those emails: Different paths seem to be followed depending on whether you point devtool at, say, a github repository versus a local checkout of the same project. That seems wrong. Also (as you've pointed out) RSS is messing all this up on master at the moment; but I assume this can/will get fixed? Things work fine on morty. Also, devtool gets tripped up when it encounters a license string that isn't found in its list of already-known license strings. This approach seems doomed to failure. It has to be able to recover gracefully and continue walking the dependency list without having to continuously add corner cases to the code. -- ___ yocto mailing list yocto@yoctoproject.org https://lists.yoctoproject.org/listinfo/yocto
Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
Hi Alexander, thanks for kicking off the topic, sounds like its kinda overdue. While I have no really good solution (d'oh!), please find below thoughts and bits and pieces about some of the points. *Recipes* My gut feeling say auto-generation of the recipes is a good way to go. Yet I am uncertain which set of functionality such an auto-generated recipe would provide. The obvious possibilities: Option 1: Only dependency/license tracking. Pro: provides exactly what it says - known and proven means to keep track of the dependencies and licenses. Contra: this would mean that we have a recipes that only do housekeeping, while the actual workload is taken care of in the main recipe. This seperation sounds like a massive headache Option 2: Full recipes Pro: would fit into the OE workflow. Contra: requires the sub-package managers to at least "play along". Think two applications having the same dependency. It would get installed once (globally/system-wide), and both applications have to use it. My interpretation is that this just is not how it works (looking at npm) Coming from there, one can go further, resulting in Option 1.5: Provide recipes for common base functionality. Those would have the all-in-one-locked-down approach, but are meant to be used as global dependencies. Example: the MEAN stack. Like its name says, it consist of 4 main pieces (-> possible recipes) which are needed by the application. Pro: reduces recipe number/bloat and makes dependencies readable. The mindset fits the classis 'library' thinking. Contra: the depending application would have to be packaged with the infrastructure in mind. So while the library recipes could rely on the locked down sub-package manager, the application would have to intently skip it and provide a custom installation. Which is an annyonce if you are application dev and packager in union - and a major pain point if you want to package some upstream application. *Lockdown* To me the approach sounds interesting, yet it comes with a couple of points to think about. - Feature set: having such a lockdown system implies/requires all sub-package managers to provide (at least) the functionality to fullfill the needs of the lockdown process/recreation. Is that something we can take for granted? - Multilanguage: imagine a package for example having some native go code, then nodejs bindings and then a npdejs application on top. How would that look like? Multiple lockdown files? What are the implications? *Sub-Package managers in general* We've seen the first (perl, php, python...) and second (npm, go, rust...) wave of those by now. A third one certainly will come one day. Taking a step back for a larger perspective, it sounds like what we actually need is some form of nested dependencies. Or scoped dependencies. Whatever we want to call it, because to me thats what it actually is. The dependencies we have now are always global. But especially the second wave things think different. Those sub-package managers do not care about the whole system. They care about their small part of it. So my interpretation is that we need to take that paradigm shift, and decide upon the actual implementation details afterwards. *Conclusion* Guess I raised more questions than I offered answers for. Sorry :-( Greetz (and try to enjoy the weekend) -- Josef Holzmayr Software Developer Embedded Systems Tel: +49 8444 9204-48 Fax: +49 8444 9204-50 R-S-I Elektrotechnik GmbH & Co. KG Woelkestrasse 11 D-85301 Schweitenkirchen www.rsi-elektrotechnik.de ——— Amtsgericht Ingolstadt – GmbH: HRB 191328 – KG: HRA 170393 Geschäftsführer: Dr.-Ing. Michael Sorg, Dipl.-Ing. Franz Sorg Ust-IdNr: DE 128592548 _ Amtsgericht Ingolstadt - GmbH: HRB 191328 - KG: HRA 170363 Geschäftsführer: Dr.-Ing. Michael Sorg, Dipl.-Ing. Franz Sorg USt-IdNr.: DE 128592548 -- ___ yocto mailing list yocto@yoctoproject.org https://lists.yoctoproject.org/listinfo/yocto
Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
On 3/10/17 8:58 AM, Otavio Salvador wrote: > On Fri, Mar 10, 2017 at 11:48 AM, Alexander Kanavin >wrote: >> On 03/10/2017 04:30 PM, Otavio Salvador wrote: >>> >>> When integrating the CHICKEN Scheme support onto the Yocto Project we >>> dealt it using their installation tool but making the package of >>> individual packages (eggs, in this specific case) as individual >>> recipes. We went further and automated the recipe generation and this >>> made it quite easy to maintain in long term. >>> >>> Take a look at: >>> >>> https://github.com/OSSystems/meta-chicken >> >> >> Thanks, even though my Scheme-fu isn't great :) >> >> I'd like to avoid generating entire separate recipes though, because that >> implies your custom-written tool would be figuring out where the dependency >> source came from in the first place, and what are its own dependencies, when >> creating the recipe, which can be tricky, breakage-prone guesswork. > > In fact not; as you generate the recipes for the dependencies, it goes > recursively and is always good. > >> I want to use existing tools (like 'npm install') for getting the stuff from >> the network - we don't really need full recipes, we just want to know the >> licenses of the dependencies, and, if possible, lock them down to a specific >> version. > > Well we initially thought this would suffice but consider a security > flaw. As many apps may be using different versions of same package it > becomes a nightmare to figure which ones are affected. If using > dependencies it is fine, for free. > I'm wondering if there may be any way to generate these recipes on the fly (say during ConfigParsed -- and then issue a reparse if things have changed?) I've got concerns with this "new model" of development specifically around: *) Bug fixes -) Security bugs especially When something is updated to fix a problem, how do we know we "got everything"? Something needs to flag the system to make it clear that not only the items itself, but all of the things it depends on are at the correct (revised) versions. The other problem is 'certified' devices. Various certification requirements don't allow just 'downloading new content whenever', but they often allow small targeted fixes. So in these cases, the user would likely want to know there was an update for a specific reason and then 'backport, and not change the version' to address their certification issues. (And yes, I know of companies that used to lock down version numbers of code, and simply upgrade anyway to get around this -- it's always possible, but not exactly within the spirit of the certification process.. targeted changes are much more reasonable in that environment.) *) License considerations A big problem in all of these new developer centric models is that they are designed for back-end service were the license doesn't matter (as much). For embedded devices, we have to have a clear lineage of the license, code and source (SRC_URI) to have any chance to "doing the right thing" or at worst (defending work during a lawsuit). *) Export/import control Unlike random open source projects, when you build a physical device there may export/import control that has to happen as well. Without the discrete components being clearly visible to the developers -- it is -very- difficult to satisfy the constraints of the various export/import requirements around the world. I think auto-generating recipe contents help with these. Having a good process behind the auto-generation can make the license and software "update" procedure a lot cleaner and provide the tracking needed for the export/import control. I do see we need a better model for these components as we move away from the core operating system and 'up the stack' better auto-generation and using the tooling provided, while yet conforming to the needs we have makes the most sense to me. (The problem of course is most of these seem to need an expert, or at least a highly interested developer to have a shot at working or keep working over time the developer needs to be able to explain for the OE the alternative model needed so we can collectively figure out how to do it best.) I think this is a good start at the discussion, but other then listing concerns, unfortunately I don't have any solutions to offer. --Mark -- ___ yocto mailing list yocto@yoctoproject.org https://lists.yoctoproject.org/listinfo/yocto
Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
On 03/10/2017 04:58 PM, Otavio Salvador wrote: I'd like to avoid generating entire separate recipes though, because that implies your custom-written tool would be figuring out where the dependency source came from in the first place, and what are its own dependencies, when creating the recipe, which can be tricky, breakage-prone guesswork. In fact not; as you generate the recipes for the dependencies, it goes recursively and is always good. Would it also be true for npm, Rust, Go, and other languages that will come along? In your specific case the metadata may be easily available to parse and convert to recipe form, but this many not hold in other situations. npm fetcher for instance was a nightmare to write, from what I've heard: http://git.yoctoproject.org/cgit/cgit.cgi/poky/tree/bitbake/lib/bb/fetch2/npm.py I want to use existing tools (like 'npm install') for getting the stuff from the network - we don't really need full recipes, we just want to know the licenses of the dependencies, and, if possible, lock them down to a specific version. Well we initially thought this would suffice but consider a security flaw. As many apps may be using different versions of same package it becomes a nightmare to figure which ones are affected. If using dependencies it is fine, for free. The lockdown files would list the versions of the dependencies (if it is possible, which is not always true), so you can inspect those to see if something is vulnerable. In node.js or Go worlds the libraries are not reused between apps anyway, so it really doesn't matter if they're packaged as separate recipes or not (I didn't have time to check Rust, but as it's also using lockdown files, I believe the libraries are not reused either). Alex -- ___ yocto mailing list yocto@yoctoproject.org https://lists.yoctoproject.org/listinfo/yocto
Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
On Fri, Mar 10, 2017 at 11:48 AM, Alexander Kanavinwrote: > On 03/10/2017 04:30 PM, Otavio Salvador wrote: >> >> When integrating the CHICKEN Scheme support onto the Yocto Project we >> dealt it using their installation tool but making the package of >> individual packages (eggs, in this specific case) as individual >> recipes. We went further and automated the recipe generation and this >> made it quite easy to maintain in long term. >> >> Take a look at: >> >> https://github.com/OSSystems/meta-chicken > > > Thanks, even though my Scheme-fu isn't great :) > > I'd like to avoid generating entire separate recipes though, because that > implies your custom-written tool would be figuring out where the dependency > source came from in the first place, and what are its own dependencies, when > creating the recipe, which can be tricky, breakage-prone guesswork. In fact not; as you generate the recipes for the dependencies, it goes recursively and is always good. > I want to use existing tools (like 'npm install') for getting the stuff from > the network - we don't really need full recipes, we just want to know the > licenses of the dependencies, and, if possible, lock them down to a specific > version. Well we initially thought this would suffice but consider a security flaw. As many apps may be using different versions of same package it becomes a nightmare to figure which ones are affected. If using dependencies it is fine, for free. -- Otavio Salvador O.S. Systems http://www.ossystems.com.brhttp://code.ossystems.com.br Mobile: +55 (53) 9981-7854Mobile: +1 (347) 903-9750 -- ___ yocto mailing list yocto@yoctoproject.org https://lists.yoctoproject.org/listinfo/yocto
Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
On 03/10/2017 04:30 PM, Otavio Salvador wrote: When integrating the CHICKEN Scheme support onto the Yocto Project we dealt it using their installation tool but making the package of individual packages (eggs, in this specific case) as individual recipes. We went further and automated the recipe generation and this made it quite easy to maintain in long term. Take a look at: https://github.com/OSSystems/meta-chicken Thanks, even though my Scheme-fu isn't great :) I'd like to avoid generating entire separate recipes though, because that implies your custom-written tool would be figuring out where the dependency source came from in the first place, and what are its own dependencies, when creating the recipe, which can be tricky, breakage-prone guesswork. I want to use existing tools (like 'npm install') for getting the stuff from the network - we don't really need full recipes, we just want to know the licenses of the dependencies, and, if possible, lock them down to a specific version. Alex -- ___ yocto mailing list yocto@yoctoproject.org https://lists.yoctoproject.org/listinfo/yocto
Re: [yocto] [Openembedded-architecture] Proposal: dealing with language-specific build tools/dependency management tools
Hello Alexander, On Fri, Mar 10, 2017 at 10:49 AM, Alexander Kanavinwrote: > The new generation of programming languages (think node.js, Go, Rust) is a > poor fit for the Yocto build model which follows the traditional Unix model. > In particular, those new development environments have no problem with > 'grabbing random stuff from the Internet' as a part of development and build > process. However, Yocto has very strict rules about the build steps and what > they can and can not do, and also a strict enforcement of license and > version checks for every component that gets built. Those two models clash, > and this is a proposal of how they could be reconciled. When integrating the CHICKEN Scheme support onto the Yocto Project we dealt it using their installation tool but making the package of individual packages (eggs, in this specific case) as individual recipes. We went further and automated the recipe generation and this made it quite easy to maintain in long term. Take a look at: https://github.com/OSSystems/meta-chicken -- Otavio Salvador O.S. Systems http://www.ossystems.com.brhttp://code.ossystems.com.br Mobile: +55 (53) 9981-7854Mobile: +1 (347) 903-9750 -- ___ yocto mailing list yocto@yoctoproject.org https://lists.yoctoproject.org/listinfo/yocto