Hi,
On Fri, 2019-10-25 at 11:10 +0200, Stefan Herbrechtsmeier wrote:
> Hi Andre,
>
> Am 25.10.19 um 10:01 schrieb André Draszik:
> > Hi,
> >
> > This has been an interesting discussion so far.
> >
> > I'd like to throw in something else...
> >
> > A couple years back I wrote a little python script to automatically
> > generate all the required dependency recipes given an npm project's
> > package.json
>
> This is similar to my prototype but I try to reuse the recipetool and
> this makes the recipe creation very slow.
Attached is the latest snapshot of scripts and support classes
that I had used back then...
nodejs.bbclass does the main work of handling the recipes.
It supports usage of the same npm package with different
versions by adding appropriate symlinks for the file system
Also, it extracts the runtime depends from package.json (and
add runtime provides) so as to get proper dependency trees.
nodegyp.bbclass is for recipes that use node-gyp (or node-pre-gyp).
This needs node-gyp and gyp (as build tools). I didn't
create-recipe-from-json.py is the actual script, point it at a
package.json, and optional shrinkwrap. It does the other part of the
deduplication by keeping track of the dependency tree and adding the
appropriate symlink information to the recipes as understood by
nodejs.bbclass.
At the time I had a problem with runtime depends for native packages,
RDEPENDS for native packages wasn't implemented in OE back then.
I've commented out "DEPENDS_append_class-native" handling now,
as this should work without tricks these days.
node-pre-gyp support wasn't fully implemented in the create-recipe
script. npm-modules using node-gyp or node-pre-gyp did work fine and
can be cross-compiled without probllem, though.
Some npm-modules require extra handling, e.g. patches to be applied, this
is done in around line 205, where an extra 'require' line is added
to the generated recipe, I've also added nodejs-node-gyp.inc as an
example for what that could look like.
I didn't use recipetool, either cause I didn't know it back then, or it
wasn't available at the time.
It's very fast though. Most time is spent downloading the packages for
analysis...
> Do you reuse code from OE to generate the license information?
I had to look it up, this used 'licensee' https://github.com/licensee/licensee
> The generated recipes worked very well, including cross-compilation using
> > node-gyp.
>
> Do you use any bbclass? I have create multiple bbclasses to minimize the
> code inside the recipes.
All included in this mail.
>
> > At least at the time this was all reasonably straight forward, avoided
> > *any* use
> > of npm, gave me all the benefits of reproducibility, yocto caching, license
> > management, correct cross-compilation for native code, etc. Also, the
> > generated yocto packages contained none of the test- or other items that
> > npm packages usually contain and don't need to be copied onto the target
> > for image builds. All this by simply inspecting the package.json
>
> This is the reason I go this way too.
>
> > This approach also minimised maintenance cost, as all recipes were
> > auto-generated.
> >
> > The only downside was that bitbake became a bit slow, as the number of
> > tasks went to about 15000.
>
> Do you create a recipe per package version?
Yes, but it is clever in the sense that deduplication is applied, including
looking
for matching version numbers as per the version specification.
>
> > I can dig this out and share on Monday. This script could live in the
> > scripts/ subdirectory, allowing people to create recipes on demand for
> > projects they care about.
>
> Would be nice to see your code.
>
> Regards
> Stefan
Cheers,
Andre'
DEPENDS += "gyp-native"
inherit pythonnative
EXTRA_OEMAKE = "\
CC.host="${BUILD_CC}" \
CFLAGS.host="${BUILD_CFLAGS}" \
CXX.host="${BUILD_CXX}" \
CXXFLAGS.host="${BUILD_CXXFLAGS}" \
LINK.host="${BUILD_CXX}" \
LDFLAGS.host="${BUILD_LDFLAGS}" \
\
LINK.target="${CXX}" \
"
DEPENDS += "nodejs-node-gyp1.0.3-native nodejs-native"
inherit gyp nodejs
export NODE_GYP_NODEJS_INCLUDE_DIR =
"${PKG_CONFIG_SYSROOT_DIR}${NODEJS_INCLUDE_DIR}"
nodegyp_do_configure() {
node-gyp --arch ${TARGET_ARCH} --verbose configure ${EXTRA_OECONF}
}
python __anonymous () {
# Ensure we run this usually noexec task (due to nodejs.bbclass)
d.delVarFlag("do_compile", "noexec")
}
nodegyp_do_compile() {
node-gyp --arch ${TARGET_ARCH} --verbose build ${EXTRA_OEMAKE}
}
nodegyp_shell_do_install_helper() {
cd ${D}${NODE_MODULE_DIR}
# (node-)gyp creates multiple copies of the object files, we
# only keep one!
# node-expat: build/Release/node_expat.node (patch needed)
# sqlite3: lib/node_sqlite3.node (no build/ needed!)
if [ -d build ] ; then
# dtrace-provider builds nothing for !MacOS
if ls build/Release/*.node 2> /dev/null > /dev/null ; then
tmpdir=$(mktemp -d --tmpdir=./)
mv build/Release/*.node $tmpdir
rm -rf build
mkdir -p build/Release
mv $tmpdir/*.node build/Release
rm -rf $tmpdir
else
rm -rf build
fi
fi
}
python nodegyp_do_install() {
bb.build.exec_func("nodejs_do_install", d)
bb.build.exec_func("nodegyp_shell_do_install_helper", d)
}
EXPORT_FUNCTIONS do_configure do_compile do_install
# While nodejs(-native) is just a runtime dependency (RDEPENDS)
# we still need to add it to DEPENDS because:
# - the RDEPENDS is added to the target packages late, after
# OE has determined which packages it needs to build. Hence
# it wouldn't build nodejs(-native)
# - RDEPENDS are completely ignored for -native packages, so this
# is our way to make sure that nodejs-native ends up in the
# sysroot for native packages
DEPENDS += "nodejs"
NODE_MODULE = "${@('${BPN}'.replace('nodejs-', '', 1) if
'${BPN}'.startswith('nodejs-') else '${BPN}').replace('${PV}', '', 1)}"
SRC_URI =
"http://registry.npmjs.org/${NODE_MODULE}/-/${NODE_MODULE}-${PV}.tgz;subdir=${BP}"
NODEJS_SITELIB_DIR = "${libdir}/node_modules"
NODEJS_INCLUDE_DIR = "${includedir}/node"
export NODE_LIBDIR = "${PKG_CONFIG_SYSROOT_DIR}${NODEJS_SITELIB_DIR}"
NODE_MODULE_DIR = "${NODEJS_SITELIB_DIR}/${NODE_MODULE}${PV}"
# Tarballs store the contents of the module inside a 'package' directory
# This conflicts with OE's 'package' directory, which is dynamically
# created during the packaging process. So we have to make sure to extract
# the tarball into a different directory.
S = "${WORKDIR}/${BP}/package"
python nodejs_do_unpack() {
import shutil
bb.build.exec_func('base_do_unpack', d)
# some node modules (ejs!) have an unusual directory structure,
# i.e. they don't use package/ but ${NODE_MODULE}-v${PV}
for path in [ "${BP}/${NODE_MODULE}-v${PV}", "${BP}/${NODE_MODULE}" ]:
if os.path.exists(path):
shutil.rmtree("${S}", True)
shutil.move(path, "${S}")
break
# Remove bundled node_modules if there are any...
shutil.rmtree("${S}/node_modules", True)
}
# Some node modules have Makefiles which don't build but run tests
# instead. This fails in OE, as Makefiles are expected to build if
# they exist. So let's just prevent automatic building.
do_compile[noexec] = "1"
python nodejs_do_install() {
import shutil
# copy everything
shutil.copytree("${B}", "${D}${NODE_MODULE_DIR}", symlinks=True)
# remove unneeded dirs
os.chdir("${D}${NODE_MODULE_DIR}")
# these are created by OE
shutil.rmtree(".pc", True)
shutil.rmtree("patches", True)
# these are usually not needed
shutil.rmtree("test", True)
shutil.rmtree("tests", True)
shutil.rmtree("example", True)
shutil.rmtree("examples", True)
}
python nodejs_do_install_append_class-native() {
import json
if '${NODE_MODULE}' != 'gulp' and '${NODE_MODULE}' != 'node-gyp':
return
os.chdir("${D}${NODE_MODULE_DIR}")
# install bin files
bindir_files = {}
metadata_p = json.load(open('package.json'))
# which files do we need to install into bin/
if 'bin' in metadata_p:
# this can be a dict, or a simple string,
https://docs.npmjs.com/files/package.json
if isinstance(metadata_p['bin'], basestring):
bindir_files[metadata_p['name']] =
os.path.normpath(metadata_p['bin'])
elif isinstance(metadata_p['bin'], dict):
for link, target in metadata_p['bin'].items():
bindir_files[link] = os.path.normpath(target)
else:
raise ValueError('Unsupported entry for "bin": ' +
str(type(metadata_p['bin'])))
if bindir_files:
d_bindir = "${D}${bindir}"
if not os.path.exists(d_bindir):
os.mkdir(d_bindir)
for link, target in bindir_files.items():
os.chmod(target, 0755)
os.symlink(os.path.join("${NODE_MODULE_DIR}", target),
os.path.join(d_bindir, link))
}
NODEJS_SKIP_MODULES_SYMLINK ?= "0"
def nodejs_do_install_symlinks2(d, do_devdepends=False):
"""
Symlink a node module's dependencies into the node_modules directory so
node.js
can find them
"""
import json
import os
import shutil
# inspired from fedora's
nodejs-packaging-fedora-7.tar.xz::nodejs-symlink-deps
sitelib = d.getVar('NODEJS_SITELIB_DIR', True)
def symlink(source, dest):
try:
os.symlink(source, dest)
except OSError:
if os.path.isdir(dest):
shutil.rmtree(dest)
else:
os.unlink(dest)
os.symlink(source, dest)
def symlink_deps(deps):
s_sitelib = os.path.join(d.getVar('STAGING_DIR_NATIVE', True),
sitelib.lstrip("/"))
def symlink_dep_from_rdepends(rdepends, dep):
target = d.getVarFlag("NODEJS_SYMLINKS", dep)
if not target:
# fatal for now...
bb.fatal("Dependency {0} doesn't exist in
NODEJS_SYMLINKS!".format(dep))
for key in rdepends.keys():
t = key.replace('nodejs-', '', 1).replace('-native', '', 1)
if t.startswith(dep + '-'):
target = t
del rdepends[key]
break
if not target:
bb.fatal("Symlink to dependency {0} couldn't be
resolved!".format(dep))
if not do_devdepends:
target = os.path.join("..", "..", target)
else:
target = os.path.join(s_sitelib, target)
symlink(target, dep)
def sanity_check_symlink(dep):
# Symlink to dep should have been created due to RDPENDS_${PN}
# parsing
if not os.path.lexists(dep):
bb.fatal("The package {0} depends on {1}, but the symlink "
"wasn't created yet. Error in RDPENDS_${{PN}}."
.format(d.getVar('PN', True), dep))
if do_devdepends:
if not os.path.exists(dep):
bb.fatal("The package nodejs-{0} has not been "
"installed into staging ({1}) even though "
"{2} needs it!"
.format(dep, os.path.readlink(dep), d.getVar('PN',
True)))
rdepends = {}
me = d.getVar("PN", True)
# -native packages have an empty PACKAGES
#packages = d.getVar("PACKAGES", True).split() or [ d.getVar("PN",
True) ]
# actually, we only want the main package. As that is the place where
# all dependencies should be stated!
packages = [ d.getVar("PN", True) ]
for pkg in packages:
rdepends.update(bb.utils.explode_dep_versions2(d.getVar('RDEPENDS_'
+ pkg, True) or ""))
for key in rdepends.keys():
if not key.startswith('nodejs-'):
del rdepends[key]
if isinstance(deps, dict):
for dep, ver in deps.iteritems():
symlink_dep_from_rdepends(rdepends, dep)
for dep, ver in deps.iteritems():
sanity_check_symlink(dep)
elif isinstance(deps, list):
for dep in deps:
symlink_dep_from_rdepends(rdepends, dep)
for dep in deps:
sanity_check_symlink(dep)
elif isinstance(deps, basestring):
symlink_dep_from_rdepends(rdepends, deps)
sanity_check_symlink(deps)
else:
bb.fatal("Invalid package.json while building {0}: dependencies not
a valid type".format(d.getVar('PN', True)))
if not do_devdepends:
d_sitelib = os.path.join(d.getVar('D', True), sitelib.lstrip('/'))
modules = [os.path.join(d_sitelib, module) for module in
os.listdir(d_sitelib)]
else:
modules = [d.getVar('S', True)]
for path in modules:
os.chdir(path)
metadata = json.load(open('package.json'))
# Normally, we would want to error out if the install process created
this
# directory, but it could also still be around from a previous run, so
let's
# just delete it.
shutil.rmtree('node_modules', True)
if (not do_devdepends and ('dependencies' in metadata or
'optionalDependencies' in metadata)) or (do_devdepends and 'devDependencies' in
metadata):
os.mkdir('node_modules')
oldwd = os.getcwd()
os.chdir('node_modules')
if not do_devdepends:
if 'dependencies' in metadata:
symlink_deps(metadata['dependencies'])
if 'optionalDependencies' in metadata:
symlink_deps(metadata['optionalDependencies'])
else:
if 'devDependencies' in metadata:
symlink_deps(metadata['devDependencies'])
os.chdir(oldwd)
if not os.listdir('node_modules'):
os.rmdir('node_modules')
python nodejs_do_install_symlinks() {
if d.getVar("NODEJS_SKIP_MODULES_SYMLINK", True) == "0":
nodejs_do_install_symlinks2(d, False)
}
nodejs_do_install_symlinks[vardeps] += "NODEJS_SYMLINKS
NODEJS_SKIP_MODULES_SYMLINK NODEJS_SITELIB_DIR"
addtask install_symlinks after do_install before do_populate_sysroot do_package
python nodejs_do_link_devdepends() {
nodejs_do_install_symlinks2(d, True)
}
do_link_devdepends[noexec] = "1"
do_link_devdepends[deptask] = "do_populate_sysroot"
nodejs_do_link_devdepends[vardeps] += "NODEJS_SYMLINKS NODEJS_SITELIB_DIR"
addtask link_devdepends after do_patch before do_compile
python nodejs_extract_runtime_provides() {
"""
Extract RPROVIDES from package.json.
See 'man npm-json' for details.
"""
import json
import os
# inspired from package.bbclass
sitelib = d.getVar('NODEJS_SITELIB_DIR', True)
pkgdest = d.getVar('PKGDEST', True)
packages = d.getVar("PACKAGES").split()
for pkg in packages:
# it's highly unlikely that we'll ever get more than one file but we
# handle this in a generic way nevertheless
files = []
p_sitelib = os.path.join(pkgdest, pkg, sitelib.lstrip("/"))
for file in pkgfiles[pkg]:
# only process files in the sitelib, and ignore files local to the
# current package
if file.find(p_sitelib) == 0:
if os.path.basename(file) == 'package.json':
files.append(file)
for file in files:
fh = open(file)
metadata = json.load(fh)
fh.close()
# inspired from fedora's
nodejs-packaging-fedora-7.tar.xz::nodejs.prov
if 'name' in metadata and not ('private' in metadata and
metadata['private']):
pname = 'virtual-npm-' + metadata['name']
ver = []
# bitbake does not understand versioned RPROVIDES, see e.g. here
#
http://lists.openembedded.org/pipermail/openembedded-commits/2014-July/162238.html
# (pkgconfig: Drop version from RPROVIDES)
#if 'version' in metadata:
# # metadata is unicode, but OE expects string in these vars
# # due to package.bbclass::write_if_exists()
# ver = metadata['version'].encode('utf8')
# metadata is unicode, but OE expects string in these vars
# due to package.bbclass::write_if_exists()
pname = pname.encode('utf8')
# normally, what we have added as default via RPROVIDES_${PN} =
... at
# the end of the file should match NODE_MODULE, so normally,
there is
# nothing to do here. Just in case, we have some code to be
failsafe,
# though.
rprovides =
bb.utils.explode_dep_versions2(d.getVar('RPROVIDES_' + pkg, True) or "")
if pname not in rprovides:
rprovides[pname] = ver
d.setVar('RPROVIDES_' + pkg, bb.utils.join_deps(rprovides,
commasep=False))
}
python nodejs_extract_runtime_depends() {
"""
Extract RDEPENDS from package.json.
See `man npm-json` for details.
"""
import json
import os
# inspired from fedora's nodejs-packaging-fedora-7.tar.xz::nodejs.req
# write out the node.js interpreter dependency
def convert_dep(req, operator, version):
"""Converts one of the two possibly listed versions into an OE
dependency"""
deps = []
if not version or version == '*':
# any version will do
deps.append(req)
elif operator in ['>', '<', '<=', '>=', '=']:
# any prefix other than ~ and ^ makes things dead simple
deps.append(' '.join([req, '(' + operator, version + ')']))
else:
# here be dragons...
# split the dotted portions into a list (handling trailing dots
properly)
parts = [part if part else 'x' for part in version.split('.')]
parts = [int(part) if part != 'x' and not '-' in part
else part for part in
parts]
if len(parts) == 1 or parts[1] == 'x':
# 1 or 1.x or 1.x.x or ~1 or ^1
if parts[0] != 0:
deps.append('{0} (>= {1})'.format(req, parts[0]))
deps.append('{0} (< {1})'.format(req, parts[0]+1))
elif len(parts) == 3 or operator != '~':
# 1.2.3 or 1.2.3-4 or 1.2.x or ~1.2.3 or ^1.2.3 or 1.2
if len(parts) == 2 or parts[2] == 'x':
# 1.2.x or 1.2
deps.append('{0} (>= {1}.{2})'.format(req, parts[0],
parts[1]))
deps.append('{0} (< {1}.{2})'.format(req, parts[0],
parts[1]+1))
elif operator == '~' or (operator == '^' and parts[0] == 0 and
parts[1] > 0):
# ~1.2.3 or ^0.1.2 (zero is special with the caret operator)
deps.append('{0} (>= {1})'.format(req, version))
deps.append('{0} (< {1}.{2})'.format(req, parts[0],
parts[1]+1))
elif operator == '^' and parts[0:1] != [0,0]:
# ^1.2.3
deps.append('{0} (>= {1})'.format(req, version))
deps.append('{0} (< {1})'.format(req, parts[0]+1))
else:
# 1.2.3 or 1.2.3-4 or ^0.0.3
# This should normally be:
# deps.append('{0} (= {1})'.format(req, version))
# but we add PR and assume that it will always be r0 for
# all recipes. Otherwise dependencies can not be resolved
# as the package version would not match (due to the
# -${PR} suffix)
deps.append('{0} (= {1}-r0)'.format(req, version))
elif operator == '~':
# ~1.2
deps.append('{0} (>= {1})'.format(req, version))
deps.append('{0} (< {1})'.format(req, parts[0]+1))
elif operator == '^':
# ^1.2
deps.append('{0} (>= {1})'.format(req, version))
deps.append('{0} (< {1})'.format(req, parts[0]+1))
return deps
def process_dep(req, version):
"""Converts an individual npm dependency into OE dependencies"""
import re
deps = []
# there's no way we can do anything like an OR dependency
if '||' in version:
bb.warn("The {0} package has an OR (||) dependency on {1}:
{2}\n".format(d.getVar('PN', True), req, version) +
"Please manually include a versioned RDEPENDS_$PN = \"{0}
(<version>)\" in the".format(req) +
"{0} recipe if necessary".format(d.getVar('FILE', True)))
deps.append(req)
elif ' - ' in version:
gt, lt = version.split(' - ')
deps.append(req + ' (>= ' + gt + ')')
deps.append(req + ' (<= ' + lt + ')')
else:
RE_VERSION =
re.compile(r'\s*v?([<>=~^]{0,2})\s*([0-9][0-9\.\-]*)\s*')
m = re.match(RE_VERSION, version)
if m:
deps += convert_dep(req, m.group(1), m.group(2))
# There could be up to two versions here (e.g.">1.0 <3.1")
if len(version) > m.end():
m = re.match(RE_VERSION, version[m.end():])
if m:
deps += convert_dep(req, m.group(1), m.group(2))
else:
deps.append(req)
return deps
sitelib = d.getVar('NODEJS_SITELIB_DIR', True)
pkgdest = d.getVar('PKGDEST', True)
# inspired from package.bbclass
packages = d.getVar("PACKAGES").split()
for pkg in packages:
deps = []
recommends = []
# it's highly unlikely that we'll ever get more than one file but we
# handle this in a generic way nevertheless
files = []
p_sitelib = os.path.join(pkgdest, pkg, sitelib.lstrip("/"))
for file in pkgfiles[pkg]:
# only process files in the sitelib, and ignore files local to the
# current package
if file.find(p_sitelib) == 0:
if os.path.basename(file) == 'package.json':
files.append(file)
for file in files:
fh = open(file)
metadata = json.load(fh)
fh.close()
req = 'nodejs'
if 'engines' in metadata and isinstance(metadata['engines'], dict) \
and 'node' in
metadata['engines']:
deps += process_dep(req, metadata['engines']['node'])
else:
deps.append(req)
if 'dependencies' in metadata:
if isinstance(metadata['dependencies'], dict):
for name, version in metadata['dependencies'].iteritems():
req = 'virtual-npm-' + name
deps += process_dep(req, version)
elif isinstance(metadata['dependencies'], list):
for name in metadata['dependencies']:
req = 'virtual-npm-' + name
deps.append(req)
elif isinstance(metadata['dependencies'], basestring):
req = 'virtual-npm-' + metadata['dependencies']
deps.append(req)
else:
raise TypeError('invalid package.json: dependencies not a
valid type')
if 'optionalDependencies' in metadata:
if isinstance(metadata['optionalDependencies'], dict):
for name, version in
metadata['optionalDependencies'].iteritems():
req = 'virtual-npm-' + name
recommends += process_dep(req, version)
elif isinstance(metadata['optionalDependencies'], list):
for name in metadata['optionalDependencies']:
req = 'virtual-npm-' + name
recommends.append(req)
elif isinstance(metadata['optionalDependencies'], basestring):
req = 'virtual-npm-' + metadata['optionalDependencies']
recommends.append(req)
else:
raise TypeError('invalid package.json: optionalDependencies
not a valid type')
rdepends = bb.utils.explode_dep_versions2(d.getVar('RDEPENDS_' + pkg,
True) or "")
# remove all nodejs-xxx rdependencies originating in the recipe itself
# they are not useful, as we only want RDEPENDS on the virtual-npm-xxx
# packages, as that allows us to have multiple versions installed in
# one image.
# Maybe we should switch all recipes to using DEPENDS to avoid the need
# for this?
for key in rdepends.keys():
if key.startswith('nodejs-'):
del rdepends[key]
# add new rdependencies which we just figured out
for dep in deps:
# metadata is unicode, but OE expects string in these vars
# due to package.bbclass::write_if_exists()
dep = dep.encode('utf8')
if dep not in rdepends:
rdepends[dep] = []
d.setVar('RDEPENDS_' + pkg, bb.utils.join_deps(rdepends,
commasep=False))
rrecommends = bb.utils.explode_dep_versions2(d.getVar('RRECOMMENDS_' +
pkg, True) or "")
# add new rrecommends which we just figured out
for recommend in recommends:
# metadata is unicode, but OE expects string in these vars
# due to package.bbclass::write_if_exists()
recommend = recommend.encode('utf8')
if recommend not in rrecommends:
rrecommends[recommend] = []
d.setVar('RRECOMMENDS_' + pkg, bb.utils.join_deps(rrecommends,
commasep=False))
}
PACKAGEFUNCS =+ "nodejs_extract_runtime_provides nodejs_extract_runtime_depends"
EXPORT_FUNCTIONS do_unpack do_install do_install_symlinks do_link_devdepends
FILES_${PN} += "${NODE_MODULE_DIR}"
DOTDEBUG-dbg += "${NODE_MODULE_DIR}/*/*/.debug"
DOTDEBUG-dbg += "${NODE_MODULE_DIR}/*/.debug"
DOTDEBUG-dbg += "${NODE_MODULE_DIR}/.debug"
RPROVIDES_${PN} = "virtual-npm-${NODE_MODULE}"
#!/usr/bin/env python3
# TODO:
# - check for node-pre-gyp usage
import argparse
import os
import json
import urllib.request
import io
import tarfile
import hashlib
from distutils.version import LooseVersion
import shutil
import tempfile
import subprocess
global args
def required_length(nmin,nmax):
class RequiredLength(argparse.Action):
def __call__(self, parser, args, values, option_string=None):
if not nmin<=len(values)<=nmax:
msg='argument "{f}" requires between {nmin} and {nmax} arguments'.format(
f=self.dest,nmin=nmin,nmax=nmax)
raise argparse.ArgumentTypeError(msg)
setattr(args, self.dest, values)
return RequiredLength
class Package:
""" Package class """
def __init__(self, name, version):
self.name = name
self.set_version(version)
self.summary = None
self.homepage = None
self.license = None
self.license_file = None
self.license_file_md5sum = None
self.md5sum = None
self.sha256sum = None
self.using_gyp = False
self.deps = []
self.details_skipped = False
self.parent = None
self.metadata_s = None
self.is_devdepend = False
self.is_circular = False
def to_JSON(self):
def members_to_dump(obj):
# prevent endless recursion by avoiding the 'parent' member
return {attr: value for attr, value in obj.__dict__.items()
if not attr.startswith('__') and attr != 'parent' and attr != 'metadata_s'}
return json.dumps(self, default=members_to_dump, sort_keys=False, indent=4, separators=(',', ': '))
def debugprint(self, indent=''):
extra = ''
if self.using_gyp:
extra = extra + ' gyp'
if self.details_skipped:
extra = extra + ' skipped'
if self.is_circular:
extra = extra + ' circular'
if extra:
extra = ' (' + extra.lstrip() + ')'
print('{0}{1} {2}{3}'.format(indent, self.name, self.version, extra))
for p in self.deps:
p.debugprint(indent + ' ')
def set_version(self, version):
if isinstance(version, str):
self.version = version
else:
raise TypeError('Unuspported Version: ' + version)
self.src_uri = 'https://registry.npmjs.org/{0}/-/{0}-{1}.tgz'.format(self.name, self.version)
def update_version_and_uri_from_shrinkwrap(self, value):
# order is important here, set_version() will set a
# default URL, but 'resolved' could point to somewhere
# else.
if 'version' in value:
self.set_version(value['version'])
if 'resolved' in value:
self.src_uri = value['resolved']
def get_latest_compatible(self):
# get latest compatible version based on self.version
import re
print('get_latest_compatible: ' + self.name + ' before: self.version is ' + self.version)
try:
info = urllib.request.urlopen('file:///tmp/npm-cache/modinfo/{0}'.format(self.name)).read()
except urllib.error.URLError:
if not os.path.exists('/tmp/npm-cache/modinfo'):
os.makedirs('/tmp/npm-cache/modinfo')
# use real URL and save for later re-use
info = urllib.request.urlopen('https://registry.npmjs.org/{0}'.format(self.name)).read()
with open('/tmp/npm-cache/modinfo/{0}'.format(self.name), 'w+b') as f:
f.write(info)
pinfo = json.loads(info.decode('utf-8'))
best = None
# check if the latest is compatible
latest = pinfo['dist-tags']['latest']
if npm_version_is_compatible(latest, self.version, self.name) == True:
best = latest
if not best:
for k, v in pinfo['versions'].items():
if npm_version_is_compatible(k, self.version, self.name) == True:
print(' ' + k + ' is compatible! ')
if not best:
print('No best yet, using ' + k)
best = k
elif LooseVersion(k) > LooseVersion(best):
print('Updating best to ' + k)
best = k
else:
print('leaving best untouched (' + best + ')')
if not best:
parent = self.parent
parent.add_dep(self)
while parent.parent:
parent = parent.parent
raise ValueError('No best version found for package {0}\n{1}\n{2}'.format(self.name, parent.to_JSON(), parent.debugprint()))
print('best is ' + best)
self.set_version(best)
self.src_uri = pinfo['versions'][self.version]['dist']['tarball']
print(self.name + ' after: self.version is ' + self.version + ' and URL is ' + self.src_uri)
def sanitise_homepage(self):
if self.homepage:
if self.homepage.startswith('[email protected]:'):
self.homepage = self.homepage.replace('[email protected]:', 'https://github.com/', 1)
elif self.homepage.startswith('git://github.com'):
self.homepage = self.homepage.replace('git://github.com', 'https://github.com', 1)
elif not self.homepage.startswith(('git://', 'https://', 'http://')):
self.homepage = self.homepage.replace('', 'https://github.com/', 1)
def add_dep(self, dep):
self.deps.append(dep)
def write_bb_recipe(self, outdir):
import re
if self.details_skipped:
return
depends = ''
rdepends = ''
depends_class_native = ''
symlinks = ''
for p in sorted(self.deps, key=lambda pkg: pkg.name):
newdep = ' nodejs-{0}{1}'.format(p.name.replace('_', '-'), p.version)
if p.is_devdepend:
depends = depends + newdep + '-native'
else:
if not p.details_skipped:
# prevent recursive depends inside DEPENDS, i.e. only add if not
# existing already
depends_class_native = depends_class_native + newdep
else:
# for testing, add stuff back in
depends_class_native = depends_class_native + newdep
rdepends = rdepends + newdep
symlinks = symlinks + '\nNODEJS_SYMLINKS[{0}] = "{0}{1}"'.format(p.name, p.version)
if depends:
depends = '\nDEPENDS = "{0}"\n'.format(depends.lstrip())
if rdepends:
if rdepends == depends_class_native:
# in this case, RDEPENDS will automatically be translated to append -native
# by bitbake ...
depends_class_native = ' ${RDEPENDS_${PN}}'
else:
# ... whereas here we need to do this manually
depends_class_native = re.sub(r'(\S+)', r'\1-native', depends_class_native)
#if depends_class_native:
# depends_class_native = 'DEPENDS_append_class-native = "{0}"\n'.format(depends_class_native)
rdepends = '\nRDEPENDS_${{PN}} = "{0}"\n{1}'.format(rdepends.lstrip(), depends_class_native)
if symlinks:
symlinks = symlinks + '\n'
lic_files_chksum = ''
if self.license_file:
lic_files_chksum = 'LIC_FILES_CHKSUM = "file://{0};md5={1}"\n'.format(self.license_file, self.license_file_md5sum)
arch = 'nodegyp' if self.using_gyp else 'allarch'
pn = self.name.replace('_', '-')
module_name = ''
if pn != self.name:
module_name = 'NODE_MODULE = "{0}"\n'.format(self.name)
# FIXME: some packages need patches etc., so we hard-code that here and expect an
# .inc file to be available with the necessary extra bits.
require = ''
if self.name == 'node-gyp':
require = '\nrequire nodejs-{0}.inc\n'.format(self.name)
srcuri = ''
if self.src_uri != 'https://registry.npmjs.org/{0}/-/{0}-{1}.tgz'.format(self.name, self.version) and \
self.src_uri != 'http://registry.npmjs.org/{0}/-/{0}-{1}.tgz'.format(self.name, self.version):
# esprima-fb has a different filename scheme. Let's be generic, though
srcuri = '\nSRC_URI = "{0};subdir=${{BP}}"\n'.format(self.src_uri)
sfx = 0
fname = '{0}/nodejs-{1}{2}_{2}.bb'.format(outdir, pn, self.version)
while os.path.exists(fname):
sfx = sfx + 1
fname = '{0}/nodejs-{1}{2}_{2}.bb.{3}'.format(outdir, pn, self.version, sfx)
with open(fname, 'wt', encoding='utf-8') as f:
f.write('''SUMMARY = "{summary}"
HOMEPAGE = "{hp}"
LICENSE = "{lic}"
{licsum}{depends}
SRC_URI[md5sum] = "{md5}"
SRC_URI[sha256sum] = "{sha256}"
inherit nodejs {arch}
{modname}{require}{srcuri}{rdepends}{symlinks}
BBCLASSEXTEND = "native"
'''.format(summary=self.summary, hp=self.homepage, lic=self.license if self.license else 'CLOSED', licsum=lic_files_chksum, depends=depends, md5=self.md5sum if self.md5sum else 'FIXME', sha256=self.sha256sum if self.sha256sum else 'FIXME', arch=arch, modname=module_name, require=require, srcuri=srcuri, rdepends=rdepends, symlinks=symlinks))
for p in self.deps:
p.write_bb_recipe(outdir)
def hashfile(afile, hasher, blocksize=65536):
afile.seek(0)
buf = afile.read(blocksize)
while len(buf) > 0:
hasher.update(buf)
buf = afile.read(blocksize)
afile.seek(0)
return hasher.hexdigest()
def npm_version_is_compatible(version, req_version_spec, pname):
import re
print(pname + ': req: ' + req_version_spec + ' against ' + version)
def compare_dep(version, operator, req_version_spec):
print(' {0} {1}{2}?'.format(version, operator, req_version_spec))
if not req_version_spec or req_version_spec == '*':
# any version will do
print(' yes (any)')
return True
elif operator in ['>', '<', '<=', '>=', '=']:
# any prefix other than ~ and ^ makes things dead simple
if operator == '>' and LooseVersion(version) > LooseVersion(req_version_spec):
print(' yes (>)')
return True
elif operator == '<' and LooseVersion(version) < LooseVersion(req_version_spec):
print(' yes (<)')
return True
elif operator == '<=' and LooseVersion(version) <= LooseVersion(req_version_spec):
print(' yes (<=)')
return True
elif operator == '>=' and LooseVersion(version) >= LooseVersion(req_version_spec):
print(' yes (>=)')
return True
elif operator == '=' and LooseVersion(version) == LooseVersion(req_version_spec):
print(' yes (=)')
return True
else:
# here be dragons...
# split the dotted portions into a list (handling trailing dots properly)
parts = [part if part else 'x' for part in req_version_spec.split('.')]
parts = [int(part) if part != 'x' and not '-' in part
else part for part in parts]
if len(parts) == 1 or parts[1] == 'x':
# 1 or 1.x or 1.x.x or ~1 or ^1
if LooseVersion(version) < LooseVersion('{0}'.format(parts[0]+1)) and (parts[0] == 0 or LooseVersion(version) >= LooseVersion('{0}'.format(parts[0]))):
print(' yes (1 or 1.x or 1.x.x or ~1 or ^1)')
return True
elif len(parts) == 3 or operator != '~':
# 1.2.3 or 1.2.3-4 or 1.2.x or ~1.2.3 or ^1.2.3 or 1.2
if len(parts) == 2 or parts[2] == 'x':
# 1.2.x or 1.2
if LooseVersion(version) >= LooseVersion('{0}.{1}'.format(parts[0], parts[1])) and LooseVersion(version) < LooseVersion('{0}.{1}'.format(parts[0], parts[1]+1)):
print(' yes (1.2.x or 1.2)')
return True
elif operator == '~' or (operator == '^' and parts[0] == 0 and parts[1] > 0):
# ~1.2.3 or ^0.1.2 (zero is special with the caret operator)
if LooseVersion(version) >= LooseVersion(req_version_spec) and LooseVersion(version) < LooseVersion('{0}.{1}'.format(parts[0], parts[1]+1)):
print(' yes (~1.2.3 or ^0.1.2)')
return True
elif operator == '^' and parts[0:1] != [0,0]:
# ^1.2.3
if LooseVersion(version) >= LooseVersion(req_version_spec) and LooseVersion(version) < LooseVersion('{0}'.format(parts[0]+1)):
print(' yes (^1.2.3)')
return True
else:
# 1.2.3 or 1.2.3-4 or ^0.0.3
if LooseVersion(version) == LooseVersion(req_version_spec):
print(' yes (1.2.3 or ^0.0.3)')
return True
elif operator == '~':
# ~1.2
if LooseVersion(version) >= LooseVersion(req_version_spec) and LooseVersion(version) < LooseVersion('{0}'.format(parts[0]+1)):
print(' yes (~1.2)')
return True
elif operator == '^':
# ^1.2
if LooseVersion(version) >= LooseVersion(req_version_spec) and LooseVersion(version) < LooseVersion('{0}'.format(parts[0]+1)):
print(' yes (^1.2)')
return True
print(' no')
return False
# if there is an OR (||) dependency, we have to iterate through all of
# them. Let's start from the end, assuming the most recent version is
# specified at the end.
for req_version in req_version_spec.split('||')[::-1]:
req_version = req_version.strip()
if ' - ' in req_version:
gt, lt = req_version.split(' - ')
if LooseVersion(version) >= LooseVersion(gt) and LooseVersion(version) <= LooseVersion(lt):
return True
else:
RE_VERSION = re.compile(r'\s*v?([<>=~^]{0,2})\s*([0-9][0-9\.\-]*)\s*')
if pname == 'esprima-fb':
# astw 1.1.0 depends on esprima-fb 3001.1.0-dev-harmony-fb
# not sure if we should allow characters in version numbers in general
RE_VERSION = re.compile(r'\s*v?([<>=~^]{0,2})\s*([0-9][0-9a-z\.\-]*)\s*')
m = re.match(RE_VERSION, req_version)
if m:
maybe_ok = compare_dep(version, m.group(1), m.group(2))
# There could be up to two versions here (e.g.">1.0 <3.1")
if maybe_ok and len(req_version) > m.end():
m = re.match(RE_VERSION, req_version[m.end():])
if m:
print(' {0} so far: {1}'.format(pname, maybe_ok))
maybe_ok = compare_dep(version, m.group(1), m.group(2))
if maybe_ok:
return maybe_ok
else:
return True
return False
def process_pkg(p, metadata_p, do_devdepends=False, do_devdepends_2nd=False):
def get_license(p, metadata_p, key):
if key in metadata_p:
if isinstance(metadata_p[key], list):
if isinstance(metadata_p[key][0], dict):
p.license = metadata_p[key][0]['type']
return True
# FIXME?
p.license = metadata_p[key][0]
return True
elif isinstance(metadata_p[key], dict):
p.license = metadata_p[key]['type']
return True
elif isinstance(metadata_p[key], str):
# FIXME?
p.license = metadata_p[key]
return True
return False
def safemembers(members):
resolved_path = lambda x: os.path.realpath(os.path.abspath(x))
def badpath(path, base):
# joinpath will ignore base if path is absolute
return not resolved_path(os.path.join(base, path)).startswith(base)
def badlink(info, base):
# Links are interpreted relative to the directory containing the link
tip = resolved_path(os.path.join(base, os.path.dirname(info.name)))
return badpath(info.linkname, base=tip)
base = resolved_path('.')
for finfo in members:
if badpath(finfo.name, base):
print('{0} is blocked: illegal path'.format(finfo.name))
elif finfo.issym() and badlink(finfo,base):
print('{0} is blocked: hardlink to {1}'.format(finfo.name, finfo.linkname))
elif finfo.islnk() and badlink(finfo,base):
print('{0} is blocked: symlink to {1}'.format(finfo.name, finfo.linkname))
else:
yield finfo
def package_already_available(pkg):
parent = pkg.parent
while parent:
#print(' -> checking parent {0} deps:'.format(parent.name))
for parent_dep in parent.deps:
#print(' -> checking parent {0} dep: {1} {2}'.format(parent.name, parent_dep.name, parent_dep.version))
if not parent_dep.details_skipped and parent_dep.name == pkg.name and npm_version_is_compatible(parent_dep.version, pkg.version, pkg.name):
print(' -> found as {0}'.format(parent_dep.version))
return parent_dep.version
parent = parent.parent
return None
def have_circular(pkg):
parent = pkg.parent
while parent:
if parent.name == pkg.name:
print(' -> circular found as {0}'.format(parent.version))
return True
parent = parent.parent
return False
p.summary = metadata_p['description'] if 'description' in metadata_p else None
if 'repository' in metadata_p:
repo = metadata_p['repository']
if isinstance(repo, dict):
p.homepage = repo['url']
elif isinstance(repo, str):
p.homepage = repo
else:
raise TypeError('Unsupported "repository" scheme: {0}'.format(metadata_p['repository']))
metadata_keys = ['dependencies', 'optionalDependencies' ]
if do_devdepends:
metadata_keys.append('devDependencies')
for metadata_key in metadata_keys:
if metadata_key in metadata_p:
is_devdepends = True if metadata_key == 'devDependencies' else False
metadata_depends = {}
if isinstance(metadata_p[metadata_key], list):
# this sucks... Let's convert the deps list to
# a dict with any version
print('converting list to dict!')
metadata_depends = { name:'*' for name in metadata_p[metadata_key] }
else:
metadata_depends = metadata_p[metadata_key]
for name, value in metadata_depends.items():
dep_p = Package(name, value)
dep_p.parent = p
dep_p.is_devdepend = is_devdepends
if name == 'node-pre-gyp':
raise ValueError('node-pre-gyp is required but sucks, hence unsupported at the moment')
if have_circular(dep_p):
dep_p.is_circular = True
value = None
if p.metadata_s:
if 'dependencies' in p.metadata_s:
for n, v in p.metadata_s['dependencies'].items():
if n == dep_p.name:
dep_p.update_version_and_uri_from_shrinkwrap(v)
dep_p.metadata_s = v
break;
if not p.metadata_s or not dep_p.metadata_s:
if p.metadata_s:
# We have a shrinkwrap.json, but no entry for this
# particular package. NPM has probably decided that
# it doesn't need this particular dependency, as it
# has been satisfied higher up the tree already.
print('No info in shrinkwrap about {0} {1}, should exist higher up the tree'.format(dep_p.name, dep_p.version))
else:
# no shrinkwrap - we have to figure out the latest
# compatible version and URL for it
# firstly, we'll see if it exists up the tree already, though
print('no shrinkwrap, check if {0} {1} exists higher up'.format(dep_p.name, dep_p.version))
available_version = package_already_available(dep_p)
if available_version != None:
dep_p.details_skipped = True
dep_p.version = available_version
print(' -> yes as {0}'.format(dep_p.version))
elif not p.metadata_s or dep_p.is_devdepend:
# We have no shrinkwrap.json, or
# we have a shrinkwrap.json, but no entry for this
# particular package. We also couldn't find this
# particular package higher up in the tree. Since
# it's a devdepend, things are OK, as shrinkwrap not
# necessarily contain devdepends (--save-dev)
# We now have to figure out the latest compatible
# version and URL for it
print(' -> no, should update URI')
dep_p.get_latest_compatible()
else:
# Something must be wrong
raise ValueError('{0} {1} does not exist higher up the tree even though shrinkwrap suggests it'.format(dep_p.name, dep_p.version))
p.add_dep(dep_p)
for dep_p in p.deps:
if dep_p.details_skipped:
print('Skipping parsing of depends of {0} {1} as it exists higher up the tree'.format(dep_p.name, dep_p.version))
continue
# download
print('Get ' + dep_p.src_uri)
# try from local cache first
try:
fo = io.BytesIO(urllib.request.urlopen('file:///tmp/npm-cache/{0}-{1}.tgz'.format(dep_p.name, dep_p.version)).read())
except urllib.error.URLError:
if not os.path.exists('/tmp/npm-cache'):
os.mkdir('/tmp/npm-cache')
# use real URL and save for later re-use
fo = io.BytesIO(urllib.request.urlopen(dep_p.src_uri).read())
with open('/tmp/npm-cache/{0}-{1}.tgz'.format(dep_p.name, dep_p.version), 'w+b') as f:
f.write(fo.read())
dep_p.md5sum = hashfile(fo, hashlib.md5())
dep_p.sha256sum = hashfile(fo, hashlib.sha256())
metadata_dep_p = None
with tarfile.open(fileobj=fo) as t_f:
# some package have unusual directory names (ejs-2.2.4.tgz)
for pkgdir in ['package', '{0}-v{1}'.format(dep_p.name, dep_p.version), '{0}'.format(dep_p.name)]:
old_wd = os.getcwd()
try:
fnull = None
tmpdir = None
metadata_dep_p = json.loads(t_f.extractfile('{0}/package.json'.format(pkgdir)).read().decode('utf-8'))
# figure out if *.gyp exists in the package's root dir
names = t_f.getnames()
for name in names:
if os.path.dirname(name) == pkgdir and os.path.splitext(name)[1] == ".gyp":
dep_p.using_gyp = True
break
# extract archive in a safe way:
# http://stackoverflow.com/questions/10060069/safely-extract-zip-or-tar-using-python
tmpdir = tempfile.mkdtemp(prefix='tmp.create-json.{0}-{1}.'.format(dep_p.name, dep_p.version))
t_f.extractall(path=tmpdir, members=safemembers(t_f))
# use licensee to determine license https://github.com/benbalter/licensee
# licensee only works in git directories, so let's create one...
os.chdir('{0}/{1}'.format(tmpdir, pkgdir))
fnull = open(os.devnull, 'wb')
subprocess.call(['git', 'init'], stdout=fnull)
subprocess.call(['git', 'add', '-A', '.'], stdout=fnull)
# allow empty just in case this is a git repository already
subprocess.call(['git', 'commit', '-m', 'foo', '--allow-empty'], stdout=fnull)
licensee = subprocess.check_output('licensee', stderr=fnull, universal_newlines=True)
for line in iter(licensee.splitlines()):
if line.startswith('License file: '):
a, b = line.split(':')
dep_p.license_file = b.strip()
with open(dep_p.license_file, 'rb') as f:
dep_p.license_file_md5sum = hashfile(f, hashlib.md5())
elif line.startswith('License: '):
a, b = line.split(':')
dep_p.license = b.strip()
if dep_p.license == 'MIT License':
dep_p.license = 'MIT'
elif dep_p.license == 'Apache License 2.0':
dep_p.license = 'Apache-2.0'
elif dep_p.license == 'BSD 3-clause "New" or "Revised" License':
dep_p.license = 'BSD-3-Clause'
elif dep_p.license == 'BSD 2-clause "Simplified" License':
dep_p.license = 'BSD-2-Clause'
elif dep_p.license == 'ISC License':
dep_p.license = 'ISC'
elif dep_p.license == 'no license':
dep_p.license = None
break
except KeyError:
# pkgdir not found, try next
pass
except subprocess.CalledProcessError:
pass
except FileNotFoundError:
# licensee not available
pass
finally:
if fnull:
fnull.close()
os.chdir(old_wd)
if tmpdir and os.path.exists(tmpdir):
shutil.rmtree(tmpdir)
process_pkg(dep_p, metadata_dep_p, do_devdepends_2nd)
# if not p.license:
# if not get_license(p, metadata_p, 'licenses'):
# get_license(p, metadata_p, 'license')
p.sanitise_homepage()
def process_json_from_file(packagejson, shrinkwrap, outdir, do_devdepends, do_devdepends_2nd):
fh = open(packagejson)
metadata_p = json.load(fh)
fh.close()
metadata_s = None
if shrinkwrap:
fh = open(shrinkwrap)
metadata_s = json.load(fh)
fh.close()
p = Package(metadata_p['name'], metadata_p['version'])
p.metadata_s = metadata_s
process_pkg(p, metadata_p, do_devdepends, do_devdepends_2nd)
print('JSON dump')
print(p.to_JSON())
print('manual dump')
p.debugprint()
p.write_bb_recipe(outdir)
def main():
parser = argparse.ArgumentParser()
parser.add_argument('-o', '--output', default=os.getcwd(), help='Directory to write content to. Defaults to $PWD.')
parser.add_argument('-s', '--shrinkwrap', help='npm-shrinkwrap.json to go with package.json')
parser.add_argument('-d', '--debug', help='Enable debug.', action='count')
parser.add_argument('-x', '--devdepends', help='Parse devdepends of main package', action='store_true')
parser.add_argument('-x2', '--devdepends2', help='Parse devdepends of 2nd package', action='store_true')
parser.add_argument('packagejson', help='package.json to process')
args = parser.parse_args()
args.output = args.output + '/'
if not os.path.exists(os.path.dirname(args.output)):
os.makedirs(os.path.dirname(args.output))
args.output = os.path.dirname(args.output)
process_json_from_file(args.packagejson, args.shrinkwrap,
args.output,
args.devdepends, args.devdepends2)
if __name__ == '__main__':
main()
DESCRIPTION = "\
node-gyp is a cross-platform command-line tool written in Node.js\
for compiling native addon modules for Node.js.\
.\
It features:\
* Easy to use, consistent interface\
* Same commands to build a module on every platform\
* Support of multiple target versions of Node.js\
.\
node-gyp replaces node-waf program which was deprecated in Node.js 0.8\
and removed since Node.js 0.10."
SRC_URI_append = "\
file://0001-configure-use-system-headers-for-default-nodedir.patch \
file://0002-addon.gypi-remove-bogus-include-paths.patch \
file://0003-configure.js-compat-with-our-old-version-of-gyp.patch \
"
do_unpack_append() {
import shutil
# remove bundled gyp
shutil.rmtree('${S}/gyp/', True)
}
do_install_append() {
os.mkdir("${D}${NODE_MODULE_DIR}/gyp")
os.symlink("${bindir}/gyp", "${D}${NODE_MODULE_DIR}/gyp/gyp")
}
--
_______________________________________________
Openembedded-core mailing list
[email protected]
http://lists.openembedded.org/mailman/listinfo/openembedded-core