def get_env_json_path(): directory = $VIRTUAL_ENV || ? return os.path.join(directory, ENV_JSON_FILENAME)
def on_install(pkg_json): env_json_path = get_env_json_path() env_json = json.load(env_json_path) env_json['pkgs’][pkgname] = pkg_json with open(env_json_path, 'w') as f: f.write(env_json) def read_cached_entry_points(): env_json_path = get_env_json_path() env_json = json.load(env_json_path) entry_points = flatten(**{ pkg['entry_points'] for pkg in env_json['pigs']}) return entry_points Would this introduce a need for a new and confusing rescan_metadata() (pkg.on_install() for pkg in pkgs)? On Wednesday, October 18, 2017, Nick Coghlan <ncogh...@gmail.com> wrote: > On 19 October 2017 at 12:16, Daniel Holth <dho...@gmail.com > <javascript:_e(%7B%7D,'cvml','dho...@gmail.com');>> wrote: > >> We said "you won't have to install setuptools" but actually "you don't >> have to use it" is good enough. If you had 2 pkg-resources implementations >> running you might wind up scanning sys.path extra times... >> > True, but that's where Thomas's suggestion of attempting to define a > standardised caching convention comes in: right now, there's no middle > ground between "you must use pkg_resources" and "every helper library must > scan for the raw entry-point metadata itself". > > If there's a defined common caching mechanism, and support for it is added > to new versions of pkg_resources, then the design constraint becomes "If > you end up using multiple entry-point scanners, you'll want a recent > setuptools/pkg_resource, so you don't waste too much time on repeated > metadata scans". > > Cheers, > Nick. > > -- > Nick Coghlan | ncogh...@gmail.com > <javascript:_e(%7B%7D,'cvml','ncogh...@gmail.com');> | Brisbane, > Australia >
_______________________________________________ Distutils-SIG maillist - Distutils-SIG@python.org https://mail.python.org/mailman/listinfo/distutils-sig