[rust-dev] Deprecating rustpkg

2014-06-07 Thread Fredrik Ekholdt
Hi!
Seems like this thread has been dead for a while. I am new to rust, but was 
playing with it today, looking for rustpkg and ended up reading this thread. I 
have tried to read through this thread, but it is possible that there is a 
newer more relevant thread covering this topic in which case I excuse myself.

I am currently working on a language agnostic dependency/package manager and I 
was wondering whether it might suit Rusts requirements. Right now we are 
targeting it as a replacement to Maven/Ivy on the JVM, but the idea all along 
is to make it platform independent (and native) by having a small and 
predicable algorithm for resolution.
The resolution engine is written 200 LOCs, and the overall logic (excluding 
metadata reading and so on) is about 400 LOCs more. (I wonder if the Rust 
implementation will be faster than the one in Scala? :)

If there is interest I might start looking more into how to port it (I will 
need help though) - if not, I guess it was worth a shot! :)

It is called Adept (https://github.com/adept-dm/adept) and it is in alpha for 
the Scala/JVM. The docs (listed below) are still unfortunately a bit scattered 
so I am summarising here.

Some features that might be of interest:
- Fast and reliable resolution of metadata using  versioned metadata (which is 
easily  safely cacheable).
- Fast and reliable artifact (binary files/sources) downloads, i.e. can 
download from multiple sources in parallel.
- Authors can describe compatibility matrixes of their modules. The short story 
is that authors can rank modules and thereby define which ones are compatible 
(or can be replaced) and in multiple files thereby having many “series” of 
compatible modules. Using this scheme we can emulate: “normal” versioning, 
(what is called) semantic” versioning and backward compatibility matrixes, but 
also other, more exotic, version schemes as well. AdeptHub (see below) will 
make it easy for authors to use the standard ones, but also make it possible to 
customise this.
- Adept’s engine is flexible enough so that authors can publish multiple 
packages to multiple platforms and based on user input figure out which package 
 platform a user should get.
- Adept’s engine is flexible enough to emulate the concept of 
scopes/configurations/views so an author can publish different 
scopes/configurations/views of the same package: one for compile, one for 
runtime, one for testing, etc etc.
- Supports resolution through multiple attributes author-defined attributes. 
You can require a specific version, a binary-version, but also “maturity” (or 
release-type”) or whatever other attribute that might be relevant.  
- Does not require a user to resolve (figure out which packages you need), when 
they check out code a project. The way this works is that Adept generates a 
file after resolution that contains all artifacts (their locations, their 
hashes, filenames) that is required, as well as current requirements and the 
context (which metadata and where it should be downloaded from). Users/build 
server will therefore get exactly the same artifacts each time they build 
(using the SHA-256 hashes), but using compatibility matrixes it is possible to 
upgrade to a later compatible version easily/programmatically. This file 
currently called the lockfile , but it is not to be confused with Rubygem 
lockfiles. Note the name ‘lockfile' will change of because of this confusion.
- Is decentralized (as Git), but has a central hub, adepthub.com, (as GitHub) 
so first-time users can easily find things.
- Decentralization makes it possible for users to change metadata (and 
contribute it or put it somewhere else), but also makes it possible to support 
a dsl/api in the build tool, where users can create requirements on build time 
(version ranges is supported through this).
- Works offline (i.e not connected to the intertubes) provided that 
metadata/artifacts is locally available. It knows exactly what it needs so if 
something is not available  it can give easy-to-understand error messages for 
when something is missing (which is different from Ivy/Maven although I am not 
sure what the story for cargo or rustpkg was…).
- Supports sandboxed projects reliably (no global/changing artifacts). When you 
checkout a project that uses Adept, you can be sure it has the same artifacts 
as the one you used on your dev machine.
- CLI-like search for packages (through Scalas sbt, but can be extended to a 
pure CLI tool). Works locally and on online repositories.
- Repository metadata is decoupled from a projects source code, which is 
feature for me, because you might have different workflows etc etc for source 
code and actual releases. 

Stuff we are working on the next weeks:
- Easy publishing to adepthub.com.
- A better web app including browsing and non-CLI searches.
- Online resolution on AdeptHub.
- Notifications for new compatible releases.
- Native web-“stuff support (i.e. css, js, …) made possible by importing 

Re: [rust-dev] Deprecating rustpkg

2014-06-07 Thread Tony Arcieri
You might want to check out this thread... Mozilla is sponsoring work on a
new Rust package manager called Cargo:

https://mail.mozilla.org/pipermail/rust-dev/2014-March/009090.html

-- 
Tony Arcieri
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-06-07 Thread Vladimir Matveev
Hi, Fredrik,

Currently a new package manager designed specifically for Rust is under active 
development. It is called Cargo, and you can find it here [1]. It is pretty 
much in very alpha stage now, but one day it will become a full package manager 
and build system for Rust.

  [1]: https://github.com/carlhuda/cargo

On 07 июня 2014 г., at 23:28, Fredrik Ekholdt fre...@gmail.com wrote:

 Hi!
 Seems like this thread has been dead for a while. I am new to rust, but was 
 playing with it today, looking for rustpkg and ended up reading this thread. 
 I have tried to read through this thread, but it is possible that there is a 
 newer more relevant thread covering this topic in which case I excuse myself.
 
 I am currently working on a language agnostic dependency/package manager and 
 I was wondering whether it might suit Rusts requirements. Right now we are 
 targeting it as a replacement to Maven/Ivy on the JVM, but the idea all along 
 is to make it platform independent (and native) by having a small and 
 predicable algorithm for resolution.
 The resolution engine is written 200 LOCs, and the overall logic (excluding 
 metadata reading and so on) is about 400 LOCs more. (I wonder if the Rust 
 implementation will be faster than the one in Scala? :)
 
 If there is interest I might start looking more into how to port it (I will 
 need help though) - if not, I guess it was worth a shot! :)
 
 It is called Adept (https://github.com/adept-dm/adept) and it is in alpha for 
 the Scala/JVM. The docs (listed below) are still unfortunately a bit 
 scattered so I am summarising here.
 
 Some features that might be of interest:
 - Fast and reliable resolution of metadata using  versioned metadata (which 
 is easily  safely cacheable).
 - Fast and reliable artifact (binary files/sources) downloads, i.e. can 
 download from multiple sources in parallel.
 - Authors can describe compatibility matrixes of their modules. The short 
 story is that authors can rank modules and thereby define which ones are 
 compatible (or can be replaced) and in multiple files thereby having many 
 “series” of compatible modules. Using this scheme we can emulate: “normal” 
 versioning, (what is called) semantic” versioning and backward compatibility 
 matrixes, but also other, more exotic, version schemes as well. AdeptHub (see 
 below) will make it easy for authors to use the standard ones, but also make 
 it possible to customise this.
 - Adept’s engine is flexible enough so that authors can publish multiple 
 packages to multiple platforms and based on user input figure out which 
 package  platform a user should get.
 - Adept’s engine is flexible enough to emulate the concept of 
 scopes/configurations/views so an author can publish different 
 scopes/configurations/views of the same package: one for compile, one for 
 runtime, one for testing, etc etc.
 - Supports resolution through multiple attributes author-defined attributes. 
 You can require a specific version, a binary-version, but also “maturity” (or 
 release-type”) or whatever other attribute that might be relevant.  
 - Does not require a user to resolve (figure out which packages you need), 
 when they check out code a project. The way this works is that Adept 
 generates a file after resolution that contains all artifacts (their 
 locations, their hashes, filenames) that is required, as well as current 
 requirements and the context (which metadata and where it should be 
 downloaded from). Users/build server will therefore get exactly the same 
 artifacts each time they build (using the SHA-256 hashes), but using 
 compatibility matrixes it is possible to upgrade to a later compatible 
 version easily/programmatically. This file currently called the lockfile , 
 but it is not to be confused with Rubygem lockfiles. Note the name ‘lockfile' 
 will change of because of this confusion.
 - Is decentralized (as Git), but has a central hub, adepthub.com, (as GitHub) 
 so first-time users can easily find things.
 - Decentralization makes it possible for users to change metadata (and 
 contribute it or put it somewhere else), but also makes it possible to 
 support a dsl/api in the build tool, where users can create requirements on 
 build time (version ranges is supported through this).
 - Works offline (i.e not connected to the intertubes) provided that 
 metadata/artifacts is locally available. It knows exactly what it needs so if 
 something is not available  it can give easy-to-understand error messages for 
 when something is missing (which is different from Ivy/Maven although I am 
 not sure what the story for cargo or rustpkg was…).
 - Supports sandboxed projects reliably (no global/changing artifacts). When 
 you checkout a project that uses Adept, you can be sure it has the same 
 artifacts as the one you used on your dev machine.
 - CLI-like search for packages (through Scalas sbt, but can be extended to a 
 pure CLI tool). Works locally and on online repositories.
 - 

Re: [rust-dev] Deprecating rustpkg

2014-06-07 Thread Kevin Cantu
Further historical disambiguation:
* there used to be no package manager
* Cargo was created
* rustpkg replaced Cargo
* Cargo' replaced rustpkg

Why I'm the only one who calls it Cargo', I don't know.  ;D


Kevin





On Sat, Jun 7, 2014 at 12:38 PM, Vladimir Matveev dpx.infin...@gmail.com
wrote:

 Hi, Fredrik,

 Currently a new package manager designed specifically for Rust is under
 active development. It is called Cargo, and you can find it here [1]. It is
 pretty much in very alpha stage now, but one day it will become a full
 package manager and build system for Rust.

   [1]: https://github.com/carlhuda/cargo

 On 07 июня 2014 г., at 23:28, Fredrik Ekholdt fre...@gmail.com wrote:

  Hi!
  Seems like this thread has been dead for a while. I am new to rust, but
 was playing with it today, looking for rustpkg and ended up reading this
 thread. I have tried to read through this thread, but it is possible that
 there is a newer more relevant thread covering this topic in which case I
 excuse myself.
 
  I am currently working on a language agnostic dependency/package manager
 and I was wondering whether it might suit Rusts requirements. Right now we
 are targeting it as a replacement to Maven/Ivy on the JVM, but the idea all
 along is to make it platform independent (and native) by having a small and
 predicable algorithm for resolution.
  The resolution engine is written 200 LOCs, and the overall logic
 (excluding metadata reading and so on) is about 400 LOCs more. (I wonder if
 the Rust implementation will be faster than the one in Scala? :)
 
  If there is interest I might start looking more into how to port it (I
 will need help though) - if not, I guess it was worth a shot! :)
 
  It is called Adept (https://github.com/adept-dm/adept) and it is in
 alpha for the Scala/JVM. The docs (listed below) are still unfortunately a
 bit scattered so I am summarising here.
 
  Some features that might be of interest:
  - Fast and reliable resolution of metadata using  versioned metadata
 (which is easily  safely cacheable).
  - Fast and reliable artifact (binary files/sources) downloads, i.e. can
 download from multiple sources in parallel.
  - Authors can describe compatibility matrixes of their modules. The
 short story is that authors can rank modules and thereby define which ones
 are compatible (or can be replaced) and in multiple files thereby having
 many “series” of compatible modules. Using this scheme we can emulate:
 “normal” versioning, (what is called) semantic” versioning and backward
 compatibility matrixes, but also other, more exotic, version schemes as
 well. AdeptHub (see below) will make it easy for authors to use the
 standard ones, but also make it possible to customise this.
  - Adept’s engine is flexible enough so that authors can publish multiple
 packages to multiple platforms and based on user input figure out which
 package  platform a user should get.
  - Adept’s engine is flexible enough to emulate the concept of
 scopes/configurations/views so an author can publish different
 scopes/configurations/views of the same package: one for compile, one for
 runtime, one for testing, etc etc.
  - Supports resolution through multiple attributes author-defined
 attributes. You can require a specific version, a binary-version, but also
 “maturity” (or release-type”) or whatever other attribute that might be
 relevant.
  - Does not require a user to resolve (figure out which packages you
 need), when they check out code a project. The way this works is that Adept
 generates a file after resolution that contains all artifacts (their
 locations, their hashes, filenames) that is required, as well as current
 requirements and the context (which metadata and where it should be
 downloaded from). Users/build server will therefore get exactly the same
 artifacts each time they build (using the SHA-256 hashes), but using
 compatibility matrixes it is possible to upgrade to a later compatible
 version easily/programmatically. This file currently called the lockfile
 , but it is not to be confused with Rubygem lockfiles. Note the name
 ‘lockfile' will change of because of this confusion.
  - Is decentralized (as Git), but has a central hub, adepthub.com, (as
 GitHub) so first-time users can easily find things.
  - Decentralization makes it possible for users to change metadata (and
 contribute it or put it somewhere else), but also makes it possible to
 support a dsl/api in the build tool, where users can create requirements on
 build time (version ranges is supported through this).
  - Works offline (i.e not connected to the intertubes) provided that
 metadata/artifacts is locally available. It knows exactly what it needs so
 if something is not available  it can give easy-to-understand error
 messages for when something is missing (which is different from Ivy/Maven
 although I am not sure what the story for cargo or rustpkg was…).
  - Supports sandboxed projects reliably (no global/changing 

Re: [rust-dev] Deprecating rustpkg

2014-02-04 Thread Tony Arcieri
On Tue, Feb 4, 2014 at 2:29 AM, Jordi Boggiano j.boggi...@seld.be wrote:

 I just hope whoever starts working on a new spec announces it clearly


THIS

-- 
Tony Arcieri
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-03 Thread Vladimir Matveev
2014-02-02 Thomas Leonard tal...@gmail.com:
 [ I don't want to start another argument, but since you guys are discussing
 0install, maybe I can provide some useful input... ]

 I don't follow this. Whether the developer uses 0install to get the build
 dependencies doesn't make any difference to the generated binary.

 Of course, you *can* distribute the binary using 0install too, but you're
 not required to.

I probably have left this part in that formulation by accident. I
apologize for that, I had been writing this message in several passes.
Yes, of course it does not matter for the developer where he gets
build dependencies from, provided these dependencies are readily
available for the build process and are easily managed.


 0install doesn't automatically check out Git repositories (although that
 would be handy). Here's how we currently do it:

 - Your program depends on libfoo = 1.0-post

 - The latest released version of libfoo is only 1.0

 - You git clone the libfoo repository yourself and register the
   metadata (feed) file inside it:

   $ git clone git://.../libfoo
   $ 0install add-feed libfoo/feed.xml

 - 0install now sees that libfoo 1.0 and 1.0-post are both available.
   Since your program requires libfoo = 1.0-post, it will select the
   Git checkout version.
Seems to be a lot of manual work. This could be automated by Rust
package/build manager, though.


 Given a set of requirements, 0install will tell you where some suitable
 versions of the dependencies are. For example:

   $ cd /tmp
   $ git clone https://github.com/0install/hello-scons.git
   $ cd hello-scons
   $ 0install download Hello-scons.xml --source --show
 - URI: /tmp/hello-scons/Hello-scons.xml
   Version: 1.1-post
   Path: /tmp/hello-scons

   - URI: http://0install.net/2006/3rd-party/SCons.xml
 Version: 2.0.1
 Path:
 /var/cache/0install.net/implementations/sha1new=86311df9d410de36d75bc51762d2927f2f045ebf

 - URI: http://repo.roscidus.com/python/python
   Version: 2.7.6-1
   Path: (package:arch:python2:2.7.6-1:x86_64)

 This says that the build dependencies are:

 - This package's source code (in /tmp/hello-scons)
 - The SCons build tool (which 0install has placed in /var/cache)
 - Python (provided by the distribution)

 The source could also specify library dependencies. How do you get this
 information to the build tool? The usual way is to tell 0install how to run
 the build tool in the XML. In this case, by running SCons on the project's
 SConstruct file.

 But you could get the information to it some other way. For example, a
 rustpkg tool that invokes 0install download ... --source --xml behind
 the scenes and does something with the machine-readable selections document
 produced.
Thanks for the explanation, I didn't know that 0install can run build
tools and that it could provide the information about libraries
locations. This certainly answers my question.

 How should I specify build dependencies for people who want to hack on my
 package?


 List them in the XML file that is in your project's source repository. Users
 should then be able to clone your git repository and build, with build
 dependencies handled for them.
Again, didn't know that 0install can handle build dependencies.


 This is all about run time dependencies, but I think the discussion here is
 about build time, right? You'll have the same issues with any system.
Usually build dependencies are a superset of runtime dependencies,
aren't they? Nonetheless, this was not about runtime dependencies,
this was about general approach. But I think given your explanation of
0install operation this point can be discarded.



 I think any build tool (including go, cabal, pip, rustpkg) will have this
 problem. Ideally, you want distributions to be able to turn upstream
 packages into their preferred format automatically. Whatever system you
 settle on this should be possible, as long as you have some kind of
 machine-readable dependency information.
Yes, you're quite correct on that ideally upstream packages should be
converted to distribution packages automatically. For the new
hypothetical build system I see it like the following: a maintainer
downloads sources for a package, invokes some distro-specific tool
which in turn invokes `rustpkg` to build the package and assemble
dependency information, which is then converted to a distribution
package. Then the maintainer manually adds external dependencies to
the list. Something like that is already done for Haskell in Arch
Linux, for example. It seems that it could be done with 0install, at
least, to some extent.

 Zero install may have integration with package systems, but looks like
 it is very brittle. According to [this
 page](http://0install.net/distribution-integration.html) it is package
 owner's duty to specify how native package dependencies should be
 resolved in each distribution. This is extremely fragile. I don't use
 Debian, for example, 

Re: [rust-dev] Deprecating rustpkg

2014-02-02 Thread Thomas Leonard
[ I don't want to start another argument, but since you guys are 
discussing 0install, maybe I can provide some useful input... ]


On 2014-02-02 07:20, Vladimir Matveev wrote:

How will it handle external dependencies?

I don't think it should. External dependencies are way too complex.
They come in different flavors on different systems. On Windows, for
example, you don't have a package manager, and you'll have to ship
these dependencies with the program using an installer. On each Linux
distro there is custom package manager, each having its own strategy
of naming things and its own versioning policy. It is impossible to
unify them, and I don't think that Rust package manager should attempt
to do this.


I don't understand this. A package manager specific to Rust is
additional software, just like 0install. 0install has full support for
installing dependencies via the system package manager on many systems
if desired.

*End users* won't need Rust package manager at all (unless they want
to install development versions of Rust software). Only package
maintainers and developers have to use it. End users just use their
native package manager to obtain packages created by maintainers. If
Rust would depend on zero install, however, end user will be *forced*
to use zero install.


I don't follow this. Whether the developer uses 0install to get the 
build dependencies doesn't make any difference to the generated binary.


Of course, you *can* distribute the binary using 0install too, but 
you're not required to.



I'm against of using zero install for the following reasons. First, it
is just a packaging system. It is not supposed to help in building
Rust software. But resolving build dependencies and invoking the
compiler with correct paths to installed dependencies is crucial.



How, for example, zero install would handle dependency to master branch of
some source repository?


0install doesn't automatically check out Git repositories (although that 
would be handy). Here's how we currently do it:


- Your program depends on libfoo = 1.0-post

- The latest released version of libfoo is only 1.0

- You git clone the libfoo repository yourself and register the
  metadata (feed) file inside it:

  $ git clone git://.../libfoo
  $ 0install add-feed libfoo/feed.xml

- 0install now sees that libfoo 1.0 and 1.0-post are both available.
  Since your program requires libfoo = 1.0-post, it will select the
  Git checkout version.

 What if I'm developing several packages which depend on different

versions of the same package? Zero install allows
installing multiple versions of the same package, yes, but how should
I specify where these libraries are located to the compiler?


Given a set of requirements, 0install will tell you where some suitable 
versions of the dependencies are. For example:


  $ cd /tmp
  $ git clone https://github.com/0install/hello-scons.git
  $ cd hello-scons
  $ 0install download Hello-scons.xml --source --show
- URI: /tmp/hello-scons/Hello-scons.xml
  Version: 1.1-post
  Path: /tmp/hello-scons

  - URI: http://0install.net/2006/3rd-party/SCons.xml
Version: 2.0.1
	Path: 
/var/cache/0install.net/implementations/sha1new=86311df9d410de36d75bc51762d2927f2f045ebf


- URI: http://repo.roscidus.com/python/python
  Version: 2.7.6-1
  Path: (package:arch:python2:2.7.6-1:x86_64)

This says that the build dependencies are:

- This package's source code (in /tmp/hello-scons)
- The SCons build tool (which 0install has placed in /var/cache)
- Python (provided by the distribution)

The source could also specify library dependencies. How do you get this 
information to the build tool? The usual way is to tell 0install how to 
run the build tool in the XML. In this case, by running SCons on the 
project's SConstruct file.


But you could get the information to it some other way. For example, a 
rustpkg tool that invokes 0install download ... --source --xml 
behind the scenes and does something with the machine-readable 
selections document produced.



How should I specify build dependencies for people who want to hack on my
package?


List them in the XML file that is in your project's source repository. 
Users should then be able to clone your git repository and build, with 
build dependencies handled for them.



Majority of direct dependencies will be from the Rust world,
and dedicated building/packaging tool would be able to download and
build them automatically as a part of build process, and only external
dependencies would have to be installed manually. With zero install
you will have to install everything, including Rust-world
dependencies, by yourself.


0install should be able to handle all build dependencies (e.g. 
libraries, the Rust compiler, build tools, documentation tools, etc).



Second, it is another package manager which is foreign to the system
(unless the system uses zero install as its package manager, but I
think only very minor 

Re: [rust-dev] Deprecating rustpkg

2014-02-02 Thread Vladimir Lushnikov
A general observation (not particularly replying to your post, Thomas).

For both python and haskell (just to name two languages), distribution
(where things end up on the filesystem ready to be used) can be done by
both the built-in tools (cabal-install, pip) and the distribution-specific
tools. Gentoo even has a tool to take a cabal package and generate an
ebuild from it - https://github.com/gentoo-haskell/hackport. In the Haskell
world cabal and cabal-install are separated (
http://ivanmiljenovic.wordpress.com/2010/03/15/repeat-after-me-cabal-is-not-a-package-manager/)
- which is probably a good thing and maybe something we can consider for
rustpkg. (The link btw is quite interesting in its own right and perhaps
some more inspiration could be taken from there).

My point is that a building tool should be able to either fetch or look up
dependencies that already exist in a well-specified layout on the
filesystem (for the case of development and production deployment using a
distro package manager respectively). Whether that is a single tool or two
tools is a point of design; I think both are necessary.

I feel there is enough that's been discussed on this thread for a write-up
on a wiki or the beginnings of a design/goals document that can later be
presented for another discussion. I don't see an existing place on the wiki
for this though - where should it go?


On Sun, Feb 2, 2014 at 7:47 PM, Thomas Leonard tal...@gmail.com wrote:

 [ I don't want to start another argument, but since you guys are
 discussing 0install, maybe I can provide some useful input... ]


___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Gaetan
There is not only API change. Sometime, from a minor version to another, a
feature get silently broken (that is silent regression). While it might not
impact libA which depends on it, but it may fail libB which also depends on
it, but with a previous version.
As a result, libA force installation of this dependency without any concern
(all its features works) but libB get broken without any concern.

And that the real mess to deal with.

That's happened this week at my job...

I largely prefer each library be self contained, ie, if libA depends on
libZ version X.X.X, and libB depends on libZZ version Y.Y.Y, just let each
one be installed and used at there own version. That is perfectly
acceptable (and even recommended) for a non system integrated software (for
example when a companie want to build a software with minimum system
dependency that would run on any version of Ubuntu, with the only
dependency on libc.
On the other hand, when the software get integrated into the distribution
(ubuntu, redhat, homebrew), let the distrib version manager do its job.



-
Gaetan



2014-02-01 Tony Arcieri basc...@gmail.com:

 On Fri, Jan 31, 2014 at 4:03 PM, Lee Braiden leebr...@gmail.com wrote:

 This would be counterproductive.  If a library cannot be upgraded to 1.9,
 or even 2.2, because some app REQUIRES 1.4, then that causes SERIOUS,
 SECURITY issues.


 Yes, these are exactly the types of problems I want to help solve. Many
 people on this thread are talking about pinning to specific versions of
 libraries. This will prevent upgrades in the event of a security problem.

 Good dependency resolvers work on constraints, not specific versions.

 The ONLY realistic way I can see to solve this, is to have all higher
 version numbers of the same package be backwards compatible, and have
 incompatible packages be DIFFERENT packages, as I mentioned before.

 Really, there is a contract here: an API contract.


 Are you familiar with semantic versioning?

 http://semver.org/

 Semantic Versioning would stipulate that a backwards incompatible change
 in an API would necessitate a MAJOR version bump. This indicates a break in
 the original contract.

 Ideally if people are using multiple major versions of the same package,
 and a security vulnerability is discovered which affects all versions of a
 package, that the package maintainers release a hotfix for all major
 versions.

 --
 Tony Arcieri

 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev


___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Lee Braiden

On 01/02/14 00:09, Tony Arcieri wrote:
On Fri, Jan 31, 2014 at 4:03 PM, Lee Braiden leebr...@gmail.com 
mailto:leebr...@gmail.com wrote:


This would be counterproductive.  If a library cannot be upgraded
to 1.9, or even 2.2, because some app REQUIRES 1.4, then that
causes SERIOUS, SECURITY issues.


Yes, these are exactly the types of problems I want to help solve. 
Many people on this thread are talking about pinning to specific 
versions of libraries. This will prevent upgrades in the event of a 
security problem.


Good dependency resolvers work on constraints, not specific versions.



Agreed.


Are you familiar with semantic versioning?

http://semver.org/

Semantic Versioning would stipulate that a backwards incompatible 
change in an API would necessitate a MAJOR version bump. This 
indicates a break in the original contract.




I'm familiar, in the sense that it's what many libs/apps do, but again, 
I don't believe that library 29.x should be backwards-incompatible with 
28.x.  Major versions of a package, to me, should indicate major new 
features, but not abandonment of old features.  If you want to redesign 
some code base so it's incompatible (i.e., no longer the same thing), 
then it deserves a new name.


Let's compare the mindsets of backwards-compatible library design, 
vs oh, let's call it major-breakage ;) language design:


Let's say you follow a common major-breakage approach, and do this:

1) Create a general-compression-library, version 1.0, which uses the 
LZ algorithm, and exposes some details of that.

2) During the course of development, you get ideas for version 2.0
2) You publish the 1.x library
3) Create a general-compression-library, version 2.0.  This, you 
decide, will use LZMA algorithm, and exposes some details of that.

4) You publish the 2.x library.
5) You receive a patch from someone, adding BZIP support, for 1.x. It 
includes code to make 1.x more general.  However, it's incompatible with 
2.x, and you've moved on, so you drop it, or backport your 2.x stuff.  
Maybe you publish 3.x, but now it's incompatible with 2.x AND 1.x...
6) All the while, people have been using your libraries in products, and 
some depend on 1.x, some on 2.x, some on 3.x.  It's a mess of 
compatibility hell, with no clear direction, security issues due to 
unmaintained code, etc.


Because details are exposed in each, 2.0 breaks compatibility with 1.x.  
Under a model where version 2.x can be incompatible with version 1.x, 
you say, OK, fine.  Slightly broken stuff, but new features.  People 
can upgrade and use the new stuff, or not.  Up to them.



The problem though, is that the thinking behind all this is wrong-headed 
--- beginning from bad assumptions --- and the acceptance of 
backward-incompatibility encourages that way of thinking.


Let's enforce backwards-compatiblity, and see what *might* happen, instead:

1) You create a general-compression-library, version 1.0.  You use the 
LZ algorithm, and expose details of that.

2) During the course of development, you get ideas for 2.0
3) You're about to publish the library, and realise that your 2.0 
changes won't be backwards compatible, because 1.x exposes API details 
in a non-futureproof way.
4) You do a little extra work on 1.x, making it more general -- i.e., 
living up to its name.

5) You publish 1.x
6) You create version 2.x, which ALSO supports LZMA.
7) You publish version 2.x, which now has twice as many features, does 
what it says on the tin by being a general compression library, etc.
8) You receive a patch from someone, adding BZIP support, for 1.x. You 
merge it in, and publish 3.x, which now supports 3 compression formats.
9) All the while, people have been using your libraries in products: 
they all work with general compression library x.x, later versions being 
better, minor OR major.  No security issues, because you just upgrade to 
the latest library version.


Now, instead of one base library and two forks, you have a one library 
with three versions, each backwards-compatible, each building features 
over the last.  That's a MUCH better outcome.


Now, that does involve a bit more foresight, but I think it's the kind 
of foresight that enforcing backwards compatibility encourages, and 
rightly so.




I said *might* happen.  Let's explore another turn of events, and 
imagine that you didn't have the foresight in step 3 above: you create 
general-compression-library, never realising that it's not general at 
all, and that 1.x is going to be incompatible with 2.x, until 1.x is 
published, and you come to create 2.x.  Under a backwards-compatibility 
model, that might go like this:


1) You create general-compression-library, version 1.0, with LZ support, 
expose details of that, and publish it.
2) You want to add LZMA support to this library, but can't because it 
breaks backwards compatibility.
3) Instead, you create a new library, universal-compression-library, 
1.0, with plugin support, including 

Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Vladimir Lushnikov
There are some great points here (that should probably go into some sort of
'best practices' doc for rust package authors).

However there is a fundamental flaw IMO - we are talking about open-source
code, where the author is not obligated to do *any* of this. Most
open-source licenses explicitly state that there are no implied warranties
of *any* kind. Indeed, I think we would see far less open-source software
published if we started imposing requirements on how to go about it (this
includes the versioning of libraries). (Whether this is a good thing for
open-source in general is open to debate).

In an enterprise-only world, this would obviously work because the
companies that are providing your libraries actually have to give you a
QoS. But with open-source software, you can pick any library you want or
fork any library you want - if you want something better that follows your
requirements, then just fork it and make the changes you want. Which
library fork will end up the most used is essentially a popularity contest.
Of course this excludes the standard library because that approach
definitely does not work there (most notably the D tango vs. dmd
train-wreck).

The 'don't switch things from under people' idea is definitely sound. But
if you care about your application's stability, you test each new library
upgrade for changes with your unit, regression and integration tests. That
is the only way to be sure that nothing is broken and semantic versioning
does not address that, except by making some instances where things are
really likely to be incompatible much clearer. If nothing else, this is why
even if you allow constraints as the units of dependency resolution, at any
given time you are relying upon a single pinned version. You should rebuild
and retest if you want to upgrade *anything*. Whether this can or should be
done dynamically (at runtime) is another question.

I disagree with the 'breaking-changes == new version' idea. The rust
developers have already said that whatever 2.0 will be, it may break
backward compatibility. This is a *good thing* because it's chance to clean
up. One of the reasons C++ is so huge is because it almost never removes
features, and this leads to unnecessary complexity.

Someone mentioned passing two objects with the same 'type' from different
versions of a library and how this would work in terms of memory layout.
But with 'slots', this wouldn't be allowed by the linker, because
effectively it sees the two different versions of the library as different
libraries (even though they have the same name).



On Sat, Feb 1, 2014 at 2:01 PM, Lee Braiden leebr...@gmail.com wrote:

  On 01/02/14 00:09, Tony Arcieri wrote:

  On Fri, Jan 31, 2014 at 4:03 PM, Lee Braiden leebr...@gmail.com wrote:

 This would be counterproductive.  If a library cannot be upgraded to 1.9,
 or even 2.2, because some app REQUIRES 1.4, then that causes SERIOUS,
 SECURITY issues.


  Yes, these are exactly the types of problems I want to help solve. Many
 people on this thread are talking about pinning to specific versions of
 libraries. This will prevent upgrades in the event of a security problem.

  Good dependency resolvers work on constraints, not specific versions.


 Agreed.


   Are you familiar with semantic versioning?

  http://semver.org/

  Semantic Versioning would stipulate that a backwards incompatible change
 in an API would necessitate a MAJOR version bump. This indicates a break in
 the original contract.


 I'm familiar, in the sense that it's what many libs/apps do, but again, I
 don't believe that library 29.x should be backwards-incompatible with
 28.x.  Major versions of a package, to me, should indicate major new
 features, but not abandonment of old features.  If you want to redesign
 some code base so it's incompatible (i.e., no longer the same thing), then
 it deserves a new name.

 Let's compare the mindsets of backwards-compatible library design, vs
 oh, let's call it major-breakage ;) language design:

 Let's say you follow a common major-breakage approach, and do this:

 1) Create a general-compression-library, version 1.0, which uses the LZ
 algorithm, and exposes some details of that.
 2) During the course of development, you get ideas for version 2.0
 2) You publish the 1.x library
 3) Create a general-compression-library, version 2.0.  This, you decide,
 will use LZMA algorithm, and exposes some details of that.
 4) You publish the 2.x library.
 5) You receive a patch from someone, adding BZIP support, for 1.x.  It
 includes code to make 1.x more general.  However, it's incompatible with
 2.x, and you've moved on, so you drop it, or backport your 2.x stuff.
 Maybe you publish 3.x, but now it's incompatible with 2.x AND 1.x...
 6) All the while, people have been using your libraries in products, and
 some depend on 1.x, some on 2.x, some on 3.x.  It's a mess of compatibility
 hell, with no clear direction, security issues due to unmaintained code,
 etc.

 

Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Lee Braiden

On 01/02/14 00:12, Tony Arcieri wrote:
On Fri, Jan 31, 2014 at 4:07 PM, Vladimir Lushnikov 
vladi...@slate-project.org mailto:vladi...@slate-project.org wrote:


Just to be clear, I think what you are saying is that you want
version pinning to be dynamic? I.e. when a new version of a
library dependency becomes available, upgrade the package with
that dependency?


I would like a constraints-based system that is able to calculate the 
latest API-compatible version of a given package based on rules less 
strict than version = X.Y.Z




Agreed, but that's VERY low bar for requirements; I think we need to be 
more specific.  Apt, debian's package manager, for example, can have 
package dependency rules like these:


some-package:
Version 4.11_amd64
Depends: X-bin (ver == 2.4, ver  3.1   3.7) | opengl-dev
Source-Package: some-source
Build-depends: X-devel, (scons-builder (ver = 3  ver != 3.3) | basic-make

nvidia-headers:
Provides: opengl-dev

ati-radeon-hd-devel:
Provides: radeon-dev

GNUMake:
Provides: basic-make

BSDMake:
Provides: basic-make


Which says that:

* You can build some-package 4.11_amd64 from some-source-4.11, any 
version of X-devel, version 3.x of scons builder (except for 3.3 which 
is broken somehow), and that anything providing basic make functionality 
is needed, whether it's BSD's make of GNU's.


* However, if you just want to install the binary version, you only 
need one of X-bin, or opengl-dev


You could get around the fact that android-native-devkit is a whole 
bunch of tools and libraries which don't confirm to the package system, 
by creating a dummy package requiring android-native-devkit, and saying 
that it provides basic-make and opengl-headers, so that the dependencies 
all work out.  As another example, you could break opengl-dev into API 
versions, saying that android-native-devkit provides opengl-dev, 
opengl2-dev, and opengl3-dev, but that ati-radeon-hd-dev provides only 
opengl-dev, and opengl2-dev.


Then you can say, for example, get android-native-devkit from here, and 
always use the latest, most unstable version, but give me the most 
stable versionm of BSDMake, and make sure X-bin is the stable version, 
but with the latest security patches.


One thing you can't do (without chroot/jails/containers) is to say, 
Install these packages here, and install version 1.x of packageN here, 
with 3.x there, and 2.5 there..  That's pretty important for virtual 
hosting, and development, for example.


In short, Rust's package system should probably support:

* Package names, independent of sources
* Parseable versions
* Dependency EXPRESSIONS, including boolean logic, comparison operators, 
negation (i.e., none of the packages in this sub-expression are 
compatible), etc.

* Virtual packages which include other packages or wrap other packages
* Multiple installation paths, with a list of packages to be installed / 
maintained there
* Some way to use different installation paths based on which project 
you're in
* Some way to specify the local installation path during development, 
the default installation path for public packages, and a way to override 
the default installation path for specific sysadmin purposes.
* Some way to specify dependencies on third-party libraries / tools, 
from other languages / package managers.  I've little idea of to do 
about that.  Probably just print an error message and quit, to begin with?



--
Lee

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Vladimir Lushnikov
Portage has a very similar syntax/way of specifying runtime vs. build-time
dependencies: http://devmanual.gentoo.org/general-concepts/dependencies/.

Apt doesn't have support for slots and USE flags (code that is
included/excluded at compile time for optional features).


On Sat, Feb 1, 2014 at 2:38 PM, Lee Braiden leebr...@gmail.com wrote:

Agreed, but that's VERY low bar for requirements; I think we need to be
 more specific.  Apt, debian's package manager, for example, can have
 package dependency rules like these:

 some-package:
 Version 4.11_amd64
 Depends: X-bin (ver == 2.4, ver  3.1   3.7) | opengl-dev
 Source-Package: some-source
 Build-depends: X-devel, (scons-builder (ver = 3  ver != 3.3) |
 basic-make

 nvidia-headers:
 Provides: opengl-dev

 ati-radeon-hd-devel:
 Provides: radeon-dev

 GNUMake:
 Provides: basic-make

 BSDMake:
 Provides: basic-make

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Vladimir Matveev
Is it possible at all to find the latest version of a library which is
still compatible completely automatically? Incompatibilites can be
present on logic level, so the compilation wiith incompatible version
will succeed, but the program will work incorrectly. I don't think
that this can be solved without assumptions about versioning (like
semver) and/or without manual intervention.

Couldn't we just use more loose variant of version pinning inside
semantic versioning, with manual user intervention when it is needed?
For example, assuming there is something like semantic versioning
adopted, packages specify dependencies on certain major version, and
the dependency resolver downloads latest available package inside this
major version. If for some reason automatically selected dependency is
incompatible with our package or other dependencies of our package,
the user can manually override this selection, maybe even with another
major version. This is, as far as I understand, the system of slots
used by Portage as Vladimir Lushnikov described. Slots correspond to
major versions in semver terms, and other packages depend on concrete
slot. But the user has ultimate power to select whichever version they
need, overriding automatic choice.

In short, we allow dependency resolver to use the latest possible
packages which should be compatible according to semantic versioning,
and if it fails, we provide the user with ability to override
dependency resolver choices.

2014-02-01 Tony Arcieri basc...@gmail.com:
 On Fri, Jan 31, 2014 at 3:59 PM, Jack Moffitt j...@metajack.im wrote:

 The algorithm here is rather simple. We try to satisfy rust-logger and
 rust-rest. rust-rest has a version (or could be a tag like 1.x) so we
 go get that. It depends on rust-json 2.0 so we get that. Then we try
 to look for rust-logger, whatever version is latest (in rustpkg this
 would mean current master since no version or tag is given). This
 pulls in rust-json 1.0 since 1.0 != 2.0 and those have specific tags.
 Everything is built and linked as normal. Whether rust-json's
 constraints are exact revisions or they are intervals ( 2.0 and =
 2.0 for example), makes little difference I think.


 To reiterate, it sounds like you're describing every package pinning its
 dependencies to a specific version, which I'd consider an antipattern.

 What is to prevent a program using this (still extremely handwavey)
 algorithm from depending on rust-json 1.0, 1.1, 1.2, 1.3, 1.4, 2.0, 2.1, and
 2.2 simultaneously?

 What if some of these are buggy, but the fixed versions aren't used due to
 version pinning?

 What if rust-json 1.0 has a security issue?

 --
 Tony Arcieri

 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Lee Braiden

On 01/02/14 09:39, Gaetan wrote:
There is not only API change. Sometime, from a minor version to 
another, a feature get silently broken (that is silent regression). 
While it might not impact libA which depends on it, but it may fail 
libB which also depends on it, but with a previous version.


Silent regressions are the exceptional case though, not the norm. As a 
general rule, upgrades are important and necessary, at least for 
security reasons.  It's kind of up to developers, up to distro 
maintainers, and certainly up to mission-critical sysadmins, to choose 
software (libs, and apps which use those libs) which are QA'd well 
enough to avoid this.  Breakage SOMETIMES happens, but, much like 
recovering from a failed write to disk, you just have to weigh the odds, 
try it, then back up if it didn't work out.  In fact, you could think of 
the process of upgrading a library as simple write followed by a 
verify.  If you do it properly, like a good admin would, it'll all be 
wrapped in a transaction that you can roll back.  BUT, the important 
part is that you'll probably still need to upgrade versoin x.19-x.21, 
even if upgrading x.19-x.20 fails for some reason.


Either you have a system which just works, isn't connected to the net, 
has no bugs, and no security risks associated with it, or you upgrade 
sooner or later.  In most cases, if you're acting responsibly, you 
CANNOT just install version x.19, call that a working system, and forget 
about it, installing x.21 only for newer customers / systems.  Not if 
those systems are connected to the internet, at least.


--
Lee

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Lee Braiden

On 01/02/14 14:55, Vladimir Matveev wrote:

Is it possible at all to find the latest version of a library which is
still compatible completely automatically? Incompatibilites can be
present on logic level, so the compilation wiith incompatible version
will succeed, but the program will work incorrectly. I don't think
that this can be solved without assumptions about versioning (like
semver) and/or without manual intervention.


No, it's not.  It's always going to be the library developers / 
programmer's responsibilty, to some extent.


For example, if a library adds three new functions to fit within some 
begin/end wrapper, it may modify the begin/end functions to behave 
differently.  If the library author does that in a way that breaks 
existing logic, then that's a bug, to my mind, or a deliberate 
divergence / API contract breakage.


At that point, what the author has REALLY done is decided that his 
original design for begin() end(), and for that whole part of the 
library in general, is wrong, and needs a REDESIGN.  What he can then do is:


a) Create different functions, which have extended functionality, and 
support the three new in-wrapper functions.  So, you could call:


begin()
old_funcs...
end()

OR:

extended_begin()
old_funcs()
new_funcs()
extended_end()

b) Create a new library, similar to the old one, but with new 
functionality, new API guarantees, etc.


Ignoring the problem just creates a mess though, which ripples 
throughout the development space (downstream products, library forks, 
etc.), and no package manager will completely solve it after the fact, 
except to acknowledge the mess and install separate packages for every 
program that needs them (but that has security / feature-loss issues).




Couldn't we just use more loose variant of version pinning inside
semantic versioning, with manual user intervention when it is needed?
For example, assuming there is something like semantic versioning
adopted, packages specify dependencies on certain major version, and
the dependency resolver downloads latest available package inside this
major version.


You can do that within a major version, except for one case - multiple 
developers creating diverged versions of 2.13, based on 2.12, each with 
their own features.  Really, though, what you're doing is just 
shifting/brushing the compatibility issue under the rug each time: y is 
OK in x.y because x guarantees backwards compatibility.  Fork1 in 
x.y.fork1 is OK, because x.y guarantees backwards compatibility... and 
so on, ad infinitum.  Whatever level you're at, you have two issues:


a) Backwards compatibility between library versions
b) The official, named version of the library, vs. unofficial code.

Assuming you guarantee A in some way (backwards compatibility in 
general, across all versions of the library, or backwards compatibility 
for minor versions), you still have incompatibility if (b) arises, which 
it will in all distributed repository scenarios, UNLESS you can do 
something like git's version tracking per branch, where any version 
number is unique, and also implies every version before.  Then you're 
back to whether you want to do that per major version, or overall.


But doing it per major version recursively raises the question of which 
major version is authorised: what if you have a single library at 19.x, 
and TWO people create 20.0 independently?  Again, you have 
incompatibility.  So, you're back to the question of (a): is it the same 
library, or should an author simply stay within the bounds of a 
library's API, and fork a new CONCEPTUALLY DIFFERENT new lib (most 
likely with a new name) when they break that API?




If for some reason automatically selected dependency is
incompatible with our package or other dependencies of our package,
the user can manually override this selection


But what does the user know about library APIs?  He needs to dig into 
the logic of the program, and worse, the logic of underlying libraries, 
to figure out that:


somelib::begin() from 
github://somelib/someplace/v23.2/src/module1/submod2/utils.rs, line 24


does not mean the same as:

somelib::begin() from 
github://somelib/otherplace/v23.2/src/module1/submod2/utils.rs, line 35


! ;)



major version. This is, as far as I understand, the system of slots
used by Portage as Vladimir Lushnikov described. Slots correspond to
major versions in semver terms, and other packages depend on concrete
slot.


This sounds interesting (I'll have to track down Vladimir's original 
post on that), but so far, I'm not sure it solves the problem of a 
forked minor version, any more than other methods solve a forked major 
version.  It seems to me that it always comes back to people choosing to 
break library APIs, and other people trying to clean it up in one way or 
another, which ultimately fails, at some point -- major, minor, fork, 
repository, branch, or 

Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Lee Braiden

On 01/02/14 14:49, Vladimir Lushnikov wrote:
Portage has a very similar syntax/way of specifying runtime vs. 
build-time dependencies: 
http://devmanual.gentoo.org/general-concepts/dependencies/.


Apt doesn't have support for slots and USE flags (code that is 
included/excluded at compile time for optional features).




Agreed; use flags are very nice :) I find them a bit clunky / breakable, 
though -- it's very hard to know what the valid range of flags is, and 
how that will affect every package on your system.  If Rust gets 
something similar, the exact circumstances under which they're used, the 
range valid values, and the effects of each, should be EXTREMELY clear.



--
Lee

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Vladimir Lushnikov
I think USE flags are more appropriate for library features (which is
exactly the way portage uses them). So you have your rust app with
conditional code that depends on a particular cfg (
https://github.com/mozilla/rust/wiki/Doc-attributes) and then you expose a
list of these in your package specification so that others can know to say
- I use the json library but with built-in URI support.


On Sat, Feb 1, 2014 at 3:45 PM, Lee Braiden leebr...@gmail.com wrote:

 On 01/02/14 14:49, Vladimir Lushnikov wrote:

 Portage has a very similar syntax/way of specifying runtime vs.
 build-time dependencies: http://devmanual.gentoo.org/
 general-concepts/dependencies/.

 Apt doesn't have support for slots and USE flags (code that is
 included/excluded at compile time for optional features).


 Agreed; use flags are very nice :) I find them a bit clunky / breakable,
 though -- it's very hard to know what the valid range of flags is, and how
 that will affect every package on your system.  If Rust gets something
 similar, the exact circumstances under which they're used, the range valid
 values, and the effects of each, should be EXTREMELY clear.


 --
 Lee


___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Lee Braiden

On 01/02/14 15:48, Vladimir Lushnikov wrote:
I think USE flags are more appropriate for library features (which is 
exactly the way portage uses them). So you have your rust app with 
conditional code that depends on a particular cfg 
(https://github.com/mozilla/rust/wiki/Doc-attributes) and then you 
expose a list of these in your package specification so that others 
can know to say - I use the json library but with built-in URI support.


Interesting.  I was thinking more of compiling for specific CPU 
optimisations, etc.  For the use this optional library thing, debian 
seems to mostly just use optional / recommended dependencies.  The 
package manager informs you that a package is recommended / optional, 
and you can install them if you want.  Then the ./configure script or 
whatever will normally just use it if it's there, by default if that's 
considered sensible as a default, or you can build it with extra flags 
manually, to make it build in a non-default way.  I like that Debian 
exposes those optional packages at the package manager level, but the 
global / local (iirc) use flags make a lot of sense too.


Some hybrid that had option flags when installing/building, and informed 
you of additional packages needed (much like when you select features 
to install in a GUI installer), folding that back into the package 
management/dependencies etc. might be best, but it would be relatively 
complex to implement.


--
Lee

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Lee Braiden

Ah, this:

On 01/02/14 15:43, Lee Braiden wrote:

extended_begin()
old_funcs()
new_funcs()
extended_end()


should read more like:

begin()
old_funcs()

extended_begin()
new_funcs()
extended_end()
end()


--
Lee

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Sean McArthur
On Fri, Jan 31, 2014 at 1:05 PM, Tony Arcieri basc...@gmail.com wrote:

 IMO, a system that respects semantic versioning, allows you to constrain
 the dependency to a particular *major* version without requiring pinning
 to a *specific* version.

 I would call anything that requires pinning to a specific version an
 antipattern. Among other things, pinning to specific versions precludes
 software updates which may be security-critical.


It's perfectly reasonable to require a certain *minor* version, since minor
versions (in semver) can include API additions that you may depend on.

Also, nodejs and npm supposedly support semver, but it's impossible to
enforce library authors actually do this, so you'll get libraries with
breaking changes going from 1.1.2 to 1.1.3 because reasons.
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Gaetan
why not enforcing in a way or another a API compatibility test suite for
ensuring at least a certain level of compatibility between two version? I
think it is something quite doable, and moreover this would kinda force the
package manager to write unit tests which is always a good practice.

-
Gaetan



2014-01-31 Sean McArthur s...@seanmonstar.com:

 On Fri, Jan 31, 2014 at 1:05 PM, Tony Arcieri basc...@gmail.com wrote:

 IMO, a system that respects semantic versioning, allows you to constrain
 the dependency to a particular *major* version without requiring pinning
 to a *specific* version.

 I would call anything that requires pinning to a specific version an
 antipattern. Among other things, pinning to specific versions precludes
 software updates which may be security-critical.


 It's perfectly reasonable to require a certain *minor* version, since
 minor versions (in semver) can include API additions that you may depend on.

 Also, nodejs and npm supposedly support semver, but it's impossible to
 enforce library authors actually do this, so you'll get libraries with
 breaking changes going from 1.1.2 to 1.1.3 because reasons.

 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev


___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Lee Braiden

On 01/02/14 18:54, Gaetan wrote:
why not enforcing in a way or another a API compatibility test suite 
for ensuring at least a certain level of compatibility between two 
version? I think it is something quite doable, and moreover this would 
kinda force the package manager to write unit tests which is always a 
good practice.


At the moment, we're trying to agree the policy.  After the policy is 
agreed, tools could be created to help ensure that those policies are 
met.  People would then use them if they see fit, or they could be built 
into package creation / version upload tools as standard. The first 
thing is to agree a reliable, sensible policy that improves the quality 
of software / package management, and is WORTH enforcing, though.



--
Lee

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Lee Braiden

On 01/02/14 18:54, Gaetan wrote:
why not enforcing in a way or another a API compatibility test suite 
for ensuring at least a certain level of compatibility between two 
version? I think it is something quite doable, and moreover this would 
kinda force the package manager to write unit tests which is always a 
good practice.




One other thing: I don't believe a certain level of compatibility is a 
useful attribute to track in releases.  Either something is fully 
compatible, or it breaks existing software.  It might be useful to judge 
the suitability of software for release (i.e., software passes one 
release-readiness test when it's fully compatible with a previous 
release), but that's a different thing, imho.


--
Lee

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Vladimir Matveev
 You can do that within a major version, except for one case - multiple 
 developers creating diverged versions of 2.13, based on 2.12, each with 
 their own features.  
 ...
 But doing it per major version recursively raises the question of which 
 major version is authorised: what if you have a single library at 19.x, 
 and TWO people create 20.0 independently?  Again, you have 
 incompatibility.  So, you're back to the question of (a): is it the same 
 library, or should an author simply stay within the bounds of a 
 library's API, and fork a new CONCEPTUALLY DIFFERENT new lib (most 
 likely with a new name) when they break that API?

I think that forks should be considered as completely different libraries. This
shouldn't be a problem when certain naming scheme is used, for example,
two-level names like in Java world. Central repository will certainly help,
because each entry in it will be controlled by concrete user. These entries
can also be linked with version control stream which represents main
development line. No ambiguities here.

It may be desirable then to use specific fork instead of the mainline project.
This can be a feature of overriding system, which will be present anyway. If the
user wants to use a fork instead of a library (all its versions or a specific
version), he/she will be able to specify this requirement somehow, and
dependency resolver will take it into account. Obviously, package authors will
be able to choose default fork which they want to use.

 But what does the user know about library APIs?  He needs to dig into 
 the logic of the program, and worse, the logic of underlying libraries, 
 to figure out that:
 
  somelib::begin() from 
 github://somelib/someplace/v23.2/src/module1/submod2/utils.rs, line 24
 
 does not mean the same as:
 
  somelib::begin() from 
 github://somelib/otherplace/v23.2/src/module1/submod2/utils.rs, line 35
 
 ! ;)
 

When this API is used directly by the package, then the user *should* know
about it. He's using it, after all. If this API belongs to a transitive
dependency, then I don't think there is an ideal solution. Either the version is
pinned (like in Java world), or it is chosen by the dependency resolver. In the
former case all transitive dependencies are guaranteed to be intercompatible,
because these pinned versions were deliberately chosen by libraries developers.

In the latter case there is always a possibility of compatibility problems,
because it is impossible to guarantee complete compatibility - libraries are
written by people, after all. Then it is the user's responsibility to resolve
these problems, no one else will be able to do this.
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Isaac Dupree

On 02/01/2014 06:27 AM, Matthieu Monrocq wrote:

In short, isn't there a risk of crashes if one accidentally links two
versions of a given library and start exchanging objects ? It seems
impractical to prove that objects created by one version cannot
accidentally end up being passed to the other version:

- unless the types differ at compilation time (seems awkward)


Haskell does this. Types are equal if their {package, package-version, 
module-name, type-name} is the same.  (Or maybe it is even more rigorous 
about type equality.)  Using multiple versions of some packages turns 
out not to be awkward at all, such as libraries for writing tests and 
libraries that don't export important data types.


-Isaac

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Lee Braiden

On 01/02/14 19:32, Vladimir Matveev wrote:
When this API is used directly by the package, then the user *should* 
know about it. He's using it, after all.


There are developers (direct library users), and then distro 
maintainers/admins/users who need to manage libraries installed on their 
system.  The former should know, but the others shouldn't have to think 
about it, yet should (must) be able to override the defaults if they 
need to, at least for shared libraries.  Presumably we want shared 
libraries and static libraries to function similarly, except for whether 
the user chooses static or dynamic linkage.


If this API belongs to a transitive dependency, then I don't think 
there is an ideal solution. Either the version is pinned (like in Java 
world), or it is chosen by the dependency resolver.


If we're talking about pinning to an absolute version (no upgrades), 
then I think that's a security / bugfix issue, unless we're also talking 
about static linkage in that case (which is reasonable because then the 
bug is essentially part of the black box that is the software the user 
is installing, and in that case, the software maintainer is also 
responsible for releasing updates to fix bugs within the statically 
linked code.


In the former case all transitive dependencies are guaranteed to be 
intercompatible


Are they?  What if the statically pinned version of a scanner library 
doesn't support the user's new scanner, there's an update to support his 
scanner, but it's ignored because the software allows only an absolute 
version number?


because these pinned versions were deliberately chosen by libraries 
developers.


Who are not infallible, and do/should not get to choose everything about 
the target system's libraries.  There is also a freedom issue, regarding 
someone's right to implement a new version of the library, say, to port 
it to a new GUI toolkit.


In the latter case there is always a possibility of compatibility 
problems, because it is impossible to guarantee complete compatibility 
- libraries are written by people, after all.


Yes, but we can encourage it, just like we encourage immutability, even 
though we can't force everyone to use it.


Then it is the user's responsibility to resolve these problems, no one 
else will be able to do this. 


But the user can't do this, if new libraries break old programs, or old 
programs won't allow upgrading.



--
Lee




___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Lee Braiden

On 01/02/14 19:59, Isaac Dupree wrote:

On 02/01/2014 06:27 AM, Matthieu Monrocq wrote:

In short, isn't there a risk of crashes if one accidentally links two
versions of a given library and start exchanging objects ? It seems
impractical to prove that objects created by one version cannot
accidentally end up being passed to the other version:

- unless the types differ at compilation time (seems awkward)


Haskell does this. Types are equal if their {package, package-version, 
module-name, type-name} is the same.  (Or maybe it is even more 
rigorous about type equality.)  Using multiple versions of some 
packages turns out not to be awkward at all, such as libraries for 
writing tests and libraries that don't export important data types.




This sounds useful, but still seems like it's prone to error, unless you 
can define versions in some reliable way, which works despite 
distributed repositories, branches on those repositories, etc.


Does anyone have a proposal for methods of doing that?  I think it would 
require tracking version + hash of all code --- a bit like the way git 
tracks the head of a branch.  Is that what the hash in rust libraries 
currently includes?



--
Lee

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Vladimir Matveev
To clarify, when I was writing user I meant the developer who uses this
package, not the end user of complete program.

 On 01/02/14 19:32, Vladimir Matveev wrote:
  When this API is used directly by the package, then the user *should* 
  know about it. He's using it, after all.
 
 There are developers (direct library users), and then distro 
 maintainers/admins/users who need to manage libraries installed on their 
 system.  The former should know, but the others shouldn't have to think 
 about it, yet should (must) be able to override the defaults if they 
 need to, at least for shared libraries.  Presumably we want shared 
 libraries and static libraries to function similarly, except for whether 
 the user chooses static or dynamic linkage.

Well, it seems that working for a long time with a code targeting virtual
machines is corrupting :) I completely forgot about different models of
compilation. I see your point. But I think that developing and distributing
should be considered separately. Package manager for developers should be a
part of language infrastructure (like rustpkg is now for Rust and, for example,
go tool for Go language or cabal for Haskell). This package manager allows
flexible management of Rust libraries and their dependencies, and it should be
integrated with the build system (or *be* this build system). It is used by
developers to create applications and libraries and by maintainers to prepare
these applications and libraries for integration with the distribution system
for end users.

Package manager for general users (I'll call it system package manager),
however, depends on the OS, and it is maintainer's task to determine correct
dependencies for each package. Rust package manager should not depend in any
way on the system package manager and its packages, because each system has its
own package manager, and it is just impossible to support them all. Rust also
should not force usage of concrete user-level package manager (like 0install,
for example), because this means additional unrelated software on the user
installation.

Go and Haskell do not have this problem because they are linked statically, and
their binary packages do not have any library dependencies at all. Rust is a
different story as it supports and encourages dynamic linkage. I think that
maintainers should choose standard set of Rust libraries which is OK for most
applications, and support and update them and their dependent applications. If
there are conflicts between versions (for example, some application started to
depend on another fork of a library), then maintainers should resolve this in
the standard way of their distribution system (e.g. slots for Portage, name
suffixes in apt and so on).

Essentially there is a large graph of packages in the Rust world, consisting of
packages under Rust package manager control (main graph). Developers are only
working with this graph. Then for each distribution system maintainers of this
system pull packages from the main graph and adapt it to their system in a way
this system allows and encourages. I don't think that it is possible to achieve
anything better than this. We cannot and should not force end users to use
something other than their packaging system.

  If this API belongs to a transitive dependency, then I don't think 
  there is an ideal solution. Either the version is pinned (like in Java 
  world), or it is chosen by the dependency resolver.
 
 If we're talking about pinning to an absolute version (no upgrades), 
 then I think that's a security / bugfix issue, unless we're also talking 
 about static linkage in that case (which is reasonable because then the 
 bug is essentially part of the black box that is the software the user 
 is installing, and in that case, the software maintainer is also 
 responsible for releasing updates to fix bugs within the statically 
 linked code.
 
  In the former case all transitive dependencies are guaranteed to be 
  intercompatible
 
 Are they?  What if the statically pinned version of a scanner library 
 doesn't support the user's new scanner, there's an update to support his 
 scanner, but it's ignored because the software allows only an absolute 
 version number?

I don't think your example is related. By guaranteed intercompatibility I meant
something like the following. Suppose your package is called `package`. It
depends on `foo-x` who in turn depends on `bar-y`. When versions are always
pinned by their developers, `foo` author deliberately has chosen `bar-y`
version, and he knows that `foo-x` library will work properly with `bar-y`.
This is how Java ecosystem works now. New scanner, however, is not an API
feature. Your example seems to support the general point about outdated
dependencies, and I generally agree with it.

  because these pinned versions were deliberately chosen by libraries 
  developers.
 
 Who are not infallible, and do/should not get to choose everything about 
 the target system's libraries.  

Re: [rust-dev] Deprecating rustpkg

2014-02-01 Thread Daniel Micay
On Sat, Feb 1, 2014 at 4:28 PM, Vladimir Matveev

 Well, it seems that working for a long time with a code targeting virtual
 machines is corrupting :) I completely forgot about different models of
 compilation. I see your point. But I think that developing and distributing
 should be considered separately. Package manager for developers should be a
 part of language infrastructure (like rustpkg is now for Rust and, for 
 example,
 go tool for Go language or cabal for Haskell). This package manager allows
 flexible management of Rust libraries and their dependencies, and it should be
 integrated with the build system (or *be* this build system). It is used by
 developers to create applications and libraries and by maintainers to prepare
 these applications and libraries for integration with the distribution system
 for end users.

How will it handle external dependencies?

 Package manager for general users (I'll call it system package manager),
 however, depends on the OS, and it is maintainer's task to determine correct
 dependencies for each package. Rust package manager should not depend in any
 way on the system package manager and its packages, because each system has 
 its
 own package manager, and it is just impossible to support them all. Rust also
 should not force usage of concrete user-level package manager (like 0install,
 for example), because this means additional unrelated software on the user
 installation.

I don't understand this. A package manager specific to Rust is
additional software, just like 0install. 0install has full support for
installing dependencies via the system package manager on many systems
if desired.

http://0install.net/distribution-integration.html
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Gaetan
Le vendredi 31 janvier 2014, Val Markovic v...@markovic.io a écrit :

 .This is a huge problem in large C++ codebases. It is not fun. An example: 
 every
 version of Xerces-C++ puts its code in a new C++ 
 namespacehttp://xerces.apache.org/xerces-c/program-others-3.html,
 so code is in xerces_3_0, xerces_3_1, xerces_3_2 etc to prevent these kinds
 of issues.


We did that at work, this seems to be the unique, practical solution. I
don't like when I see a hash in the library file name or symbol name, but
this very efficient for easily manage inter dependency.



-- 
-
Gaetan
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Lee Braiden

On 31/01/14 08:05, Gaetan wrote:
Le vendredi 31 janvier 2014, Val Markovic v...@markovic.io 
mailto:v...@markovic.io a écrit :


.This is a huge problem in large C++ codebases. It is not fun. An
example: every version of Xerces-C++ puts its code in a new C++
namespace
http://xerces.apache.org/xerces-c/program-others-3.html, so code
is in xerces_3_0, xerces_3_1, xerces_3_2 etc to prevent these
kinds of issues.


We did that at work, this seems to be the unique, practical solution. 
I don't like when I see a hash in the library file name or symbol 
name, but this very efficient for easily manage inter dependency.


This seems like a very wrong-headed approach to me.  The Amiga had a 
very simple and effective library system, which makes me wonder why 
other systems overcomplicate it.  It followed rules something like 
these, iirc:


1) If a minor change or bugfix happens, increment the minor version.
2) If a major change, which is backwards compatible (i.e., new features) 
happens, then increment the major version.
3) When loading libraries, you can specify a major and a minor version, 
with 0 for the minor version if you like.  You get at least that 
version, or better, or the loading fails.
4) If an incompatible change happens, then it's not fulfilling the same 
library API any more, so you stop trying to force square pegs into round 
holes, and **just rename the damn thing** ;) ;)


Rule 4 seems to be where every other OS's libraries makes a big mistake.


For the internet age, there are new complexities of decentralised forks, 
I think we'd need a few  more rules:


5) Library names have namespaces, something like Java's (and go's?) 
com.org.libname system
6) Anything unofficial (i.e., your patch to version 1.3, bringing it to 
an UNOFFICIAL version 1.4) goes in your own namespace, until accepted 
into the official codebase, OR you fork your own, NEW, incompatible 
library, as in (4)+(5).



--
Lee

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Gaetan

 1) If a minor change or bugfix happens, increment the minor version.
 2) If a major change, which is backwards compatible (i.e., new features)
 happens, then increment the major version.
 3) When loading libraries, you can specify a major and a minor version,
 with 0 for the minor version if you like.  You get at least that version,
 or better, or the loading fails.
 4) If an incompatible change happens, then it's not fulfilling the same
 library API any more, so you stop trying to force square pegs into round
 holes, and **just rename the damn thing** ;) ;)

 Rule 4 seems to be where every other OS's libraries makes a big mistake.


It's a matter of politics. You can't choose for every project. Just let
people choose what is best for them. Versionning system for each project is
different, versionning on windows libraries is different than on linux or
mac.

For the internet age, there are new complexities of decentralised forks, I
 think we'd need a few  more rules:

 5) Library names have namespaces, something like Java's (and go's?)
 com.org.libname system


I hate this. Even if this is very logicial, it's anti ergonomic. What more
ugly than having the first directory in you source base named com or
org..


 6) Anything unofficial (i.e., your patch to version 1.3, bringing it to an
 UNOFFICIAL version 1.4) goes in your own namespace, until accepted into the
 official codebase, OR you fork your own, NEW, incompatible library, as in
 (4)+(5).



 --
 Lee


 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev


___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Strahinja Markovic
On Fri Jan 31 2014 at 10:01:32 AM, Tony Arcieri basc...@gmail.com wrote:

 Or can you find a solution where both lib A and lib B are happy with one
 particular version of lib C?


You often cannot find one version that satisfies everyone. I've explained
the reasons why in my previous email.


 At what point can you pick one version of lib C as opposed to two or three
 or more?

 For that matter, how do you even resolve dependencies in this sort of
 world? What algorithm do you use?


You depend on the latest version of a lib or on a specific, explicit
version. People who don't care about the old version use the latest, and
people who need the old lib version because they haven't updated use that.


 I also don't think this comes up in practice as you allege.


This gave me a good, long chuckle (honestly, I mean no offense). Thank you,
I needed that. My day is already better.

If only I lived in a world where this issue doesn't come up in practice...



 --
 Tony Arcieri

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


[rust-dev] Deprecating rustpkg

2014-01-31 Thread Tony Arcieri
On Friday, January 31, 2014, Vladimir Lushnikov
vladi...@slate-project.orgjavascript:_e(%7B%7D,'cvml','vladi...@slate-project.org');
wrote:

 There are some very interesting questions being raised here.


You quoted my question (perhaps unintentionally) but didn't answer it...

What algorithm do you use to resolve dependencies in a multi-versioned
world? For a world where you're searching (a DAG) for a set of mutually
compatible versions, toposort is an answer. But what about in a situation
where, based on a given set of constraints, you may or may not want to use
multiple versions of the same dependency?

Unless there's an answer to this question, I think the rustpkg versioning
model is a total nonstarter. I will also note that all major dependency
resolvers (e.g. maven, bundler, apt-get, yum) calculate solutions which
consist of one particular version for each package.

The version compatibility is hard, let's go shopping! school of thought
sounds nice in theory, but with nearly a decade of battle scars from
working with a system like that (RubyGems), my experience tells me it's a
terrible, terrible idea in practice...


-- 
Tony Arcieri
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Strahinja Markovic
On Fri Jan 31 2014 at 2:29:16 PM, Jack Moffitt j...@metajack.im wrote:

  I am still confused about:
 
  1) In what situations are you saying there would actually be a workable
  multi-version solution that wouldn't exist in a system like Maven
  2) How these solutions are calculated

 Every symbol in a crate is hashed with its version. You can load two
 extern mod's of the same libriary with different versions with no
 problems. Even within the same crate. To the extent that the two
 different versions aren't creating a socket to talk to each other with
 different versions of a protocol, this works fine. As an example,
 think about a JSON library that has undergone significant API changes.
 Each version can still read and write JSON, but the API to use them
 might be quite different. There's no inherent conflict to using both
 versions.

 As for how you would calculate this, it's easy. You just use exactly
 the versions that the extern mods ask for. If they ask for a version,
 you pick that one. If they don't, you pick the newest one. This breaks
 the same way as any other algorithm when the unconstrained
 dependencies can't be resolved.


This is exactly the system I was referring to. Rust makes it easy to have
multiple versions of a library linked into a single binary. If the
libraries you depend on all ask for the latest (or unspecified) version of
a library, you pull in the latest. If they ask for a specific version, you
link that in as well.

The package manager should make this more granular as well; instead of
offering latest and exact version, you specify depending on libfoo with
latest, 1.2.3, 1.2.x, 1.x etc. If the constraint is 1.x, you pick
the latest version that's still 1.x.

And then you link in *all* the libs that are necessary to satisfy the
various constraints, you *don't* look for the One True Version that
satisfies all of your deps.



 This is substantially different than the Java (or Erlang) world where
 the function names are not versioned behind the scenes and would
 clobber each other.

 Hopefully I haven't misunderstood your question.

 jack.
 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Strahinja Markovic
On Fri Jan 31 2014 at 3:03:54 PM, Tony Arcieri basc...@gmail.com wrote:

I am 100% clear that, from a technical perspective, Rust has the ability to
support multiple different versions of the same library simultaneously.

However:

1) Is this a good idea?


Of course it's a good idea. There's a need for that today in C++, Python
and other languages. Without this feature you end up in the clash of
there's no single version that satisfies all of my deps and you're sad
and unhappy. And waste time and money.

With Rust, the symbols won't conflict so there's no downside.


2) Is this compatible with the idea of toposort-style dependency resolution?


I honestly can't even say I understand what exactly you mean by
toposort-style dependency resolution, but I can't help but feel that the
answer to your question is *Why do we care?.* Implementing an algorithm
that fulfills the design I and others have proposed is trivial.


Can anyone point to a real-world example of a dependency resolver which can
produce solutions which may-or-may-not contain multiple versions of the
same library?


What's the point? I don't know of any language other than Rust that doesn't
bork when you link/load/eval multiple versions of a library in the same
binary/process/interpreter. So no other language even *could *have
implemented this without hacks like adding the library version to its
name/namespace/whatever.

Let's not limit what we can build for Rust by constraining ourselves to
what others have built for languages that don't have Rust's capabilities.



-- 
Tony Arcieri
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Tony Arcieri
On Fri, Jan 31, 2014 at 3:24 PM, Strahinja Markovic v...@markovic.io wrote:

 I honestly can't even say I understand what exactly you mean by
 toposort-style dependency resolution, but I can't help but feel that the
 answer to your question is *Why do we care?.* Implementing an algorithm
 that fulfills the design I and others have proposed is trivial.


What is this algorithm, and what is a concrete example of a set of
dependencies that your hypothetical algorithm can solve which toposort
can't?


 What's the point? I don't know of any language other than Rust that
 doesn't bork when you link/load/eval multiple versions of a library in the
 same binary/process/interpreter. So no other language even *could *have
 implemented this without hacks like adding the library version to its
 name/namespace/whatever.

 Let's not limit what we can build for Rust by constraining ourselves to
 what others have built for languages that don't have Rust's capabilities.


If the dependency resolver can't support finding solutions which are
satisfied by multiple versions of the same library in cases where a
toposort-style dependency resolver can't, the entire endeavor is a
pointless waste of time which does nothing but complect the packaging
system and make it more confusing.

I am strongly suggesting before you go down this road you figure out:

1) *Exactly* what problem you're trying to solve
2) *Exactly* how you intend to solve it

RubyGems went down the road of trying to support the simultaneous
installation of multiple versions of the same package. However, Bundler,
the dependency resolution tool, ended up resolving packages to a single,
specific version. This made RubyGems support of multiple versions of
packages not only useless, but annoying, and even more cruft (e.g. rvm
gemsets) was added to work around the impedance mismatch between the
package manager and the dependency resolver.

I hope Rust will not make the same mistake for lack of better planning.

--
Tony Arcieri
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Corey Richardson
I see where Tony is coming from for this one. Just because we *can*
doesn't necessarily mean we should. If possible we should definitely
prefer to find a common version that both libraries can be happy with.
I myself don't have the answers to his questions, though.

On Fri, Jan 31, 2014 at 6:24 PM, Strahinja Markovic v...@markovic.io wrote:


 On Fri Jan 31 2014 at 3:03:54 PM, Tony Arcieri basc...@gmail.com wrote:

 I am 100% clear that, from a technical perspective, Rust has the ability to
 support multiple different versions of the same library simultaneously.

 However:

 1) Is this a good idea?


 Of course it's a good idea. There's a need for that today in C++, Python and
 other languages. Without this feature you end up in the clash of there's no
 single version that satisfies all of my deps and you're sad and unhappy.
 And waste time and money.

 With Rust, the symbols won't conflict so there's no downside.


 2) Is this compatible with the idea of toposort-style dependency resolution?


 I honestly can't even say I understand what exactly you mean by
 toposort-style dependency resolution, but I can't help but feel that the
 answer to your question is Why do we care?. Implementing an algorithm that
 fulfills the design I and others have proposed is trivial.


 Can anyone point to a real-world example of a dependency resolver which can
 produce solutions which may-or-may-not contain multiple versions of the same
 library?


 What's the point? I don't know of any language other than Rust that doesn't
 bork when you link/load/eval multiple versions of a library in the same
 binary/process/interpreter. So no other language even could have implemented
 this without hacks like adding the library version to its
 name/namespace/whatever.

 Let's not limit what we can build for Rust by constraining ourselves to what
 others have built for languages that don't have Rust's capabilities.



 --
 Tony Arcieri


 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Strahinja Markovic
On Fri Jan 31 2014 at 3:33:00 PM, Tony Arcieri basc...@gmail.com wrote:

 I am strongly suggesting before you go down this road you figure out:

 1) *Exactly* what problem you're trying to solve
 2) *Exactly* how you intend to solve it


I'll try to find some time today/weekend to write up a Google Doc
explaining what I'm proposing (and the algorithm to do it) in more detail.
I'll ping this thread with a link.



 RubyGems went down the road of trying to support the simultaneous
 installation of multiple versions of the same package. However, Bundler,
 the dependency resolution tool, ended up resolving packages to a single,
 specific version. This made RubyGems support of multiple versions of
 packages not only useless, but annoying, and even more cruft (e.g. rvm
 gemsets) was added to work around the impedance mismatch between the
 package manager and the dependency resolver.

 I hope Rust will not make the same mistake for lack of better planning.

 --
 Tony Arcieri

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Tony Arcieri
On Fri, Jan 31, 2014 at 3:59 PM, Jack Moffitt j...@metajack.im wrote:

 The algorithm here is rather simple. We try to satisfy rust-logger and
  rust-rest. rust-rest has a version (or could be a tag like 1.x) so we
 go get that. It depends on rust-json 2.0 so we get that. Then we try
 to look for rust-logger, whatever version is latest (in rustpkg this
 would mean current master since no version or tag is given). This
 pulls in rust-json 1.0 since 1.0 != 2.0 and those have specific tags.
 Everything is built and linked as normal. Whether rust-json's
 constraints are exact revisions or they are intervals ( 2.0 and =
 2.0 for example), makes little difference I think.


To reiterate, it sounds like you're describing every package pinning its
dependencies to a specific version, which I'd consider an antipattern.

What is to prevent a program using this (still extremely handwavey)
algorithm from depending on rust-json 1.0, 1.1, 1.2, 1.3, 1.4, 2.0, 2.1,
and 2.2 simultaneously?

What if some of these are buggy, but the fixed versions aren't used due to
version pinning?

What if rust-json 1.0 has a security issue?

-- 
Tony Arcieri
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Tony Arcieri
On Fri, Jan 31, 2014 at 4:03 PM, Lee Braiden leebr...@gmail.com wrote:

 This would be counterproductive.  If a library cannot be upgraded to 1.9,
 or even 2.2, because some app REQUIRES 1.4, then that causes SERIOUS,
 SECURITY issues.


Yes, these are exactly the types of problems I want to help solve. Many
people on this thread are talking about pinning to specific versions of
libraries. This will prevent upgrades in the event of a security problem.

Good dependency resolvers work on constraints, not specific versions.

The ONLY realistic way I can see to solve this, is to have all higher
 version numbers of the same package be backwards compatible, and have
 incompatible packages be DIFFERENT packages, as I mentioned before.

 Really, there is a contract here: an API contract.


Are you familiar with semantic versioning?

http://semver.org/

Semantic Versioning would stipulate that a backwards incompatible change in
an API would necessitate a MAJOR version bump. This indicates a break in
the original contract.

Ideally if people are using multiple major versions of the same package,
and a security vulnerability is discovered which affects all versions of a
package, that the package maintainers release a hotfix for all major
versions.

-- 
Tony Arcieri
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Vladimir Lushnikov
Looking briefly over the semantic versioning page, doesn't it just mandate
a particular version format for when you break API compatibility? This is
in theory the same thing I was talking about with slots, just encoded in
the version number. It can get ridiculous as well though - see for example
openssl's version 1.0.0l…


On Sat, Feb 1, 2014 at 12:09 AM, Tony Arcieri basc...@gmail.com wrote:

 On Fri, Jan 31, 2014 at 4:03 PM, Lee Braiden leebr...@gmail.com wrote:

 This would be counterproductive.  If a library cannot be upgraded to 1.9,
 or even 2.2, because some app REQUIRES 1.4, then that causes SERIOUS,
 SECURITY issues.


 Yes, these are exactly the types of problems I want to help solve. Many
 people on this thread are talking about pinning to specific versions of
 libraries. This will prevent upgrades in the event of a security problem.

 Good dependency resolvers work on constraints, not specific versions.

 The ONLY realistic way I can see to solve this, is to have all higher
 version numbers of the same package be backwards compatible, and have
 incompatible packages be DIFFERENT packages, as I mentioned before.

 Really, there is a contract here: an API contract.


 Are you familiar with semantic versioning?

 http://semver.org/

 Semantic Versioning would stipulate that a backwards incompatible change
 in an API would necessitate a MAJOR version bump. This indicates a break in
 the original contract.

 Ideally if people are using multiple major versions of the same package,
 and a security vulnerability is discovered which affects all versions of a
 package, that the package maintainers release a hotfix for all major
 versions.

 --
 Tony Arcieri

 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev


___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Tony Arcieri
On Fri, Jan 31, 2014 at 4:15 PM, Vladimir Lushnikov 
vladi...@slate-project.org wrote:

 1) To determine API compatibility for you based on some input from the
 library author? (be this semantic versioning, slots, something else)


This. You should be able to lock to major and/or minor versions of a
particular package

-- 
Tony Arcieri
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Vladimir Lushnikov
Is this not exactly what was being discussed? Maven doesn't support this
(but pip does for example Baz=2.0, 3.0). Portage certainly supports it
as well.


On Sat, Feb 1, 2014 at 12:15 AM, Tony Arcieri basc...@gmail.com wrote:

 On Fri, Jan 31, 2014 at 4:15 PM, Vladimir Lushnikov 
 vladi...@slate-project.org wrote:

 1) To determine API compatibility for you based on some input from the
 library author? (be this semantic versioning, slots, something else)


 This. You should be able to lock to major and/or minor versions of a
 particular package

 --
 Tony Arcieri

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-31 Thread Tony Arcieri
On Fri, Jan 31, 2014 at 4:23 PM, Vladimir Lushnikov 
vladi...@slate-project.org wrote:

 Is this not exactly what was being discussed?


It is! I'm still asking for a less handwavy explanation of how this would
work in a system which may-or-may-not select multiple versions of a package
depending on the given constraints, and concrete examples of how it would
work.


 Maven doesn't support this


Uhh, yes it does?

http://docs.codehaus.org/display/MAVEN/Dependency+Mediation+and+Conflict+Resolution#DependencyMediationandConflictResolution-DependencyVersionRanges

-- 
Tony Arcieri
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-30 Thread Val Markovic
I'll second Armin and Corey here; if lib A and B depend on different
versions of lib C, you *must* still be able to easily build and link them
together as deps of lib D. This is *critical* in large codebases where it's
not feasible to go into A and B and update which version of C they depend
on since A  B are possibly very big and have tons of other users who
aren't as willing as you are to switch to a new version of C.

And now imagine a (frighteningly common) nightmare scenario, where the
library version clash is not two steps away from you, but say 8 and 12. So
you link something that depends on something that depends on something ...
that 8 steps away depends on A (which depends on C v1), and a different
chain of deps that 12 steps away depends on B (which depends on C v2). Good
luck resolving that.

Even more fun scenario: both A and B depend on C v1, but B wants to update
to C v2 because of a new feature they need. And they can't, because it
would break everyone upstream of them who depended on A. And no, you can't
just go into A and update it to C v2 because maybe the API in C changed,
maybe the same functions return slightly different results (a bug was
fixed, but old code implicitly depends on the bug) and just maybe lib A is
millions of lines of code and updating it would take weeks (or longer) for
the team that maintains A in the first place, and God only knows how long
it would take you.

This is a huge problem in large C++ codebases. It is not fun. An example: every
version of Xerces-C++ puts its code in a new C++
namespacehttp://xerces.apache.org/xerces-c/program-others-3.html,
so code is in xerces_3_0, xerces_3_1, xerces_3_2 etc to prevent these kinds
of issues.

Not being able to link together different versions of a library together
completely breaks encapsulation. Just don't go there.



On Thu, Jan 30, 2014 at 4:27 PM, Armin Ronacher armin.ronac...@active-4.com
 wrote:

 Hi,


 On 30/01/2014 23:08, Tony Arcieri wrote:

 What if the different versions of the library do incompatible things,
 like:

 - Talk incompatible versions of a network protocol
 - Serialize data differently
 - One contains important security fixes the other does not

 They are different libraries.  It's not an issue in practice because the
 situation where this happens is if you have a library depending on another
 library internally.

 Not having the option to run multiple versions of the same library in
 parallel in different parts is a major problem and Python suffers
 tremendously under this.


 Regards,
 Armin

 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-28 Thread Gaetan
I also agree on the task force proposal, it's the right way to capitalize
on past failure and success.

For me, rust-pkg will not success if it doesn't have a proper centralized
repository. That's a debate, the current version explicitely specify the
URL where to download stuff. But things changes, developers changes, URL
get broken, or a new developer forks a project or continue it on another
repository.

I have quite a good experience (I think) on dependency management, mostly
on C++ (QT/Linux projects and embedded) and now python (that's why I love
the simplicity and power of pipy!) so I would be glad to be associated with
such task force. For me you have two approach: enough for major use case,
things like pipy that do the job for most use case, and exhaustive
approach where you end up with complicated but extremely powerful
do-it-all tools like maven but that get eventually dropped because it's too
complex to use.

That is also joining the build system thread, where also rustpkg appeared
:) But I push to split them appart: the dependency management tool should
trigger a build system and not do everything

-
Gaetan



2014-01-28 Huon Wilson dbau...@gmail.com

 On 28/01/14 19:36, György Andrasek wrote:

 I never quite understood the problem `rustpkg` was meant to solve. For
 building Rust code, `rustc --out-dir build` is good enough. For running
 tests and benchmarks, `rustc` is good enough. For downloading things, I
 still need to feed it a github address, which kinda takes away any value it
 could have over `git clone` or git submodules.


 rustpkg (theoretically) manages fetching and building dependencies (with
 the appropriate versions), as well as making sure those dependencies can be
 found (i.e. what the -L flag does for rustc).



 What I would actually need from a build system, i.e. finding {C,C++,Rust}
 libraries, building {C,C++,Rust} libraries/executables and linking them to
 said {C,C++,Rust} libraries, it doesn't do. It also doesn't bootstrap rustc.


 rustpkg is unfinished and has several bugs, so describing its current
 behaviour/usage as if it were its intended behaviour/usage is not correct.
 I believe it was designed to handle native (non-Rust) dependencies to some
 degree.


 Huon



  [Disclaimer: I've never quite got a rustpkg workflow going. It's probably
 awesome, but completely overshadowed by `rustc`.]

 On 01/28/2014 09:02 AM, Tim Chevalier wrote:

 On Mon, Jan 27, 2014 at 10:20 PM, Val Markovic v...@markovic.io wrote:


 On Jan 27, 2014 8:53 PM, Jeremy Ong jeremyc...@gmail.com wrote:


 I'm somewhat new to the Rust dev scene. Would anybody care to summarize
 roughly what the deficiencies are in the existing system in the
 interest of
 forward progress? It may help seed the discussion for the next effort
 as
 well.


 I'd like to second this request. I haven't used rustpkg myself but I've
 read
 its reference manual (
 https://github.com/mozilla/rust/blob/master/doc/rustpkg.md) and it
 sounds
 like a reasonable design. Again, since I haven't used it, I'm sure I'm
 missing some obvious flaws.


 Thirded. I implemented rustpkg as it's currently known, and did so in
 the open, detailing what I was thinking about in a series of
 exhaustively detailed blog posts. Since few people seemed very
 interested in providing feedback on it as I was developing it (with
 the exception of Graydon, who also worked on the initial design), I
 assumed that it was on the right track. I rewrote rustpkg because
 there was a perception that the initial design of rustpkg was not on
 the right track, nor was cargo, but obviously simply rewriting the
 whole system from scratch in the hopes that it would be better didn't
 work, since people are talking about throwing it out. So, before
 anybody embarks on a third rewrite in the hopes that *that* will be
 better, I suggest that a working group form to look at what went wrong
 in the past 2 or 3 attempts at implementing a build system / package
 system for Rust, so that those mistakes can be learned from. Perhaps
 all that needs to be done differently is that someone more central to
 the community needs to write it, but if that's what it takes, it seems
 preferable to the wasted time and effort that I imagine will ensue
 from yet another rewrite for the sake of throwing out code.

 Cheers,
 Tim

 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev


 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-28 Thread Jeremy Ong
Rubygems is, in my opinion, one of the best examples of package managers
for a programming language out there. I don't use ruby currently but I
recall liking it much more than the competition, at least as of a few years
ago. In no particular order, it features:

- central repository
- optionally specify alternative urls on a per package basis
- hard and soft version requirements
- local bundles (so you can have multiple versions of the same package and
switch seamlessly from project to project)
- dependency calculation
- dependency groups (i.e. ability to group a number of deps in a test
group which can be switched on or off)
- local or systemwide install of libraries and executables
- easy to write a package and publish it (very easy)
- gem signing
- plugin system that allowed extensions to be written (an example plugin
was something that allowed a gem to be compiled with specific ruby versions
iirc)
- with the bundler gem, an easy way to at a glance, examine all
dependencies of a given project and fetch them all (the Gemfile). This had
implications to deploy processes too (gemfile.lock).

This isn't a comprehensive list but it evidently made an impression on me.
I'm sure there are tons of other things I'm missing. Other systems that
I've worked with I found critically lacking in isolating the dependencies
of one project from another or failing to resolve dependency trees
satisfactorily. Most importantly, the ease with which packages could be
downloaded and used immediately were immense.

As mentioned before, getting this right is crucial to establishing a
vibrant developer ecosystem (see rubygems and npm).


On Tue, Jan 28, 2014 at 1:33 AM, Lee Braiden leebr...@gmail.com wrote:

 On 28/01/14 08:36, György Andrasek wrote:

 I never quite understood the problem `rustpkg` was meant to solve. For
 building Rust code, `rustc --out-dir build` is good enough. For running
 tests and benchmarks, `rustc` is good enough. For downloading things, I
 still need to feed it a github address, which kinda takes away any value it
 could have over `git clone` or git submodules.

 What I would actually need from a build system, i.e. finding {C,C++,Rust}
 libraries, building {C,C++,Rust} libraries/executables and linking them to
 said {C,C++,Rust} libraries, it doesn't do. It also doesn't bootstrap rustc.


 I agree with this.  What I'd want is much more like apt (add repositories,
 update lists of available packages from those repositories, manage
 priorities between repositories, say that one repository should be
 preferred over another for a particular package, working in specific
 prefixes (/usr/local, /usr, /, ~/Projects/something-requiring-old-libs),
 but rust-specific and platform independent.


 --
 Lee


 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-28 Thread György Andrasek

On 01/28/2014 10:33 AM, Lee Braiden wrote:

I agree with this.  What I'd want is much more like apt (add
repositories, update lists of available packages from those
repositories, manage priorities between repositories, say that one
repository should be preferred over another for a particular package,
working in specific prefixes (/usr/local, /usr, /,
~/Projects/something-requiring-old-libs), but rust-specific and platform
independent.


What you actually want is Paludis[0], which installs from source. I'd 
propose it as a standard Rust package manager, but it does have some 
serious flaws for that purpose:


- Designed as the package manager for a full Linux distro, so it wants 
to handle everything by itself. If you give it a build dependency on 
gcc, it'll want to maintain the entire toolchain.

- No Windows support.
- Hard dependency on `bash`.

That said, it has solved a serious number of PM problems we should learn 
from:


- Completely build system agnostic. All tooling is done in bash 
libraries called `exheres`, with an infinite number of customization 
hooks from patching to post-install. Can build everything from glibc to 
xmonad.
- Separate set of build and run dependencies, with configurable install 
root: you can (probably) bootstrap an embedded Linux with it.
- Metadata lives in a number of git repos you can cherry-pick. Creating 
your own is *easy*.

- Some support for external package sources, like Hackage.
- Fast[1] dependency handling including cycles and both global and local 
keywords: `doc` pulls in the doc build tools, `bash_completion` installs 
relevant extra files, `texture-float` enables patented code in mesa.

- Installing from source control.
- All user-facing configuration is done in /etc/paludis, where you can 
apply keywords, CFLAGS, mirrors, scm options, mask/unmask packages etc. 
on a per-package per-version basis with globs and stuff.



tl;dr If you want to design a package manager, bootstrap an Exherbo[2] 
first.





[0]: http://paludis.exherbo.org/overview/features.html
[2]: http://exherbo.org/
[1]:
```
# time cave resolve world --everything
Done: 2501 steps
snip
Executing pretend actions: 265 of 265

 * You have 6 unread news items (use 'eclectic news' to read)

real0m16.108s
user0m12.706s
sys 0m1.643s
```
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-28 Thread Matthew Frazier

On 01/28/2014 04:33 AM, Lee Braiden wrote:

On 28/01/14 08:36, György Andrasek wrote:

I never quite understood the problem `rustpkg` was meant to solve. For
building Rust code, `rustc --out-dir build` is good enough. For
running tests and benchmarks, `rustc` is good enough. For downloading
things, I still need to feed it a github address, which kinda takes
away any value it could have over `git clone` or git submodules.

What I would actually need from a build system, i.e. finding
{C,C++,Rust} libraries, building {C,C++,Rust} libraries/executables
and linking them to said {C,C++,Rust} libraries, it doesn't do. It
also doesn't bootstrap rustc.


I agree with this.  What I'd want is much more like apt (add
repositories, update lists of available packages from those
repositories, manage priorities between repositories, say that one
repository should be preferred over another for a particular package,
working in specific prefixes (/usr/local, /usr, /,
~/Projects/something-requiring-old-libs), but rust-specific and platform
independent.


Have you ever used Composer https://getcomposer.org/? I know that 
Jordi Boggiano, one of the authors, has been involved with the Rust 
community in the past. Some Composer features that I think are critical 
for the new Rust package manager include:


- Tags and branches are automatically recognized from git to create 
versions.


- Version specifiers aren't just limited to this version exactly, but 
allow you to match on a range of versions (though Semantic Versioning is 
encouraged).


- Packages have vendor prefixes (like rust/flate or 
leafstorm/mycoollibrary) to help avoid name conflicts and allow for 
forking, but these aren't linked to the way packages are retrieved.


- There's a central repository, but it's really easy to add random git 
repositories to the composer.json, or to create your own repositories 
for internal use.


- It automatically generates a lock file that allows you to reinstall 
exactly the same versions of the dependencies across machines.


- Everything is installed per-project, so no conflicts across projects. 
(Though the new rustpkg may want to not do this exactly because of 
compile times.)


If I had more time and more Rust experience, I would be interested in 
implementing a Composer-like package manager for Rust. Unfortunately I 
have little of both. :-(

--
Thanks,
Matthew Frazier
http://leafstorm.us
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-28 Thread SiegeLord

On 01/27/2014 11:53 PM, Jeremy Ong wrote:

I'm somewhat new to the Rust dev scene. Would anybody care to summarize
roughly what the deficiencies are in the existing system in the interest
of forward progress? It may help seed the discussion for the next effort
as well.


I can only speak for myself, but here are some reasons why I abandoned 
rustpkg and switched to CMake.


Firstly, and overarchingly, it was the attitude of the project 
development with respect to issues. As a comparison, let me consider 
Rust the language. It is a pain to make my code pass the borrow check 
sometimes, the lifetimes are perhaps the most frustrating aspect of 
Rust. I put up with them however, because they solve a gigantic problem 
and are the keystone of Rust's safety-without-GC story. rustpkg also has 
many incredibly frustrating aspects, but they are there (in my opinion) 
arbitrarily and not as a solution to any real problem. When I hit them, 
I do not get the same sense of purposeful sacrifice I get with Rust's 
difficult points. Let me outline the specific issues I personally hit (I 
know of other ones, but I haven't encountered them personally).


Conflation of package id and source. That fact combined with the fact 
that to depend on some external package you have to write extern mod = 
pkgid meant that you needed to create bizarre directory structures to 
depend on locally developed packages (e.g. you'd have to put your 
locally developed project in a directory tree like so: 
github.com/SiegeLord/Project). This is not something I was going to do.


The package dependencies are written in the source file, which makes it 
onerous to switch between versions/forks. A simple package script would 
have solved it, but it wasn't present by design.


My repositories have multiple crates, and rustpkg is woefully 
under-equipped to handle that case. You cannot build them without 
dealing with pkg.rs, and using them from other projects seemed 
impossible too (the extern mod syntax wasn't equipped to handle multiple 
crates per package). This is particularly vexing when you have multiple 
example programs alongside your library. I was not going to split my 
repository up just because rustpkg wasn't designed to handle that case.


All of those points would be solved by having an explicit package 
description file/script which was THE overarching design non-goal of 
rustpkg. After that was made clear to me, I just ditched it and went to 
C++ style package management and a CMake build system.


-SL
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-28 Thread Ian Daniher
Lots of good points in this thread, but I wanted to request deprecation,
but not removal until a better alternative is documented and made
available. Rustpkg works for my needs - I use it every day -  but it
definitely needs some TLC.

Thanks!
--
Ian


On Tue, Jan 28, 2014 at 11:46 AM, SiegeLord slab...@aim.com wrote:

 On 01/27/2014 11:53 PM, Jeremy Ong wrote:

 I'm somewhat new to the Rust dev scene. Would anybody care to summarize
 roughly what the deficiencies are in the existing system in the interest
 of forward progress? It may help seed the discussion for the next effort
 as well.


 I can only speak for myself, but here are some reasons why I abandoned
 rustpkg and switched to CMake.

 Firstly, and overarchingly, it was the attitude of the project development
 with respect to issues. As a comparison, let me consider Rust the language.
 It is a pain to make my code pass the borrow check sometimes, the lifetimes
 are perhaps the most frustrating aspect of Rust. I put up with them
 however, because they solve a gigantic problem and are the keystone of
 Rust's safety-without-GC story. rustpkg also has many incredibly
 frustrating aspects, but they are there (in my opinion) arbitrarily and not
 as a solution to any real problem. When I hit them, I do not get the same
 sense of purposeful sacrifice I get with Rust's difficult points. Let me
 outline the specific issues I personally hit (I know of other ones, but I
 haven't encountered them personally).

 Conflation of package id and source. That fact combined with the fact that
 to depend on some external package you have to write extern mod = pkgid
 meant that you needed to create bizarre directory structures to depend on
 locally developed packages (e.g. you'd have to put your locally developed
 project in a directory tree like so: github.com/SiegeLord/Project). This
 is not something I was going to do.

 The package dependencies are written in the source file, which makes it
 onerous to switch between versions/forks. A simple package script would
 have solved it, but it wasn't present by design.

 My repositories have multiple crates, and rustpkg is woefully
 under-equipped to handle that case. You cannot build them without dealing
 with pkg.rs, and using them from other projects seemed impossible too
 (the extern mod syntax wasn't equipped to handle multiple crates per
 package). This is particularly vexing when you have multiple example
 programs alongside your library. I was not going to split my repository up
 just because rustpkg wasn't designed to handle that case.

 All of those points would be solved by having an explicit package
 description file/script which was THE overarching design non-goal of
 rustpkg. After that was made clear to me, I just ditched it and went to C++
 style package management and a CMake build system.

 -SL

 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-28 Thread Kevin Ballard
Keeping it around means maintaining it, and it means tempting people to use it 
even though it's deprecated.

My suggestion would be, if you really need rustpkg, then extract it into a 
separate repo and maintain it there. But get it out of the mozilla/rust tree.

-Kevin

On Jan 28, 2014, at 11:28 AM, Ian Daniher explodingm...@gmail.com wrote:

 Lots of good points in this thread, but I wanted to request deprecation, but 
 not removal until a better alternative is documented and made available. 
 Rustpkg works for my needs - I use it every day -  but it definitely needs 
 some TLC.
 
 Thanks!
 --
 Ian
 
 
 On Tue, Jan 28, 2014 at 11:46 AM, SiegeLord slab...@aim.com wrote:
 On 01/27/2014 11:53 PM, Jeremy Ong wrote:
 I'm somewhat new to the Rust dev scene. Would anybody care to summarize
 roughly what the deficiencies are in the existing system in the interest
 of forward progress? It may help seed the discussion for the next effort
 as well.
 
 I can only speak for myself, but here are some reasons why I abandoned 
 rustpkg and switched to CMake.
 
 Firstly, and overarchingly, it was the attitude of the project development 
 with respect to issues. As a comparison, let me consider Rust the language. 
 It is a pain to make my code pass the borrow check sometimes, the lifetimes 
 are perhaps the most frustrating aspect of Rust. I put up with them however, 
 because they solve a gigantic problem and are the keystone of Rust's 
 safety-without-GC story. rustpkg also has many incredibly frustrating 
 aspects, but they are there (in my opinion) arbitrarily and not as a solution 
 to any real problem. When I hit them, I do not get the same sense of 
 purposeful sacrifice I get with Rust's difficult points. Let me outline the 
 specific issues I personally hit (I know of other ones, but I haven't 
 encountered them personally).
 
 Conflation of package id and source. That fact combined with the fact that to 
 depend on some external package you have to write extern mod = pkgid meant 
 that you needed to create bizarre directory structures to depend on locally 
 developed packages (e.g. you'd have to put your locally developed project in 
 a directory tree like so: github.com/SiegeLord/Project). This is not 
 something I was going to do.
 
 The package dependencies are written in the source file, which makes it 
 onerous to switch between versions/forks. A simple package script would have 
 solved it, but it wasn't present by design.
 
 My repositories have multiple crates, and rustpkg is woefully under-equipped 
 to handle that case. You cannot build them without dealing with pkg.rs, and 
 using them from other projects seemed impossible too (the extern mod syntax 
 wasn't equipped to handle multiple crates per package). This is particularly 
 vexing when you have multiple example programs alongside your library. I was 
 not going to split my repository up just because rustpkg wasn't designed to 
 handle that case.
 
 All of those points would be solved by having an explicit package description 
 file/script which was THE overarching design non-goal of rustpkg. After that 
 was made clear to me, I just ditched it and went to C++ style package 
 management and a CMake build system.
 
 -SL
 
 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev
 
 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


[rust-dev] Deprecating rustpkg

2014-01-27 Thread Brian Anderson

Hey again, Rusticians.

So I think most of us know that rustpkg isn't quite working the way 
people expect, and the general consensus seems to be that its flaws 
extend pretty deep, to the point where it may just not be exposing the 
right model. I'd like to deprecate it immediately to end the 
frustrations people continue encountering with it, while we figure out 
what to do with it.


Having a good packaging story is critical for Rust's adoption, so I want 
to keep pushing on this. I am looking into hiring a domain expert to 
help us.


Regards,
Brian


___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-27 Thread Steve Klabnik
Vote of strong support here. I removed the rustpkg chapter from Rust
for Rubyists for a reason. :/
___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-27 Thread Jeremy Ong
I'm somewhat new to the Rust dev scene. Would anybody care to summarize
roughly what the deficiencies are in the existing system in the interest of
forward progress? It may help seed the discussion for the next effort as
well.


On Mon, Jan 27, 2014 at 6:05 PM, Steve Klabnik st...@steveklabnik.comwrote:

 Vote of strong support here. I removed the rustpkg chapter from Rust
 for Rubyists for a reason. :/
 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev


Re: [rust-dev] Deprecating rustpkg

2014-01-27 Thread Val Markovic
On Jan 27, 2014 8:53 PM, Jeremy Ong jeremyc...@gmail.com wrote:

 I'm somewhat new to the Rust dev scene. Would anybody care to summarize
roughly what the deficiencies are in the existing system in the interest of
forward progress? It may help seed the discussion for the next effort as
well.

I'd like to second this request. I haven't used rustpkg myself but I've
read its reference manual (
https://github.com/mozilla/rust/blob/master/doc/rustpkg.md) and it sounds
like a reasonable design. Again, since I haven't used it, I'm sure I'm
missing some obvious flaws.



 On Mon, Jan 27, 2014 at 6:05 PM, Steve Klabnik st...@steveklabnik.com
wrote:

 Vote of strong support here. I removed the rustpkg chapter from Rust
 for Rubyists for a reason. :/
 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev



 ___
 Rust-dev mailing list
 Rust-dev@mozilla.org
 https://mail.mozilla.org/listinfo/rust-dev

___
Rust-dev mailing list
Rust-dev@mozilla.org
https://mail.mozilla.org/listinfo/rust-dev