Re: the right to rewrite history to rectify the past (was Re: Concerns/questions around Software Heritage Archive)

2024-03-21 Thread Hartmut Goebel

Am 21.03.24 um 07:12 schrieb MSavoritias:
Specifically the social rules that we support trans people and we want 
to include them. Any person really that want to change their name at 
some point for some reason. 


Interestingly you are asking the right to get the old name rewritten for 
trans people only.


To be frank: IMHO This is a quiet egocentric point of view.

In many cultures all over the world women are required to change their 
name when they merry. And you are not asking for women's right. But only 
for right for the small but loud minority of trans people.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: cmake-build-system: build tests only if #:tests? is true?

2024-03-18 Thread Hartmut Goebel

Am 17.03.24 um 18:35 schrieb Ludovic Courtès:

https://issues.guix.gnu.org/issue/69554

Also, isn’t ‘BUILD_TESTING’ a convention rather than a flag CMake always
honors?


Yes, it's more like a convention, as the patch says: "Anyhow, the 
CMakeLists.txt needs to implement handling this flag."


Do you want the text in the documentation to be more verbose on this?

--
Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |


G-exps: thunk instead of top-level references?

2024-03-06 Thread Hartmut Goebel

  
  
Hi Ludio,
I'd like to get some advice:

In commit 84c3aafb5a18ad278bbb36df7b70849fe05789c8 "trytond:
  Avoid top-level references to other modules" your turned a
  top-level variable which defines into a thunk:



  -(define %standard-trytond-native-inputs
+(define (%standard-trytond-native-inputs)
   `(("python-dateutil" ,python-dateutil)
  

and the users:

   (native-inputs
- `(,@%standard-trytond-native-inputs
+ `(,@(%standard-trytond-native-inputs)
    ("trytond-purchase" ,trytond-purchase)))
  



I'm about to change the uses into G-exprs (see below) and wonder
  whether "%standard-trytond-native-inputs" should still be a thunk
  or can be turned pack into a top-level variable.

  -(define
(%standard-trytond-native-inputs)
  
  -
 `(("python-dateutil" ,python-dateutil)
  
  +(define
%standard-trytond-native-inputs
  
  +  (list
  python-dateutil
  
  

  and uses


  -    (native-inputs
(%standard-trytond-native-inputs))
+    (native-inputs
+ (cons* trytond-account-payment-clearing
+    %standard-trytond-native-inputs))
     
      
    
-- 
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |

  




Re: Contribute or create a channel?

2024-03-05 Thread Hartmut Goebel

Am 04.03.24 um 09:32 schrieb Andreas Enge:

I think that would be okay if you think it will be easier to maintain
(not needing to "roll over" code from an old package at the inheritance
root when it is deleted), assuming that the old packages are removed
from time to time.


This sounds like maintaining multiple LTS versions is desired. Anyhow 
Ricardo wrote:



I think it would be preferable to have one LTS version in Guix.


Thus the discussion about maintaining several versions is only relevant 
if this is what Guix wants.


In both cases I need some tooling to fetch the current bug-fix version 
of the series in question. This can not be done using "guix refresh" 
only AFAIU, as this would use the next release-series if this is already 
released while our packages are not yet updated. Thus maintaining two 
LTS versions should not be too much work (except if many dependencies 
change incompatible).


WDYT?

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: cmake-build-system: build tests only if #:tests? is true?

2024-03-04 Thread Hartmut Goebel

Hi Greg,

I will submit a patchset shortly for review and if accepted we can
look to combine all the cmake patches for the large rebuild.


Yes, of course we should combine them. May I ask you to take care of it 
and include mine, which I just posted:


https://issues.guix.gnu.org/issue/69554

Many thanks in advance.

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: cmake-build-system: build tests only if #:tests? is true?

2024-03-02 Thread Hartmut Goebel

Am 02.03.24 um 22:23 schrieb Ekaitz Zarraga:

core-updates?


Yes, many, many packages. Thus of course, this would need to go into 
core-updates.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




language-specific packages

2024-03-02 Thread Hartmut Goebel

Hi Andrew,

a few days ago you planed (but canceled) to stream about management of 
Elixir projects with Guix and why it's better not to package 
language-specific packages with Guix.


I still have ejabberd in my pipeline which would add quite some Erlang 
and Elixir packages to Guix. Thus I would be eager to learn your ideas 
prior to pushing these packages to Guix. May I ask you to share your 
thoughts?! Thanks.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




cmake-build-system: build tests only if #:tests? is true?

2024-03-02 Thread Hartmut Goebel

Hi,

I found an old and unfinished patch in my pile. It optimizes building 
with cmake by not building the test if "#:tests?" is false. (Basically 
it passes -DBUILD_TESTING=OFF/ON" depending on "#:tests?".)


Is this of interest? Then I would take my time and finish the patch.

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Contribute or create a channel?

2024-03-02 Thread Hartmut Goebel

Am 01.03.24 um 19:16 schrieb Saku Laesvuori:

It just requires a different updating method. The different versions can
just be defined as separate packages (see postgresql for an example) and
the user the defines which one they want to use. They can either refer
to the package variable directly in scheme (e.g. postgresql-15) or on
the command line with the name@version syntax (e.g. postgresql@15).


I'm aware of this way. It's fisable when there is a small number of 
packages like for postgres. But it is very laborious for rust packages 
already. I would not want to do the same for more than 200 packages.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Contribute or create a channel?

2024-03-02 Thread Hartmut Goebel

Hi Ricardo,

I think it would be preferable to have one LTS version in Guix.

Okay, that's a clear statement.


whether to grant you commit rights to handle these upgrades
by yourself.


Well, I already have commit rights :-) Anyhow the review period is a bit 
hindering, esp. for bug-fixes.




OTOH in Guix, maintaining several version seems laborious.

What makes you say this?  What exactly is the obstacle here?


The only way I know for handling several versions of the same package is 
what Saku Laesvuori described: Adding a new variable for each of these 
versions, possibly making the older version inherit from the new one. 
This means moving quite some code on every new version. Done this for 
quite some rust-packages and it i quite laborious and hard to automate. 
And for Tryton we have about 200 packages and growing.


When using branches (in a channel) one would just checkout that branch, 
automatically apply the patches and commit. If the version is no longer 
supported, simply stop applying updates on that branch.


Maybe using one file per release (and accept duplicate code) would be a 
suitable workaround.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: A friendlier API for operating-system declarations

2024-03-01 Thread Hartmut Goebel

Hi,

Am 19.02.24 um 23:25 schrieb antlers:

(define (os-with-yubi parent users*)
   (modify-record parent
 (groups -> (cons (user-group (name "plugdev")) <>))
 (users  -> (map (lambda (user)
   (if (member (user-account-name user)
   users*)
   (modify-record user
 (supplementary-groups -> (cons "plugdev" <>)))
   user))
 <>))
 (services => (append <> (list
   (service pcscd-service-type)
   (simple-service 'u2f-udev-rules udev-service-type
   (list (specification->package "libu2f-host")))
   (simple-service 'yubi-udev-rules udev-service-type
   (list (specification->package 
"yubikey-personalization"


I'd appreciate such a mechanism, as it fits my mental model of how to 
compose a system out of reusable components (much like "roles" in 
Ansible resp. Debop).


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Contribute or create a channel?

2024-03-01 Thread Hartmut Goebel

Hi,

I'm currently updating Tryton to version 7.0 and am wondering whether 
it's better to contribute the change to Guix or to set up a channel for 
Tryton.


WDYT? I'm eager to learn about your thoughts.

Here is why I'm wondering:

 * Tryton consists of a client, a server and about 200 module/add-on
   providing business logic.
 *

   Tryton publishes a LTS version every 2.5 years. Two LTS versions are
   supported (currently 6.0 and 7.0) and bugfixes are backported there
   for 5 years.

 *

   Every 6 month a new release is crafted (x.2, x.4, x.6, x,8) which
   will get bugfixes for1 year. Releases typically provide new modules
   (which is why updating is of interest) , might change inputs and
   might require database updates.

 * Bugfixes happens rather often and per-module, since they are
   published even for smaller fixes. Upstream promises to not contain
   functional changes or change requirements. Each bugfix could be
   implemented as a graft, since .

Given this, it might be interesting to have three versions of Tryton 
available: the two LTS versions and the latest version.


Now the idea is to provide a channel which provides a branch for each 
LTS version and a "main" branch for the latest release. This would allow 
to checkout the respective branch and refresh the packages of the 
respective version semi-automatically.


OTOH in Guix, maintaining several version seems laborious.

Anyhow I'm unsure whether it's worth the effort maintaining three 
versions and whether I'll be able to keep three version up to date - 
esp. given that I don't have much automation for this.


Some more background-info:

 * Within each version, there is guarantee that the database schema
   will not be changed. Anyhow between versions the db schema might
   change, requiring manual migration steps.
 * Debian as of now provides packages for 6.0 only (7.0 was released )


--
Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |


Re: Using gitlab-ci to maintain a channel?

2024-03-01 Thread Hartmut Goebel

Hi Ludo,

thanks for the answer. Looks like I need to go with Cuirass. But more 
probably I'll abandon the idea for now since its a spare-time project only.



--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Simple design question for schemers

2024-03-01 Thread Hartmut Goebel

Hi both of you,

Am 25.02.24 um 11:05 schrieb Ricardo Wurmus:

We have a macro called MODIFY-INPUTS, which you could use, but CONS* is
probably enough in your case.


Thanks. I'm using cons* now.

cons* basically is the same the "extend" I'm used to from Python - sadly 
the Guile manual is so terrible to understand that I did not get that.



--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Simple design question for schemers

2024-02-24 Thread Hartmut Goebel

  
  
Hi,
I'm about to refactor the Tryton packages to the (not so) new
  style. Now the trytond-xxx modules all share a basic list of
  native inputs,like this:
    (native-inputs
   `(,@(%standard-trytond-native-inputs)
     ("trytond-account-fr" ,trytond-account-fr)
     ("trytond-edocument-uncefact"
  ,trytond-edocument-uncefact)))

Now I wonder what would be the most scheme-like way for doing
  this in new style?
Using "apply list":

    (native-inputs (apply list
    trytond-account-fr
    trytond-edocument-uncefact
    %standard-trytond-native-inputs ))
Using "append":

    (native-inputs (append %standard-trytond-native-inputs
     (list trytond-account-fr
   trytond-edocument-uncefact)))

Using a custom function "extend":

    (native-inputs
   (extend %standard-trytond-native-inputs
   trytond-account-invoice
   trytond-purchase
   trytond-sale))
Using a custom function "@trytond-module-native-inputs":

    (native-inputs (@trytond-module-native-inputs
      trytond-account-invoice
      trytond-purchase
      trytond-sale))

    Opinions?

-- 
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |

  




Using gitlab-ci to maintain a channel?

2024-02-16 Thread Hartmut Goebel

  
  
Hi,
I wonder whether it's possible to maintain a channel using
  gitlab-ci. Any thought or experiences to share?
My idea is to schedule jobs which are refreshing the packages,
  building/testing them and to provide substitutes. 

The channel is meant for Tryton (tryton.org), which consists of
  about 200 plug-in packages and bug-fixes are published for the LTS
  version every now and then. The channel should follow these
  bug-fix releases. Tryton is pure Python, anyhow some dependencies
  require C, C++ and even rust for building, updating dependencies
  should be avoided if not all substitutes for all dependencies are
  available (or maybe deny-list specific dependencies).

The points I'm wondering are:

  What version of guix shall be used? Always the latest one?
  
  The runners need a docker image. Where to I get one? Possibly
containing a warmed-up cache? (Using a Debian docker image and
installing guix into it on every run sounds like a bad idea.)
  
  OTOH /gnu/tore could be cached. How much data would this
typically be?
  
  How to clean the cache from unreachable items?
  
  How to publish the substitutes?
  

Why gitlab-ci? Well, the channel will live on a gitlab instance,
  thus using that infrastructure would simplify things and avoid
  single users managing processes.

-- 
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |

  




Re: Guix Days: Patch flow discussion

2024-02-05 Thread Hartmut Goebel

Am 05.02.24 um 19:44 schrieb Suhail:

Could you please share a reference where the key difficulties you
encountered wrt the "current mail-based workflow" are summarized.  Is
the difficulty regd. checking out the code at the right commit and
installing the patches, or something else?


It's not only installing and testing the patches, but also

 * when has this issue/patch been worked on last - is somebody
   currently working on it
 * what issue/patches I started to review?
 * commenting on code requires to download the patch - strip out parts
   which are okay, comment, then mail the commented code to the correct
   issue number
 * Even when using the debbugs interface in emacs
 o It is is hard to use for occasional users
 o It is an insurmountable obstacle for those not using emacs.
 o It does not tell what issues/patches I've been working on
   already - and waiting for a reply
 o It does not tell which issues are stale

Anyhow, all of this has been discussed several times already. And as 
long as vocal (and active :-) members of the community insist on being 
able to work via e-mail — while also not adopting modern e-mail-capable 
forges — this situation will not change.


Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |


Re: Upgrading Guix's security team

2024-02-05 Thread Hartmut Goebel

Am 16.11.23 um 15:22 schrieb Ludovic Courtès:

We could distinguish security issues in packages provided by Guix from
security issues in Guix itself.


Maybe its also a good idea to add a security.txt to the website?

https://en.wikipedia.org/wiki/Security.txt "is meant to allow security 
researchers to easily report security vulnerabilities".


Respective RFC: https://datatracker.ietf.org/doc/html/rfc9116

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Guix Days: Patch flow discussion

2024-02-05 Thread Hartmut Goebel

Am 05.02.24 um 10:39 schrieb Steve George:
Hinders 


This list is missing one point - which has been discussed several times 
already without any result:


The current mail-based workflow is too complicated for new and 
occasional committers. This is the main reason I gave up reviewing patches.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: SSSD, Kerberized NFSv4 and Bacula

2023-08-29 Thread Hartmut Goebel

Hi,

Am 24.08.23 um 21:55 schrieb Martin Baulig:


 1. My "guix secrets" tool provides a command-line interface to
maintain a "secrets database" (/etc/guix/secrets.db) that's only
accessible to root.  It can contain simple passwords, arbitrary
text (like for instance X509 certificates in PEM format) and
binary data.

 2. …

 3. Finally, "secrets-service-type" depends on all of the above to do
its work.

It takes a /template file/ - which is typically interned in the
store - containing special "tokens" that tell it which keys to
look up from the /secrets database/.


This sounds great and like being a major step towards "guixops" [1], [2].

[1] 
https://lists.gnu.org/archive/html/guix-devel/2019-07/msg00435.html[2] 
https://lists.gnu.org/archive/html/guix-devel/2017-09/msg00196.html


--
Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |


Re: Registering an artifact as root

2023-08-09 Thread Hartmut Goebel

Am 23.07.23 um 14:38 schrieb Liliana Marie Prikler:
Hope that helps. 


Thanks, I was able to make this work:

guile -L .. -c '(use-modules (guix store)) (add-indirect-root 
(open-connection) "/abs/path/to/my/artifact")'


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Reusing code for a build-phase?

2023-08-06 Thread Hartmut Goebel

Am 06.08.23 um 14:49 schrieb Csepp:

Maybe you could create a build system that inherits from the one these
packages use, but adds the extra phase.


I was thinking about this, too. But this seems to be too much, as there 
are not hundreds of Vagrant plugins. Also a build-system requires 
documentation, wich is another pile of work.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: poetry: python-poetry?

2023-08-05 Thread Hartmut Goebel

Am 31.07.23 um 04:05 schrieb Hilton Chain:

I think we can define library and CLI program separately, since Python
libraries usually need to propagate some inputs, while CLI programs in
/bin and /sbin do not, as they are wrapped by the build system.


I like the idea of "hiding" dependencies behind the script and not have 
them pollute the library path. And indeed I just thought about something 
like this when packaging vagrant (which is a rube-program).


If we implement such a thing, IMHO it should become a wrapper function, 
doing all the magic, so the program would defined as simple as


(define-public xxx
    (python-scripts-from-package
   python-xxx
   "xxx" "xxx3" "xxx-admin"))   ; selecting scripts might be 
useful/necessary


And of course we should start providing such a thing for other 
languages, too.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Reusing code for a build-phase?

2023-08-05 Thread Hartmut Goebel

  
  
Hi,
I'm currently packaging vagrant and some plugins. For all plugins
  an additional phase is required, generating a json file, see
  below. Since this is quite some code, I'd like to put it into some
  definition or some wrapper. Since it uses (this-package-verion),
  my attempts failed.

At best, a plugin would then be defined like
  (define-public vagrant-abc (vagrant-plugin-wrapper (package …
Anyhow I'd be happy to when being able to use some function in
  the phase instead of duplicating the code.
Any ideas? Many thanks in advance

    (arguments
   (list
    #:tests? #f ; tests involve running vagrant and downloading
  a box
    #:phases
    #~(modify-phases %standard-phases
    (add-after 'install 'install-plugin.json
      (lambda _
    (let* ((plugins.d (string-append
   #$output
  "/share/vagrant-plugins/plugins.d"))
   (plugin.json (string-append
     plugins.d "/" #$name ".json")))
      (mkdir-p plugins.d)
      #$(with-extensions (list guile-json-4)
      #~(begin
      (use-modules (json))
      (call-with-output-file plugin.json
    (lambda (port)
      (scm->json
   '((#$name
      .
      (("ruby_version"
    . #$(package-version
  (this-package-input "ruby")))
   ("vagrant_version"
    . #$(package-version
  (this-package-input "vagrant")))
   ("gem_version" .  "")
   ("require" . "")
       ("sources" . #()
   port)))
  

-- 
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |

  




Re: Transformations Shell Syntax

2023-07-23 Thread Hartmut Goebel

Am 03.07.23 um 02:01 schrieb jgart:

Starting multiple workers:

$ herd start microblog-tasks@{1..4}
$ herd status microblog-tasks@{1..4}


Please note that this syntax is expanted by the shell! Thus these 
commands are the same as


$ herd start microblog-tasks@1 microblog-tasks@2 microblog-tasks@3 
microblog-tasks@4
$ herd status microblog-tasks@1 microblog-tasks@2 microblog-tasks@3 
microblog-tasks@4


--
Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |


Registering an artifact as root

2023-07-23 Thread Hartmut Goebel

  
  
Hi,
I'd like to create a symlink to a store object somewhere in my
  home-directory and register this symlink as a root, to avoid the
  garbage collector removes the store object.
How can I achieve this with existing guix means?

Background: Vagrant has the means of distributing virtual
  machines as "boxes", tar.gz files which are downloaded and then
  extracted into some cache directory. When Vagrant creates a
  virtual machine based on that box, it copies the disk-image to
  some working directory. The cache content is (almost) trivial,
  thus I wrote a small script creating the required files in the
  cache directory and sym-linking the disk-image there. Of course,
  the garbage collector should not remove the disk-image a long as
  it is used (= linked from the box cache).

-- 
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |

  




Re: License of “10 years of stories behind Guix”

2023-02-11 Thread Hartmut Goebel

Dear lawyers of the world,

I agree for my contribution to the blog post

https://guix.gnu.org/en/blog/2022/10-years-of-stories-behind-guix/

to be published under CC-BY-SA 4.0 and GFDL version 1.3 or later (or
whatever other open license the maintainers of the blog might prefer in
the future).


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: All updaters are broken

2023-01-03 Thread Hartmut Goebel

Am 03.01.23 um 10:49 schrieb Ludovic Courtès:

I tried something different and perhaps simpler: making sure
‘options->update-specs’ always returns a list of , as the
name implies, and does the right thing with manifests, -r, and -e.
(Part of the patch moves the  definition before its first
use.)

WDYT?


I'm biased, as this look much like my last proposal :-) plus some good 
simplifications which I would never been able of. Esp. getting rid of 
this (package) function is good, as this function made the control-flow 
obscure,


FMPOV please go ahead and merge.

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: All updaters are broken

2023-01-02 Thread Hartmut Goebel

Am 02.01.23 um 20:17 schrieb Ricardo Wurmus:

Thanks for providing the patch. For me this looks huge and hard to
maintain.

“Hard to maintain”?  How so?


For me this double structure is hard to understand and thus to maintain. 
YMMV.


Anyhow, if you want me to implement a solution bases on your code, I'll 
do so. You are far more experienced in Scheme than I am.


--
Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |


Re: All updaters are broken

2023-01-02 Thread Hartmut Goebel

Hello Ricardo,

Am 02.01.23 um 14:16 schrieb Ricardo Wurmus:

Attached is a crude implementation of that.  I just consed the lists
together instead of returning multiple values, because the compound
value is to be used inside the store monad where we can’t easily access
multiple values.


Thanks for providing the patch. For me this looks huge and hard to 
maintain. I'd rather make "options->update-specs" return update-specs in 
any cases. This adds a small overhead only in the case of --recursive.


Enclosed please find my proposal. WDY?

Tested cases

./pre-inst-env guix refresh --list-updaters
./pre-inst-env guix refresh -u python-flask
./pre-inst-env guix refresh -u python-flask=2.2.1
./pre-inst-env guix refresh python-flask
./pre-inst-env guix refresh python-flask=2.2.1
./pre-inst-env guix refresh --list-transitive python-flask
./pre-inst-env guix refresh --list-dependent python-flask
./pre-inst-env guix refresh -l python-flask

./pre-inst-env guix refresh -t hexpm -u
./pre-inst-env guix refresh -t hexpm
./pre-inst-env guix refresh -t hexpm erlang-relx -u
./pre-inst-env guix refresh -t hexpm erlang-relx

./pre-inst-env guix refresh -e '(@ (gnu packages erlang) erlang-relx)'
./pre-inst-env guix refresh -m test-manifest.scm

./pre-inst-env guix refresh --recursive python-flask
./pre-inst-env guix refresh --select=core

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |
diff --git a/guix/scripts/refresh.scm b/guix/scripts/refresh.scm
index e0b94ce48d..f9c4a4c87c 100644
--- a/guix/scripts/refresh.scm
+++ b/guix/scripts/refresh.scm
@@ -184,8 +184,8 @@ specified with `--select'.\n"))
   (show-bug-report-information))
 
 (define (options->update-specs opts)
-  "Return the list of packages requested by OPTS, honoring options like
-'--recursive'."
+  "Return the list of  records requested by OPTS, honoring
+options like '--recursive'."
   (define core-package?
 (let* ((input->package (match-lambda
  ((name (? package? package) _ ...) package)
@@ -220,15 +220,15 @@ update would trigger a complete rebuild."
 (_
  (cons package lst)
 
-  (define args-packages
-;; Packages explicitly passed as command-line arguments.
+  (define args-packages->update-specs
+;; update-specs for packages explicitly passed as command-line arguments.
 (match (filter-map (match-lambda
  (('argument . spec)
   ;; Take either the specified version or the
   ;; latest one.
   (update-specification->update-spec spec))
  (('expression . exp)
-  (read/eval-package-expression exp))
+  (package->update-spec (read/eval-package-expression exp)))
  (_ #f))
opts)
   (() ;default to all packages
@@ -236,25 +236,29 @@ update would trigger a complete rebuild."
 ('core core-package?)
 ('non-core (negate core-package?))
 (_ (const #t)
- (fold-packages (lambda (package result)
-  (if (select? package)
-  (keep-newest package result)
-  result))
-'(
+ (map package->update-spec
+  (fold-packages (lambda (package result)
+   (if (select? package)
+   (keep-newest package result)
+   result))
+ '()
   (some   ;user-specified packages
some)))
 
-  (define packages
+  (define update-specs
 (match (assoc-ref opts 'manifest)
-  (#f args-packages)
-  ((? string? file) (packages-from-manifest file
+  (#f args-packages->update-specs)
+  ((? string? file) (map package->update-spec
+ (packages-from-manifest file)
 
   (if (assoc-ref opts 'recursive?)
   (mlet %store-monad ((edges (node-edges %bag-node-type
  (all-packages
-(return (node-transitive-edges packages edges)))
+(return (map package->update-spec
+ (node-transitive-edges (map update-spec-package update-specs)
+edges
   (with-monad %store-monad
-(return packages
+(return update-specs
 
 
 ;;;
@@ -268,13 +272,17 @@ update would trigger a complete rebuild."
   (version update-spec-version))
 
 (define (update-specification->update-spec spec)
-  "Given SPEC, a package name like \"

Re: All updaters are broken

2023-01-01 Thread Hartmut Goebel

Hi Ricardo,

I managed working on this this evening already.

Am 31.12.22 um 15:27 schrieb Ricardo Wurmus:

Commit 8aeccc6240ec45f0bc7bed655e0c8149ae4253eb seems like the problem
here.  Hartmut, can you please fix this?  Otherwise I’d like to revert
this and related commits ASAP.


I fixed he tests and pushed as d7a9d72bb02a2a3b1a99183655bf878547116032.

Regarding the command "guix refresh": According to my tests only 
invocations not providing a package name failed (see below). Anyhow I 
did not manage fixing this:


options->update-specs need to return update-specs in all cases, and 
currently returns packages if no packages have been named on the command 
line.


guix/scripts/refresh.scm (options->update-specs), lines 252ff:

  (if (assoc-ref opts 'recursive?)
  (mlet %store-monad ((edges (node-edges %bag-node-type
 (all-packages
    (return (node-transitive-edges packages edges)))
  (with-monad %store-monad
    (return packages

Any hints?

These invocations fail:

/pre-inst-env guix refresh -t crane -u
/pre-inst-env guix refresh -t hexpm -u
/pre-inst-env guix refresh -t hexpm

All these invocations pass:

./pre-inst-env guix refresh --list-updaters
./pre-inst-env guix refresh -u python-flask
./pre-inst-env guix refresh -u python-flask=2.2.1
./pre-inst-env guix refresh --list-transitive python-flask
./pre-inst-env guix refresh --list-dependent python-flask
./pre-inst-env guix refresh -l python-flask

Untested:

--recursive — did nothing?
--select

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |
H




Re: All updaters are broken

2023-01-01 Thread Hartmut Goebel

Hi Ricardo,

my fault, I missed running the tests again after the latest changes.

I'll work on fixing this tomorrow (Monday).

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Stratification of GNU Guix into Independent Channels

2022-12-27 Thread Hartmut Goebel

Am 24.12.22 um 04:49 schrieb jgart:

Should GNU Guix be a small core of packages (and services?)?


No. As others already stated, this would complicate things for users. 
Having all available software at on place is a big plus. (For many 
GNU/Linux distros you need to add another ppa for many „standard“ cases. 
IMHO this is disgusting)


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Antioxidant (new rust build system) update - 100% builds

2022-11-01 Thread Hartmut Goebel

Am 29.10.22 um 21:38 schrieb Maxime Devos:
100% (rounded up) of the packages build with antioxidant, though a 
very few still fail to build: 
<https://ci.guix.gnu.org/eval/749079/dashboard>. 


\o/ You are great! Thansk for all the work you've put into this.

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: antioxidant update: librsvg builds, and other things (core-updates)

2022-09-01 Thread Hartmut Goebel


Hi Maxmine,

great news, thanks for the update and for working on antioxidant.



Some questions:

  * Some Rust crates have 'examples' and 'benchmarks' that can be
compiled and installed.


I support skipping these, as there is little value.


--
Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |


Re: Idea: Function composition to declare operating-system

2022-08-31 Thread Hartmut Goebel

Am 29.08.22 um 17:14 schrieb Théo Maxime Tyburn:

Anyway, what do you think about this functionality? Have you already 
experimented with similar things?
Did I reinvent the wheel? Is there a better approach?


I really like the idea!

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: guix refresh to a specific version?

2022-08-28 Thread Hartmut Goebel

Am 07.07.22 um 09:45 schrieb Ludovic Courtès:

I’ll be monitoring guix-patches for the final version.  :-)


It took some time, and here it is: https://issues.guix.gnu.org/57460

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




native-inputs: Go for completeness or minimalism?

2022-07-20 Thread Hartmut Goebel

Hi,

shall native-inputs be as complete as possible or as minimal as possible?

Background: I just stepped over a couple of packages where upstream 
requires a lot of code-quality checkers which are not actually run when 
running the tests. (More specific: These are Python packages demanding 
tools like flake8, flake8-docstring, black, bandit.)


Now when going for minimal dependencies and minimal native-inputs,

Pro: Less dependencies, simpler dependency tree, thus less computation, 
faster, less power consumption.


Con: Might need phase to remove dependencies, 'guix shell -D' will not 
provide every development requirement.


Personally I tend to minimal.

WDYT?

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: guix refresh to a specific version?

2022-07-04 Thread Hartmut Goebel

Am 30.06.22 um 13:58 schrieb Ludovic Courtès:

Excellent. If you want, you can ping me for early comments on the
 API for this.


Thanks for offering this, I appreciate.

I pushed my current working state to wip-import-version. The branch also 
contains some patches from bug56295 (Add some Chicken eggs and fix egg 
importer) and bug56318 (Fix github updater).


Basic ideas

 * Add a keyword argument „version“ to all “latest-X-release” functions
   and eventually pass it on to called functions.
 * Minimal changes

Here is a list of updaters and packages to test the import-specific-version:

+crate: rust-sequoia-openpgp@1.7.0
+egg: testing requires new packages (see bug56295),
  changing version of “chicken-args” to 1.6.0 in chicken.scm and
  test updating to chicken-args@1.6.1
+generic-git (import/git): remmina@1.4.25
+generic-html (gnu-maintenance): xclock@1.1.0
+github: libwacom@1.12
+gnome: gtk@4.6.2
+gnu (gnu-maintenance): help2man@1.49.1
+gnu-ftp (gnu-maintenance): libgpg-error@1.43
+hexpm: testing requires changing version in erlang.scm
       erlang-relx to 4.5.0 and test updating to erlang-relx@4.6.0
+kde: plasma-framework@5.93.0
+kernel.org (gnu-maintenance) = html: iw@5.3
+launchpad: testing requires changing version in terminals.scm
       sakura to 3.8.0 and testing upate to sakura@3.8.3
+pypi: trytond-party@6.2.0
+savannah (gnu-maintenance) = html: libpipeline@1.5.4
+xorg (gnu-maintenance) = ftp : libxi@1.7.99.2


These updaters report an error that they can't update to a specific 
version, if a version is given:


?bioconductor (cran.scm) -- repo supports latest version only
?cran -- repo supports latest version only
  old version available in
https://cran.r-project.org/src/contrib/Archive/PACKAGENAME
-cpan  --- no version, not recursive
-elm  -- no updater
-elpa  --- no version
-gem  --- no version
-hackage  --- no version
-minetest  --- no version
-opam  --- no version
-sourceforge (gnu-maintenance) -- to complicated
-json  -- n/a
-texlive  --- no version, not recursive, no updater

Not yet implemented — unclear how to handle:

# stackage  --- LTS version

Still missing:

 * Finishing commit messages
 * Documentation?
 * For the final commit, I'll squash the commits.

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Shall updaters fall back to other updaters?

2022-07-04 Thread Hartmut Goebel

Am 03.07.22 um 17:11 schrieb Kaelyn:

To me, this feels like the importers will need a more deterministic order 
imposed on them to get import results that are consistent across systems. HTH!


Many thanks for your analysis.

Meanwhile it came to my mind that I actually have different 
installations of guix which I can compare, Below are three 
installations. each one has a different order (these are actually 
different versions of guix with different importers/updaters, anyhow the 
order should be the same):


stackage pypi opam minetest launchpad kde hexpm hackage gnome github
generic-git gem elpa egg crate bioconductor cran cpan savannah
generic-html gnu-ftp sourceforge xorg kernel.org gnu

stackage pypi opam minetest launchpad kde hackage gnome github
generic-git gem elpa egg crate cran bioconductor cpan kernel.org
sourceforge gnu generic-html gnu-ftp xorg savannah

stackage pypi opam minetest launchpad kde hackage gnome github
generic-git gem elpa egg crate bioconductor cran cpan sourceforge
generic-html gnu-ftp savannah xorg kernel.org gnu

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Shall updaters fall back to other updaters?

2022-07-03 Thread Hartmut Goebel

Am 01.07.22 um 15:15 schrieb Ludovic Courtès:

BTW 2: Which updater is used for each package is non-deterministic.

Do you have an example?  I’d think they’re always tried in the same
order, no?


When looking at the code, I don't see any means of sorting: 
<https://git.savannah.gnu.org/cgit/guix.git/tree/guix/upstream.scm#n245> 
and below. (Maybe sorting is hidden in one of the function 
guile-internal used there there.)



--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Shall updaters fall back to other updaters?

2022-06-30 Thread Hartmut Goebel

Hi,

while working on refreshing to a specific version (see 
https://lists.gnu.org/archive/html/guix-devel/2022-06/msg00222.html) I 
discovered that the updaters fall back to another updater. Is this intended?


Concrete example (using refresh to a specific version): Package "xlsxio" 
has no version 0.2.30. When trying to refresh to this version, the 
github updater comes first and of course fails to get this version. Then 
the generic-git updater is triggered and tries to get the version.


IMHO each package should be handled by a single updater.

What do you think?

BTW 1: There are other packages which are handled be several updaters: 
If you sum up the percent valued of "guix refresh --list-updaters" you 
will get about 140%. Anyhow the generic-git updater contributed with 
about 30% to this amount.


BTW 2: Which updater is used for each package is non-deterministic.

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: guix refresh to a specific version?

2022-06-28 Thread Hartmut Goebel
FYI: I'm working on this. Given the number of importers and updaters is 
just takes some time.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: On commit access, patch review, and remaining healthy

2022-06-20 Thread Hartmut Goebel

Hi,

here are my reasons for reviewing patches very very rarely-

Basically I share Brian Cully's experiences. I'm using Thunderbird for 
mail and my system is set up to emacs could send out mails.


I tried debbugs from time to time and for me it is disgusting:

 * too complicated to get to the list of patches or bugs ( I never can
   remember the many key-presses to perform) ,
 * I did not manage to apply patches from there (emacs would need to
   know where my guix development directory is - how can I tell it?)
 * commands withing debbugs feel complicated
 * if a ticket contains several updated patches, its very hard to find
   those relevant (one of the reasons of forges' success is that they
   present you the current state)
 * actually testing the patches required to apply the patches to my
   worktree - and chances are high 'git am' will fail with some
   conflict - chances raise extremely for old patches
 * Over all for me debbugs.el needs a much more "noops"-friendly interface

Regarding the actual review:

 * Yes, I miss a review guide-line
 * As Arun wrote: Guix has high quality standards. I feel uncomfortable
   with judging whether a summary or description is good enough. Also
   I'm not a native speaker and don't feel entitled to review English
   gramar and spelling.
 * I miss a way to contribute to a review but not actually approving
   it. (In git(lab,hub) I could add comments other reviewers and the
   submitter could resolve or give feedback. This allows me to focus on
   e.g. some parts, while someone else could review the summary and
   description.)
 * I also miss automated tests. E.g. it dos not make sense to waste my
   time for running 'guix lint', as a automate could do this.

When agreeing to a patch:

 * I'd like to simply merge the patch without taking care abut whether
   the submitter has commit right. This saves time for the submitter.
   Sending "LGTM" is pleasant, anyhow wasting time. The reviewer needs
   to send a mail, the submitter needs to dig the branch, rebase it and
   push. If the reviewer already did merge it, he/she could push, too,
   IMHO.
 * And for me as a submitter: I want my patches to be merged by the
   reviewer.


Am 07.06.22 um 17:11 schrieb Ludovic Courtès:

Do you or would you use them to keep track of pending patches?


I use issues.guix.gnu.org eventually. Anyhow this is browse-only. I did 
not even manage to add a reply there.


--
Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |




Re: guix refresh to a specific version?

2022-06-19 Thread Hartmut Goebel

Hi,

Am 17.06.22 um 17:37 schrieb Ludovic Courtès:

It’s currently not possible, but pretty much all the machinery is there,
in importers.  It’s a low-hanging fruit that I think we should pick!


I gave it a try and discovered that we need to discuss the interface and 
design:


One can already specify a version, which defines the package to be 
updated. The respective place the code has the comment


Take either the specified version or the latest one.

and this was added by you (Ludo) in 
4b9b3cbbc45afa3e374889847d4ab8673b8b2db2 (April 2015):


refresh: Allow users to refer to specific package versions.

* guix/scripts/refresh.scm (guix-refresh): Use 'specification->package'
  instead of 'find-packages-by-name'.  This allows users to specify
  things like "qt-4.8.6".


Beside this, I'm unconfident about the implementation: Currently refresh 
passes around package objects and passes this to the updater. When 
updating to a specific version, we need to pass around both the package 
and the desired version, which implies case-handling at many places. Any 
idea how to solve this elegantly?


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




guix refresh to a specific version?

2022-06-15 Thread Hartmut Goebel

Hi,

I wonder whether this is a way to refresh to a specific version, like 
one can import a specific version:


works:

    guix import pypi trytond@6.2.0

does not work:

    guix refresh -u trytond@6.2.0
    […]
    guix refresh: error: trytond: package not found for version 6.2.0

My use-case is to update 170 trytond modules from 6.0.x to the 6.2.x, 
while 6.4 is current. (I have a list of exact versions, thus being able 
to specify a discrete version is fine.)


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Status of KDE Plasma

2022-05-02 Thread Hartmut Goebel

Am 02.05.22 um 11:14 schrieb Hartmut Goebel:
Basically there are package definitions for most of the Plasma 
packages (as of 5.19), kwin, etc. One of the files has a list of state 
per package. Anyhow I failed to make Plasma start the actual desktop.


To emphasize this: Packaging the Plasma module is basically done. But 
making Plasma Desktop actually run is a major issue.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Status of KDE Plasma

2022-05-02 Thread Hartmut Goebel

Am 01.05.22 um 23:13 schrieb Anthony Wang:

I also found the wip-kde-plasma branch in the Guix Git repository has some made 
some progress too. However, this branch has not been committed to in more than 
a year and only supports KDE 5.19.


This branch intentionally is on 5.19, as the Readme says:

|Note 1: It might be worth making an older version of Plasma work, as 
this an older version has less dependencies. This is why this branch 
still sticks at 5.19.5. |


And to my experience Plama is a beast and taming it might be much easier 
when using a less beasty version. This is why I recommend sticking with 
this old version for now and update only after this works.


||

--
Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |


Re: Status of KDE Plasma

2022-05-02 Thread Hartmut Goebel

Hi,

glad to see someone to pick up this task.

Am 01.05.22 um 05:06 schrieb Anthony Wang:

I have a few questions: What is the current status of KDE Plasma on Guix? And 
also, how can I help or contribute?


You can find the current state of Plasma in the „wip-kde-plasma” branch 
<https://git.savannah.gnu.org/cgit/guix.git/tree/?h=wip-kde-plasma>.


The top-level directory contains a 00-README-Plasma.txt 
<https://git.savannah.gnu.org/cgit/guix.git/tree/00-README-Plasma.txt?h=wip-kde-plasma>, 
explaining the current state in detail. The branch also provides some 
helper-scripts for testing, etc. I strongly recommend reading through 
all of these files to find the hidden gems ;-)


Basically there are package definitions for most of the Plasma packages 
(as of 5.19), kwin, etc. One of the files has a list of state per 
package. Anyhow I failed to make Plasma start the actual desktop.


--
Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |


Designing importers (was: (Re-) Designing extracting-downloader)

2022-04-06 Thread Hartmut Goebel

Am 26.03.22 um 01:56 schrieb Maxim Cournoyer:

[Answering on the question how to design the extracting download I 
originally thought of using got hex.pm packages:]



Is there a strong reason to want to use the archive instead of the
sources from the project repository?


For the same reason you prefer to import from a PyPI package instead of 
the project git-repo: The metadata is easily available.


Anyhow, using the git-repo could be a pro, since the hex.pm package 
might miss tests or test-data. OTOH I discovered that some Erlang 
projects have the build-tool binary („rebar3“)  committed in the 
git-repo, So when using the git-repo, this needs to be removed by a 
snippet (which would not be required when using the hex.pm archive).


So this is a more general discussion: Would it be better — also in 
regard to detecting new versions — to use the projects source-repo or 
the package manager's repo.


Given the recent discussion about how to make packaging easier, maybe 
the hex.pm importer (and others) should become much more capable: E.g. 
the importer could fetch the meta-data from hex.pm and then create a 
package definition pointing to github (falling back to hex.pm). And then 
- to make life easy for packagers, check the repo for „rebar3“ and in 
case create a snippet for removing it.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Building hexyl (a rust app) without cargo, with antioxidant-build-system

2022-04-06 Thread Hartmut Goebel

Am 05.04.22 um 18:10 schrieb Maxime Devos:
Some other improvements that weren't announced previously: 


Wow! Impressive!

I'd be eager to tr it with sequoia-openpgp. Please drop me a note when 
you think your work might be able to build that beast.




   * Package definitions can request non-default features to be built
 anyway.

 A difference from cargo-build-system: features are set in the
 package of the rust crate, not the package using the rust crate.


How is this intended to work?

Package 1 has features AAA (= default) and BBB. So a .rlib is build for 
each feature (package1-AAA.rlib, package1-BBB.rlib) Or will one need to 
define two guix packages (package1+aaa and package1+bbb) and make the 
build-system build the respective feature?


I personally would prefer the former. Thus package2 would pick up the 
pre-compiled rlib for the respective feature.


Rational: If there are more features to be combined, the number of 
packages to be build can a order of square. So defining all these 
packages becomes a burden quickly. This is a computer's job :-) The 
build-system could easily build all combinations. suffixing each rlib 
with a short hash over the feature names,



--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Compiling rust things without cargo (super WIP POC)

2022-04-02 Thread Hartmut Goebel

Am 31.03.22 um 22:06 schrieb Maxime Devos:

In my experiments, it looks like the rust compiler actually_does_
support static libraries, though perhaps cargo doesn't.


AFAIU this assumption is correct.



I invite you to take a look 
at<https://notabug.org/maximed/cargoless-rust-experiments>.
It contains a minimal rust library (libhello) and a minimal 'hello
world'-style application that uses 'libhello'.

Impressive!

As a next step, maybe I could try writing a Guix package definition for libhello
and hello-oxygen, gradually making things more complicated (macros, transitive
dependencies, some non-toy Rust dependencies, a Guix build system ...)?


Here is my challenge :-) 
<https://gitlab.com/sequoia-pgp/sequoia/-/blob/main/openpgp/Cargo.toml>: 
different dependencies per feature, os, target-arch and target-os as 
well as passing on features to dependencies.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Compiling rust things without cargo (super WIP POC)

2022-04-02 Thread Hartmut Goebel

Am 01.04.22 um 12:08 schrieb Maxime Devos:

Do you know a ‘real’ Rust applications with few transitive dependencies
(say, 3 or so) with preferably few rust source files?


For my tests I used „roxmltree“, and by searching I just discovered 
https://crates.io/crates/hello_exercism.


HTH

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Removing #:skip-build? from the crate importer?

2022-04-02 Thread Hartmut Goebel

Am 31.03.22 um 21:14 schrieb Maxime Devos:

There are a few counter-proposals.  One suggestion that has been
raised, but not yet implemented, would be to make it so that build
results can actually be reused.  This is the most promising
conceptually, but from what I can gather from those working on it might
not be that easy to pull off.

Yes, that would be nice.


AFAIK Effraim and me tried to make cargo pick up artifacts build in 
another package. I gave somewhen: cargo did not pick up the artifacts, 
due to much „magic“ which I was unable to understand and work around.


One of the topics here are „features“ (like compile-time options), 
timestamps, exact absolute pathname, etc. All of this seems to go into 
some hash, which will determine which files will be rebuild. If anybody 
is interested, I could share the code of my last try. (Maybe the road to 
go would be to make cargo less strict here. But this requires 
understanding rust code and cargo.)



--
Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |


Re: Removing #:skip-build? from the crate importer?

2022-03-31 Thread Hartmut Goebel

Hi,

since rust does not support anything like static or dynamic libraries, 
building (intermediate) crates is useless like a hole in my head. Any 
output on any intermediate crate will just be thrown away.



Often, when new rust package definitions are submitted at guix-
patches@, I see #:skip-build? #false.  Apparently it's added by default
in (guix import cargo), with some exceptions.
The idea behind is to have #:skip-buiild #f for all "top level" crates, 
which are assumed to be programs. Thus, only crates imported recursively 
will get get #:skip-buiild #t. If one imports a single crate, it well 
get #:skip-buiild #f — which is what you experience.



   However, ‘(guix)Rust
Crates’ states:

Care should be taken to ensure the correct version of dependencies
are used; to this end we try to refrain from skipping the tests or
using ‘#:skip-build?’ when possible. Of course this is not always
possible [...]


This text is from 2020-02-17 (written by Effraim) and predates 
269c1db41bd8 (committed 2020-12-02).


While I understand the intention of this, I'm not convinces about it. 
Primary this will lead to a huge wast of time and electrical power - 
just to trash the results. This will not only effect our own build farm, 
but also each user.


Please be aware, that with #:skip-buiild #t, every crate will be build 
again by every other crate using it. So if crate AA is used by B1 and B2 
and C1 depends on B1 and B2, AA will be build 4 times!




As such, WDYT of removing #:skip-build? #false from (guix import
crate)?  FWIW, this was added in commit


I would propose the opposite: Keep it and make #t the default.

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: (rust) Do we always need to package minor versions separately?

2022-03-10 Thread Hartmut Goebel

Am 08.03.22 um 22:01 schrieb Liliana Marie Prikler:

In practice, we assume 0.y.z be compatible with 0.y.a, a < z or the
other way round depending on which direction you're looking at.  I'm
not sure if this is a rust fortification of semver


This is backed by the cargo manual 
<https://doc.rust-lang.org/cargo/reference/specifying-dependencies.html>:


   An update is allowed if the new version number *does not modify the
   left-most non-zero digit* in the major, minor, patch grouping.

--
Schönen Gruß
Hartmut Goebel
Dipl.-Informatiker (univ), CISSP, CSSLP, ISO 27001 Lead Implementer
Information Security Management, Security Governance, Secure Software 
Development


Goebel Consult, Landshut
http://www.goebel-consult.de

Blog: 
https://www.goe-con.de/blog/nicht-pgp-ist-gescheitert-die-entwickler-haben-versagt 

Kolumne: 
https://www.goe-con.de/hartmut-goebel/cissp-gefluester/2012-02-bring-your-own-life-glosse 



Re: Excessively energy-consuming software considered malware?

2022-02-24 Thread Hartmut Goebel

CW: politics below

Am 20.02.22 um 21:39 schrieb Martin Becze:
But undermining the governments ability to raise tax and therefor to 
wage war or not expending energy to prevent government theft is the 
‘controversial morality’ that I am sure can be agreed to death and 
which probably doesn't belong on this list. 


Undermining the governments ability to raise tax also means undermining 
the ability to build schools, kindergartens, public libraries, public 
transport, streets, etc. Who is going to pay and provide all of this, If 
there is no democratically controlled(*) government?


You might argument that this will then be paid be wealthy people - but 
the country will depend solely on their will and want. And these wealthy 
people are not controlled at all. And these people might wage war, too. 
We already had such a system in the medieval time. It:s called feudalism.


So nothing is won by undermining the government.

(*) Democratic control still needs a lot of improvement. Esp. in the USA 
where „the winner takes it all“ results in a two-party system, which 
does not represent the people. But this is another issue.



--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Excessively energy-consuming software considered malware?

2022-02-24 Thread Hartmut Goebel

Am 20.02.22 um 17:52 schrieb Maxime Devos:

While it's the user's choice whether they_want_  to mine or not
(Guix is not a thought police!), it seems inadvisable to_help_  people
with mining and perhaps useful to_stop_  people from mining.


+1

Since we are technicians, we have to take your share of responsibility 
to save our planet. (Much like we want planet-savers to respect the 
human right to privacy.)


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: (Re-) Designing extractong-downaloder

2022-02-24 Thread Hartmut Goebel

Am 23.02.22 um 13:35 schrieb Maxime Devos:

Nevermind, this benefit is probably undone by the extra unpacking.


Probably.

Anyway, this is worth thinking of, as it would make the additional 
unpacking part of the source. And thus unpacking would be decoupled from 
the build-system. (Which was part of the idea behind the proposal.)


After considering this for some time, I actually like your idea: it is 
explicit (which is better than implicit), flexible and simple (no 
extracting downloader required at all). And it also does not lead to any 
problems with content-addressed downloads like SWH. The only downside I 
can see at the moment is that is stores both the outer and the inner 
archive.


Let's see what others think about it.

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: (Re-) Designing extractong-downaloder

2022-02-24 Thread Hartmut Goebel

Am 23.02.22 um 11:52 schrieb pukkamustard:

Why use the source from hex.pm at all?


While issue 51061 is about the hex.pm importer and the rebar build 
system, this thread in only about the extracting downloader :-)



The hex.pm metadata.config file does not seem to exactly specify the
upstream source. We would need some heuristics to figure this out. But
maybe we could find a heuristic that works well enough? This would solve
the double-archive problem.


FMPOV, hex.pm is one important valid distribution point for erlang and 
elixir packages. Like PypPi is for Python and CPAN is for Perl. So we 
should support defining this as a packages source, which can also be 
used for checking for updates much easier than any git repository or 
git-based forge.


Some of the packages I've investigated so far are easier to build from 
hex.pm than from github. E.g. some github repos contain a „rebar“ binary 
(which needs to be deleted by a snippet when defining the source), while 
the corresponding hex.pm package can be used as-is.


Regarding heuristics: Since build should be reproducible, a source 
definition must not use any heuristics. Anyhow this might be useful for 
the hex.pm importer.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




(Re-) Designing extractong-downaloder

2022-02-23 Thread Hartmut Goebel

Hi,

TL;DR: What do you think about the idea of an „extracting dowloader“?

I'm about pick up work on „extracting downloader“ and the rebar build 
system (for erlang), see <https://issues.guix.gnu.org/51061> for a first 
try, In the aforementioned issue some points came up regarding the basic 
design of the patch. Thus before starting to write code, I'd like to 
agree on a basic design.


The basic idea behind „extracting downloader“ is as follows: Packages 
provided by hex.pm (the distribution repository for erlang and elixir 
packages) are tar-archives containing some meta-data files and the 
actual source (contents.tar.gz), see example below, So the ideas was to 
only store the contents.tar.gz (instead of requiring an additional 
unpacking step).


In some earlier discussion someone mentioned, this could be interesting 
for ruby gems, too.


Storing only the archive would allow to have the archive's hash as the 
"source"-hash and allow for easy validation of the hash. Anyhow, much of 
the complexity of the current implementation (see issue 51061) is caused 
by this idea, since the code needs to postbone hashing to after the 
download.


Also In some earlier discussion Ludo (afair) brought up the point 
whether e.g. swh would be able provide a source-package if hased this way.


What do you think about the idea of an „extracting dowloader“?


Example for a package from hex.pm:

$ wget https://repo.hex.pm/tarballs/getopt-1.0.2.tar
…
$ tar tvf getopt-1.0.2.tar
-rw-r--r-- 0/0   1 2000-01-01 01:00 VERSION
-rw-r--r-- 0/0  64 2000-01-01 01:00 CHECKSUM
-rw-r--r-- 0/0 451 2000-01-01 01:00 metadata.config
-rw-r--r-- 0/0   14513 2000-01-01 01:00 contents.tar.gz


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Updating old blog posts?

2022-01-01 Thread Hartmut Goebel

Am 31.12.21 um 12:14 schrieb Liliana Marie Prikler:

have a small notification directing them to a new blog post or
the cookbook


+1

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Formalizing teams

2021-12-24 Thread Hartmut Goebel

Am 23.12.21 um 22:51 schrieb Jonathan McHugh:

I reckon 'coterie' is more elegant a term:


IMHO we chould not use this term.

The German translation is "Seilschaft", "Klüngel" - both of which have 
negative meaning: Working together to the benefit of the members of the 
group only - ignoring common interest.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: [core-updates-frozen] Tryton broken

2021-12-19 Thread Hartmut Goebel

Am 14.12.21 um 09:15 schrieb zimoun:

Now, core-updates-frozen is merged, they can go to master. :-)


I pushed the fixes to master yesterday. Shall I cherry-pick them to the 
release-1.4 branch or will somebody else take care of that?


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: [core-updates-frozen] Tryton broken

2021-12-03 Thread Hartmut Goebel

Hi,
I just sent in patches fixing this issue, see 
<http://debbugs.gnu.org/cgi/bugreport.cgi?bug=52259>


They can be added on master, too, since they are not dealing with 
'sanitiy-check


Question: After approval, shall I push them to core-updates-frozen and 
or to master?



--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




sanity-check: Don't understand the error

2021-12-02 Thread Hartmut Goebel

Hi,

while trying to fix the trytond modules, one of the packages reports an 
error, which I simply don't understand:


starting phase `sanity-check'
validating 'trytond-party' 
/gnu/store/mf5rby1afnmvvxc778sr56gyangzdz6r-trytond-p

arty-6.0.2/lib/python3.9/site-packages
...checking requirements: ERROR: trytond-party==6.0.2 (python-stdnum 
1.14 (/gnu/

store/04i1p7rw5583g0la8d66qwzwlfs9rvhg-python-stdnum-1.14/lib/python3.9/site-pac
kages), Requirement.parse('python-stdnum>=1.15'), {'trytond-party'})

What is the *meaning* of this error? Does Requirement.parse()? fail? 
Please help me understanding the error so I can fix the trytond modules. 
Thanks.


PS: Any change to get a improved sanity-check.py into 
core-udates-frozen? Then we could add more explainations to the output.


--
Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |


Re: [core-updates-frozen] Tryton broken

2021-12-02 Thread Hartmut Goebel

Hi,

TL;DR: I'll take care of this within the next few days.

Am 01.12.21 um 17:44 schrieb zimoun:

Many thanks for providing this info and the links.



The issue with 'trythond-*' is the new phase `sanity-check' for
python-build-system.


The way trytond modules are intended (by the maintainers) to be 
installed is *very* special.  Thus I'm not astound to find sanity checks 
for end-points failing - end points are simply not supported by trytond 
for trytond modules as one would expect in Python.



Any chance that someone give a look before the merge?  Otherwise,
these broken packages could land to master; which would be sad.


trytond itself and tryton seem okay. So I suggest to remove the phase 
sanity-check to all failing packages due to this check. Hopefully only a 
few trytond- module are effected - those containing scripts.


I'll take care of this within the next few days.

--
Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |




Revising sequoia packaging

2021-11-30 Thread Hartmut Goebel

Hi,

for those who contributed to sequoia packaging:

Currently, sequoia is packages somewhat sub-optimal:

There are some rust-creates in crates-io.scm (rust-sequoia-openpgp-0.9, 
rust-sequoia-rfc2822-0.9) and the big sequoia package in sequoia.scm. 
The other sequoia crates are currently not available as separate 
creates. Future versions of pEp (pretty easy privacy, pep.scm, which 
relies on sequoia) will get its own "pEpEngineSequoiaBackend" FFI library.


I propose the following:

 * Move all sequoia related crates in to sequoia.scm. As of now these
   are only rust-sequoia-openpgp-0.9, rust-sequoia-rfc2822-0.9 and
   there is only on package depending on them rust-libpijul-0.12
 * bufferedreader, rust-nettle and other crates from the sequoia
   project but not having "sequoia" (or such) in the name would be kept
   in crates-io.scm
 * In sequoia.scm there would be all sequoia crates, with the app
   packages named without "rust-" prefix ("sequoia-sq", …)
 * The current "sequoia" package will become a "wrapper", just
   propagating (or copying, what ever is more common in guix) the other
   packages which have an actual output.

WDYT?

--
Regards
Hartmut Goebel

| Hartmut Goebel  |h.goe...@crazy-compilers.com|
|www.crazy-compilers.com  | compilers which you thought are impossible |


Re: No license in crate - guix import

2021-10-05 Thread Hartmut Goebel

Am 04.10.21 um 22:30 schrieb Michael Zappa:

guix import cratelibpulse-sys@0.0.0  -r
[…]
316:37  2 (crate->guix-package "libpulse-sys" #:version _ # _ # _)
213:14  1 (string->license _)
In unknown file:
0 (string-split null #)

ERROR: In procedure string-split:
In procedure string-split: Wrong type argument in position 1 (expecting 
string): null


I was able to reproduce this issue. Please file a bug-report.

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Python Site Package Syntax Runtime Error

2021-09-20 Thread Hartmut Goebel

Am 20.09.21 um 22:28 schrieb Antwane Mason:

        (snippet
         '(begin (substitute* "setup.py"
                   (("scripts=\\['onlykey_agent.py'\\]")
                    "py_modules=['onlykey_agent']"))
                 #t

Typically these kind of changes go into a phase. Search other package 
declarations for "substitute" to see how this is done. source-snippets 
are primary for removing binaries and proprietary code.


Regarding the other issue, I can't give any advice, since I don't know 
the code :-)


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |



Re: Python Site Package Syntax Runtime Error

2021-09-20 Thread Hartmut Goebel

Am 18.09.21 um 21:44 schrieb Antwane Mason:
From what I can tell, one of the build phases responsible for wrapping 
executables into shell scripts is wrongly wrapping one of the python 
files. This causes the shell script which is named as the original 
python file to be loaded as a python module causing a syntax error 
because the export line is a shell directive and not valid python 
syntax. Below is the stack trace again for reference.  The last file 
referenced in the stack trace is a shell script wrapper for the 
original onlykey_agent.py file which was renamed 
.onlykey_agent.py-real. Below is the full file for this shell script. 
Can anyone provide guidance as to which build phase needs to change 
and how to change it to prevent onlykey_agent.py from being wrapped?



Looking at the code of the package - which actually is quite simple - I 
discover


   scripts=['onlykey_agent.py'],

This might indeed trigger some issue in phase wrap. Please open a 
bug-report for this, explicitly pointing to release v1.1.11 of that package.


As a solution for you I propose replacing the aforementioned line in 
setup.py by this line:


    py_modules=['onlykey_agent'],

I also suggest reporting this upstream, since I assume having 
onlykey_agent.py in bin is not desired. (Actually this is not a working 
script at all.)


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |



Re: Rethinking propagated inputs?

2021-09-16 Thread Hartmut Goebel

Am 09.09.21 um 11:48 schrieb Ludovic Courtès:

We’d have to check how they do it in Nix.  Basically, it’s about
reversing things: currently, everything goes to “out”, and a tiny bit
goes to “lib”.  Here, it’d be the other way around: “out” would contain
only binaries and doc (unless there’s a “doc” output), and “dev” would
contain everything else (headers, pkg-config files, cmake files, section
3 man pages, etc.)


+1

Additionally any static lib (.a) should go into  "lib".

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Python Site Package Syntax Runtime Error

2021-09-16 Thread Hartmut Goebel

Am 07.09.21 um 19:39 schrieb Antwane Mason:
File 
"/gnu/store/s2w1lq80x9vcwp5382kn98f5pi2k4b7b-python-onlykey-agent-1.1.12/bin/onlykey_agent.py", 
line 2

    export PYTHONPATH="/gnu/store/…-


This looks like an error in the package definition: A .py-file contains 
a shell command.


HTH

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




How to solve "abiI-check: recompilation needed"

2021-09-16 Thread Hartmut Goebel

Hi,

quite often, after "git pull" I'm facing this error when running 
"make-make-go":


$ make make-go
Compiling Scheme modules...
Compiling Scheme modules...
[ 69%] LOAD gnu/packages/admin.scm
error: failed to load 'gnu/packages/check.scm':
ice-9/eval.scm:293:34: In procedure abi-check: #>: 
record ABI mismatch; recompilation needed

make: *** [Makefile:7096: make-packages-go] Fehler 1

How can I solve this, without removing and rebulding *all* files - which 
is time consuming and a waste of electric power.


I already tried without success "rm gnu/package.go" as well as removing 
some other .go-files listed in the output.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Questions regarding Python packaging

2021-07-07 Thread Hartmut Goebel

Am 29.06.21 um 09:20 schrieb Lars-Dominik Braun:

AFAIK this might not be true if other build systems not using setuptools
at all might show up. And isn't this the main reason for all your work?

No, try


Sorry, I've been inprecise on this:

There might still be quite some packages out there importing plain, old 
distutils (and not setuptools) in their setup.py. These are what I meant 
with "other build systems not using setuptools". For these setup.py to 
understand the options we (and pip) need for installation, "import 
distutils" has to be hacked to actually become "import setuptools" - 
which is what setuptools-shim does


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |



Re: Questions regarding Python packaging

2021-06-28 Thread Hartmut Goebel

Hi Lars-Dominik,

Am 28.06.21 um 13:59 schrieb Lars-Dominik Braun:*

 Not installing pip by default might break some user's environments.
 Anyhow, since using pip in guix is not such a good idea anyway, this
 should be okay.

True. We could rename python→python-minimal-interpreteronly (or similar;
given that python-minimal already exists) and python-toolchain→python to
work around that.


What should be the use of having a package without pip? Anything else 
than saving a few KB?




[setuptools-shim has been removed]

Is this relevant though? I doubt many packages are still importing
distutils and the few that do can be patched.


Was I wrote: This code is still in pip, so I assume it is still relevant.

I don't think patching is a good idea. It requires effort (implementing, 
reviewing), which can be saved by keeping exisiting and tested code.




 set-SOURCE-DATE-EPOCH: This implementation makes the code depend on
 wheel and wheel being used for installation.

Technically it depends on the wheel builder understanding
SOURCE_DATE_EPOCH (not necessarily python-wheel). I’d say that’s
acceptable and it’d be preferable to fix build systems not respecting
this variable imo.


For this case please change the comment not not referring to wheel in 
this way. More something like "we expect the builder to support 
SOURCE_DATE_EPOCH, like wheel does"


Anyhow, *m not actually convinced that we should throw away the old 
code. I can imagine in the next cuple of years quite some new 
build-systems to arrive, most of which will most probably not support 
SOURCE_DATE_EPOCH in the beginning, and thus making package's life harder.






 Why has rename-pth-file been removed? Are you sure .pth-files are
 never created anymore nowerdays?

Given that easy-install has been deprecated I think it’s safe to remove
this phase and flag any packages creating this easy-install.pth as
broken. (There are, however, legitimate packages creating files like
ruamel.yaml-0.15.83-py3.8-nspkg.pth.)


What exaclty do you mean with "flag as broken"? Will anybody (you? ;-) 
verify *all* current packages to not be "broken" prior to merging this 
change?


Anyhow, again, I'm not convinced we should remove this phase now. 
.pth-file are deprecated only, but still supported. By removing this 
phase we might create conflict cased we can not foresee. And I would 
keep it even if one analyzes none of the current packages is "broken" - 
just do be on the save side fpr avoiding user trouble. (These issues 
will show up at the user, and are hard to track down, since noone will 
think about .pth files)






 python-hashbang: Isn't this done already by the normal
 "patch-shebangs" phase after install in  gnu-build-system? (BTW:
 these are called *she*bangs).

Afaik the function patch-shebang expects a leading slash and thus it
does not replace this “special” shebang (see
https://www.python.org/dev/peps/pep-0427/#installing-a-wheel-distribution-1-0-py32-none-any-whl;
Spread, point 3).


IC. Please add a comment to make this clear (e.g. "handle shebang of 
scripts generated by wheel missing leading slash")



   *

 I suggest to have phase compile-bytecode still honor older versions
 of python

I’m not sure what you mean. compileall is also part of Python 2.


The old code did not compile the source for Python <3.7. Please see the 
comment of the old code for rational.




As I Python developer I nowerdays would expect pip and venv (which is
part of the std-lib - but not the virualenv, which is a separate module)
to be availalbe when installing "python". Anyhow I could live with pip
being a separate package.

If we keep setuptools/pip bundled, we don’t have to do any of this
pypa-build dance. We could also modernize python-build-system around
`pip install` and just be done with it. (I don’t have a proof-of-concept
for that yet.)


AFAIK this might not be true if other build systems not using setuptools 
at all might show up. And isn't this the main reason for all your work?






The gnu-build-system already provides the "unzip" binary (used in phase
"unpack"). So we could simply use this. Otherwise I recommend using the
Python zip module, as this is what is used for creating the zip-archives
:-)

I’m using Python’s zipfile module already.

Fine, so you can safely remove the respective comment ;-)

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Questions regarding Python packaging

2021-06-22 Thread Hartmut Goebel

Hi Lars,

sorry for being late for commenting on this (the time I can spend on 
guix is rather limited atm).


Here are some general remarks on this patch-set (in order of appearance):

 *

   Not installing pip by default might break some user's environments.
   Anyhow, since using pip in guix is not such a good idea anyway, this
   should be okay.

 *

   "use-setuptools" is gone. There are still about 10 packages with
   "#:use-setuptools #f" - which means they are (expected to be)
   incompatible with setuptools for some reason. You might want to
   check whether these packages actually still can't be packages with
   setuptools.

 *

   setuptools-shim has been removed. I don't think this is a good idea,
   since this peace of code enforces packages to be actually build with
   setuptools instead of old distutils. This code is still in current
   pip, so I assume it is still required.

   (This shim ensures setuptools is used, even if setup.py only imports
   distutils. And setuptools is required for some options like
   ""--single-version-externally-managed" - as the comment for the shim
   says.)

 *

   set-SOURCE-DATE-EPOCH: Please keep the verbose rational. It's much
   more helpful than the new one-line comment.

 *

   set-SOURCE-DATE-EPOCH: This implementation makes the code depend on
   wheel and wheel being used for installation.

 *

   Why has rename-pth-file been removed? Are you sure .pth-files are
   never created anymore nowerdays?

 *

   python-hashbang: Isn't this done already by the normal
   "patch-shebangs" phase after install in  gnu-build-system? (BTW:
   these are called *she*bangs).

 *

   I suggest to have phase compile-bytecode still honor older versions
   of python



1) Validate the general idea of using pypa-build is viable and
sustainable in the long run – ideally through review by someone else
than me. We can’t touch python-build-system every week to solve
structural issues, so it needs to be bullet-proof.


pypa bulld is where the PyPA is pushing towards. Anyhow, as of today, as 
far as I can see, adoption is low.



2) Figure out how to run testing code. Currently python-build-system
just picks pytest, if available – not sure this is the best option we
have. How do we deal with other test systems? How do we pass options?


AFAIK fhere is no standard way for running tests in python. pytest seems 
to be the most modern test-system. Anyhow packages still use nose or tox 
(which again might run pytest or nose, with parameters fetched from 
tox.ini). So I'm afraid, there is no general rule.


Did the PyPA publish some recommendations or PEP on this?


4) Iron out minor details like including pip in the python package or
create a new python-toolchain package? What do we include in that
meta-package? pip? virtualenv? …?


As I Python developer I nowerdays would expect pip and venv (which is 
part of the std-lib - but not the virualenv, which is a separate module) 
to be availalbe when installing "python". Anyhow I could live with pip 
being a separate package.


"python-toolchain" sounds oversized for me. Would this include the 
C-compiler, too (which one? maybe I want to build cross). I'd rather not 
have such a package.



5) Fix my awkward Scheme code, especially regarding unpacking of the
built wheels. Should we be using Python’s unzip module or can be
assumed unzip is available in the build environment? (Should we add
it?)
The gnu-build-system already provides the "unzip" binary (used in phase 
"unpack"). So we could simply use this. Otherwise I recommend using the 
Python zip module, as this is what is used for creating the zip-archives 
:-)


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |



Removal of Python 2?

2021-06-22 Thread Hartmut Goebel

Am 06.06.21 um 21:44 schrieb Lars-Dominik Braun:

3) Determine the fate of Python 2, which is probably broken through this
patch set. Shall we remove it entirely? Is it worth to keep support?


Python 2 is dead, dead, dead like the parrot and end-of-prolonged life 
as of more than 1 1/2 years. Anyhow, there might still be quite some 
software not ported to Python 3 after 10 years. So I'm afraid we need to 
keep Python 2.


Other opinions?

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Some more rust/cargo insights

2021-06-08 Thread Hartmut Goebel

Am 08.06.21 um 11:15 schrieb Efraim Flashner (referring to Debian)

My understanding is that they pool together all the sources that they
have and then build all the rust packages in one go.


Maybe, I have no clue. Anyhow given my experience with sequoia, I doubt 
this will reduce build-times:


When building sequoia, several crates are build up to 4 times - even 
sequoia-openpgp itself.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Some more rust/cargo insights

2021-06-07 Thread Hartmut Goebel

Am 07.06.21 um 18:26 schrieb Hartmut Goebel:
Another path we should checkout is to see what Debian does. My 
understanding is that they figured something out.  Worth a shot, but 
I’d rather the problem be fixed upstream. It will just take 
collaboration.


I did not check their tollchain lately, but package-earch still does 
not show many packages 
<https://packages.debian.org/search?suite=experimental=names=rust>. 
Last time I check, they basically do what guix does: compile 
everything from source - again and again.


Just checked: Debian has quite some source packages [1], which even list 
"binary" packages (which are development packages). Anyhow it seems as 
if these are just the source again [2]


[1] 
<https://packages.debian.org/search?keywords=rust=sourcenames=stable=all>
[2] 
<https://packages.debian.org/buster/amd64/librust-build-const-dev/filelist>


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Some more rust/cargo insights

2021-06-07 Thread Hartmut Goebel

Hi John.

Am 07.06.21 um 17:13 schrieb John Soo:
Rust has a very well documented rfc process and we can at least bring 
it up that way.  I brought up the possibility of collaboration between 
rust and functional package managers on the rust Zulip, even.  They 
seemed to like the idea.


I'd be more than happy if you could start a RFC process then. This issue 
bugs us and other distros since many months. (I have not contact to the 
rust community and will not sign up at another proprietary communication 
platform (Zulip)).


My feeling on this is that we should partner with the Rust community 
to make shared library support from cargo a priority.


Our issue is a different one: Its about being able to reuse already 
compiled binaries - keeping current behavior of rust binaries being 
statically linked.


While this looks like being the same as dynamic library support, it is 
not: While for dynamic libraries you meet to ensure the very correct 
version of a "crate" is loaded, for static linking with pre-build 
binaries you only need to ensure this at build-time. (For guix, of 
course, both would not be a problem, but I doubt we can make rust people 
understand this. And other distros will still have the problem.)


rustc already has a notion for "static libraries", just cargo fu** it 
up. (Sorry for hard wording, but I'm actually quite angry about cargos' 
narrow-minded behavior).


So our task is much, much easier and doesn't require changed to rustc, 
only to cargo.



 Specifying an output directory is currently a nightly feature, that 
could be helpful.


Not exactly sure what you mean. But what breaks build with cargo are the 
*input* directories - and other magic which gets included into the 
"meta-data" for no reason.



In general Rust tooling does not compose with existing tools.  I 
believe they will be amenable to the thought that it should. If Rust 
wants to be used in the linux kernel, for instance, it should be easy 
to use with Make.


From your lips to God's ears. From what I've seen an read, I'm not 
convident they will change anything. I'd like to be proofed wrong, 
though :-)



Another path we should checkout is to see what Debian does. My 
understanding is that they figured something out.  Worth a shot, but 
I’d rather the problem be fixed upstream. It will just take collaboration.


I did not check their tollchain lately, but package-earch still does not 
show many packages 
<https://packages.debian.org/search?suite=experimental=names=rust>. 
Last time I check, they basically do what guix does: compile everything 
from source - again and again.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |





Re: Some more rust/cargo insights

2021-06-07 Thread Hartmut Goebel

Am 07.06.21 um 10:28 schrieb Pjotr Prins:

Exactly my idea. One challenge will be that the source of dependencies
need to be available - think of it as include files. One thing we
could do as ship them as part of the Guix package. Or have a separate
one for sources. We do that for include files already.


Well, the current cargo-build-system already handles the source 
dependencies.


We need to aim towards pre-built libraries (rlib, much like .a files in 
C, I assume)


When cargo calls rustc, the command looks like:

LD_LIBRARY_PATH='$PWD/target/release/deps:/gnu/store/…-rust-1.45.2/lib' \
rustc … src/lib.rs --crate-type lib \
-L dependency=$PWD/target/release/deps \
--extern 
xmlparser=$PWD/target/release/deps/libxmlparser-53596ac1828b1c97.rmeta


Thus I assume one could pass the rlib's of all dependencies in -L and 
the respective mata-data in --extern


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Some more rust/cargo insights

2021-06-07 Thread Hartmut Goebel

Am 06.06.21 um 20:38 schrieb Pjotr Prins:

Since that community is about not invented here - maybe we can incense
someone to pick it up. Needs a mature programmer though.


One solution that came to my mind is to not use Cargo, but instead parse 
Cargo.toml and issue the appropriate "rustc" commands ourself.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Some more rust/cargo insights

2021-06-06 Thread Hartmut Goebel

Hi.,

these day I had spent some more hours struggling with rust ans cargo, 
trying to get "pre-built" crates.


Summery; Cargo is cruft, no solution found yet.

I tried reusing a crate from the very same place it was built (see 
enclosed script). Anyhow, this does not work since cargo uses a 
different "metadata" value, even if noting changed. Based in the verbose 
output (cargo build -v …) I assume that some parameters of the 
"destination" build get included into this value.


This meets another observation; when building the sequoia suite, several 
crates are build several times - even if all builds are performed in the 
same environment.


Rust's build system is such a cruft - I really would like to throw it 
where it belongs: into the trash.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |

(use-modules
 (guix download)
 (guix packages)
 (guix build-system cargo)
 (gnu packages crates-io)
 (srfi srfi-1)
 (srfi srfi-26)
 (gnu packages crates-graphics)
 (gnu packages rust-apps))

(define-public rust-pretty-assertions-0.5
  (package
(inherit rust-pretty-assertions-0.6)
(name "rust-pretty-assertions")
(version "0.5.1")
(source
 (origin
   (method url-fetch)
   (uri (crate-uri "pretty_assertions" version))
   (file-name
(string-append name "-" version ".tar.gz"))
   (sha256
  (base32 "1ins6swkpxmrh8q5h96h8nv0497d3dclsiyx2lyvqi6py0q980is"
(build-system cargo-build-system)
(arguments
 `(#:tests? #f
   #:cargo-inputs
   (("rust-ansi-term" ,rust-ansi-term-0.11)
("rust-difference" ,rust-difference-2))

;; /gnu/store/wknzymkfbfjbxwfd3djrn4hk9zdfgs56-rust-xmlparser-0.13.3 -- original
;; 
;; libxmlparser-f82b201ea4144ed3.rlib

(define-public myrust-xmlparser
  (package
(inherit rust-xmlparser-0.13)
(outputs '("out" "rlib"))
(arguments
 `(#:skip-build? #f
   #:tests? #f
   #:cargo-build-flags (list "--release" "-vv")
   #:phases
   (modify-phases %standard-phases
 (add-after 'install 'install-rlib
   (lambda* (#:key outputs #:allow-other-keys)
 (let* ((rout (assoc-ref outputs "rlib"))
(dest (string-append rout "/rlib")))
   ;;(mkdir dest)
   ;;(for-each (cut install-file <> (string-append rout "/rlib"))
   (for-each (lambda (fn)
   (install-file fn (string-append rout "/rlib")))
 (find-files "target/release/deps" "\\.(rlib|rmeta)$"))
   )))
  ;; (add-after 'install 'fail
  ;;   (lambda _ #f))
)

(define-public myrust-roxmltree
  (package
(inherit rust-roxmltree-0.14)
;;(outputs '("out" "crate"))
(inputs
 `(("rust-xmlparser" ,myrust-xmlparser "rlib")))
(arguments
 `(#:skip-build? #f
   #:tests? #f
   ;;#:vendor-dir "/tmp/src"
   #:cargo-build-flags (list "--release" "-vv")
   #:cargo-inputs
   (("rust-xmlparser:src" ,rust-xmlparser-0.13)
("rust-pretty-assertions" ,rust-pretty-assertions-0.5))
   #:phases
   (modify-phases %standard-phases
 (add-after 'patch-cargo-checksums 'bring-in-rlib
   (lambda* (#:key inputs #:allow-other-keys)
 (let* ((rin (assoc-ref inputs "rust-xmlparser"))
(src (assoc-ref inputs "rust-xmlparser"))
(rlib (string-append rin "/rlib")))
   (mkdir "/tmp/guix-build-rust-xmlparser-0.13.3.drv-0/")
   (copy-recursively
   "guix-vendor/rust-xmlparser-0.13.3.tar.gz"
   "/tmp/guix-build-rust-xmlparser-0.13.3.drv-0/xmlparser-0.13.3")
   (rename-file
"guix-vendor/rust-xmlparser-0.13.3.tar.gz"
"../rust-xmlparser-0.13.3.tar.gz")
   (symlink
"/tmp/guix-build-rust-xmlparser-0.13.3.drv-0/xmlparser-0.13.3"
"guix-vendor/rust-xmlparser-0.13.3")
;;(let ((port (open-file ".cargo/config" "w" #:encoding "utf-8")))
;;  (display "
;; #paths = [\"/tmp/guix-build-rust-xmlparser-0.13.3.drv-0/xmlparser-0.13.3\"]

;; [source.crates-io]
;; replace-with = 'vendored-sources'

;; #[patch.crates-io]
;; #xmlparser = { path = '/tmp/guix-build-rust-xmlparser-0.13.3.drv-0/xmlparser-0.13.3' }

;; [source.vendored-sources]
;; directory = '" port)
;;  (display

Addons (esp. pEp) for Icecat/THunderbird

2021-05-04 Thread Hartmut Goebel

Hello,

pEp just publishes a beta-version of pEp for Thunderbird. Since guix 
already provides the required base-packages (sequoia, pEp-engiine) it 
would be great if we could provide the icecat/thunderbird-addon, too.


Anybody to support my on packaging the add-on? Or the other way round: 
somebody t package the add-on, whom I'd happily support?


https://dev.pep.foundation/Thunderbird

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: rust-tempfile-3 update to 3.2.0 breaks sequoia build

2021-05-04 Thread Hartmut Goebel

Hi Nicolas,

thanks for the review, Unfortunately I pushed the change just a few 
minutes ago :-(



In `fix-permissions' phase, are you sure you need #o644 permission?
Otherwise, you may want to use `make-file-writeable'.


Thanks for this tip, I didn't know this function. I'll keep it in mind 
for further patches.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: rust-tempfile-3 update to 3.2.0 breaks sequoia build

2021-05-02 Thread Hartmut Goebel
Hi Nicolas,

I was able to fix sequoia by updating to 1.1 and applying some more
changes. Please review <http://issues.guix.gnu.org/issue/48154> so we
can get it into upcoming guix 1.3. Thanks in advance

-- 
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: #:cargo-inputs don't honor --with-input

2021-05-01 Thread Hartmut Goebel

Hi Ludo,

Am 30.04.21 um 12:45 schrieb Ludovic Courtès:


Uh.  More generally, Rust packages kinda create a “shadow dependency
graph” via #:cargo-inputs & co., which breaks all the tools that are
unaware of it.  It was discussed several times on this list, and
apparently it’s unfortunately unavoidable at this time.  :-/


Maybe we can get rid of #:cargo-inputs at least:

guix/build-system/cargo.scm says: "Although cargo does not permit cyclic 
dependencies between crates,

however, it permits cycles to occur via dev-dependencies"

So we could change #:cargo-inputs into normal inputs and get at least 
part of the dependencies right.


I'm aware of the "special treatment" of cargo-inputs. Anyhow we could 
apply the following changes to the cargo build-system:


 *

   The cargo build-system copies the "pre-built crate" (more on this
   below) into a new output called "rlib" or "crate". There already is
   a phase "packaging" which only needs to be changed to use the other
   output.

 *

   All of today's #:cargo-inputs will be changed into normal inputs
   using the "rlib/crate" output. (To avoid duplicate assoc-rec keys we
   might need to change the name/keys, but this should be a minor issue.)

 *

   If required, the cargo build-system can easily identify former
   #:cargo-inputs  by being inputs from a "rlib/crate" output.

Benefits up to here:

 * The dependency graph would be much more complete - although
   "#:cargo-development-inputs" would still be missing.
 * Package transformation options would work -again except for
   "#:cargo-development-inputs".
 * If(!) we actually manage to make cargo pick "pre-built" crates,
   package definition will already be adjusted to use them.

|Drawbacks up to here:|

 * ||Since the "packaging" phase copies the source, there is not much
   benefit in having a "rlib/crate" output yet. Actually, when a
   "rlib/crate" output needs to be build, the user will end up with two
   copies of the source (one from the git-checkout, one from packaging)

About "pre-built" crate: Given the many possible ways to build crates 
(e.g. switching on and off "features", different crate types), we might 
never be able to provide pre-built packages for all cases. Thus we might 
end up always providing the source, even if we manage to make cargo pick 
of pre-built artifacts.


About the output name: Rust has a notion of "rlib" (a specialized .a 
file), which seems to be the pre-built artifacts we are seeking. Thus 
the proposed name.


WDYT?

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |



#:cargo-inputs don't honor --with-input

2021-04-28 Thread Hartmut Goebel

Hi,

FYI: yet another rust issue: #:cargo-inputs don't honor --with-input.

--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: rust-tempfile-3 update to 3.2.0 breaks sequoia build

2021-04-06 Thread Hartmut Goebel
Am 04.04.21 um 11:08 schrieb Nicolas Goaziou:
> Not really. Is it possible to upgrade it to a more recent commit? I see
> that Cargo.lock references tempfile 3.2.0 in HEAD.

The issue is not in tempfile, it's just caused by this update. Thus I'm
not confident updating sequoia to a more recent version will fix the issue.

-- 
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Rust and parametric packages

2021-04-03 Thread Hartmut Goebel
Am 17.03.21 um 19:23 schrieb Léo Le Bouter:
> I advise you look there also: 
> https://rust-lang.zulipchat.com/#narrow/stream/246057-t-cargo/topic/rlib-intermediate-object-reuse/near/225653640

Access requires a login. Is there a publicly available mirror?

-- 
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




rust-tempfile-3 update to 3.2.0 breaks sequoia build

2021-03-30 Thread Hartmut Goebel
Hi Nicolas,

building sequoia is currently broken in master with
"syn::export::ToTokens" not found.

I tracked this down to 6513650d40f74 "gnu: rust-tempfile-3: Update to
3.2.0." (2021-02-16). The updated package also updates some
dependency-requirements: cfg-if 0.1 -> 1, rand 0.7 -> 0.8 and
redox-syscall 0.1 -> 0.2.

While in theory - according to semantic versioning -, updating 3.1.x to
3.2.x should not any effect on others packages, this updates has :-(

I tried building with rust-tempfile-3.1, with no success - this fails
with "rand::rngs::OsRng" missing. Updating rust-rand-core-0.6 to 0.6.2
(since 0.6.1 was yanked), did not help either - then I gave up.

Any ideas how to make sequoia build again?

-- 
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |





Re: 03/163: build/python: Add a new guix-pythonpath procedure.

2021-03-27 Thread Hartmut Goebel
Am 14.03.21 um 01:58 schrieb Maxim Cournoyer:
>>>> 6) Please add some more comments to the code explaining the idea.
>>> I was under the impression the code was concise enough to forego with
>>> verbose explanations; I'd rather keep it this way.
>> Please add some comments. I had a hard time understanding it - and I was
>> not even sure I understood, see my question (1).
> I'm spread thin right now, so if you could prepare a patch addressing
> the above for me to review, that'd be much appreciated.  Otherwise I'll
> get to it, but it won't be before some time.
>
Sorry, I'm too short in time ATM - and I'm afraid this will not change
anytime soon. Just take my questions as a guidance what the comments
should answer to make others (me :-) understand what is going on.

Many thanks.

-- 
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |



Re: 03/163: build/python: Add a new guix-pythonpath procedure.

2021-03-07 Thread Hartmut Goebel
Hi schrieb Maxim,
> Sorry for the delay.

No problem, I reward this with another delay ;-) (Just kidding ;-)


> Hartmut Goebel  writes:
>
>> 2) This does not remove duplicates and does not honor .pth files in
>> the respective directories - which might still be used. Thus
>> site.addsitedir() should be called for adding the paths. This also
>> takes care about duplicates.
> I confess I didn't pay attention to .pth files, which mostly seemed like
> legacy cruft to me; are they still used in the context of PEP 517 and
> modern Python packaging?  

I can't tell for sure. (I rinember to have seen a note about .pth still
being used in some setuptool-tick, but can't find it now.) Anyhow, since
site.py still supports it, I would prefer to be on the save side and
support it, to.

> The problem with calling site.addsitedir is
> that it simply appends to sys.path.  We want to splice in the content of
> GUIX_PYTHONPATH at a controlled location.

site.addsitedir takes an option second arguments where the paths are
collected into.


>> 4) Since PYTHONPATH is evaluated prior to importing sitecustomize, any
>> sitecustominze.py in the user's path will overwrite our file, thus
>> inhibiting our paths to be added. Not sure this is what we want in Guix.
> I asked guidance on the #python channel on freenode and was recommended
> to use sitecustomize.py for this purpose; reading the doc here seems to
> confirm our usage of it is as intended [0]:

IC.


>> 6) Please add some more comments to the code explaining the idea.
> I was under the impression the code was concise enough to forego with
> verbose explanations; I'd rather keep it this way.

Please add some comments. I had a hard time understanding it - and I was
not even sure I understood, see my question (1).


Another point, which came into my mind just now: Do virtuall
environments still work as expected? (With --system-site-packages,
packages in the profile are available, but venv-packages overwrite.
Without ----system-site-packages packages in the profile are *not*
available.)


-- 
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |





Re: option hint for all commands?

2021-03-01 Thread Hartmut Goebel

Am 25.02.21 um 00:47 schrieb zimoun:

Other said, all the commands using ’parse-command-line’ enjoy the typo
hinter and all the commands using ’args-parse*’ don’t.  I am proposing
to use ’parse-command-line’ for all the commands .  Any objection?


+1

--
Schönen Gruß
Hartmut Goebel
Dipl.-Informatiker (univ), CISSP, CSSLP, ISO 27001 Lead Implementer
Information Security Management, Security Governance, Secure Software 
Development


Goebel Consult, Landshut
http://www.goebel-consult.de <http://www.goebel-consult.de>

Blog: https://www.goe-con.de/blog/alternative-android-betriebssystem 
<https://www.goe-con.de/blog/alternative-android-betriebssystem>
Kolumne: 
https://www.goe-con.de/hartmut-goebel/cissp-gefluester/2011-09-kommerz-uber-recht-fdp-die-gefaellt-mir-partei 
<https://www.goe-con.de/hartmut-goebel/cissp-gefluester/2011-09-kommerz-uber-recht-fdp-die-gefaellt-mir-partei> 






Re: Discover GNU Guix eco-system with awesome-guix!

2021-02-13 Thread Hartmut Goebel

Am 09.02.21 um 17:16 schrieb Léo Le Bouter:

Commonly awesome lists are used to share links to all things related to
some topic or some software


I wonder why not just calling it "Link list" then?  Thsi would be much 
easier to understand for non-nerds.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Changes to the branching workflow

2021-02-13 Thread Hartmut Goebel

Am 12.02.21 um 21:49 schrieb Andreas Enge:

 From what I understood of the discussion, I would also go with Tobias's and
Efraim's suggestion: There is a core-updates branch that is constantly open
and where people can push; this does not seem to leave a possibility of
mistake, almost by definition. Then we can branch off core-updates-frozen,
which is frozen :), except for cherry-picked bug fixing commits and merges
from master. Once it is finished, it is merged into master and deleted.


This is what I understood, too.


Technically speaking, this is the same as your suggestion, Leo, but it
avoids the constant dance between core-updates, that disappears and
reappears under the name core-updates-next, that disappears and reappears
under the name core-updates, and so on.
It's even worse: When removing staging and core-updates at Savannah, 
this does not effect local copies. Thus one might push these branches to 
Savannah, which might lead to a lot of confusion and trouble.


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Changes to the branching workflow

2021-02-13 Thread Hartmut Goebel

Am 11.02.21 um 23:42 schrieb Leo Famulari:

The default branch names remain "core-updates" and "staging".

[…]

During those periods, new patches can be pushed to "core-updates-next"
and "staging-next".


What should be the use of these *-next branches. I can't see any except 
confusing committers. Why can't on push to staging and core-update 
during the active part of the cycle?


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: [DOUBT]: native-search-paths VS search-paths

2021-02-09 Thread Hartmut Goebel

Am 09.02.21 um 11:06 schrieb Leo Prikler:

Depends on the package.  If it gets propagated into the build
environment, the variable is set as well.  At other times, it might be
set through the wrap phase for runtime purposes.


This makes me think whether the wrap-phase of the qt-build-system does 
it right (even after 45784 is merged): It searches the "inputs" for some 
directories. This has the major drawback of including native-inputs 
(most notable: cmake).


Now I wonder whether the correct paths are already available a 
"search-path"?


--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




Re: Questions regarding Python packaging

2021-02-05 Thread Hartmut Goebel

Am 23.01.21 um 13:34 schrieb Lars-Dominik Braun:

Remove pip and
setuptools from python (saves almost 20MiB from the closure


When doing to we need to be carefully. pip is expected to be available 
after installing "python". So when removing pip and setuptool, we would 
need some "python/bootstrap" package without pip and setuptools and some 
"python" package still including both.



--
Regards
Hartmut Goebel

| Hartmut Goebel  | h.goe...@crazy-compilers.com   |
| www.crazy-compilers.com | compilers which you thought are impossible |




  1   2   3   4   5   6   7   8   9   10   >