Hey Niels,

> As far as I can tell, the underlying desire is to do some form of
> deduplication.
Yes, indeed.

> If so, then I think this is similar to #888397 with an expanded scope
> and a proposal for doing this via dh_link (rather than dh_installdocs
> or a new helper).

I guess my proposal is actually more narrow and specific. That other
report asks about more generic deduplication (run on the entire package,
or maybe a subset of files) that automatically finds duplicate files. I
guess that could be useful, but is also quickly a lot more complicated
(more work to find duplicates, which is the canonical one, any
exceptions). I'm also less inclined to completely automate this, I'd
rather make some more explicit choices (though something like "link files
in *this* directory to *that* directory if you find duplicates" could be
a nice compromise between automatic and manual work, maybe).

Regardless, my suggestion would allow manual and explicit deduplication
to become a bit easier, at the expense of having to manually track the
list of duplicate files on upstream changes (but with my suggestions,
detecting files that are no longer duplicated is automatic, and lintian
can I think already detect *new* duplicate files, so together that would
allow adequately fixing all duplicates).

I think that both issues are thus sufficiently separate (in how they
work and would be implemented) and can be valuable to eventually
implement side-by-side, so I would suggest keeping both issues open for
now.

Gr.

Matthijs

Attachment: signature.asc
Description: PGP signature

Reply via email to