Re: Packaging dependencies for mailman3-hyperkitty

2016-03-25 Thread Paul Wise
On Fri, 2016-03-25 at 19:35 +0100, Pierre-Elliott Bécue wrote:

> That's in progress, the only goal of this detection is to deactivate
> javascript dynamic load of threads. We're thinking about alternative
> solutions.

I don't understand why you would deactivate JavaScript dynamic load for
bots? Typically they don't support JavaScript (except Google) and you
should have a fallback for people who turn off JavaScript anyway.

I hope mailman3/HyperKitty web interfaces use progressive enhancement:

http://en.wikipedia.org/wiki/Progressive_enhancement

> I understand your point, and I'll think about it, but my goal is to make
> upstream remove obsolete dependencies. django-libravatar seems to be the
> only project that bundles support for that, and it's not maintained, whereas
> django-gravatar2 is still maintained.

I guess you are talking about this project, it hasn't seen any issues
filed so probably just works as-is without needing any changes?

https://github.com/fnp/django-libravatar

> So, for now, I think that I'd rather have a first mailman3 suite in debian,
> and then, think about how to make things better. :)

I see.

-- 
bye,
pabs

https://wiki.debian.org/PaulWise




signature.asc
Description: This is a digitally signed message part


Re: Packaging dependencies for mailman3-hyperkitty

2016-03-25 Thread Pierre-Elliott Bécue
Le vendredi 25 mars 2016 à 13:02:55+0800, Paul Wise a écrit :
> On Thu, Mar 24, 2016 at 11:43 PM, Pierre-Elliott Bécue wrote:
> 
> > Packaging dependencies for mailman3-hyperkitty
> 
> Does HyperKitty depend on mailman3 or just enhance it by providing an
> archive web interface? If the latter, I would suggest calling it
> hyperkitty instead of mailman3-hyperkitty.
> 
> > robot-detection suffers the same illness, but it's tiny, it's possible to
> > integrate it in hyperkitty, or make it optionnal.
> 
> Embedded code copies are against Debian policy, please package it
> separately or get upstream to switch to something else.
> 
> https://wiki.debian.org/EmbeddedCodeCopies
> 
> Something like that sounds like it isn't possible to keep usefully
> up-to-date in Debian stable though, since the landscape of robots on
> the web will be changing continually and many will be aiming to
> emulate browsers.
> 
> https://pypi.python.org/pypi/robot-detection
> 
> In addition, it seems to be woefully inadequate for that since the API
> doesn't appear to take into account IP address ranges.
> 
> It also depends on the robotstxt.org database, which would need to be
> packaged separately and is also no longer kept up to date at this
> time:
> 
> http://www.robotstxt.org/db.html
> 
> "This robots database is currently undergoing re-engineering. Due to
> popular demand we have restored the existing data, but
> addition/modification are disabled."
> 
> As the page says, there is a better database of user-agents available
> 
> http://www.botsvsbrowsers.com/
> http://www.botsvsbrowsers.com/category/1/index.html
> 
> Unfortunately this is incompatible with the data format used by
> robotstxt.org/robot-detection:
> 
> http://www.robotstxt.org/db/all.txt
> 
> So you can see from the botsvsbrowsers.com data, the User-Agent field
> is often bogus or contains vulnerability attack patterns and is thus
> mostly not useful at all and should probably just be ignored by all
> web apps at this point.
> 
> So I would suggest convincing upstream to remove whatever use of
> robot-detection is present in mailman3 or hyperkitty.

That's in progress, the only goal of this detection is to deactivate
javascript dynamic load of threads. We're thinking about alternative
solutions.

> > That leaves me with django-gravatar2, that seems useful, and is still
> > developed. I heard there is some kind of "canonical" way of packaging django
> > apps. As I'm not used to that, I'm here to ask advice.
> 
> I would suggest upstream switch from Gravatar (a centralised
> proprietary service) to Libravatar (a federated Free Software service
> that falls back on Gravatar):
> 
> https://www.libravatar.org/

I understand your point, and I'll think about it, but my goal is to make
upstream remove obsolete dependencies. django-libravatar seems to be the
only project that bundles support for that, and it's not maintained, whereas
django-gravatar2 is still maintained.

So, for now, I think that I'd rather have a first mailman3 suite in debian,
and then, think about how to make things better. :)

> Re canonical django packaging, you may be talking about this:
> 
> https://wiki.debian.org/DjangoPackagingDraft
> 
> There are also lots of python-django-* packages in Debian that you
> could look at.

Thanks!

-- 
PEB



Re: Packaging dependencies for mailman3-hyperkitty

2016-03-25 Thread Barry Warsaw
On Mar 25, 2016, at 01:02 PM, Paul Wise wrote:

>Does HyperKitty depend on mailman3 or just enhance it by providing an
>archive web interface?

Although greatly enhanced by it, Mailman 3 (core) doesn't require HyperKitty.
HK isn't currently useful on its own though.

Cheers,
-Barry


pgp3ExyuUZIGA.pgp
Description: OpenPGP digital signature


Re: Packaging dependencies for mailman3-hyperkitty

2016-03-24 Thread Paul Wise
On Thu, Mar 24, 2016 at 11:43 PM, Pierre-Elliott Bécue wrote:

> Packaging dependencies for mailman3-hyperkitty

Does HyperKitty depend on mailman3 or just enhance it by providing an
archive web interface? If the latter, I would suggest calling it
hyperkitty instead of mailman3-hyperkitty.

> robot-detection suffers the same illness, but it's tiny, it's possible to
> integrate it in hyperkitty, or make it optionnal.

Embedded code copies are against Debian policy, please package it
separately or get upstream to switch to something else.

https://wiki.debian.org/EmbeddedCodeCopies

Something like that sounds like it isn't possible to keep usefully
up-to-date in Debian stable though, since the landscape of robots on
the web will be changing continually and many will be aiming to
emulate browsers.

https://pypi.python.org/pypi/robot-detection

In addition, it seems to be woefully inadequate for that since the API
doesn't appear to take into account IP address ranges.

It also depends on the robotstxt.org database, which would need to be
packaged separately and is also no longer kept up to date at this
time:

http://www.robotstxt.org/db.html

"This robots database is currently undergoing re-engineering. Due to
popular demand we have restored the existing data, but
addition/modification are disabled."

As the page says, there is a better database of user-agents available

http://www.botsvsbrowsers.com/
http://www.botsvsbrowsers.com/category/1/index.html

Unfortunately this is incompatible with the data format used by
robotstxt.org/robot-detection:

http://www.robotstxt.org/db/all.txt

So you can see from the botsvsbrowsers.com data, the User-Agent field
is often bogus or contains vulnerability attack patterns and is thus
mostly not useful at all and should probably just be ignored by all
web apps at this point.

So I would suggest convincing upstream to remove whatever use of
robot-detection is present in mailman3 or hyperkitty.

> That leaves me with django-gravatar2, that seems useful, and is still
> developed. I heard there is some kind of "canonical" way of packaging django
> apps. As I'm not used to that, I'm here to ask advice.

I would suggest upstream switch from Gravatar (a centralised
proprietary service) to Libravatar (a federated Free Software service
that falls back on Gravatar):

https://www.libravatar.org/

Re canonical django packaging, you may be talking about this:

https://wiki.debian.org/DjangoPackagingDraft

There are also lots of python-django-* packages in Debian that you
could look at.

-- 
bye,
pabs

https://wiki.debian.org/PaulWise