Re: Intent to implement: ScrollTimeline

2017-03-24 Thread Zibi Braniecki
Is it possible to use this, or is there a similar proposal, for linking 
animation timeline to other user-controlled means of interacting with the UI?

I'm thinking primarily about things like:

 - drag - the percentage of the distance between the source and target 
linked to the animation timeline
 - touch events - unfold or move an element with a thumb on mobile triggers an 
animation linked to the percentage of the distance between folded/unfolded.

Thanks,
zb.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Intent to implement: ScrollTimeline

2017-03-24 Thread Botond Ballo
Summary:
  Scroll-linked animations are a way for web developers to write
  web animations whose progress is linked to scrolling rather than
  to time.
  ScrollTimeline is the JS API for creating scroll-linked animations
  (the spec also contains a CSS API, which will be implemented at
  a later time).

Bug:
  Tracking bug for scroll-linked animations: bug 1281348 [1]
  Tracking bug for the initial landing of ScrollTimeline: bug 1321428 [2]

Link to standard:
  https://wicg.github.io/scroll-animations/

  This spec is fairly unstable for the time being, and is likely to
  undergo significant changes before final standardization.
  However, I believe the core mechanism that I'm proposing to
  implement here is reasonably stable, and web developers will
  benefit from having it in a Firefox build so they can play around
  with it.

Platform coverage: where will this be available?
  All platforms.

Estimated or target release:
  Not known yet. Probably not very soon as the spec still needs
  to stabilize.

Preference behind which this will be implemented:
  dom.animations-api.scroll-driven.enabled

  Due to the early stage of standardization of this feature, it will
  initially only be available on Nightly _and_ behind a pref (that is,
  you need to be running Nightly and have the pref flipped to
  use it).

Is this feature enabled by default in sandboxed iframes?
  Yes

If allowed, does it preserve the current invariants in terms of what
sandboxed iframes can do?
  I believe so, but I'm not an authority on sandboxing.

DevTools bug:
  Bug 1350461

Do other browser engines implement this?
  Safari has originally proposed the CSS properties that this spec
  is based on [4], though I'm not aware of recent activity relating
  to it.

  Blink has a very similar proposal [5] which we expect will be
  consolidated into the WICG spec linked above.

Tests
  Currently just layout/reftests/async-scrolling/animation.
  Bug 1324605 tracks writing web platform tests.

Cheers,
Botond

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=1281348
[2] https://bugzilla.mozilla.org/show_bug.cgi?id=1321428
[3] https://bugzilla.mozilla.org/show_bug.cgi?id=1350461
[4] https://lists.w3.org/Archives/Public/www-style/2014Sep/0135.html
[5] https://github.com/drufball/generalized-animations
[6] https://bugzilla.mozilla.org/show_bug.cgi?id=1324605
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Rationalising Linux audio backend support

2017-03-24 Thread Trevor Saunders
On Fri, Mar 24, 2017 at 11:45:57AM -0700, kthies...@mozilla.com wrote:
> On Wednesday, March 22, 2017 at 9:42:02 PM UTC-7, jtkel...@gmail.com wrote:
> > On Wednesday, March 22, 2017 at 1:35:06 PM UTC-5, Botond Ballo wrote:
> > > Based on this new information, might there be room to reconsider this 
> > > decision?
> > 
> > Even if you do not reconsider the full decision, could you at least turn it 
> > back on for v52, so it can ride the ESR train? This has also been mentioned 
> > in the bug in question. 
> > 
> > This would give users a longer period of time to try and transition to 
> > PulseAudio or find a solution like apulse to keep Firefox audio working for 
> > them. And it would give you more time to accept patches on the ALSA 
> > backend, without impeding work on the 5.1 audio support for PulseAudio. 
> > 
> > Users would have to consciously choose to use the ESR version once v53 
> > comes out, hence you will be sure that they will be aware of the potential 
> > dropping of ALSA support. 
> > 
> > It seems a simple fix to give the disgruntled users something and ease 
> > hostilities, making things easier on all the devs involved, and give the 
> > users a feeling that Mozilla listens, so they won't have another reason to 
> > leave. 
> > 
> > Clearly a lot of people are affected. A lot of smaller distros use Firefox 
> > as their default, and do not want to include PulseAudio which adds a 
> > further complication and is unnecessary for nearly every other Linux app.
> 
> Agreed.  PulseAudio is "opinionated software," much like systemd, by the same 
> author.  Some folks feel that PulseAudio's opinions, like systemd's, run 
> counter to the practices of good Linux/Unix system operators.  Insisting on 
> PulseAudio discourages the inclusion of Firefox in the smaller, less 
> commercially-focused distros.

Speaking only for myself even leaving asside my opinions about
PulseAudio's opinions its opinions can come into conflict with other
software that people care about more than audio.  That happens for me
personally, and while I still use Firefox I've accepted not having audio
in the browser.

Trev

> 
> Also, please be aware that this will have effects on *BSD ports of Firefox.  
> Most of the people who run BSD by choice want nothing to do with "Linuxisms" 
> like PulseAudio.
> 
> Thanks for reading,
> --KT.
> 
> --
> Karl THIESSEN
> Systems group, Firefox Test Engineering,
> Mozilla Corp.
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Rationalising Linux audio backend support

2017-03-24 Thread kthiessen
On Wednesday, March 22, 2017 at 9:42:02 PM UTC-7, jtkel...@gmail.com wrote:
> On Wednesday, March 22, 2017 at 1:35:06 PM UTC-5, Botond Ballo wrote:
> > Based on this new information, might there be room to reconsider this 
> > decision?
> 
> Even if you do not reconsider the full decision, could you at least turn it 
> back on for v52, so it can ride the ESR train? This has also been mentioned 
> in the bug in question. 
> 
> This would give users a longer period of time to try and transition to 
> PulseAudio or find a solution like apulse to keep Firefox audio working for 
> them. And it would give you more time to accept patches on the ALSA backend, 
> without impeding work on the 5.1 audio support for PulseAudio. 
> 
> Users would have to consciously choose to use the ESR version once v53 comes 
> out, hence you will be sure that they will be aware of the potential dropping 
> of ALSA support. 
> 
> It seems a simple fix to give the disgruntled users something and ease 
> hostilities, making things easier on all the devs involved, and give the 
> users a feeling that Mozilla listens, so they won't have another reason to 
> leave. 
> 
> Clearly a lot of people are affected. A lot of smaller distros use Firefox as 
> their default, and do not want to include PulseAudio which adds a further 
> complication and is unnecessary for nearly every other Linux app.

Agreed.  PulseAudio is "opinionated software," much like systemd, by the same 
author.  Some folks feel that PulseAudio's opinions, like systemd's, run 
counter to the practices of good Linux/Unix system operators.  Insisting on 
PulseAudio discourages the inclusion of Firefox in the smaller, less 
commercially-focused distros.

Also, please be aware that this will have effects on *BSD ports of Firefox.  
Most of the people who run BSD by choice want nothing to do with "Linuxisms" 
like PulseAudio.

Thanks for reading,
--KT.

--
Karl THIESSEN
Systems group, Firefox Test Engineering,
Mozilla Corp.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Future of out-of-tree spell checkers?

2017-03-24 Thread Bill McCloskey
If we do end up going with the dlopen plan, let's make sure that we enforce
some kind of code signing. We're finally almost rid of all the untrusted
binary code that we used to load (NPAPI, binary XPCOM, ctypes). It would be
a shame to open up a new path.

-Bill

On Fri, Mar 24, 2017 at 6:20 AM, Ehsan Akhgari 
wrote:

> On 2017-03-24 4:20 AM, Henri Sivonen wrote:
> > On Fri, Mar 24, 2017 at 2:38 AM, Ehsan Akhgari 
> wrote:
> >> On Wed, Mar 22, 2017 at 11:50 AM, Jeff Muizelaar <
> jmuizel...@mozilla.com>
> >> wrote:
> >>>
> >>> On Wed, Mar 22, 2017 at 11:08 AM, Henri Sivonen 
> >>> wrote:
> 
>  dlopening libvoikko, if installed, and having thin C++ glue code
>  in-tree seems much simpler, except maybe for sandboxing. What are the
>  sandboxing implications of dlopening a shared library that will want
>  to load its data files?
> >>>
> >>> My understanding is that the spell checker mostly lives in the Chrome
> >>> process so it seems sandboxing won't be a problem.
> >>
> >>
> >> That is mostly correct.  The spell checker *completely* lives in the
> parent
> >> process and is completely unaffected by sandboxing.
> >>
> >> But that's actually a problem.  My understanding is that WebExtensions
> won't
> >> be allowed to load code in the parent process.  Bill, Kris, is that
> correct?
> >> If yes, we should work with the maintainers of the Finnish and
> Greenlandic
> >> dictionaries on adding custom support for loading their code...
> >
> > But when (according to doing a Google Web search excluding mozilla.org
> > and wading through all the results and by searching the JS for all
> > AMO-hosted extensions) the only out-of-tree spell checkers use
> > libvoikko, why involve Web Extensions at all? Why wouldn't we dlopen
> > libvoikko and put a thin C++ adapter between libvoikko's C API and our
> > internal C++ interface in-tree? That would be significantly simpler
> > than involving Web extensions.
>
> Is that different than what I suggested above in some way that I'm
> missing?  I think it's better to engage the developers of those
> libraries first and ask them how they would like us to proceed.  At any
> rate, something has to change on their side, since after Firefox 57
> presumably Firefox would just ignore their XPI file or something.  The
> actual implementation mechanism would probably end up being the
> dlopening that you're suggesting, but if we're going to be signing up to
> doing that, we better have at least a communication channel with the
> authors of those libraries in case for example we need to change
> something on our interface some day.
>
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Better download security through browsers

2017-03-24 Thread Mike Hoye

Love it. How do we make it happen?

- mhoye

On 2017-03-24 1:30 PM, Tom Ritter wrote:

It seems like SubResource Integrity could be extended to do this...
It's specifically for the use case: where you kinda trust your CDN,
but you want to be completely sure.

-tom

On Fri, Mar 24, 2017 at 12:24 PM, Mike Hoye  wrote:

My 2006 proposal didn't get any traction either.

https://lists.w3.org/Archives/Public/public-whatwg-archive/2006Jan/0270.html

FWIW I still think it'd be a good idea with the right UI.

- mhoye


On 2017-03-24 1:16 PM, Dave Townsend wrote:

I remember that Gerv was interested in a similar idea many years ago, you
might want to see if he went anywhere with it.

https://blog.gerv.net/2005/03/link_fingerprin_1/


On Fri, Mar 24, 2017 at 10:12 AM, Gregory Szorc  wrote:


I recently reinstalled Windows 10 on one of my machines. This involved
visiting various web sites and downloading lots of software.

It is pretty common for software publishers to publish hashes or
cryptographic signatures of software so the downloaded software can be
verified. (Often times the download is performed through a CDN, mirroring
network, etc and you may not have full trust in the server operator.)

Unless you know how to practice safe security, you probably don't bother
verifying downloaded files match the signatures authors have provided.
Furthermore, many sites redundantly write documentation for how to verify
the integrity of downloads. This feels sub-optimal.

This got me thinking: why doesn't the user agent get involved to help
provide better download security? What my (not a web standard spec
author)
brain came up with is standardized metadata in the HTML for the download
link (probably an ) that defines file integrity information. When the
user agent downloads that file, it automatically verifies file integrity
and fails the download or pops up a big warning box, etc or things don't
check out. In other words, this mechanism would extend the trust anchor
in
the source web site (likely via a trusted x509 cert) to file downloads.
This would provide additional security over (optional) x509 cert
validation
of the download server alone. Having the integrity metadata baked into
the
origin site is important: you can't trust the HTTP response from the
download server because it may be from an untrusted server.

Having such a feature would also improve the web experience. How many
times
have you downloaded a corrupted file? Advanced user agents (like
browsers)
could keep telemetry of how often downloads fail integrity. This could be
used to identify buggy proxies, malicious ISPs rewriting content, etc.

I was curious if this enhancement to the web platform has ever been
considered and/or if it is something Mozilla would consider pushing.

gps
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Better download security through browsers

2017-03-24 Thread Tom Ritter
It seems like SubResource Integrity could be extended to do this...
It's specifically for the use case: where you kinda trust your CDN,
but you want to be completely sure.

-tom

On Fri, Mar 24, 2017 at 12:24 PM, Mike Hoye  wrote:
> My 2006 proposal didn't get any traction either.
>
> https://lists.w3.org/Archives/Public/public-whatwg-archive/2006Jan/0270.html
>
> FWIW I still think it'd be a good idea with the right UI.
>
> - mhoye
>
>
> On 2017-03-24 1:16 PM, Dave Townsend wrote:
>>
>> I remember that Gerv was interested in a similar idea many years ago, you
>> might want to see if he went anywhere with it.
>>
>> https://blog.gerv.net/2005/03/link_fingerprin_1/
>>
>>
>> On Fri, Mar 24, 2017 at 10:12 AM, Gregory Szorc  wrote:
>>
>>> I recently reinstalled Windows 10 on one of my machines. This involved
>>> visiting various web sites and downloading lots of software.
>>>
>>> It is pretty common for software publishers to publish hashes or
>>> cryptographic signatures of software so the downloaded software can be
>>> verified. (Often times the download is performed through a CDN, mirroring
>>> network, etc and you may not have full trust in the server operator.)
>>>
>>> Unless you know how to practice safe security, you probably don't bother
>>> verifying downloaded files match the signatures authors have provided.
>>> Furthermore, many sites redundantly write documentation for how to verify
>>> the integrity of downloads. This feels sub-optimal.
>>>
>>> This got me thinking: why doesn't the user agent get involved to help
>>> provide better download security? What my (not a web standard spec
>>> author)
>>> brain came up with is standardized metadata in the HTML for the download
>>> link (probably an ) that defines file integrity information. When the
>>> user agent downloads that file, it automatically verifies file integrity
>>> and fails the download or pops up a big warning box, etc or things don't
>>> check out. In other words, this mechanism would extend the trust anchor
>>> in
>>> the source web site (likely via a trusted x509 cert) to file downloads.
>>> This would provide additional security over (optional) x509 cert
>>> validation
>>> of the download server alone. Having the integrity metadata baked into
>>> the
>>> origin site is important: you can't trust the HTTP response from the
>>> download server because it may be from an untrusted server.
>>>
>>> Having such a feature would also improve the web experience. How many
>>> times
>>> have you downloaded a corrupted file? Advanced user agents (like
>>> browsers)
>>> could keep telemetry of how often downloads fail integrity. This could be
>>> used to identify buggy proxies, malicious ISPs rewriting content, etc.
>>>
>>> I was curious if this enhancement to the web platform has ever been
>>> considered and/or if it is something Mozilla would consider pushing.
>>>
>>> gps
>>> ___
>>> dev-platform mailing list
>>> dev-platform@lists.mozilla.org
>>> https://lists.mozilla.org/listinfo/dev-platform
>>>
>> ___
>> dev-platform mailing list
>> dev-platform@lists.mozilla.org
>> https://lists.mozilla.org/listinfo/dev-platform
>
>
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Better download security through browsers

2017-03-24 Thread Ben Kelly
We now have SRI and support integrity attributes on elements like

Re: Better download security through browsers

2017-03-24 Thread Mike Hoye

My 2006 proposal didn't get any traction either.

https://lists.w3.org/Archives/Public/public-whatwg-archive/2006Jan/0270.html

FWIW I still think it'd be a good idea with the right UI.

- mhoye

On 2017-03-24 1:16 PM, Dave Townsend wrote:

I remember that Gerv was interested in a similar idea many years ago, you
might want to see if he went anywhere with it.

https://blog.gerv.net/2005/03/link_fingerprin_1/


On Fri, Mar 24, 2017 at 10:12 AM, Gregory Szorc  wrote:


I recently reinstalled Windows 10 on one of my machines. This involved
visiting various web sites and downloading lots of software.

It is pretty common for software publishers to publish hashes or
cryptographic signatures of software so the downloaded software can be
verified. (Often times the download is performed through a CDN, mirroring
network, etc and you may not have full trust in the server operator.)

Unless you know how to practice safe security, you probably don't bother
verifying downloaded files match the signatures authors have provided.
Furthermore, many sites redundantly write documentation for how to verify
the integrity of downloads. This feels sub-optimal.

This got me thinking: why doesn't the user agent get involved to help
provide better download security? What my (not a web standard spec author)
brain came up with is standardized metadata in the HTML for the download
link (probably an ) that defines file integrity information. When the
user agent downloads that file, it automatically verifies file integrity
and fails the download or pops up a big warning box, etc or things don't
check out. In other words, this mechanism would extend the trust anchor in
the source web site (likely via a trusted x509 cert) to file downloads.
This would provide additional security over (optional) x509 cert validation
of the download server alone. Having the integrity metadata baked into the
origin site is important: you can't trust the HTTP response from the
download server because it may be from an untrusted server.

Having such a feature would also improve the web experience. How many times
have you downloaded a corrupted file? Advanced user agents (like browsers)
could keep telemetry of how often downloads fail integrity. This could be
used to identify buggy proxies, malicious ISPs rewriting content, etc.

I was curious if this enhancement to the web platform has ever been
considered and/or if it is something Mozilla would consider pushing.

gps
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Better download security through browsers

2017-03-24 Thread Dave Townsend
I remember that Gerv was interested in a similar idea many years ago, you
might want to see if he went anywhere with it.

https://blog.gerv.net/2005/03/link_fingerprin_1/


On Fri, Mar 24, 2017 at 10:12 AM, Gregory Szorc  wrote:

> I recently reinstalled Windows 10 on one of my machines. This involved
> visiting various web sites and downloading lots of software.
>
> It is pretty common for software publishers to publish hashes or
> cryptographic signatures of software so the downloaded software can be
> verified. (Often times the download is performed through a CDN, mirroring
> network, etc and you may not have full trust in the server operator.)
>
> Unless you know how to practice safe security, you probably don't bother
> verifying downloaded files match the signatures authors have provided.
> Furthermore, many sites redundantly write documentation for how to verify
> the integrity of downloads. This feels sub-optimal.
>
> This got me thinking: why doesn't the user agent get involved to help
> provide better download security? What my (not a web standard spec author)
> brain came up with is standardized metadata in the HTML for the download
> link (probably an ) that defines file integrity information. When the
> user agent downloads that file, it automatically verifies file integrity
> and fails the download or pops up a big warning box, etc or things don't
> check out. In other words, this mechanism would extend the trust anchor in
> the source web site (likely via a trusted x509 cert) to file downloads.
> This would provide additional security over (optional) x509 cert validation
> of the download server alone. Having the integrity metadata baked into the
> origin site is important: you can't trust the HTTP response from the
> download server because it may be from an untrusted server.
>
> Having such a feature would also improve the web experience. How many times
> have you downloaded a corrupted file? Advanced user agents (like browsers)
> could keep telemetry of how often downloads fail integrity. This could be
> used to identify buggy proxies, malicious ISPs rewriting content, etc.
>
> I was curious if this enhancement to the web platform has ever been
> considered and/or if it is something Mozilla would consider pushing.
>
> gps
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
>
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Better download security through browsers

2017-03-24 Thread Gregory Szorc
I recently reinstalled Windows 10 on one of my machines. This involved
visiting various web sites and downloading lots of software.

It is pretty common for software publishers to publish hashes or
cryptographic signatures of software so the downloaded software can be
verified. (Often times the download is performed through a CDN, mirroring
network, etc and you may not have full trust in the server operator.)

Unless you know how to practice safe security, you probably don't bother
verifying downloaded files match the signatures authors have provided.
Furthermore, many sites redundantly write documentation for how to verify
the integrity of downloads. This feels sub-optimal.

This got me thinking: why doesn't the user agent get involved to help
provide better download security? What my (not a web standard spec author)
brain came up with is standardized metadata in the HTML for the download
link (probably an ) that defines file integrity information. When the
user agent downloads that file, it automatically verifies file integrity
and fails the download or pops up a big warning box, etc or things don't
check out. In other words, this mechanism would extend the trust anchor in
the source web site (likely via a trusted x509 cert) to file downloads.
This would provide additional security over (optional) x509 cert validation
of the download server alone. Having the integrity metadata baked into the
origin site is important: you can't trust the HTTP response from the
download server because it may be from an untrusted server.

Having such a feature would also improve the web experience. How many times
have you downloaded a corrupted file? Advanced user agents (like browsers)
could keep telemetry of how often downloads fail integrity. This could be
used to identify buggy proxies, malicious ISPs rewriting content, etc.

I was curious if this enhancement to the web platform has ever been
considered and/or if it is something Mozilla would consider pushing.

gps
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Intent to ship: IntersectionObserver API

2017-03-24 Thread Tobias Schneider
We just successfully run another experiment, enabling the
IntersectionObserver API for 50% of our Nightly user population. No related
stability issues reported. Also, find a Gecko profile using an Intersection
Observer per element (109192 total) on the single page version of the HTML5
spec here:
https://www.dropbox.com/s/cm55ixrvk7bqaxu/SinglePageHTML5SpecIntersectionObserver.sps.json.zip?dl=0
.

On Thu, Mar 16, 2017 at 1:52 AM, Patrick Brosset 
wrote:

> >
> > 1)  Is there devtools support for this (e.g. to be able to see what
> > intersection observers are registered where)?  If not, are there at least
> > bugs tracking it?
>
>
> I filed https://bugzilla.mozilla.org/show_bug.cgi?id=1347849 for this.
> MutationObserver would be a good one to add support for too!
>
> On Thu, Mar 16, 2017 at 8:39 AM, Anne van Kesteren 
> wrote:
>
> > On Wed, Mar 15, 2017 at 7:38 PM, Boris Zbarsky  wrote:
> > > On 3/15/17 1:32 PM, Tobias Schneider wrote:
> > >> 2.3) Platform tests are in the process of being upstreamed by Google (
> > >> https://github.com/w3c/web-platform-tests/pull/4384).
> > >
> > > That seems to be in limbo for well over a month now.  jgraham just
> poked
> > it
> > > in hopes of at least getting automated testing going so we find out
> > whether
> > > Firefox passes the tests... but that will test release, afaik. Have you
> > > checked whether we pass these tests?
> >
> > They would be run against Nightly.
> >
> >
> > --
> > https://annevankesteren.nl/
> > ___
> > dev-platform mailing list
> > dev-platform@lists.mozilla.org
> > https://lists.mozilla.org/listinfo/dev-platform
> >
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
>
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Future of out-of-tree spell checkers?

2017-03-24 Thread Ehsan Akhgari
On 2017-03-24 4:20 AM, Henri Sivonen wrote:
> On Fri, Mar 24, 2017 at 2:38 AM, Ehsan Akhgari  
> wrote:
>> On Wed, Mar 22, 2017 at 11:50 AM, Jeff Muizelaar 
>> wrote:
>>>
>>> On Wed, Mar 22, 2017 at 11:08 AM, Henri Sivonen 
>>> wrote:

 dlopening libvoikko, if installed, and having thin C++ glue code
 in-tree seems much simpler, except maybe for sandboxing. What are the
 sandboxing implications of dlopening a shared library that will want
 to load its data files?
>>>
>>> My understanding is that the spell checker mostly lives in the Chrome
>>> process so it seems sandboxing won't be a problem.
>>
>>
>> That is mostly correct.  The spell checker *completely* lives in the parent
>> process and is completely unaffected by sandboxing.
>>
>> But that's actually a problem.  My understanding is that WebExtensions won't
>> be allowed to load code in the parent process.  Bill, Kris, is that correct?
>> If yes, we should work with the maintainers of the Finnish and Greenlandic
>> dictionaries on adding custom support for loading their code...
> 
> But when (according to doing a Google Web search excluding mozilla.org
> and wading through all the results and by searching the JS for all
> AMO-hosted extensions) the only out-of-tree spell checkers use
> libvoikko, why involve Web Extensions at all? Why wouldn't we dlopen
> libvoikko and put a thin C++ adapter between libvoikko's C API and our
> internal C++ interface in-tree? That would be significantly simpler
> than involving Web extensions.

Is that different than what I suggested above in some way that I'm
missing?  I think it's better to engage the developers of those
libraries first and ask them how they would like us to proceed.  At any
rate, something has to change on their side, since after Firefox 57
presumably Firefox would just ignore their XPI file or something.  The
actual implementation mechanism would probably end up being the
dlopening that you're suggesting, but if we're going to be signing up to
doing that, we better have at least a communication channel with the
authors of those libraries in case for example we need to change
something on our interface some day.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Faster gecko builds with IceCC on Mac and Linux

2017-03-24 Thread Ted Mielczarek
On Fri, Mar 24, 2017, at 12:10 AM, Jeff Muizelaar wrote:
> I have a Ryzen 7 1800 X and it does a Windows clobber builds in ~20min
> (3 min of that is configure which seems higher than what I've seen on
> other machines). This compares pretty favorably to the Lenovo p710
> machines that people are getting which do 18min clobber builds and
> cost more than twice the price.

Just as a data point, I have one of those Lenovo P710 machines and I get
14-15 minute clobber builds on Windows.

-Ted
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Faster gecko builds with IceCC on Mac and Linux

2017-03-24 Thread Gabriele Svelto
On 24/03/2017 05:39, Gregory Szorc wrote:
> The introduction of Ryzen has literally changed the landscape
> and the calculus that determines what hardware engineers should have.
> Before I disappeared for ~1 month, I was working with IT and management to
> define an optimal hardware load out for Firefox engineers. I need to resume
> that work and fully evaluate Ryzen...

The fact that with the appropriate motherboard they also support ECC
memory (*) made a lot of Xeon offerings a lot less appealing. Especially
the workstation-oriented ones.

 Gabriele

*) Which is useful to those of us who keep their machines on for weeks
w/o rebooting or just want to have a more reliable setup



signature.asc
Description: OpenPGP digital signature
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Future of out-of-tree spell checkers?

2017-03-24 Thread Henri Sivonen
On Fri, Mar 24, 2017 at 2:38 AM, Ehsan Akhgari  wrote:
> On Wed, Mar 22, 2017 at 11:50 AM, Jeff Muizelaar 
> wrote:
>>
>> On Wed, Mar 22, 2017 at 11:08 AM, Henri Sivonen 
>> wrote:
>> >
>> > dlopening libvoikko, if installed, and having thin C++ glue code
>> > in-tree seems much simpler, except maybe for sandboxing. What are the
>> > sandboxing implications of dlopening a shared library that will want
>> > to load its data files?
>>
>> My understanding is that the spell checker mostly lives in the Chrome
>> process so it seems sandboxing won't be a problem.
>
>
> That is mostly correct.  The spell checker *completely* lives in the parent
> process and is completely unaffected by sandboxing.
>
> But that's actually a problem.  My understanding is that WebExtensions won't
> be allowed to load code in the parent process.  Bill, Kris, is that correct?
> If yes, we should work with the maintainers of the Finnish and Greenlandic
> dictionaries on adding custom support for loading their code...

But when (according to doing a Google Web search excluding mozilla.org
and wading through all the results and by searching the JS for all
AMO-hosted extensions) the only out-of-tree spell checkers use
libvoikko, why involve Web Extensions at all? Why wouldn't we dlopen
libvoikko and put a thin C++ adapter between libvoikko's C API and our
internal C++ interface in-tree? That would be significantly simpler
than involving Web extensions.

-- 
Henri Sivonen
hsivo...@hsivonen.fi
https://hsivonen.fi/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform