Re: Intent to restrict to secure contexts: navigator.geolocation

2016-10-21 Thread Chris Peterson

On 10/21/2016 3:11 PM, Tantek Çelik wrote:

> Does this mean that we'd be breaking one in 5 geolocation requests as a
> result of this?  That seems super high.  :(

Agreed. For example, my understanding is that this will break
http://www.nextbus.com/ (and thus http://www.nextmuni.com/ ) location
awareness (useful for us SF folks), which is kind of essential for
having it tell you transit stops near you. -t


Indeed, the geolocation feature on nextbus.com is broken in Chrome. (The 
site shows a geolocation error message on first use.)


Next Bus already has an HTTPS version of their site, but it is not the 
default and has some mixed-content warnings. For a site that uses 
geolocation as a core part of its service, I'm surprised they have let 
it stay broken in Chrome for six months. Chrome removed insecure 
geolocation in April 2016 and announced its deprecation in November 2015.

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Proposed W3C Charter: Automotive Working Group

2016-10-21 Thread Tantek Çelik
Ekr,

This sounds to me like there are sufficient reasons to formally object
to this charter, and as Martin points out, a special case of IoT/WoT
(with additional concerns!).

David,

Thus I too think we should formally object, link to our previous
formal objection of the WoT charter (since nearly all the same reasons
apply), and list the new items that Martin and Ekr provided. I suggest
we cc this response to www-archive as well.

Thanks,

Tantek



On Tue, Oct 18, 2016 at 3:10 AM, Eric Rescorla  wrote:
> I share Martin's concerns here...
>
> There's fairly extensive evidence of security vulnerabilities in
> vehicular systems that can lead to serious safety issues (see:
> http://www.autosec.org/publications.html), so more than usual
> attention needs to be paid to security in this context.
>
> In fairness, a lot of these are implementation security issues:
> i.e., how to properly firewall off any network access from the
> CAN bus. You really need to ensure that there's no way
> to influence the CAN bus, which probably means some kind of
> very strong one-way communications guarantee. At some level
> these are out of scope for this group, but it's predictable
> that if this technology is built, people will also implement
> it in insecure ways, so in that respect it's very much in-scope.
>
> The commuications security story also seems to be not well
> fleshed out. The examples shown all seem to have fixed hostnames
> (wwwivi/localhost) which don't really seem like the basis for
> strong identity. It's not just enough to say, as in S 6, that
> the server has to have a certificate; what are the properties of
> that certificate? What identity does it have?
>
> This is particularly concerning:
>
> At this point, internet based clients and servers do not know the
> dynamic IP address that was assigned to a specific vehicle. So
> normally, a vehicle has to connect to a well known endpoint,
> generally using a URL to connect to a V2X Server. The vehicle and
> the internet server typically mutually authenticate and the
> vehicle 'registers' with the server over an encrypted channel
> passing it a unique identifier e.g. its Vehicle Identification
> Number (VIN). From that point on, the server has the IP address
> that is currently assigned to a vehicle with a particular VIN, and
> can share this information with other internet based clients and
> servers, which are then able to send messages to the vehicle.
>
> How does the V2X server know that this is actually my VIN? Just because
> I claim it over an encrypted channel.
>
> In IETF we often ask at the WG-forming stage whether we feel that the
> community has the expertise to take on this work. The current proposal
> seems to call that into question and absent some evidence that that
> expertise does in fact exist, I believe we shoud oppose formation.
>
> -Ekr
>
>
> On Mon, Oct 17, 2016 at 5:11 PM, Martin Thomson  wrote:
>
>> This seems to be a more specific instance of WoT.  As such, the goals
>> are much clearer here.  While some of the concerns with the WoT
>> charter apply (security in particular!), here are a few additional
>> observations:
>>
>> Exposing the level of information that they claim to want to expose
>> needs more privacy treatment than just a casual mention of the PIG.
>>
>> Websockets?  Protocol?  Both of these are red flags.  Protocol
>> development is an entirely different game to APIs and the choice of
>> websockets makes me question the judgment of the people involved.  Of
>> particular concern is how the group intends to manage interactions
>> with SOP.  Do they intend to allow the web at large to connect to
>> parts of the car?  The security architecture is worrying in its lack
>> of detail.
>>
>> If this proceeds, the naming choice (wwwivi) will have to change.  It
>> is not OK to register a new GTLD (see
>> https://tools.ietf.org/html/rfc6761).  A similar mistake was made
>> recently in the IETF, and it was ugly. For people interested in the
>> gory details, I can provide more details offline.
>>
>> On Tue, Oct 18, 2016 at 6:32 AM, L. David Baron  wrote:
>> > The W3C is proposing a new charter for:
>> >
>> >   Automotive Working Group
>> >   https://lists.w3.org/Archives/Public/public-new-work/2016Oct/0003.html
>> >   https://www.w3.org/2014/automotive/charter-2016.html
>> >
>> > Mozilla has the opportunity to send comments or objections through
>> > Monday, November 7.  However, I hope to be able to complete the
>> > comments by Tuesday, October 25.
>> >
>> > Please reply to this thread if you think there's something we should
>> > say as part of this charter review, or if you think we should
>> > support or oppose it.
>> >
>> > Note that this is a new working group.  I don't know of anyone from
>> > Mozilla being involved in the discussions that led to this charter.
>> >
>> > -David
>> >
>> > --
>> > 턞   L. David Baron

Re: Intent to restrict to secure contexts: navigator.geolocation

2016-10-21 Thread Tantek Çelik
On Fri, Oct 21, 2016 at 2:56 PM, Ehsan Akhgari  wrote:
> On 2016-10-21 3:49 PM, Richard Barnes wrote:
>> The geolocation API allows web pages to request the user's geolocation,
>> drawing from things like GPS on mobile, and doing WiFi / IP based
>> geolocation on desktop.
>>
>> Due to the privacy risks associated with this functionality, I would like
>> to propose that we restrict this functionality to secure contexts [1].
>>
>> Our telemetry for geolocation is a little rough, but we can derive some
>> upper bounds.  According to telemetry from Firefox 49, the geolocation
>> permissions prompt has been shown around 4.6M times [2], on about 3B page
>> loads [3].  Around 21% of these requests were (1) from "http:" origins, and
>> (2) granted by the user.  So the average rate of permissions being granted
>> to non-secure origins per pageload is 4.6M * 21% / 3B = 0.0319%.
>
> Does this mean that we'd be breaking one in 5 geolocation requests as a
> result of this?  That seems super high.  :(

Agreed. For example, my understanding is that this will break
http://www.nextbus.com/ (and thus http://www.nextmuni.com/ ) location
awareness (useful for us SF folks), which is kind of essential for
having it tell you transit stops near you. -t


> Since the proposal in the bug is adding [SecureContext] to
> Navigator.geolocation, have we also collected telemetry around which
> properties and methods are accessed?  Since another kind of breakage we
> may encounter is code like |navigator.geolocation.getCurrentPosition()|
> throwing an exception and breaking other parts of site scripts...
>
>> Access to geolocation from non-secure contexts is already disabled in
>> Chrome [4] and WebKit [5].
>>
>> Please send any comments on this proposal by Friday, October 28.
>>
>> Relevant bug: https://bugzilla.mozilla.org/show_bug.cgi?id=1072859
>>
>> [1] https://www.w3.org/TR/secure-contexts/
>> [2] https://mzl.la/2eeoWm9
>> [3] https://mzl.la/2eoiIAw
>> [4] https://codereview.chromium.org/1530403002/
>> [5] https://trac.webkit.org/changeset/200686
>> ___
>> dev-platform mailing list
>> dev-platform@lists.mozilla.org
>> https://lists.mozilla.org/listinfo/dev-platform
>>
>
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Proposed W3C Charter: Second Screen Working Group

2016-10-21 Thread Tantek Çelik
I have reviewed the charter and the current set of deliverables. The
work appears to be proceeding reasonably (pragmatically, with many
members/implementers including Apple and Google) and reasonably
minimally scoped. There is also the companion Second Screen Community
Group which appears to be used to incubate work before proceeding in
the working group, a pattern which we are generally supportive of.

Support revised charter.

I don't know of our current implementation plans on this WG's deliverables.

The most recent discussion here of any of the deliverables of the WG
was on "PresentationAPI", in particular a "[PresentationAPI] Intend to
implement" thread here in 2014 September.

Tantek


On Mon, Oct 17, 2016 at 12:28 PM, L. David Baron  wrote:
> The W3C is proposing a revised charter for:
>
>   Second Screen Working Group
>   https://lists.w3.org/Archives/Public/public-new-work/2016Sep/0011.html
>   https://www.w3.org/2014/secondscreen/charter-2016.html
>
> Mozilla has the opportunity to send comments or objections through
> next Tuesday, October 25.
>
> Please reply to this thread if you think there's something we should
> say as part of this charter review, or if you think we should
> support or oppose it.
>
> Mozilla does have participants in this group:
> https://www.w3.org/2000/09/dbwg/details?group=74168=1=org#_MozillaFoundation
>
> -David
>
> --
> 턞   L. David Baron http://dbaron.org/   턂
> 턢   Mozilla  https://www.mozilla.org/   턂
>  Before I built a wall I'd ask to know
>  What I was walling in or walling out,
>  And to whom I was like to give offense.
>- Robert Frost, Mending Wall (1914)
>
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
>
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Intent to restrict to secure contexts: navigator.geolocation

2016-10-21 Thread Richard Barnes
The geolocation API allows web pages to request the user's geolocation,
drawing from things like GPS on mobile, and doing WiFi / IP based
geolocation on desktop.

Due to the privacy risks associated with this functionality, I would like
to propose that we restrict this functionality to secure contexts [1].

Our telemetry for geolocation is a little rough, but we can derive some
upper bounds.  According to telemetry from Firefox 49, the geolocation
permissions prompt has been shown around 4.6M times [2], on about 3B page
loads [3].  Around 21% of these requests were (1) from "http:" origins, and
(2) granted by the user.  So the average rate of permissions being granted
to non-secure origins per pageload is 4.6M * 21% / 3B = 0.0319%.

Access to geolocation from non-secure contexts is already disabled in
Chrome [4] and WebKit [5].

Please send any comments on this proposal by Friday, October 28.

Relevant bug: https://bugzilla.mozilla.org/show_bug.cgi?id=1072859

[1] https://www.w3.org/TR/secure-contexts/
[2] https://mzl.la/2eeoWm9
[3] https://mzl.la/2eoiIAw
[4] https://codereview.chromium.org/1530403002/
[5] https://trac.webkit.org/changeset/200686
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Windows XP and Vista Long Term Support Plan

2016-10-21 Thread Kyle Huey
On Fri, Oct 21, 2016 at 3:05 AM,   wrote:
> My point for the above paragraph is that even if Mozilla stops security 
> updates for ESR 52, these computers will still need to get around on the 
> Internet. These machines will still need to do log ins and banking. The world 
> isn't the same as back in the day when Netscape 4 roamed the web or even in 
> 2008 when Mozilla dropped support for Windows 98 SE with 2.0.0.20. Part of 
> securing the web means making sure that every server has a digital 
> certificate with Let's Encrypt. But that part only works if the browser has 
> up to date TLS and digital certificates. What happens to Vista and XP on ESR 
> 52 or even OSX 10.6-10.8 on ESR 45 when a POODLE style attack drives everyone 
> from TLS 1.2 to TLS 1.3 with no fall back? What happens when older 
> certificates are found to have been compromised by a third party like a crime 
> syndicate or government intelligence agency? Do ESR 52 and ESR 45 get stuck 
> with corrupted certificates while the latest versions of Firefox get their 
> certificates refresh
 ed
>  ?

No.  These machines should not be on the Internet anymore.  If the
operating system vendor is no longer supporting their product with
security releases an out of date TLS stack is a minor problem compared
to the remote code execution that's going to pwn the machine.

- Kyle
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Windows XP and Vista Long Term Support Plan

2016-10-21 Thread keithgallistel
On Monday, October 17, 2016 at 1:33:06 AM UTC-5, Peter Dolanjski wrote:
> Thanks for taking the time to provide thorough feedback.
> 
> 3) For Windows Vista, I don't see where the fire is. I realize that it has
> > a vastly smaller user base, but it is close to Window 7 code base and API
> > wise.
> 
> 
> I'm sure the engineering team can probably provide a more detailed response
> on this one, but as I understand it the main issue is that the sandboxing
> effort [1] makes use of Chromium's sandbox [2] which now only supports
> Windows 7+.
> The challenge would come from maintaining a separate version for Vista
> (which given the relatively low user numbers is hard to justify).
> 
> [1] https://wiki.mozilla.org/Security/Sandbox
> [2] http://www.chromium.org/developers/design-documents/sandbox
> 
> Peter

Greetings again.

First, I didn't think about OS support as a problem for Vista, but it makes 
sense considering how Vista failed in the market. When Mozilla does the press 
release stating that it is dropping Vista and XP at the same time, the 
explanation for Vista's dropping should include information about the lack of 
tool and library support so that people don't think Mozilla is just copying 
Google again.

Over the past week, since I responded to your post, I had to use old laptop 
because it was storming (it's not good to have a computer on during a lightning 
storm I learned that the hard way). It was a Vista x32 that I upgraded to 
Windows 7 x64, but using it got me thinking. In your initial post, you 
basically laid out that Vista and XP would be left a ESR 52 and that ESR 52 
would be feature frozen with only security updates. Said security updates past 
the first year would be on an 'as user base size justified' basis. My concern 
is about something else along the lines of security: what about TLS and digital 
certificates?

In 2014 arstechnica did a story "My coworkers made me use Mac OS 9 for their 
(and your) amusement" 
(http://arstechnica.com/apple/2014/09/my-coworkers-made-me-use-mac-os-9-for-their-and-your-amusement/).
 On page 2, the author talked about trying to do work and failing. He had found 
the last remaining modern OS 9 web browser, Classilla, and tried to log in to 
the server to do work only to find that the lack of modern encryption made 
working from an OS 9 machine impossible. Cameron Kaiser did update Classilla 
after that article ("And now for something completely different: Classilla is 
back", 
https://tenfourfox.blogspot.com/2014/10/and-now-for-something-completely_28.html).
 Reading through his blog post, he described needing to update NSS (Network 
Security Services) to recognize TLS 1.0 and SHA-2. Furthermore, the release 
notes (http://www.floodgap.com/software/classilla/releases/) describe the SSL 
root certificates having to be refreshed.

My point for the above paragraph is that even if Mozilla stops security updates 
for ESR 52, these computers will still need to get around on the Internet. 
These machines will still need to do log ins and banking. The world isn't the 
same as back in the day when Netscape 4 roamed the web or even in 2008 when 
Mozilla dropped support for Windows 98 SE with 2.0.0.20. Part of securing the 
web means making sure that every server has a digital certificate with Let's 
Encrypt. But that part only works if the browser has up to date TLS and digital 
certificates. What happens to Vista and XP on ESR 52 or even OSX 10.6-10.8 on 
ESR 45 when a POODLE style attack drives everyone from TLS 1.2 to TLS 1.3 with 
no fall back? What happens when older certificates are found to have been 
compromised by a third party like a crime syndicate or government intelligence 
agency? Do ESR 52 and ESR 45 get stuck with corrupted certificates while the 
latest versions of Firefox get their certificates refreshed
 ? 

I'm just wondering what your thoughts on if this is an issue or how to handle 
it if it is. 
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform