WebIDL consts now generate C++ definitions

2017-10-26 Thread Kyle Machulis
As of Bug 792059 landing, If you're adding new WebIDL interfaces to gecko,
we will now generate constant definitions in the binding namespace of the
C++ binding headers.

For example, if you have a WebIDL interface that looks like

---

interface TestExampleInterface {
...
  [NeedsWindowsUndef]
  const unsigned long NO_ERROR = 0x;
...
}

---

This will now be accessible in C++ as
mozilla::dom::TestExampleInterfaceBinding::NO_ERROR. You will no longer
need to copy constant definitions into your binding implementations.

This addition also creates the new [NeedsWindowsUndef] extended attribute,
as some WebIDL constant names conflict with windows.h macros, and undef'ing
them in the binding generation is easier than tracking down include order
issues.

This information will be in the MDN WebIDL Bindings page soon.

Bug 1407106 has been created to clean up the interface const definitions
already in gecko.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Gregory Szorc
On Thu, Oct 26, 2017 at 4:31 PM, Mike Hommey  wrote:

> On Thu, Oct 26, 2017 at 04:02:20PM -0700, Gregory Szorc wrote:
> > Also, the machines come with Windows by default. That's by design: that's
> > where the bulk of Firefox users are. We will develop better products if
> the
> > machines we use every day resemble what actual users use. I would
> encourage
> > developers to keep Windows on the new machines when they are issued.
>
> Except actual users are not using i9s or dual xeons. Yes, we have
> slower reference hardware, but that also makes the argument of using the
> same thing as actual users less relevant: you can't develop on machines
> that actually look like what users have. So, as long as you have the
> slower reference hardware to test, it doesn't seem to me it should
> matter what OS you're running on your development machine.


Host OS matters for finding UI bugs and issues with add-ons (since lots of
add-on developers are also on Linux or MacOS).

I concede that performance testing on i9s and Xeons is not at all
indicative of the typical user :)
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Mike Hommey
On Thu, Oct 26, 2017 at 04:02:20PM -0700, Gregory Szorc wrote:
> Also, the machines come with Windows by default. That's by design: that's
> where the bulk of Firefox users are. We will develop better products if the
> machines we use every day resemble what actual users use. I would encourage
> developers to keep Windows on the new machines when they are issued.

Except actual users are not using i9s or dual xeons. Yes, we have
slower reference hardware, but that also makes the argument of using the
same thing as actual users less relevant: you can't develop on machines
that actually look like what users have. So, as long as you have the
slower reference hardware to test, it doesn't seem to me it should
matter what OS you're running on your development machine.

Mike
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Jeff Muizelaar
On Thu, Oct 26, 2017 at 7:02 PM, Gregory Szorc  wrote:
> I also share your desire to not issue fancy video cards in these machines
> by default. If there are suggestions for a default video card, now is the
> time to make noise :)

Intel GPUs are the best choice if you want to be like bulk of our
users. Otherwise any cheap AMD GPU is going to be good enough.
Probably the number and kind of display outputs are what matters most.

-Jeff
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Jeff Muizelaar
On Thu, Oct 26, 2017 at 7:02 PM, Gregory Szorc  wrote:
> Unless you have requirements that prohibit using a
> VM, I encourage using this setup.

rr doesn't work in hyper-v. AFAIK the only Windows VM it works in is VMWare

-Jeff
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Gregory Szorc
On Thu, Oct 26, 2017 at 6:34 AM, Henri Sivonen  wrote:

> On Thu, Oct 26, 2017 at 9:15 AM, Henri Sivonen 
> wrote:
> > There's a huge downside, though:
> > If the screen stops consuming the DisplayPort data stream, the
> > graphical session gets killed! So if you do normal things like turn
> > the screen off or switch input on a multi-input screen, your graphical
> > session is no longer there when you come back and you get a login
> > screen instead! (I haven't yet formed an opinion on whether this
> > behavior can be lived with or not.)
>
> And the downsides don't even end there. rr didn't work. Plus other
> stuff not worth mentioning here.
>
> I guess going back to 16.04.1 is a better deal than 17.10.
>
> > P.S. It would be good for productivity if Mozilla issued slightly less
> > cutting-edge Nvidia GPUs to developers to increase the probability
> > that support in nouveau has had time to bake.
>
> This Mozilla-issued Quadro M2000 has been a very significant harm to
> my productivity. Considering how good rr is, I think it makes sense to
> continue to run Linux to develop Firefox. However, I think it doesn't
> make sense to issue fancy cutting-edge Nvidia GPUs to developers who
> aren't specifically working on Nvidia-specific bugs and, instead, it
> would make sense to issue GPUs that are boring as possible in terms of
> Linux driver support (i.e. Just Works with distro-bundled Free
> Software drivers). Going forward, perhaps Mozilla could issue AMD GPUs
> with computers that don't have Intel GPUs?
>
> As for the computer at hand, I want to put an end to this Nvidia
> obstacle to getting stuff done. It's been suggested to me that Radeon
> RX 560 would be well supported by distro-provided drivers, but the
> "*2" footnote at https://help.ubuntu.com/community/AMDGPU-Driver
> doesn't look too good. Based on that table it seems one should get
> Radeon RX 460. Is this the correct conclusion? Does Radeon RX 460 Just
> Work with Ubuntu 16.04? Is Radeon RX 460 going to be
> WebRender-compatible?
>

Sophana (CCd) is working on a new system build right now. It will be based
on the i9's instead of dual socket Xeons and should be faster and cheaper.
We can all thank AMD for introducing competition in the CPU market to
enable this to happen :)

I also share your desire to not issue fancy video cards in these machines
by default. If there are suggestions for a default video card, now is the
time to make noise :)

Also, the machines come with Windows by default. That's by design: that's
where the bulk of Firefox users are. We will develop better products if the
machines we use every day resemble what actual users use. I would encourage
developers to keep Windows on the new machines when they are issued.

I concede that developing Firefox on Linux is better than on Windows for a
myriad of reasons. However, that doesn't mean you have to forego Linux. I
use Hyper-V under Windows 10 to run Linux. I do most of my development
(editors, builds, etc) in that local Linux VM. I use an X server for
connecting to graphic Linux applications. The overhead of Hyper-V as
compared to native Linux is negligible. Unless I need fast graphics in
Linux (which is rare), I pretty much get the advantages of Windows *and*
Linux simultaneously. Unless you have requirements that prohibit using a
VM, I encourage using this setup.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Thomas Daede
On 10/26/2017 06:34 AM, Henri Sivonen wrote:
> As for the computer at hand, I want to put an end to this Nvidia
> obstacle to getting stuff done. It's been suggested to me that Radeon
> RX 560 would be well supported by distro-provided drivers, but the
> "*2" footnote at https://help.ubuntu.com/community/AMDGPU-Driver
> doesn't look too good. Based on that table it seems one should get
> Radeon RX 460. Is this the correct conclusion? Does Radeon RX 460 Just
> Work with Ubuntu 16.04? Is Radeon RX 460 going to be
> WebRender-compatible?
> 

I have a RX 460 in a desktop with F26 and can confirm that it works
out-of-the-box at 4K with the open source drivers, and will happily run
Pathfinder demos at <16ms frame time.* It also seems to run Servo's
Webrender just fine.

It's been superseded by the RX 560, which is a faster clock of the same
chip. It should work just as well, but might need a slightly newer
kernel than the 4xx to pick up the pci ids (maybe a problem with LTS
ubuntu?) The RX 570 and 580 should be fine too, but require power
connectors. The Vega models are waiting on a kernel-side driver rewrite
(by AMD) that will land in 4.15 (hopefully with new features and
regressions to the RX 5xx series...)

Intel graphics are also nice but only available on the E3 xeons AFAIK.
And nouveau is stuck, because new cards require signed firmware that
nVidia is unwilling to distribute.

* While Pathfinder happily renders at 60fps, Firefox draws frames slower
because of its WebGL readback path. That is not the fault of the GPU,
however.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Default Rust optimization level decreased from 2 to 1

2017-10-26 Thread Boris Zbarsky

On 10/26/17 2:51 PM, Jeff Muizelaar wrote:

 What's the use case for a
--enable-optimize, opt-level=1 build?


Fwiw, I ended up doing a fair amount of my work recently in a 
"--enable-optimize, --disable-debug, --enable-rust-debug" build, for the 
following reasons:


1)  I was modifying Rust style system code a lot and wanted a sane 
compile time for incremental modifications.  The time it took to build a 
fully-optimized gkrust was prohibitive.


2)  I was running tests locally on my modifications, and debug build 
startup time in our test harnesses is horrible, to the point where 
compiling my changes would take less time than running a single reftest. 
 Hence the --enable-optimize.


So that's a use case...

I agree that I constantly had to remember to never ever do any 
performance measurement in that build; I treated it like I do my debug 
builds in terms of what things I did with it, basically.


-Boris
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Default Rust optimization level decreased from 2 to 1

2017-10-26 Thread Jeff Muizelaar
On Thu, Oct 26, 2017 at 3:08 PM, Gregory Szorc  wrote:
> Would it help if we had a separate --enable-optimize-rust (or similar)
> option to control Rust optimizations so we have separate knobs? If we did
> that, --disable-optimize-rust could be opt-level 0 or 1 and
> --enable-optimize-rust could be opt-level=2. The local defaults would
> probably be --enable-optimize/--disable-optimize-rust (mirroring today).

Yeah, that would probably be more user friendly than the environment
variable solution that we have today. Still it's hard to know what the
correct defaults are.

> I'm not sure if it is possible, but per-crate optimization levels might
> help. Although, the data shows that the style crate is one of the slowest to
> compile. And, this crate's optimization is also apparently very important
> for accurate performance testing. That's a really unfortunate conflict to
> have and it would be terrific if we could make the style crate compile
> faster so this conflict goes away. I've filed bug 1412077 to track
> improvements here.

Hopefully the thinlto work that Alex is doing
(https://internals.rust-lang.org/t/help-test-out-thinlto/6017) will
make a difference here.

-Jeff
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Default Rust optimization level decreased from 2 to 1

2017-10-26 Thread Gregory Szorc
On Thu, Oct 26, 2017 at 11:51 AM, Jeff Muizelaar 
wrote:

> FWIW, WebRender becomes unusable opt-level=1. It also looks like style
> performance takes quite a hit as well which means that our default
> developer builds become unusable for performance work. I worry that
> people will forget this and end up rediscovering only when they look
> at profiles (as mstange just did). What's the use case for a
> --enable-optimize, opt-level=1 build?
>

Yes, this was a painful trade-off. I acknowledge we still may not have the
proper set of defaults or tweakable options in play.

Currently, --enable-optimize is the default and it is tied to both C/C++
and Rust (Rust inherited the option).

Would it help if we had a separate --enable-optimize-rust (or similar)
option to control Rust optimizations so we have separate knobs? If we did
that, --disable-optimize-rust could be opt-level 0 or 1 and
--enable-optimize-rust could be opt-level=2. The local defaults would
probably be --enable-optimize/--disable-optimize-rust (mirroring today).

I'm not sure if it is possible, but per-crate optimization levels might
help. Although, the data shows that the style crate is one of the slowest
to compile. And, this crate's optimization is also apparently very
important for accurate performance testing. That's a really unfortunate
conflict to have and it would be terrific if we could make the style crate
compile faster so this conflict goes away. I've filed bug 1412077 to track
improvements here.


>
> On Wed, Oct 25, 2017 at 1:34 PM, Gregory Szorc  wrote:
> > Compiling Rust code with optimizations is significantly slower than
> > compiling without optimizations. As was measured in bug 1411081, the
> > difference between rustc's -Copt-level 1 and 2 on an i7-6700K (4+4 cores)
> > for a recent revision of mozilla-central was 325s/802s wall/CPU versus
> > 625s/1282s. This made Rust compilation during Firefox builds stand out
> as a
> > long pole and significantly slowed down builds.
> >
> > Because we couldn't justify the benefits of level 2 for the build time
> > overhead it added, we've changed the build system default so Rust is
> > compiled with -Copt-level=1 (instead of 2).
> >
> > Adding --enable-release to your mozconfig (the configuration for builds
> we
> > ship to users) enables -Copt-level=2. (i.e. we didn't change optimization
> > settings for builds we ship to users.)
> >
> > Adding --disable-optimize sets to -Copt-level=0. (This behavior is
> > unchanged.)
> >
> > If you want explicit control over -Copt-level, you can `export
> > RUSTC_OPT_LEVEL=` in your mozconfig and that value will always be
> > used. --enable-release implies a number of other changes. So if you just
> > want to restore the old build system behavior, set this variable in your
> > mozconfig.
> >
> > Also, due to ongoing work around Rust integration in the build system, it
> > is dangerous to rely on manual `cargo` invocations to compile Rust
> because
> > bypassing the build system (not using `mach build`) may not use the same
> > set of RUSTFLAGS that direct `cargo` invocations do. Things were mostly
> in
> > sync before. But this change and anticipated future changes will cause
> more
> > drift. If you want the correct behavior, use `mach`.
> > ___
> > dev-platform mailing list
> > dev-platform@lists.mozilla.org
> > https://lists.mozilla.org/listinfo/dev-platform
>
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Default Rust optimization level decreased from 2 to 1

2017-10-26 Thread Jeff Muizelaar
FWIW, WebRender becomes unusable opt-level=1. It also looks like style
performance takes quite a hit as well which means that our default
developer builds become unusable for performance work. I worry that
people will forget this and end up rediscovering only when they look
at profiles (as mstange just did). What's the use case for a
--enable-optimize, opt-level=1 build?

-Jeff

On Wed, Oct 25, 2017 at 1:34 PM, Gregory Szorc  wrote:
> Compiling Rust code with optimizations is significantly slower than
> compiling without optimizations. As was measured in bug 1411081, the
> difference between rustc's -Copt-level 1 and 2 on an i7-6700K (4+4 cores)
> for a recent revision of mozilla-central was 325s/802s wall/CPU versus
> 625s/1282s. This made Rust compilation during Firefox builds stand out as a
> long pole and significantly slowed down builds.
>
> Because we couldn't justify the benefits of level 2 for the build time
> overhead it added, we've changed the build system default so Rust is
> compiled with -Copt-level=1 (instead of 2).
>
> Adding --enable-release to your mozconfig (the configuration for builds we
> ship to users) enables -Copt-level=2. (i.e. we didn't change optimization
> settings for builds we ship to users.)
>
> Adding --disable-optimize sets to -Copt-level=0. (This behavior is
> unchanged.)
>
> If you want explicit control over -Copt-level, you can `export
> RUSTC_OPT_LEVEL=` in your mozconfig and that value will always be
> used. --enable-release implies a number of other changes. So if you just
> want to restore the old build system behavior, set this variable in your
> mozconfig.
>
> Also, due to ongoing work around Rust integration in the build system, it
> is dangerous to rely on manual `cargo` invocations to compile Rust because
> bypassing the build system (not using `mach build`) may not use the same
> set of RUSTFLAGS that direct `cargo` invocations do. Things were mostly in
> sync before. But this change and anticipated future changes will cause more
> drift. If you want the correct behavior, use `mach`.
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Overriding JS to allow for opening in a new tab?

2017-10-26 Thread Andrew Overholt
On Thu, Oct 26, 2017 at 12:34 AM, Boris Zbarsky  wrote:

> Either approach would break at least some legitimate sites.
>

Thanks for confirming this.

In general, as Myk points out, the question of when web pages should be
> able to respond to what sorts of input, and whether they should be able to
> prevent the default browser respond to that same input, is something that
> keeps coming up.  It's not clear to me that there is a general
> context-independent answer here.


Ok, thanks for the clarification (also to Myk and Dave!). Sounds like in
this case there's no easy answer.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Default Rust optimization level decreased from 2 to 1

2017-10-26 Thread Gregory Szorc
On Wed, Oct 25, 2017 at 10:34 AM, Gregory Szorc  wrote:

> Compiling Rust code with optimizations is significantly slower than
> compiling without optimizations. As was measured in bug 1411081, the
> difference between rustc's -Copt-level 1 and 2 on an i7-6700K (4+4 cores)
> for a recent revision of mozilla-central was 325s/802s wall/CPU versus
> 625s/1282s. This made Rust compilation during Firefox builds stand out as a
> long pole and significantly slowed down builds.
>
> Because we couldn't justify the benefits of level 2 for the build time
> overhead it added, we've changed the build system default so Rust is
> compiled with -Copt-level=1 (instead of 2).
>
> Adding --enable-release to your mozconfig (the configuration for builds we
> ship to users) enables -Copt-level=2. (i.e. we didn't change optimization
> settings for builds we ship to users.)
>
> Adding --disable-optimize sets to -Copt-level=0. (This behavior is
> unchanged.)
>
> If you want explicit control over -Copt-level, you can `export
> RUSTC_OPT_LEVEL=` in your mozconfig and that value will always be
> used. --enable-release implies a number of other changes. So if you just
> want to restore the old build system behavior, set this variable in your
> mozconfig.
>
> Also, due to ongoing work around Rust integration in the build system, it
> is dangerous to rely on manual `cargo` invocations to compile Rust because
> bypassing the build system (not using `mach build`) may not use the same
> set of RUSTFLAGS that direct `cargo` invocations do. Things were mostly in
> sync before. But this change and anticipated future changes will cause more
> drift. If you want the correct behavior, use `mach`.
>

Heads up: the code change uncovered a subtle bug in sccache's argument
parser. If you see "error: codegen option `opt-level` requires a string (C
opt-level=)", you've hit it.

Until the fix merges around, the workaround is to disable sccache or graft
https://hg.mozilla.org/integration/autoland/rev/bc103ec9d0b3.

Sorry for any disruption. Caching and argument parsing are hard :/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Better triage for intermittent leaks in tests?

2017-10-26 Thread Geoffrey Brown
Some of our most troublesome intermittent test failures are leak bugs
("Intermittent LeakSanitizer | leak at ..." or "Intermittent leakcheck
| default process:  bytes leaked ...") Even when they fail
frequently, these bugs often seem to remain unresolved for many weeks.
Leaks are sometimes not strongly associated with a particular test,
making it difficult to assign to a useful bugzilla component, or find
a motivated triage owner or assignee.

I feel like these bugs are not being connected to the "right" people
effectively. Could we do better?

For instance, could we assign all leak bugs to a specific bugzilla
component, with a "leak guru" as triage owner? Volunteers??

- Geoff
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Visual Studio 2017 coming soon

2017-10-26 Thread mhoye



On 2017-10-26 12:19 PM, Ryan VanderMeulen wrote:

On 10/26/2017 10:14 AM, Milan Sreckovic wrote:
Are we locked into using the same compiler for the ESR updates?  In 
other words, do we need to keep VS2015 for ESR52 builds until they 
are not needed anymore?


Our compiler toolchains are determined with in-tree configs nowadays, 
so this change won't impact any other branches.


I'm pleased to discover that this change shouldn't hurt community 
developers on older hardware. VS2017 drops support for developing on 
some older versions of Windows Server, but doesn't otherwise cause major 
changes in hardware/software requirements.


- mhoye
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Visual Studio 2017 coming soon

2017-10-26 Thread Ryan VanderMeulen

On 10/26/2017 10:14 AM, Milan Sreckovic wrote:
Are we locked into using the same compiler for the ESR updates?  In 
other words, do we need to keep VS2015 for ESR52 builds until they are 
not needed anymore?


Our compiler toolchains are determined with in-tree configs nowadays, so 
this change won't impact any other branches.


-Ryan
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Visual Studio 2017 coming soon

2017-10-26 Thread David Major
It would be great to get these speed gains for 58, hot on the heels of the
57 release.

My plan is this: if I can get this landed by Monday, that still leaves two
weeks in the cycle. Based on my positive experience thus far with this
compiler (this update has been much more smooth than past ones), I'm
comfortable with that number. If it goes longer than that, I agree it makes
sense to wait for a new train.



On Thu, Oct 26, 2017 at 3:31 AM, Sylvestre Ledru  wrote:

> Hello,
>
>
> On 25/10/2017 23:48, David Major wrote:
> > I'm planning to move production Windows builds to VS2017 (15.4.1) in bug
> > 1408789.
> >
> In which version are you planning to land this change?
> As we are close to the end of the 58 cycle in nightly, it would be great
> to wait for 59.
>
> Thanks,
> Sylvestre
>
>
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Visual Studio 2017 coming soon

2017-10-26 Thread David Major
Agreed, changing compilers of an already-released ESR isn't a good idea.

You could use 2017 to build ESR52 locally though, if that's what you're
asking. Our tree has supported 2017 builds for a good while, since it's the
default VS download from Microsoft and a number of Mozillians have been
using it.

On Thu, Oct 26, 2017 at 10:33 AM, Jonathan Kew  wrote:

> On 26/10/2017 15:14, Milan Sreckovic wrote:
>
>> Are we locked into using the same compiler for the ESR updates?  In other
>> words, do we need to keep VS2015 for ESR52 builds until they are not needed
>> anymore?
>>
>>
> Yes, IMO.
>
> Whether or not we're "locked" in any technical sense, I think we should
> probably lock ourselves there by policy, unless a specific bug in the older
> compiler is directly affecting ESR builds in a serious way, and can only be
> solved by updating.
>
> Short of something like that (which seems pretty unlikely!), the stability
> risk involved in switching compilers doesn't sound like it belongs anywhere
> near the ESR world.
>
> JK
>
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
>
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Visual Studio 2017 coming soon

2017-10-26 Thread Jonathan Kew

On 26/10/2017 15:14, Milan Sreckovic wrote:
Are we locked into using the same compiler for the ESR updates?  In 
other words, do we need to keep VS2015 for ESR52 builds until they are 
not needed anymore?




Yes, IMO.

Whether or not we're "locked" in any technical sense, I think we should 
probably lock ourselves there by policy, unless a specific bug in the 
older compiler is directly affecting ESR builds in a serious way, and 
can only be solved by updating.


Short of something like that (which seems pretty unlikely!), the 
stability risk involved in switching compilers doesn't sound like it 
belongs anywhere near the ESR world.


JK
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Visual Studio 2017 coming soon

2017-10-26 Thread Milan Sreckovic
Are we locked into using the same compiler for the ESR updates?  In 
other words, do we need to keep VS2015 for ESR52 builds until they are 
not needed anymore?


On 26-Oct-17 3:31, Sylvestre Ledru wrote:

Hello,


On 25/10/2017 23:48, David Major wrote:

I'm planning to move production Windows builds to VS2017 (15.4.1) in bug
1408789.


In which version are you planning to land this change?
As we are close to the end of the 58 cycle in nightly, it would be great
to wait for 59.

Thanks,
Sylvestre

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


--
- Milan (mi...@mozilla.com)

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Nathan Froyd
On Thu, Oct 26, 2017 at 9:34 AM, Henri Sivonen  wrote:
> As for the computer at hand, I want to put an end to this Nvidia
> obstacle to getting stuff done. It's been suggested to me that Radeon
> RX 560 would be well supported by distro-provided drivers, but the
> "*2" footnote at https://help.ubuntu.com/community/AMDGPU-Driver
> doesn't look too good. Based on that table it seems one should get
> Radeon RX 460. Is this the correct conclusion? Does Radeon RX 460 Just
> Work with Ubuntu 16.04? Is Radeon RX 460 going to be
> WebRender-compatible?

Can't speak to the WebRender compatibility issue, but I have a Radeon
R270 and a Radeon RX 470 in my Linux machine, and Ubuntu 16.04 seems
to be pretty happy with both of them.

-Nathan
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Jeff Muizelaar
Yeah. I'd suggest anyone who's running Linux on these machines just go
out and buy a $100 AMD GPU to replace the Quadro. Even if you don't
expense the new GPU and just throw the Quadro in the trash you'll
probably be happier.

-Jeff

On Thu, Oct 26, 2017 at 9:34 AM, Henri Sivonen  wrote:
> On Thu, Oct 26, 2017 at 9:15 AM, Henri Sivonen  wrote:
>> There's a huge downside, though:
>> If the screen stops consuming the DisplayPort data stream, the
>> graphical session gets killed! So if you do normal things like turn
>> the screen off or switch input on a multi-input screen, your graphical
>> session is no longer there when you come back and you get a login
>> screen instead! (I haven't yet formed an opinion on whether this
>> behavior can be lived with or not.)
>
> And the downsides don't even end there. rr didn't work. Plus other
> stuff not worth mentioning here.
>
> I guess going back to 16.04.1 is a better deal than 17.10.
>
>> P.S. It would be good for productivity if Mozilla issued slightly less
>> cutting-edge Nvidia GPUs to developers to increase the probability
>> that support in nouveau has had time to bake.
>
> This Mozilla-issued Quadro M2000 has been a very significant harm to
> my productivity. Considering how good rr is, I think it makes sense to
> continue to run Linux to develop Firefox. However, I think it doesn't
> make sense to issue fancy cutting-edge Nvidia GPUs to developers who
> aren't specifically working on Nvidia-specific bugs and, instead, it
> would make sense to issue GPUs that are boring as possible in terms of
> Linux driver support (i.e. Just Works with distro-bundled Free
> Software drivers). Going forward, perhaps Mozilla could issue AMD GPUs
> with computers that don't have Intel GPUs?
>
> As for the computer at hand, I want to put an end to this Nvidia
> obstacle to getting stuff done. It's been suggested to me that Radeon
> RX 560 would be well supported by distro-provided drivers, but the
> "*2" footnote at https://help.ubuntu.com/community/AMDGPU-Driver
> doesn't look too good. Based on that table it seems one should get
> Radeon RX 460. Is this the correct conclusion? Does Radeon RX 460 Just
> Work with Ubuntu 16.04? Is Radeon RX 460 going to be
> WebRender-compatible?
>
> --
> Henri Sivonen
> hsivo...@hsivonen.fi
> https://hsivonen.fi/
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Henri Sivonen
On Thu, Oct 26, 2017 at 9:15 AM, Henri Sivonen  wrote:
> There's a huge downside, though:
> If the screen stops consuming the DisplayPort data stream, the
> graphical session gets killed! So if you do normal things like turn
> the screen off or switch input on a multi-input screen, your graphical
> session is no longer there when you come back and you get a login
> screen instead! (I haven't yet formed an opinion on whether this
> behavior can be lived with or not.)

And the downsides don't even end there. rr didn't work. Plus other
stuff not worth mentioning here.

I guess going back to 16.04.1 is a better deal than 17.10.

> P.S. It would be good for productivity if Mozilla issued slightly less
> cutting-edge Nvidia GPUs to developers to increase the probability
> that support in nouveau has had time to bake.

This Mozilla-issued Quadro M2000 has been a very significant harm to
my productivity. Considering how good rr is, I think it makes sense to
continue to run Linux to develop Firefox. However, I think it doesn't
make sense to issue fancy cutting-edge Nvidia GPUs to developers who
aren't specifically working on Nvidia-specific bugs and, instead, it
would make sense to issue GPUs that are boring as possible in terms of
Linux driver support (i.e. Just Works with distro-bundled Free
Software drivers). Going forward, perhaps Mozilla could issue AMD GPUs
with computers that don't have Intel GPUs?

As for the computer at hand, I want to put an end to this Nvidia
obstacle to getting stuff done. It's been suggested to me that Radeon
RX 560 would be well supported by distro-provided drivers, but the
"*2" footnote at https://help.ubuntu.com/community/AMDGPU-Driver
doesn't look too good. Based on that table it seems one should get
Radeon RX 460. Is this the correct conclusion? Does Radeon RX 460 Just
Work with Ubuntu 16.04? Is Radeon RX 460 going to be
WebRender-compatible?

-- 
Henri Sivonen
hsivo...@hsivonen.fi
https://hsivonen.fi/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Visual Studio 2017 coming soon

2017-10-26 Thread Sylvestre Ledru
Hello,


On 25/10/2017 23:48, David Major wrote:
> I'm planning to move production Windows builds to VS2017 (15.4.1) in bug
> 1408789.
>
In which version are you planning to land this change?
As we are close to the end of the 58 cycle in nightly, it would be great
to wait for 59.

Thanks,
Sylvestre

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Henri Sivonen
On Thu, Mar 23, 2017 at 3:43 PM, Henri Sivonen  wrote:
> On Wed, Jul 6, 2016 at 2:42 AM, Gregory Szorc  wrote:
>> The Lenovo ThinkStation P710 is a good starting point (
>> http://shop.lenovo.com/us/en/workstations/thinkstation/p-series/p710/).
>
> To help others who follow the above advice save some time:
>
> Xeons don't have Intel integrated GPUs, so one has to figure how to
> get this up and running with a discrete GPU. In the case of Nvidia
> Quadro M2000, the latest Ubuntu and Fedora install images don't work.
>
> This works:
> Disable or enable the TPM. (By default, it's in a mode where the
> kernel can see it but it doesn't work. It should either be hidden or
> be allowed to work.)
> Disable secure boot. (Nvidia's proprietary drivers don't work with
> secure boot enabled.)
> Use the Ubuntu 16.04.1 install image (i.e. intentionally old
> image--you can upgrade later)
> After installing, edit /etc/default/grub and set
> GRUB_CMDLINE_LINUX_DEFAULT="" (i.e. make the string empty; without
> this, the nvidia proprietary driver conflicts with LUKS pass phrase
> input).
> update-initramfs -u
> update-grub
> apt install nvidia-375
> Then upgrade the rest. Even rolling forward the HWE stack works
> *after* the above steps.

Xenial set up according to the above steps managed to make itself
unbootable. I don't know why, but I suspect the nvidia proprietary
driver somehow fell out of use and nouveau froze.

The symptom is that a warning triangle (triangle with an exclamation
mark) shows up in the upper right part of the front panel and the
light of the topmost USB port in the front panel starts blinking.

Turning the computer off isn't enough to get rid of the warning
triangle and the blinking USB port light. To get rid of those,
disconnect the power cord for a while and then plug it back in.

After the warning triangle is gone, it's possible to boot Ubuntu
16.04.1 or 17.10 from USB to mount the root volume and make a backup
of the files onto an external disk.

Ubuntu 17.10 now boots on the hardware with nouveau with 3D enabled
(whereas 16.04.1 was 2D-only and the versions in between were broken).
However, before the boot completes, it seems to hang with the text:
[Firmware Bug]: TSC_DEADLINE disabled due to Errata: please update
microcode to version: 0xb20 (or later)
nouveau :01:00.0: bus: MMIO write of 012c FAULT at 10eb14
[ IBUS ]

Wait for a while. (I didn't time it, but the wait time is on the order
of half a minute to a couple of minutes.) Then the boot resumes.

The BIOS update from 2017-09-05 does not update the microcode to the
version the kernel wants to see. However, once Ubuntu 17.10 has been
installed, the intel-microcode package does. (It's probably a good
idea to update the BIOS for AMT and TPM bug fixes anyway.)

I left the box for installing proprietary drivers during installation
unchecked. I'm not sure it checking the box would install the nvidia
proprietary drivers, but the point of going with 17.10 instead
starting with 16.04.1 again is to use nouveau for OpenGL and avoid the
integration problems with the nvidia propriatery drivers.

The wait time during boot repeats with the installed system, but
during the wait, there's no text on the screen by default. Just wait.

On this system, with Ubuntu 17.10, nouveau seems to even qualify for
WebGL2 in Firefox.

There's a huge downside, though:
If the screen stops consuming the DisplayPort data stream, the
graphical session gets killed! So if you do normal things like turn
the screen off or switch input on a multi-input screen, your graphical
session is no longer there when you come back and you get a login
screen instead! (I haven't yet formed an opinion on whether this
behavior can be lived with or not.)

This applies to the live session on the install media, too. Therefore,
it's best to use another virtual console (ctrl-alt-F3) for restoring
backups. (GUI is now some weird dual existence in ctrl-alt-F1 and
ctrl-alt-F2.)

(Fedora 26 still doesn't boot on this hardware. I didn't try Fedora 27 beta.)

P.S. It would be good for productivity if Mozilla issued slightly less
cutting-edge Nvidia GPUs to developers to increase the probability
that support in nouveau has had time to bake.

-- 
Henri Sivonen
hsivo...@hsivonen.fi
https://hsivonen.fi/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform