Re: Bigger hard drives wanted (was Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux))

2017-11-08 Thread Gregory Szorc
This thread is good feedback. I think changing the default to a 1TB SSD is
a reasonable request.

Please send any future comments regarding hardware to Sophana (
s...@mozilla.com) to increase the chances that feedback is acted on.

On Wed, Nov 8, 2017 at 9:09 AM, Julian Seward  wrote:

> On 08/11/17 17:28, Boris Zbarsky wrote:
>
> > The last desktop I was shipped came with a 512 GB drive.  [..]
> >
> > In practice, I routinely run out of disk space and have to delete
> > objdirs and rebuild them the next day, because I have to build
> > something else in a different srcdir...
>
> I totally agree.  I had a machine with a 512GB SSD and wound up in the
> same endless juggle/compress/delete-and-rebuild game.  I got a new machine
> with a 512GB SSD *and* a 1T HDD, and that helps a lot, although the perf
> hit from the HDD especially when linking libxul is terrible.
>
> J
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
>
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Bigger hard drives wanted (was Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux))

2017-11-08 Thread Julian Seward
On 08/11/17 17:28, Boris Zbarsky wrote:

> The last desktop I was shipped came with a 512 GB drive.  [..]
>
> In practice, I routinely run out of disk space and have to delete
> objdirs and rebuild them the next day, because I have to build
> something else in a different srcdir...

I totally agree.  I had a machine with a 512GB SSD and wound up in the
same endless juggle/compress/delete-and-rebuild game.  I got a new machine
with a 512GB SSD *and* a 1T HDD, and that helps a lot, although the perf
hit from the HDD especially when linking libxul is terrible.

J
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Bigger hard drives wanted (was Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux))

2017-11-08 Thread Michael de Boer
I’d like to add the VM multiplier: I’m working mainly on OSX and run a Windows 
and a Linux VM in there with their own checkouts and objdirs. Instead of 
allocating a comfortable size virtual disks, I end up resizing them quite 
frequently to avoid running out of space to save as much as possible for OSX.

Mike.

> On 8 Nov 2017, at 17:28, Boris Zbarsky  wrote:
> 
> On 11/7/17 4:13 PM, Sophana "Soap" Aik wrote:
>> Nothing is worse than hearing IT picked or chose hardware that nobody
>> actually wanted or will use.
> 
> If I could interject with a comment about the hardware we pick...
> 
> The last desktop I was shipped came with a 512 GB drive.  One of our srcdirs 
> is about 5-8GB nowadays (we seem to have mach commands that dump large stuff 
> in the srcdir).
> 
> Each objdir is 9+GB at least on Linux.  Figure 25GB for source + opt + debug.
> 
> For the work I do (e.g. backporting security fixes every so often) I need a 
> release tree, a beta tree, and ESR tree, and at least 3 tip trees.  That's at 
> least 150GB.  If I want to have an effective ccache, that's about 20-30GB 
> (recall that each objdir is 9+GB!).  Call it 175GB.
> 
> If I want to dual-boot or have a VM so I can do both Linux and Windows work, 
> that's 350GB.  Plus the actual operating systems involved.  Plus any data 
> files that might be being generated as part of work, etc.
> 
> In practice, I routinely run out of disk space and have to delete objdirs and 
> rebuild them the next day, because I have to build something else in a 
> different srcdir...
> 
> -Boris
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Bigger hard drives wanted (was Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux))

2017-11-08 Thread Boris Zbarsky

On 11/7/17 4:13 PM, Sophana "Soap" Aik wrote:

Nothing is worse than hearing IT picked or chose hardware that nobody
actually wanted or will use.


If I could interject with a comment about the hardware we pick...

The last desktop I was shipped came with a 512 GB drive.  One of our 
srcdirs is about 5-8GB nowadays (we seem to have mach commands that dump 
large stuff in the srcdir).


Each objdir is 9+GB at least on Linux.  Figure 25GB for source + opt + 
debug.


For the work I do (e.g. backporting security fixes every so often) I 
need a release tree, a beta tree, and ESR tree, and at least 3 tip 
trees.  That's at least 150GB.  If I want to have an effective ccache, 
that's about 20-30GB (recall that each objdir is 9+GB!).  Call it 175GB.


If I want to dual-boot or have a VM so I can do both Linux and Windows 
work, that's 350GB.  Plus the actual operating systems involved.  Plus 
any data files that might be being generated as part of work, etc.


In practice, I routinely run out of disk space and have to delete 
objdirs and rebuild them the next day, because I have to build something 
else in a different srcdir...


-Boris
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-08 Thread Sophana "Soap" Aik
Thanks Jeff, I understand your reasoning. 14 cores vs 10 is definitely
huge.

I will also add, there isn't anything to stop us to having more than one
config, just like we do with laptops.

I'm fortunate to be in this situation to finally help you all have
influence on the type of hardware that makes sense for your use cases.
Nothing is worse than hearing IT picked or chose hardware that nobody
actually wanted or will use.

I'll continue to pursue the Core i9 as an option, just currently there
aren't many OEM builders providing these yet.

On Tue, Nov 7, 2017 at 1:00 PM, Jeff Muizelaar 
wrote:

> The Core i9s are a quite a bit cheaper than the Xeon Ws:
> https://ark.intel.com/products/series/125035/Intel-Xeon-Processor-W-Family
> vs
> https://ark.intel.com/products/126695
>
> I wouldn't want to trade ECC for 4 cores.
>
> -Jeff
>
> On Tue, Nov 7, 2017 at 3:51 PM, Sophana "Soap" Aik 
> wrote:
> > Kris has touched on the many advantages of having a standard model. From
> > what I am seeing with most people's use case scenario, only the GPU is
> what
> > will determine what the machine is used for. IE: VR Research team may
> end up
> > only needing a GPU upgrade.
> >
> > Fortunately the new W-Series Xeon's seem to be equal or better to the
> Core
> > i9's but with ECC support. So there's no sacrifice to performance in
> single
> > threaded or multi-threaded workloads.
> >
> > With all that said, we'll move forward with the evaluation machine and
> find
> > out for sure in real world testing. :)
> >
> >
> >
> > On Tue, Nov 7, 2017 at 12:30 PM, Kris Maglione 
> > wrote:
> >>
> >> On Tue, Nov 07, 2017 at 03:07:55PM -0500, Jeff Muizelaar wrote:
> >>>
> >>> On Mon, Nov 6, 2017 at 1:32 PM, Sophana "Soap" Aik 
> >>> wrote:
> 
>  Hi All,
> 
>  I'm in the middle of getting another evaluation machine with a 10-core
>  W-Series Xeon Processor (that is similar to the 7900X in terms of
> clock
>  speed and performance) but with ECC memory support.
> 
>  I'm trying to make sure this is a "one size fits all" machine as much
> as
>  possible.
> >>>
> >>>
> >>> What's the advantage of having a "one size fits all" machine? I
> >>> imagine there's quite a range of uses and preferences for these
> >>> machines. e.g some people are going to be spending more time waiting
> >>> for a single core and so would prefer a smaller core count and higher
> >>> clock, other people want a machine that's as wide as possible. Some
> >>> people would value performance over correctness and so would likely
> >>> not want ECC. etc. I've heard a number of horror stories of people
> >>> ending up with hardware that's not well suited to their tasks just
> >>> because that was the only hardware on the list.
> >>
> >>
> >> High core count Xeons will divert power from idle cores to increase the
> >> clock speed of saturated cores during mostly single-threaded workloads.
> >>
> >> The advantage of a one-size-fits-all machine is that it means more of us
> >> have the same hardware configuration, which means fewer of us running
> into
> >> independent issues, more of us being able to share software
> configurations
> >> that work well, easier purchasing and stocking of upgrades and
> accessories,
> >> ... I own a personal high-end Xeon workstation, and if every developer
> at
> >> the company had to go through the same teething and configuration
> troubles
> >> that I did while breaking it in, we would not be in a good place.
> >>
> >> And I don't really want to get into the weeds on ECC again, but the
> >> performance of load-reduced ECC is quite good, and the additional cost
> of
> >> ECC is very low compared to the cost of developer time over the two
> years
> >> that they're expected to use it.
> >
> >
> >
> >
> > --
> > moz://a
> > Sophana "Soap" Aik
> > IT Vendor Management Analyst
> > IRC/Slack: soap
>



-- 
moz://a
Sophana "Soap" Aik
IT Vendor Management Analyst
IRC/Slack: soap
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-08 Thread Sophana "Soap" Aik
Kris has touched on the many advantages of having a standard model. From
what I am seeing with most people's use case scenario, only the GPU is what
will determine what the machine is used for. IE: VR Research team may end
up only needing a GPU upgrade.

Fortunately the new W-Series Xeon's seem to be equal or better to the Core
i9's but with ECC support. So there's no sacrifice to performance in single
threaded or multi-threaded workloads.

With all that said, we'll move forward with the evaluation machine and find
out for sure in real world testing. :)



On Tue, Nov 7, 2017 at 12:30 PM, Kris Maglione 
wrote:

> On Tue, Nov 07, 2017 at 03:07:55PM -0500, Jeff Muizelaar wrote:
>
>> On Mon, Nov 6, 2017 at 1:32 PM, Sophana "Soap" Aik 
>> wrote:
>>
>>> Hi All,
>>>
>>> I'm in the middle of getting another evaluation machine with a 10-core
>>> W-Series Xeon Processor (that is similar to the 7900X in terms of clock
>>> speed and performance) but with ECC memory support.
>>>
>>> I'm trying to make sure this is a "one size fits all" machine as much as
>>> possible.
>>>
>>
>> What's the advantage of having a "one size fits all" machine? I
>> imagine there's quite a range of uses and preferences for these
>> machines. e.g some people are going to be spending more time waiting
>> for a single core and so would prefer a smaller core count and higher
>> clock, other people want a machine that's as wide as possible. Some
>> people would value performance over correctness and so would likely
>> not want ECC. etc. I've heard a number of horror stories of people
>> ending up with hardware that's not well suited to their tasks just
>> because that was the only hardware on the list.
>>
>
> High core count Xeons will divert power from idle cores to increase the
> clock speed of saturated cores during mostly single-threaded workloads.
>
> The advantage of a one-size-fits-all machine is that it means more of us
> have the same hardware configuration, which means fewer of us running into
> independent issues, more of us being able to share software configurations
> that work well, easier purchasing and stocking of upgrades and accessories,
> ... I own a personal high-end Xeon workstation, and if every developer at
> the company had to go through the same teething and configuration troubles
> that I did while breaking it in, we would not be in a good place.
>
> And I don't really want to get into the weeds on ECC again, but the
> performance of load-reduced ECC is quite good, and the additional cost of
> ECC is very low compared to the cost of developer time over the two years
> that they're expected to use it.
>



-- 
moz://a
Sophana "Soap" Aik
IT Vendor Management Analyst
IRC/Slack: soap
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-08 Thread Jean-Yves Avenard
With all this talk…

I’m eagerly waiting for the iMac Pro.

Best of all worlds really:
- High core count
- ECC RAM
- 5K 27” display
- Great graphic card
- Super silent…

I’ve been using a Mac Pro 2013 (the trash can one), Xeon E5 8 cores, 32 GB ECC 
RAM, connected to two 27” screens (one 5K with DPI set at 200%, the other a 
2560x1440 Apple thunderbolt)

It runs flawlessly Windows, Mac and Linux (though under Linux I never managed 
to get more than one screen working at a time).

It compiles on mac, even with stylo under 12 minutes and on Windows in 19 
minutes (used to be 6 minutes and 12 minutes respectively before all this rust 
thing came in)…. And that’s using mach with 14 jobs only so that I continue to 
work on the machine without noticing it’s doing a CPU intensive task. The UI 
stays ultra responsive.

And best of all, it’s sitting 60cm from my hear and I can’t hear anything at 
all…

This has been my primary machine since 2014, I’ve had no desire to upgrade as 
no other machine will allow me such comfortable development environment under 
all platforms we support.

It had been difficult to choose at the beginning between the higher frequency 6 
cores or the 8 cores. But that turned out to be a moot issue as the 8 cores, 
when only 6 cores are run will go as high as the 6 cores version…

The mac pro was an expensive machine, but seeing that it will last me longer 
than your usual machine, I do believe that in the long term it will be best 
value for money.

My $0.02

> On 8 Nov 2017, at 8:43 am, Henri Sivonen  wrote:
> 
> I agree that workstation GPUs should be avoided. Even if they were as
> well supported by Linux distro-provided Open Source drivers as
> consumer GPUs, it's at the very least more difficult to find
> information about what's true about them.
> 
> We don't need the GPU to be at max spec like we need the CPU to be.
> The GPU doesn't affect build times, and for running Firefox it seems
> more useful to see how it runs with a consumer GPU.
> 
> I think we also shouldn't overdo multi-monitor *connectors* at the
> expense of Linux-compatibility, especially considering that
> DisplayPort is supposed to support monitor chaining behind one port on
> the graphics card. The Quadro M2000 that caused trouble for me had
> *four* DisplayPort connectors. Considering the number of ports vs.
> Linux distros Just Working, I'd expect the prioritizing Linux distros
> Just Working to be more useful (as in letting developers write code
> instead of troubleshoot GPU issues) than having a "professional"
> number of connectors as the configuration offered to people who don't
> ask for a lot of connectors. (The specs for the older generation
> consumer-grade Radeon RX 460 claim 5 DisplayPort screens behind the
> one DisplayPort connector on the card, but I haven't verified it
> empirically, since I don't have that many screens to test with.)
> 
> On Tue, Nov 7, 2017 at 10:27 PM, Jeff Gilbert  > wrote:
>> Avoid workstation GPUs if you can. At best, they're just a more
>> expensive consumer GPU. At worst, they may sacrifice performance we
>> care about in their optimization for CAD and modelling workloads, in
>> addition to moving us further away from testing what our users use. We
>> have no need for workstation GPUs, so we should avoid them if we can.
>> 
>> On Mon, Nov 6, 2017 at 10:32 AM, Sophana "Soap" Aik  wrote:
>>> Hi All,
>>> 
>>> I'm in the middle of getting another evaluation machine with a 10-core
>>> W-Series Xeon Processor (that is similar to the 7900X in terms of clock
>>> speed and performance) but with ECC memory support.
>>> 
>>> I'm trying to make sure this is a "one size fits all" machine as much as
>>> possible.
>>> 
>>> Also there are some AMD Radeon workstation GPU's that look interesting to
>>> me. The one I was thinking to include was a Radeon Pro WX2100, 2GB, FH
>>> (5820T) so we can start testing that as well.
>>> 
>>> Stay tuned...
>>> 
>>> On Mon, Nov 6, 2017 at 12:46 AM, Henri Sivonen  wrote:
>>> 
 Thank you for including an AMD card among the ones to be tested.
 
 - -
 
 The Radeon RX 460 mentioned earlier in this thread arrived. There was
 again enough weirdness that I think it's worth sharing in case it
 saves time for someone else:
 
 Initially, for multiple rounds of booting with different cable
 configurations, the Lenovo UEFI consistenly displayed nothing if a
 cable with a powered-on screen was plugged into the DisplayPort
 connector on the RX 460. To see the boot password prompt or anything
 else displayed by the Lenovo UEFI, I needed to connect a screen to the
 DVI port and *not* have a powered-on screen connected to DisplayPort.
 However, Lenovo UEFI started displaying on a DisplayPort-connected
 screen (with or without DVI also connected) after one time I had had a
 powered-on screen 

Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-08 Thread Mike Hommey
On Wed, Nov 08, 2017 at 09:43:29AM +0200, Henri Sivonen wrote:
> I agree that workstation GPUs should be avoided. Even if they were as
> well supported by Linux distro-provided Open Source drivers as
> consumer GPUs, it's at the very least more difficult to find
> information about what's true about them.
> 
> We don't need the GPU to be at max spec like we need the CPU to be.
> The GPU doesn't affect build times, and for running Firefox it seems
> more useful to see how it runs with a consumer GPU.
> 
> I think we also shouldn't overdo multi-monitor *connectors* at the
> expense of Linux-compatibility, especially considering that
> DisplayPort is supposed to support monitor chaining behind one port on
> the graphics card. The Quadro M2000 that caused trouble for me had
> *four* DisplayPort connectors. Considering the number of ports vs.
> Linux distros Just Working, I'd expect the prioritizing Linux distros
> Just Working to be more useful (as in letting developers write code
> instead of troubleshoot GPU issues) than having a "professional"
> number of connectors as the configuration offered to people who don't
> ask for a lot of connectors. (The specs for the older generation
> consumer-grade Radeon RX 460 claim 5 DisplayPort screens behind the
> one DisplayPort connector on the card, but I haven't verified it
> empirically, since I don't have that many screens to test with.)

Yes, you can daisy-chain many monitors with DisplayPort, but there's a
bandwidth limit you need to be aware of.

DP 1.2 can only handle 4 HD screens at 60Hz, and *one* 4K screen at 60Hz
DP 1.3 and 1.4 can "only" handle two 4K screens at 60Hz.

Also, support for multi-screen over DP is usually flaky wrt hot-plug. At
least that's been my experience on both Linux and Windows, and I hear
Windows is actually worse. Also, I usually get my monitors set in a
different order when I upgrade the kernel. (And I'm only using two HD
monitors)

Mike
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-07 Thread Henri Sivonen
I agree that workstation GPUs should be avoided. Even if they were as
well supported by Linux distro-provided Open Source drivers as
consumer GPUs, it's at the very least more difficult to find
information about what's true about them.

We don't need the GPU to be at max spec like we need the CPU to be.
The GPU doesn't affect build times, and for running Firefox it seems
more useful to see how it runs with a consumer GPU.

I think we also shouldn't overdo multi-monitor *connectors* at the
expense of Linux-compatibility, especially considering that
DisplayPort is supposed to support monitor chaining behind one port on
the graphics card. The Quadro M2000 that caused trouble for me had
*four* DisplayPort connectors. Considering the number of ports vs.
Linux distros Just Working, I'd expect the prioritizing Linux distros
Just Working to be more useful (as in letting developers write code
instead of troubleshoot GPU issues) than having a "professional"
number of connectors as the configuration offered to people who don't
ask for a lot of connectors. (The specs for the older generation
consumer-grade Radeon RX 460 claim 5 DisplayPort screens behind the
one DisplayPort connector on the card, but I haven't verified it
empirically, since I don't have that many screens to test with.)

On Tue, Nov 7, 2017 at 10:27 PM, Jeff Gilbert  wrote:
> Avoid workstation GPUs if you can. At best, they're just a more
> expensive consumer GPU. At worst, they may sacrifice performance we
> care about in their optimization for CAD and modelling workloads, in
> addition to moving us further away from testing what our users use. We
> have no need for workstation GPUs, so we should avoid them if we can.
>
> On Mon, Nov 6, 2017 at 10:32 AM, Sophana "Soap" Aik  wrote:
>> Hi All,
>>
>> I'm in the middle of getting another evaluation machine with a 10-core
>> W-Series Xeon Processor (that is similar to the 7900X in terms of clock
>> speed and performance) but with ECC memory support.
>>
>> I'm trying to make sure this is a "one size fits all" machine as much as
>> possible.
>>
>> Also there are some AMD Radeon workstation GPU's that look interesting to
>> me. The one I was thinking to include was a Radeon Pro WX2100, 2GB, FH
>> (5820T) so we can start testing that as well.
>>
>> Stay tuned...
>>
>> On Mon, Nov 6, 2017 at 12:46 AM, Henri Sivonen  wrote:
>>
>>> Thank you for including an AMD card among the ones to be tested.
>>>
>>> - -
>>>
>>> The Radeon RX 460 mentioned earlier in this thread arrived. There was
>>> again enough weirdness that I think it's worth sharing in case it
>>> saves time for someone else:
>>>
>>> Initially, for multiple rounds of booting with different cable
>>> configurations, the Lenovo UEFI consistenly displayed nothing if a
>>> cable with a powered-on screen was plugged into the DisplayPort
>>> connector on the RX 460. To see the boot password prompt or anything
>>> else displayed by the Lenovo UEFI, I needed to connect a screen to the
>>> DVI port and *not* have a powered-on screen connected to DisplayPort.
>>> However, Lenovo UEFI started displaying on a DisplayPort-connected
>>> screen (with or without DVI also connected) after one time I had had a
>>> powered-on screen connected to DVI and a powered-off screen connected
>>> to DisplayPort at the start of the boot and I turned on the
>>> DisplayPort screen while the DVI screen was displaying the UEFI
>>> password prompt. However, during that same boot, I happened to not to
>>> have a keyboard connected, because it was connected via the screen
>>> that was powered off, and this caused an UEFI error, so I don't know
>>> which of the DisplayPort device powering on during the UEFI phase or
>>> UEFI going through an error phase due to missing keyboard jolted it to
>>> use the DisplayPort screen properly subsequently. Weird.
>>>
>>> On the Linux side, the original Ubuntu 16.04 kernel (4.4) supported
>>> only a low resolution fallback mode. Rolling the hardware enablement
>>> stack forward (to 4.10 series kernel using the incantation given at
>>> https://wiki.ubuntu.com/Kernel/LTSEnablementStack ) fixed this and
>>> resulted in Firefox reporting WebGL2 and all. The fix for
>>> https://bugzilla.kernel.org/show_bug.cgi?id=191281 hasn't propagated
>>> to Ubuntu 16.04's latest HWE stack, which looks distressing during
>>> boot, but it seems harmless so far.
>>>
>>> I got the 4 GB model, since it was available at roughly the same price
>>> as the 2 GB model. It supports both screens I have available for
>>> testing at their full resolution simultaneously (2560x1440 plugged
>>> into DisplayPort and 1920x1200 plugged into DVI).
>>>
>>> The card is significantly larger than the Quadro M2000. It takes the
>>> space of two card slots (connects to one, but the heat sink and the
>>> dual fans take the space of another slot). The fans don't appear to
>>> make an audible difference compared to the Quadro M2000.
>>>
>>> 

Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-07 Thread Jeff Muizelaar
The Core i9s are a quite a bit cheaper than the Xeon Ws:
https://ark.intel.com/products/series/125035/Intel-Xeon-Processor-W-Family vs
https://ark.intel.com/products/126695

I wouldn't want to trade ECC for 4 cores.

-Jeff

On Tue, Nov 7, 2017 at 3:51 PM, Sophana "Soap" Aik  wrote:
> Kris has touched on the many advantages of having a standard model. From
> what I am seeing with most people's use case scenario, only the GPU is what
> will determine what the machine is used for. IE: VR Research team may end up
> only needing a GPU upgrade.
>
> Fortunately the new W-Series Xeon's seem to be equal or better to the Core
> i9's but with ECC support. So there's no sacrifice to performance in single
> threaded or multi-threaded workloads.
>
> With all that said, we'll move forward with the evaluation machine and find
> out for sure in real world testing. :)
>
>
>
> On Tue, Nov 7, 2017 at 12:30 PM, Kris Maglione 
> wrote:
>>
>> On Tue, Nov 07, 2017 at 03:07:55PM -0500, Jeff Muizelaar wrote:
>>>
>>> On Mon, Nov 6, 2017 at 1:32 PM, Sophana "Soap" Aik 
>>> wrote:

 Hi All,

 I'm in the middle of getting another evaluation machine with a 10-core
 W-Series Xeon Processor (that is similar to the 7900X in terms of clock
 speed and performance) but with ECC memory support.

 I'm trying to make sure this is a "one size fits all" machine as much as
 possible.
>>>
>>>
>>> What's the advantage of having a "one size fits all" machine? I
>>> imagine there's quite a range of uses and preferences for these
>>> machines. e.g some people are going to be spending more time waiting
>>> for a single core and so would prefer a smaller core count and higher
>>> clock, other people want a machine that's as wide as possible. Some
>>> people would value performance over correctness and so would likely
>>> not want ECC. etc. I've heard a number of horror stories of people
>>> ending up with hardware that's not well suited to their tasks just
>>> because that was the only hardware on the list.
>>
>>
>> High core count Xeons will divert power from idle cores to increase the
>> clock speed of saturated cores during mostly single-threaded workloads.
>>
>> The advantage of a one-size-fits-all machine is that it means more of us
>> have the same hardware configuration, which means fewer of us running into
>> independent issues, more of us being able to share software configurations
>> that work well, easier purchasing and stocking of upgrades and accessories,
>> ... I own a personal high-end Xeon workstation, and if every developer at
>> the company had to go through the same teething and configuration troubles
>> that I did while breaking it in, we would not be in a good place.
>>
>> And I don't really want to get into the weeds on ECC again, but the
>> performance of load-reduced ECC is quite good, and the additional cost of
>> ECC is very low compared to the cost of developer time over the two years
>> that they're expected to use it.
>
>
>
>
> --
> moz://a
> Sophana "Soap" Aik
> IT Vendor Management Analyst
> IRC/Slack: soap
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-07 Thread Jeff Gilbert
If you don't want to get into the weeds on ECC again, please do not
reinitiate discussion. I do not agree that "the additional cost of ECC
is very low compared to the cost of developer time over the two years
that they're expected to use it", but I will restrict my disagreement
to the forked thread that you created. Please repost there.

On Tue, Nov 7, 2017 at 12:30 PM, Kris Maglione  wrote:
> On Tue, Nov 07, 2017 at 03:07:55PM -0500, Jeff Muizelaar wrote:
>>
>> On Mon, Nov 6, 2017 at 1:32 PM, Sophana "Soap" Aik 
>> wrote:
>>>
>>> Hi All,
>>>
>>> I'm in the middle of getting another evaluation machine with a 10-core
>>> W-Series Xeon Processor (that is similar to the 7900X in terms of clock
>>> speed and performance) but with ECC memory support.
>>>
>>> I'm trying to make sure this is a "one size fits all" machine as much as
>>> possible.
>>
>>
>> What's the advantage of having a "one size fits all" machine? I
>> imagine there's quite a range of uses and preferences for these
>> machines. e.g some people are going to be spending more time waiting
>> for a single core and so would prefer a smaller core count and higher
>> clock, other people want a machine that's as wide as possible. Some
>> people would value performance over correctness and so would likely
>> not want ECC. etc. I've heard a number of horror stories of people
>> ending up with hardware that's not well suited to their tasks just
>> because that was the only hardware on the list.
>
>
> High core count Xeons will divert power from idle cores to increase the
> clock speed of saturated cores during mostly single-threaded workloads.
>
> The advantage of a one-size-fits-all machine is that it means more of us
> have the same hardware configuration, which means fewer of us running into
> independent issues, more of us being able to share software configurations
> that work well, easier purchasing and stocking of upgrades and accessories,
> ... I own a personal high-end Xeon workstation, and if every developer at
> the company had to go through the same teething and configuration troubles
> that I did while breaking it in, we would not be in a good place.
>
> And I don't really want to get into the weeds on ECC again, but the
> performance of load-reduced ECC is quite good, and the additional cost of
> ECC is very low compared to the cost of developer time over the two years
> that they're expected to use it.
>
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-07 Thread Kris Maglione

On Tue, Nov 07, 2017 at 03:07:55PM -0500, Jeff Muizelaar wrote:

On Mon, Nov 6, 2017 at 1:32 PM, Sophana "Soap" Aik  wrote:

Hi All,

I'm in the middle of getting another evaluation machine with a 10-core
W-Series Xeon Processor (that is similar to the 7900X in terms of clock
speed and performance) but with ECC memory support.

I'm trying to make sure this is a "one size fits all" machine as much as
possible.


What's the advantage of having a "one size fits all" machine? I
imagine there's quite a range of uses and preferences for these
machines. e.g some people are going to be spending more time waiting
for a single core and so would prefer a smaller core count and higher
clock, other people want a machine that's as wide as possible. Some
people would value performance over correctness and so would likely
not want ECC. etc. I've heard a number of horror stories of people
ending up with hardware that's not well suited to their tasks just
because that was the only hardware on the list.


High core count Xeons will divert power from idle cores to increase the 
clock speed of saturated cores during mostly single-threaded workloads.


The advantage of a one-size-fits-all machine is that it means more of us 
have the same hardware configuration, which means fewer of us running 
into independent issues, more of us being able to share software 
configurations that work well, easier purchasing and stocking of 
upgrades and accessories, ... I own a personal high-end Xeon 
workstation, and if every developer at the company had to go through the 
same teething and configuration troubles that I did while breaking it 
in, we would not be in a good place.


And I don't really want to get into the weeds on ECC again, but the 
performance of load-reduced ECC is quite good, and the additional cost 
of ECC is very low compared to the cost of developer time over the two 
years that they're expected to use it.

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-07 Thread Sophana "Soap" Aik
Hi All,

I'm in the middle of getting another evaluation machine with a 10-core
W-Series Xeon Processor (that is similar to the 7900X in terms of clock
speed and performance) but with ECC memory support.

I'm trying to make sure this is a "one size fits all" machine as much as
possible.

Also there are some AMD Radeon workstation GPU's that look interesting to
me. The one I was thinking to include was a Radeon Pro WX2100, 2GB, FH
(5820T) so we can start testing that as well.

Stay tuned...

On Mon, Nov 6, 2017 at 12:46 AM, Henri Sivonen  wrote:

> Thank you for including an AMD card among the ones to be tested.
>
> - -
>
> The Radeon RX 460 mentioned earlier in this thread arrived. There was
> again enough weirdness that I think it's worth sharing in case it
> saves time for someone else:
>
> Initially, for multiple rounds of booting with different cable
> configurations, the Lenovo UEFI consistenly displayed nothing if a
> cable with a powered-on screen was plugged into the DisplayPort
> connector on the RX 460. To see the boot password prompt or anything
> else displayed by the Lenovo UEFI, I needed to connect a screen to the
> DVI port and *not* have a powered-on screen connected to DisplayPort.
> However, Lenovo UEFI started displaying on a DisplayPort-connected
> screen (with or without DVI also connected) after one time I had had a
> powered-on screen connected to DVI and a powered-off screen connected
> to DisplayPort at the start of the boot and I turned on the
> DisplayPort screen while the DVI screen was displaying the UEFI
> password prompt. However, during that same boot, I happened to not to
> have a keyboard connected, because it was connected via the screen
> that was powered off, and this caused an UEFI error, so I don't know
> which of the DisplayPort device powering on during the UEFI phase or
> UEFI going through an error phase due to missing keyboard jolted it to
> use the DisplayPort screen properly subsequently. Weird.
>
> On the Linux side, the original Ubuntu 16.04 kernel (4.4) supported
> only a low resolution fallback mode. Rolling the hardware enablement
> stack forward (to 4.10 series kernel using the incantation given at
> https://wiki.ubuntu.com/Kernel/LTSEnablementStack ) fixed this and
> resulted in Firefox reporting WebGL2 and all. The fix for
> https://bugzilla.kernel.org/show_bug.cgi?id=191281 hasn't propagated
> to Ubuntu 16.04's latest HWE stack, which looks distressing during
> boot, but it seems harmless so far.
>
> I got the 4 GB model, since it was available at roughly the same price
> as the 2 GB model. It supports both screens I have available for
> testing at their full resolution simultaneously (2560x1440 plugged
> into DisplayPort and 1920x1200 plugged into DVI).
>
> The card is significantly larger than the Quadro M2000. It takes the
> space of two card slots (connects to one, but the heat sink and the
> dual fans take the space of another slot). The fans don't appear to
> make an audible difference compared to the Quadro M2000.
>
> On Fri, Oct 27, 2017 at 6:19 PM, Sophana "Soap" Aik 
> wrote:
> > Thank you Henri for the feedback.
> >
> > How about this, we can order some graphics cards and put them in the
> > evaluation/test machine that is with Greg, to make sure it has good
> > compatibility.
> >
> > We could do:
> > Nvidia GTX 1060 3GB
> > AMD Radeon RX570
> >
> > These two options will ensure it can drive multi displays.
> >
> > Other suggestions welcomed.
> >
> > Greg, is that something you think we should do?
> >
> > On Thu, Oct 26, 2017 at 11:33 PM, Henri Sivonen 
> > wrote:
> >>
> >> On Fri, Oct 27, 2017 at 4:48 AM, Sophana "Soap" Aik 
> >> wrote:
> >> > Hello everyone, great feedback that I will keep in mind and continue
> to
> >> > work
> >> > with our vendors to find the best solution with. One of the cards
> that I
> >> > was
> >> > looking at is fairly cheap and can at least drive multi-displays (even
> >> > 4K
> >> > 60hz) was the Nvidia Quadro P600.
> >>
> >> Is that GPU known to be well-supported by Nouveau of Ubuntu 16.04
> vintage?
> >>
> >> I don't want to deny a single-GPU multi-monitor setup to anyone for
> >> whom that's the priority, but considering how much damage the Quadro
> >> M2000 has done to my productivity (and from what I've heard from other
> >> people on the DOM team, I gather I'm not the only one who has had
> >> trouble with it), the four DisplayPort connectors on it look like very
> >> bad economics.
> >>
> >> I suggest these two criteria be considered for developer workstations
> >> in addition to build performance:
> >>  1) The CPU is compatible with rr (at present, this means that the CPU
> >> has to be from Intel and not from AMD)
> >>  2) The GPU offered by default (again, I don't want to deny multiple
> >> DisplayPort connectors on a single GPU to people who request them)
> >> works well in OpenGL mode (i.e. without llvmpipe activating) 

Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-06 Thread Jeff Gilbert
My understanding of current policy is that ECC is not required. (and
not even an option with MacBook Pros) Given the volume of development
that happens unhindered on our developers' many, many non-ECC
machines, I believe the burden of proof-of-burden is on the pro-ECC
argument to show that it's likely to be a worthwhile investment for
our use-cases.

As for evidence for lack of ECC being a non-issue, I call to witness
the vast majority of Firefox development, most applicably that portion
done in the last ten years, and especially all MacOS development
excluding the very few Mac Pros we have.

If we've given developers ECC machines already when non-ECC was an
option, absent a positive request for ECC from the developer, I would
consider this to have been a minor mistake.

On Mon, Nov 6, 2017 at 3:03 PM, Gabriele Svelto  wrote:
> On 06/11/2017 22:44, Jeff Gilbert wrote:
>> Price matters, since every dollar we spend chasing ECC would be a
>> dollar we can't allocate towards perf improvements, hardware refresh
>> rate, or simply more machines for any build clusters we may want.
>
> And every day our developers or IT staff waste chasing apparently random
> issues is a waste of both money and time.
>
>> The paper linked above addresses massive compute clusters, which seems
>> to have limited implications for our use-cases.
>
> The clusters are 6000 and 8500 nodes respectively, quite small by
> today's standards. How many developers do we have? Hundreds for sure, it
> could be a thousand looking at our current headcount so we're in the
> same ballpark.
>
>> Nearly every machine we do development on does not currently use ECC.
>> I don't see why that should change now.
>
> Not true. The current Xeon E5-based ThinkStation P710 available from
> Service Now has ECC memory and so did the previous models in the last
> five years. Having a workstation available w/o ECC would actually be a
> step backwards.
>
>> To me, ECC for desktop compute
>> workloads crosses the line into jumping at shadows, since "restart
>> your machine slightly more often than otherwise" is not onerous.
> Do you have data to prove that this is not an issue?
>
>  Gabriele
>
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-06 Thread Gabriele Svelto
On 06/11/2017 22:44, Jeff Gilbert wrote:
> Price matters, since every dollar we spend chasing ECC would be a
> dollar we can't allocate towards perf improvements, hardware refresh
> rate, or simply more machines for any build clusters we may want.

And every day our developers or IT staff waste chasing apparently random
issues is a waste of both money and time.

> The paper linked above addresses massive compute clusters, which seems
> to have limited implications for our use-cases.

The clusters are 6000 and 8500 nodes respectively, quite small by
today's standards. How many developers do we have? Hundreds for sure, it
could be a thousand looking at our current headcount so we're in the
same ballpark.

> Nearly every machine we do development on does not currently use ECC.
> I don't see why that should change now.

Not true. The current Xeon E5-based ThinkStation P710 available from
Service Now has ECC memory and so did the previous models in the last
five years. Having a workstation available w/o ECC would actually be a
step backwards.

> To me, ECC for desktop compute
> workloads crosses the line into jumping at shadows, since "restart
> your machine slightly more often than otherwise" is not onerous.
Do you have data to prove that this is not an issue?

 Gabriele



signature.asc
Description: OpenPGP digital signature
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-06 Thread Jeff Gilbert
Price matters, since every dollar we spend chasing ECC would be a
dollar we can't allocate towards perf improvements, hardware refresh
rate, or simply more machines for any build clusters we may want.

The paper linked above addresses massive compute clusters, which seems
to have limited implications for our use-cases.

Nearly every machine we do development on does not currently use ECC.
I don't see why that should change now. To me, ECC for desktop compute
workloads crosses the line into jumping at shadows, since "restart
your machine slightly more often than otherwise" is not onerous.

On Mon, Nov 6, 2017 at 9:19 AM, Gregory Szorc  wrote:
>
>
>> On Nov 6, 2017, at 05:19, Gabriele Svelto  wrote:
>>
>>> On 04/11/2017 01:10, Jeff Gilbert wrote:
>>> Clock speed and core count matter much more than ECC. I wouldn't chase
>>> ECC support for general dev machines.
>>
>> The Xeon-W SKUs I posted in the previous thread all had identical or
>> higher clock speeds than equivalent Core i9 SKUs and ECC support with
>> the sole exception of the i9-7980XE which has slightly higher (100MHz)
>> peak turbo clock than the Xeon W-2195.
>>
>> There is IMHO no performance-related reason to skimp on ECC support
>> especially for machines that will sport a significant amount of memory.
>>
>> Importance of ECC memory is IMHO underestimated mostly because it's not
>> common and thus users do not realize they may be hitting memory errors
>> more frequently than they realize. My main workstation is now 5 years
>> old and has accumulated 24 memory errors; that may not seem much but if
>> it happens at a bad time, or in a bad place, they can ruin your day or
>> permanently corrupt your data.
>>
>> As another example of ECC importance my laptop (obviously) doesn't have
>> ECC support and two years ago had a single bit that went bad in the
>> second DIMM. The issue manifested itself as internal compiler errors
>> while building Fennec. The first time I just pulled again from central
>> thinking it was a fluke, the second I updated the build dependencies
>> which I hadn't done in a while thinking that an old GCC might have been
>> the cause. It was not until the third day with a failure that I realized
>> what was happening. A 2-hours long memory test showed me the second DIMM
>> was bad so I removed it, ordered a new one and went on to check my
>> machine. I had to purge my compilation cache because garbage had
>> accumulated in there, run an hg verify on my repo as well as verifying
>> all the installed packages for errors. Since I didn't have access to my
>> main workstation at the time I had wasted 3 days chasing the issue and
>> my workflow was slowed down by a cold compilation cache and a gimped
>> machine (until I could replace the DIMM).
>>
>> This is not common, but it's not rare either and we now have hundreds of
>> developers within Mozilla so people are going to run into issues that
>> can be easily prevented by having ECC memory.
>>
>> That being said ECC memory also makes machines less susceptible to
>> Rowhammer-like attacks and makes them detectable while they are happening.
>>
>> For a more in-depth reading on the matter I suggest reading "Memory
>> Errors in Modern Systems - The Good, The Bad, and The Ugly" [1] in which
>> the authors analyze memory errors on live systems over two years and
>> argue that SEC-DED ECC (the type of protection you usually get on
>> workstations) is often insufficient and even chipkill ECC (now common on
>> most servers) is not enough to catch all errors happening during real
>> world use.
>>
>> Gabriele
>>
>> [1] https://www.cs.virginia.edu/~gurumurthi/papers/asplos15.pdf
>>
>
> The Xeon-W’s are basically the i9’s (both Skylake-X) with support for ECC, 
> more vPRO, and AMT. The Xeon-W’s lack Turbo 3.0 (preferred core). However, 
> Turbo 2.0 apparently reaches the same MHz, so I don’t think it matters much. 
> There are some other differences with regards to PCIe lanes, chipset, etc.
>
> Another big difference is price. The Xeon’s cost a lot more.
>
> For building Firefox, the i9’s and Xeon-W are probably very similar (and is 
> something we should test). It likely comes down to whether you want to pay a 
> premium for ECC and other Xeon-W features. I’m not in a position to answer 
> that.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-06 Thread Gregory Szorc


> On Nov 6, 2017, at 05:19, Gabriele Svelto  wrote:
> 
>> On 04/11/2017 01:10, Jeff Gilbert wrote:
>> Clock speed and core count matter much more than ECC. I wouldn't chase
>> ECC support for general dev machines.
> 
> The Xeon-W SKUs I posted in the previous thread all had identical or
> higher clock speeds than equivalent Core i9 SKUs and ECC support with
> the sole exception of the i9-7980XE which has slightly higher (100MHz)
> peak turbo clock than the Xeon W-2195.
> 
> There is IMHO no performance-related reason to skimp on ECC support
> especially for machines that will sport a significant amount of memory.
> 
> Importance of ECC memory is IMHO underestimated mostly because it's not
> common and thus users do not realize they may be hitting memory errors
> more frequently than they realize. My main workstation is now 5 years
> old and has accumulated 24 memory errors; that may not seem much but if
> it happens at a bad time, or in a bad place, they can ruin your day or
> permanently corrupt your data.
> 
> As another example of ECC importance my laptop (obviously) doesn't have
> ECC support and two years ago had a single bit that went bad in the
> second DIMM. The issue manifested itself as internal compiler errors
> while building Fennec. The first time I just pulled again from central
> thinking it was a fluke, the second I updated the build dependencies
> which I hadn't done in a while thinking that an old GCC might have been
> the cause. It was not until the third day with a failure that I realized
> what was happening. A 2-hours long memory test showed me the second DIMM
> was bad so I removed it, ordered a new one and went on to check my
> machine. I had to purge my compilation cache because garbage had
> accumulated in there, run an hg verify on my repo as well as verifying
> all the installed packages for errors. Since I didn't have access to my
> main workstation at the time I had wasted 3 days chasing the issue and
> my workflow was slowed down by a cold compilation cache and a gimped
> machine (until I could replace the DIMM).
> 
> This is not common, but it's not rare either and we now have hundreds of
> developers within Mozilla so people are going to run into issues that
> can be easily prevented by having ECC memory.
> 
> That being said ECC memory also makes machines less susceptible to
> Rowhammer-like attacks and makes them detectable while they are happening.
> 
> For a more in-depth reading on the matter I suggest reading "Memory
> Errors in Modern Systems - The Good, The Bad, and The Ugly" [1] in which
> the authors analyze memory errors on live systems over two years and
> argue that SEC-DED ECC (the type of protection you usually get on
> workstations) is often insufficient and even chipkill ECC (now common on
> most servers) is not enough to catch all errors happening during real
> world use.
> 
> Gabriele
> 
> [1] https://www.cs.virginia.edu/~gurumurthi/papers/asplos15.pdf
> 

The Xeon-W’s are basically the i9’s (both Skylake-X) with support for ECC, more 
vPRO, and AMT. The Xeon-W’s lack Turbo 3.0 (preferred core). However, Turbo 2.0 
apparently reaches the same MHz, so I don’t think it matters much. There are 
some other differences with regards to PCIe lanes, chipset, etc.

Another big difference is price. The Xeon’s cost a lot more.

For building Firefox, the i9’s and Xeon-W are probably very similar (and is 
something we should test). It likely comes down to whether you want to pay a 
premium for ECC and other Xeon-W features. I’m not in a position to answer that.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-06 Thread Gabriele Svelto
On 04/11/2017 01:10, Jeff Gilbert wrote:
> Clock speed and core count matter much more than ECC. I wouldn't chase
> ECC support for general dev machines.

The Xeon-W SKUs I posted in the previous thread all had identical or
higher clock speeds than equivalent Core i9 SKUs and ECC support with
the sole exception of the i9-7980XE which has slightly higher (100MHz)
peak turbo clock than the Xeon W-2195.

There is IMHO no performance-related reason to skimp on ECC support
especially for machines that will sport a significant amount of memory.

Importance of ECC memory is IMHO underestimated mostly because it's not
common and thus users do not realize they may be hitting memory errors
more frequently than they realize. My main workstation is now 5 years
old and has accumulated 24 memory errors; that may not seem much but if
it happens at a bad time, or in a bad place, they can ruin your day or
permanently corrupt your data.

As another example of ECC importance my laptop (obviously) doesn't have
ECC support and two years ago had a single bit that went bad in the
second DIMM. The issue manifested itself as internal compiler errors
while building Fennec. The first time I just pulled again from central
thinking it was a fluke, the second I updated the build dependencies
which I hadn't done in a while thinking that an old GCC might have been
the cause. It was not until the third day with a failure that I realized
what was happening. A 2-hours long memory test showed me the second DIMM
was bad so I removed it, ordered a new one and went on to check my
machine. I had to purge my compilation cache because garbage had
accumulated in there, run an hg verify on my repo as well as verifying
all the installed packages for errors. Since I didn't have access to my
main workstation at the time I had wasted 3 days chasing the issue and
my workflow was slowed down by a cold compilation cache and a gimped
machine (until I could replace the DIMM).

This is not common, but it's not rare either and we now have hundreds of
developers within Mozilla so people are going to run into issues that
can be easily prevented by having ECC memory.

That being said ECC memory also makes machines less susceptible to
Rowhammer-like attacks and makes them detectable while they are happening.

For a more in-depth reading on the matter I suggest reading "Memory
Errors in Modern Systems - The Good, The Bad, and The Ugly" [1] in which
the authors analyze memory errors on live systems over two years and
argue that SEC-DED ECC (the type of protection you usually get on
workstations) is often insufficient and even chipkill ECC (now common on
most servers) is not enough to catch all errors happening during real
world use.

 Gabriele

[1] https://www.cs.virginia.edu/~gurumurthi/papers/asplos15.pdf



signature.asc
Description: OpenPGP digital signature
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-06 Thread Henri Sivonen
Thank you for including an AMD card among the ones to be tested.

- -

The Radeon RX 460 mentioned earlier in this thread arrived. There was
again enough weirdness that I think it's worth sharing in case it
saves time for someone else:

Initially, for multiple rounds of booting with different cable
configurations, the Lenovo UEFI consistenly displayed nothing if a
cable with a powered-on screen was plugged into the DisplayPort
connector on the RX 460. To see the boot password prompt or anything
else displayed by the Lenovo UEFI, I needed to connect a screen to the
DVI port and *not* have a powered-on screen connected to DisplayPort.
However, Lenovo UEFI started displaying on a DisplayPort-connected
screen (with or without DVI also connected) after one time I had had a
powered-on screen connected to DVI and a powered-off screen connected
to DisplayPort at the start of the boot and I turned on the
DisplayPort screen while the DVI screen was displaying the UEFI
password prompt. However, during that same boot, I happened to not to
have a keyboard connected, because it was connected via the screen
that was powered off, and this caused an UEFI error, so I don't know
which of the DisplayPort device powering on during the UEFI phase or
UEFI going through an error phase due to missing keyboard jolted it to
use the DisplayPort screen properly subsequently. Weird.

On the Linux side, the original Ubuntu 16.04 kernel (4.4) supported
only a low resolution fallback mode. Rolling the hardware enablement
stack forward (to 4.10 series kernel using the incantation given at
https://wiki.ubuntu.com/Kernel/LTSEnablementStack ) fixed this and
resulted in Firefox reporting WebGL2 and all. The fix for
https://bugzilla.kernel.org/show_bug.cgi?id=191281 hasn't propagated
to Ubuntu 16.04's latest HWE stack, which looks distressing during
boot, but it seems harmless so far.

I got the 4 GB model, since it was available at roughly the same price
as the 2 GB model. It supports both screens I have available for
testing at their full resolution simultaneously (2560x1440 plugged
into DisplayPort and 1920x1200 plugged into DVI).

The card is significantly larger than the Quadro M2000. It takes the
space of two card slots (connects to one, but the heat sink and the
dual fans take the space of another slot). The fans don't appear to
make an audible difference compared to the Quadro M2000.

On Fri, Oct 27, 2017 at 6:19 PM, Sophana "Soap" Aik  wrote:
> Thank you Henri for the feedback.
>
> How about this, we can order some graphics cards and put them in the
> evaluation/test machine that is with Greg, to make sure it has good
> compatibility.
>
> We could do:
> Nvidia GTX 1060 3GB
> AMD Radeon RX570
>
> These two options will ensure it can drive multi displays.
>
> Other suggestions welcomed.
>
> Greg, is that something you think we should do?
>
> On Thu, Oct 26, 2017 at 11:33 PM, Henri Sivonen 
> wrote:
>>
>> On Fri, Oct 27, 2017 at 4:48 AM, Sophana "Soap" Aik 
>> wrote:
>> > Hello everyone, great feedback that I will keep in mind and continue to
>> > work
>> > with our vendors to find the best solution with. One of the cards that I
>> > was
>> > looking at is fairly cheap and can at least drive multi-displays (even
>> > 4K
>> > 60hz) was the Nvidia Quadro P600.
>>
>> Is that GPU known to be well-supported by Nouveau of Ubuntu 16.04 vintage?
>>
>> I don't want to deny a single-GPU multi-monitor setup to anyone for
>> whom that's the priority, but considering how much damage the Quadro
>> M2000 has done to my productivity (and from what I've heard from other
>> people on the DOM team, I gather I'm not the only one who has had
>> trouble with it), the four DisplayPort connectors on it look like very
>> bad economics.
>>
>> I suggest these two criteria be considered for developer workstations
>> in addition to build performance:
>>  1) The CPU is compatible with rr (at present, this means that the CPU
>> has to be from Intel and not from AMD)
>>  2) The GPU offered by default (again, I don't want to deny multiple
>> DisplayPort connectors on a single GPU to people who request them)
>> works well in OpenGL mode (i.e. without llvmpipe activating) without
>> freezes using the Open Source drivers included in Ubuntu LTS and
>> Fedora.
>>
>> On Fri, Oct 27, 2017 at 2:36 AM, Gregory Szorc  wrote:
>> > Host OS matters for finding UI bugs and issues with add-ons (since lots
>> > of
>> > add-on developers are also on Linux or MacOS).
>>
>> I think it's a bad tradeoff to trade off the productivity of
>> developers working on the cross-platform core of Firefox in order to
>> get them to report Windows-specific bugs. We have people in the
>> organization who aren't developing the cross-platform core and who are
>> running Windows anyway. I'd prefer the energy currently put into
>> getting developers of the cross-platform core to use Windows to be put
>> into getting the people who 

Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-03 Thread Jeff Gilbert
Clock speed and core count matter much more than ECC. I wouldn't chase
ECC support for general dev machines.

On Thu, Nov 2, 2017 at 6:46 PM, Gregory Szorc  wrote:
> On Thu, Nov 2, 2017 at 3:43 PM, Nico Grunbaum  wrote:
>
>> For rr I have an i7 desktop with a base clock of 4.0 Ghz, and for building
>> I use icecc to distribute the load (or rather I will be again when bug
>> 1412240[0] is closed).  The i9 series has lower base clocks (2.8 Ghz, and
>> 2.6Ghz for the top SKUs)[1], but high boost clocks of 4.2 Ghz.  If I were
>> to switch over to an i9 for everything, would I see a notable difference in
>> performance in rr?
>>
>
> Which i7? You should get better CPU efficiency with newer
> microarchitectures. The i9's we're talking about are based on Skylake-X
> which is based on Skylake which are the i7-6XXX models in the consumer
> lines. It isn't enough to compare MHz: you need to also consider
> microarchitectures, memory, and workload.
>
> https://arstechnica.com/gadgets/2017/09/intel-core-i9-7960x-review/2/ has
> some single-threaded benchmarks. The i7-7700K (Kaby Lake) seems to "win"
> for single-threaded performance. But the i9's aren't far behind. Not far
> enough behind to cancel out the benefits of the extra cores IMO.
>
> This is because the i9's are pretty aggressive about using turbo. More
> aggressive than the Xeons. As long as cooling can keep up, the top-end GHz
> is great and you aren't sacrificing that much perf to have more cores on
> die. You can counter by arguing that the consumer-grade i7's can yield more
> speedups via overclocking. But for enterprise uses, having this all built
> into the chip so it "just works" without voiding warranty is a nice trait :)
>
> FWIW, the choice to go with Xeons always bothered me because we had to make
> an explicit clock vs core trade-off. Building Firefox requires both many
> cores for compiling and fast cores for linking. Since the i9's turbo so
> well, we get the best of both worlds. And at a much lower price. Aside from
> the loss of ECC, it is a pretty easy decision to switch.
>
>
>> -Nico
>>
>> [0] https://bugzilla.mozilla.org/show_bug.cgi?id=1412240 Build failure in
>> libavutil (missing atomic definitions), when building with clang and icecc
>>
>> [1] https://ark.intel.com/products/series/123588/Intel-Core-X-
>> series-Processors
>>
>> On 10/27/17 7:50 PM, Robert O'Callahan wrote:
>>
>>> BTW can someone forward this entire thread to their friends at AMD so AMD
>>> will fix their CPUs to run rr? They're tantalizingly close :-/.
>>>
>>> Rob
>>>
>>
>> ___
>> dev-platform mailing list
>> dev-platform@lists.mozilla.org
>> https://lists.mozilla.org/listinfo/dev-platform
>>
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-02 Thread Gregory Szorc
On Thu, Nov 2, 2017 at 3:43 PM, Nico Grunbaum  wrote:

> For rr I have an i7 desktop with a base clock of 4.0 Ghz, and for building
> I use icecc to distribute the load (or rather I will be again when bug
> 1412240[0] is closed).  The i9 series has lower base clocks (2.8 Ghz, and
> 2.6Ghz for the top SKUs)[1], but high boost clocks of 4.2 Ghz.  If I were
> to switch over to an i9 for everything, would I see a notable difference in
> performance in rr?
>

Which i7? You should get better CPU efficiency with newer
microarchitectures. The i9's we're talking about are based on Skylake-X
which is based on Skylake which are the i7-6XXX models in the consumer
lines. It isn't enough to compare MHz: you need to also consider
microarchitectures, memory, and workload.

https://arstechnica.com/gadgets/2017/09/intel-core-i9-7960x-review/2/ has
some single-threaded benchmarks. The i7-7700K (Kaby Lake) seems to "win"
for single-threaded performance. But the i9's aren't far behind. Not far
enough behind to cancel out the benefits of the extra cores IMO.

This is because the i9's are pretty aggressive about using turbo. More
aggressive than the Xeons. As long as cooling can keep up, the top-end GHz
is great and you aren't sacrificing that much perf to have more cores on
die. You can counter by arguing that the consumer-grade i7's can yield more
speedups via overclocking. But for enterprise uses, having this all built
into the chip so it "just works" without voiding warranty is a nice trait :)

FWIW, the choice to go with Xeons always bothered me because we had to make
an explicit clock vs core trade-off. Building Firefox requires both many
cores for compiling and fast cores for linking. Since the i9's turbo so
well, we get the best of both worlds. And at a much lower price. Aside from
the loss of ECC, it is a pretty easy decision to switch.


> -Nico
>
> [0] https://bugzilla.mozilla.org/show_bug.cgi?id=1412240 Build failure in
> libavutil (missing atomic definitions), when building with clang and icecc
>
> [1] https://ark.intel.com/products/series/123588/Intel-Core-X-
> series-Processors
>
> On 10/27/17 7:50 PM, Robert O'Callahan wrote:
>
>> BTW can someone forward this entire thread to their friends at AMD so AMD
>> will fix their CPUs to run rr? They're tantalizingly close :-/.
>>
>> Rob
>>
>
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
>
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-11-02 Thread Nico Grunbaum
For rr I have an i7 desktop with a base clock of 4.0 Ghz, and for 
building I use icecc to distribute the load (or rather I will be again 
when bug 1412240[0] is closed).  The i9 series has lower base clocks 
(2.8 Ghz, and 2.6Ghz for the top SKUs)[1], but high boost clocks of 4.2 
Ghz.  If I were to switch over to an i9 for everything, would I see a 
notable difference in performance in rr?


-Nico

[0] https://bugzilla.mozilla.org/show_bug.cgi?id=1412240 Build failure 
in libavutil (missing atomic definitions), when building with clang and 
icecc


[1] 
https://ark.intel.com/products/series/123588/Intel-Core-X-series-Processors


On 10/27/17 7:50 PM, Robert O'Callahan wrote:

BTW can someone forward this entire thread to their friends at AMD so AMD
will fix their CPUs to run rr? They're tantalizingly close :-/.

Rob


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-28 Thread Sophana "Soap" Aik
Thanks Gabriele, that poses a problem then for the system build we have in
mind here as the i9's do not support ECC memory. That may have to be a
separate system with a Xeon.

On Fri, Oct 27, 2017 at 3:58 PM, Gabriele Svelto 
wrote:

> On 27/10/2017 01:02, Gregory Szorc wrote:
> > Sophana (CCd) is working on a new system build right now. It will be
> based
> > on the i9's instead of dual socket Xeons and should be faster and
> cheaper.
>
> ... and lacking ECC memory. Please whatever CPU is chosen make sure it
> has ECC support and the machine comes loaded with ECC memory. Developer
> boxes usually ship with plenty of memory, and they can stay on for days
> without a reboot churning at builds and tests. Memory errors happen and
> they can ruin days of work if they hit you at the wrong time.
>
>  Gabriele
>
>
>


-- 
moz://a
Sophana "Soap" Aik
IT Vendor Management Analyst
IRC/Slack: soap
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-28 Thread Sophana "Soap" Aik
Thank you Henri for the feedback.

How about this, we can order some graphics cards and put them in the
evaluation/test machine that is with Greg, to make sure it has good
compatibility.

We could do:
Nvidia GTX 1060 3GB
AMD Radeon RX570

These two options will ensure it can drive multi displays.

Other suggestions welcomed.

Greg, is that something you think we should do?

On Thu, Oct 26, 2017 at 11:33 PM, Henri Sivonen 
wrote:

> On Fri, Oct 27, 2017 at 4:48 AM, Sophana "Soap" Aik 
> wrote:
> > Hello everyone, great feedback that I will keep in mind and continue to
> work
> > with our vendors to find the best solution with. One of the cards that I
> was
> > looking at is fairly cheap and can at least drive multi-displays (even 4K
> > 60hz) was the Nvidia Quadro P600.
>
> Is that GPU known to be well-supported by Nouveau of Ubuntu 16.04 vintage?
>
> I don't want to deny a single-GPU multi-monitor setup to anyone for
> whom that's the priority, but considering how much damage the Quadro
> M2000 has done to my productivity (and from what I've heard from other
> people on the DOM team, I gather I'm not the only one who has had
> trouble with it), the four DisplayPort connectors on it look like very
> bad economics.
>
> I suggest these two criteria be considered for developer workstations
> in addition to build performance:
>  1) The CPU is compatible with rr (at present, this means that the CPU
> has to be from Intel and not from AMD)
>  2) The GPU offered by default (again, I don't want to deny multiple
> DisplayPort connectors on a single GPU to people who request them)
> works well in OpenGL mode (i.e. without llvmpipe activating) without
> freezes using the Open Source drivers included in Ubuntu LTS and
> Fedora.
>
> On Fri, Oct 27, 2017 at 2:36 AM, Gregory Szorc  wrote:
> > Host OS matters for finding UI bugs and issues with add-ons (since lots
> of
> > add-on developers are also on Linux or MacOS).
>
> I think it's a bad tradeoff to trade off the productivity of
> developers working on the cross-platform core of Firefox in order to
> get them to report Windows-specific bugs. We have people in the
> organization who aren't developing the cross-platform core and who are
> running Windows anyway. I'd prefer the energy currently put into
> getting developers of the cross-platform core to use Windows to be put
> into getting the people who use Windows anyway to use Nightly. (It
> saddens me to hear fear of Nightly from within Mozilla.)
>
> > Unless you have requirements that prohibit using a VM, I encourage using
> this setup.
>
> For some three-four years, I developed in a Linux VM hosted on
> Windows. I'm not too worried about the performance overhead of a VM.
> However, rr is such an awesome tool that it justifies running Linux as
> the host OS.
>
> > I concede that performance testing on i9s and Xeons is not at all
> indicative
> > of the typical user :)
>
> Indeed. Still, we don't need Nvidia professional GPUs for build times,
> so boring well-supported consumer-grade GPUs would also be in the
> interest of "using what our users use" even if paired with a CPU that
> isn't representative of typical users' computers.
>
> On Fri, Oct 27, 2017 at 1:13 AM, Thomas Daede  wrote:
> > I have a RX 460 in a desktop with F26 and can confirm that it works
> > out-of-the-box at 4K with the open source drivers, and will happily run
> > Pathfinder demos at <16ms frame time.* It also seems to run Servo's
> > Webrender just fine.
> >
> > It's been superseded by the RX 560, which is a faster clock of the same
> > chip. It should work just as well, but might need a slightly newer
> > kernel than the 4xx to pick up the pci ids (maybe a problem with LTS
> > ubuntu?) The RX 570 and 580 should be fine too, but require power
> > connectors. The Vega models are waiting on a kernel-side driver rewrite
> > (by AMD) that will land in 4.15 (hopefully with new features and
> > regressions to the RX 5xx series...)
>
> Thank you. I placed an order for an RX 460.
>
> --
> Henri Sivonen
> hsivo...@hsivonen.fi
> https://hsivonen.fi/
>



-- 
moz://a
Sophana "Soap" Aik
IT Vendor Management Analyst
IRC/Slack: soap
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-28 Thread Sophana "Soap" Aik
Hello everyone, great feedback that I will keep in mind and continue to
work with our vendors to find the best solution with. One of the cards that
I was looking at is fairly cheap and can at least drive multi-displays
(even 4K 60hz) was the Nvidia Quadro P600. I feel especially based on the
work that Greg has been doing, the processor, storage, and RAM is more
important than graphics. So I will lean more towards that type of build. I
will provide an update as soon as we have something more concrete regarding
some final specifications that I hope to have soon. Thanks

On Thu, Oct 26, 2017 at 4:36 PM, Gregory Szorc  wrote:

> On Thu, Oct 26, 2017 at 4:31 PM, Mike Hommey  wrote:
>
>> On Thu, Oct 26, 2017 at 04:02:20PM -0700, Gregory Szorc wrote:
>> > Also, the machines come with Windows by default. That's by design:
>> that's
>> > where the bulk of Firefox users are. We will develop better products if
>> the
>> > machines we use every day resemble what actual users use. I would
>> encourage
>> > developers to keep Windows on the new machines when they are issued.
>>
>> Except actual users are not using i9s or dual xeons. Yes, we have
>> slower reference hardware, but that also makes the argument of using the
>> same thing as actual users less relevant: you can't develop on machines
>> that actually look like what users have. So, as long as you have the
>> slower reference hardware to test, it doesn't seem to me it should
>> matter what OS you're running on your development machine.
>
>
> Host OS matters for finding UI bugs and issues with add-ons (since lots of
> add-on developers are also on Linux or MacOS).
>
> I concede that performance testing on i9s and Xeons is not at all
> indicative of the typical user :)
>



-- 
moz://a
Sophana "Soap" Aik
IT Vendor Management Analyst
IRC/Slack: soap
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-27 Thread Robert O'Callahan
BTW can someone forward this entire thread to their friends at AMD so AMD
will fix their CPUs to run rr? They're tantalizingly close :-/.

Rob
-- 
lbir ye,ea yer.tnietoehr  rdn rdsme,anea lurpr  edna e hnysnenh hhe uresyf
toD
selthor  stor  edna  siewaoeodm  or v sstvr  esBa  kbvted,t
rdsme,aoreseoouoto
o l euetiuruewFa  kbn e hnystoivateweh uresyf tulsa rehr  rdm  or rnea
lurpr
.a war hsrer holsa rodvted,t  nenh hneireseoouot.tniesiewaoeivatewt sstvr
esn
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-27 Thread Gabriele Svelto
On 28/10/2017 01:08, Sophana "Soap" Aik wrote:
> Thanks Gabriele, that poses a problem then for the system build we have
> in mind here as the i9's do not support ECC memory. That may have to be
> a separate system with a Xeon.

Xeon-W processors are identical to the i9 but come with more
workstation/server-oriented features such as ECC memory support; they
are also offered with slightly higher peak clock speed to equivalent
i9s. Here's a side-by-side comparison of the top 4 SKUs in both families:

https://ark.intel.com/compare/123589,126709,123767,126707,125042,123613,126793,126699

 Gabriele



signature.asc
Description: OpenPGP digital signature
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-27 Thread Gregory Szorc
Yeah. Only the Xeons and ThreadRipper (as our potential high core count
machines) support ECC. rr, ECC, or reasonable costs: pick at most two :/

On Fri, Oct 27, 2017 at 4:08 PM, Sophana "Soap" Aik 
wrote:

> Thanks Gabriele, that poses a problem then for the system build we have in
> mind here as the i9's do not support ECC memory. That may have to be a
> separate system with a Xeon.
>
> On Fri, Oct 27, 2017 at 3:58 PM, Gabriele Svelto 
> wrote:
>
>> On 27/10/2017 01:02, Gregory Szorc wrote:
>> > Sophana (CCd) is working on a new system build right now. It will be
>> based
>> > on the i9's instead of dual socket Xeons and should be faster and
>> cheaper.
>>
>> ... and lacking ECC memory. Please whatever CPU is chosen make sure it
>> has ECC support and the machine comes loaded with ECC memory. Developer
>> boxes usually ship with plenty of memory, and they can stay on for days
>> without a reboot churning at builds and tests. Memory errors happen and
>> they can ruin days of work if they hit you at the wrong time.
>>
>>  Gabriele
>>
>>
>>
>
>
> --
> moz://a
> Sophana "Soap" Aik
> IT Vendor Management Analyst
> IRC/Slack: soap
>
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-27 Thread Gabriele Svelto
On 27/10/2017 01:02, Gregory Szorc wrote:
> Sophana (CCd) is working on a new system build right now. It will be based
> on the i9's instead of dual socket Xeons and should be faster and cheaper.

... and lacking ECC memory. Please whatever CPU is chosen make sure it
has ECC support and the machine comes loaded with ECC memory. Developer
boxes usually ship with plenty of memory, and they can stay on for days
without a reboot churning at builds and tests. Memory errors happen and
they can ruin days of work if they hit you at the wrong time.

 Gabriele




signature.asc
Description: OpenPGP digital signature
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-27 Thread Steve Fink
Not necessarily relevant to this specific discussion, but I'm on a 
Lenovo P50 running Linux, and wanted to offer up my setup as a 
datapoint. (It's not quite either a recommendation or a word of warning. 
A combination.)


I use Linux (Fedora 25) as the host OS, with two external monitors plus 
the laptop screen. Windows is installed natively on a separate 
partition. For a while, I used VirtualBox to run the native Windows 
installation in a VM, rebooting into Windows on the rare occasions when 
I needed the extra performance or I wanted to diagnose whether something 
was virtualization-specific. The machine has an Intel HD Graphics P530 
and a Quadro M2000M. One external monitor is hooked up via DP, the other 
HDMI. I have a single desktop spread across all three (as in, I can drag 
windows between them). I use the nouveau driver. Videoconferencing works.


It all works well enough. There are large caveats and drawbacks. It took 
an insane amount of configuration attempts to get it to where it is, and 
again: there are large caveats and drawbacks.


Whichever monitor is on HDMI is at the wrong resolution (1920x1080 
instead of its native 1920x1200). I am running X11 because Wayland 
doesn't work. (Though I'm fine with that, because I'm old school and I 
run xfce4.) The laptop screen is HiDPI and when I disconnect from the 
external screens, I have to zoom everything in, which only partially 
works (eg Firefox's chrome is still small). I used to use xrandr with 
--scale 0.5x0.5 to expand everything, but that caused too many issues. 
When I turn my external monitors back on in the morning, one of them 
comes up fine and the other does not display anything until I do 
ctrl-alt-f3 alt-f2 to switch to VT3 then back to VT2. When I reconnect 
my monitors, it will often mirror a single image to all 3 displays, and 
I have to turn mirroring on and then back off again then drag my 
monitors back to the right relative positioning.


I use the nouveau driver now. I started out with nouveau and it was 
causing lots of random lockups, so I switched to the proprietary nvidia 
driver. It did not work well when I disconnected and reconnected the 
external monitors. Nor sometimes if I suspended and resumed. I have no 
idea why nouveau has magically become stable; probably some update or other.


My Windows setup broke when I switched from an HDD to an SSD. First, it 
stopped booting natively and I could only run it through the VM. Now it 
hangs on boot even with the VM unless I boot into safe mode. I have sunk 
more time than I'm willing to admit in trying to fix it and failed. My 
plan is to start over with a disk with Windows preinstalled and clone my 
Linux partitions over to it, but I can't muster the energy to dive back 
into the nightmare and I don't really need Windows very often anyway.



___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-27 Thread Robert O'Callahan
On Fri, Oct 27, 2017 at 2:34 AM, Henri Sivonen  wrote:

> And the downsides don't even end there. rr didn't work. Plus other
> stuff not worth mentioning here.
>

Turns out that rr not working with Nvidia on Ubuntu 17.10 was actually an
rr issue triggered by the Ubuntu libc upgrade, not Nvidia's fault. I just
fixed it in rr master. We'll do an rr release soon, because the libc update
required a number of rr fixes.

Rob
-- 
lbir ye,ea yer.tnietoehr  rdn rdsme,anea lurpr  edna e hnysnenh hhe uresyf
toD
selthor  stor  edna  siewaoeodm  or v sstvr  esBa  kbvted,t
rdsme,aoreseoouoto
o l euetiuruewFa  kbn e hnystoivateweh uresyf tulsa rehr  rdm  or rnea
lurpr
.a war hsrer holsa rodvted,t  nenh hneireseoouot.tniesiewaoeivatewt sstvr
esn
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-27 Thread Henri Sivonen
On Fri, Oct 27, 2017 at 4:48 AM, Sophana "Soap" Aik  wrote:
> Hello everyone, great feedback that I will keep in mind and continue to work
> with our vendors to find the best solution with. One of the cards that I was
> looking at is fairly cheap and can at least drive multi-displays (even 4K
> 60hz) was the Nvidia Quadro P600.

Is that GPU known to be well-supported by Nouveau of Ubuntu 16.04 vintage?

I don't want to deny a single-GPU multi-monitor setup to anyone for
whom that's the priority, but considering how much damage the Quadro
M2000 has done to my productivity (and from what I've heard from other
people on the DOM team, I gather I'm not the only one who has had
trouble with it), the four DisplayPort connectors on it look like very
bad economics.

I suggest these two criteria be considered for developer workstations
in addition to build performance:
 1) The CPU is compatible with rr (at present, this means that the CPU
has to be from Intel and not from AMD)
 2) The GPU offered by default (again, I don't want to deny multiple
DisplayPort connectors on a single GPU to people who request them)
works well in OpenGL mode (i.e. without llvmpipe activating) without
freezes using the Open Source drivers included in Ubuntu LTS and
Fedora.

On Fri, Oct 27, 2017 at 2:36 AM, Gregory Szorc  wrote:
> Host OS matters for finding UI bugs and issues with add-ons (since lots of
> add-on developers are also on Linux or MacOS).

I think it's a bad tradeoff to trade off the productivity of
developers working on the cross-platform core of Firefox in order to
get them to report Windows-specific bugs. We have people in the
organization who aren't developing the cross-platform core and who are
running Windows anyway. I'd prefer the energy currently put into
getting developers of the cross-platform core to use Windows to be put
into getting the people who use Windows anyway to use Nightly. (It
saddens me to hear fear of Nightly from within Mozilla.)

> Unless you have requirements that prohibit using a VM, I encourage using this 
> setup.

For some three-four years, I developed in a Linux VM hosted on
Windows. I'm not too worried about the performance overhead of a VM.
However, rr is such an awesome tool that it justifies running Linux as
the host OS.

> I concede that performance testing on i9s and Xeons is not at all indicative
> of the typical user :)

Indeed. Still, we don't need Nvidia professional GPUs for build times,
so boring well-supported consumer-grade GPUs would also be in the
interest of "using what our users use" even if paired with a CPU that
isn't representative of typical users' computers.

On Fri, Oct 27, 2017 at 1:13 AM, Thomas Daede  wrote:
> I have a RX 460 in a desktop with F26 and can confirm that it works
> out-of-the-box at 4K with the open source drivers, and will happily run
> Pathfinder demos at <16ms frame time.* It also seems to run Servo's
> Webrender just fine.
>
> It's been superseded by the RX 560, which is a faster clock of the same
> chip. It should work just as well, but might need a slightly newer
> kernel than the 4xx to pick up the pci ids (maybe a problem with LTS
> ubuntu?) The RX 570 and 580 should be fine too, but require power
> connectors. The Vega models are waiting on a kernel-side driver rewrite
> (by AMD) that will land in 4.15 (hopefully with new features and
> regressions to the RX 5xx series...)

Thank you. I placed an order for an RX 460.

-- 
Henri Sivonen
hsivo...@hsivonen.fi
https://hsivonen.fi/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Gregory Szorc
On Thu, Oct 26, 2017 at 4:31 PM, Mike Hommey  wrote:

> On Thu, Oct 26, 2017 at 04:02:20PM -0700, Gregory Szorc wrote:
> > Also, the machines come with Windows by default. That's by design: that's
> > where the bulk of Firefox users are. We will develop better products if
> the
> > machines we use every day resemble what actual users use. I would
> encourage
> > developers to keep Windows on the new machines when they are issued.
>
> Except actual users are not using i9s or dual xeons. Yes, we have
> slower reference hardware, but that also makes the argument of using the
> same thing as actual users less relevant: you can't develop on machines
> that actually look like what users have. So, as long as you have the
> slower reference hardware to test, it doesn't seem to me it should
> matter what OS you're running on your development machine.


Host OS matters for finding UI bugs and issues with add-ons (since lots of
add-on developers are also on Linux or MacOS).

I concede that performance testing on i9s and Xeons is not at all
indicative of the typical user :)
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Mike Hommey
On Thu, Oct 26, 2017 at 04:02:20PM -0700, Gregory Szorc wrote:
> Also, the machines come with Windows by default. That's by design: that's
> where the bulk of Firefox users are. We will develop better products if the
> machines we use every day resemble what actual users use. I would encourage
> developers to keep Windows on the new machines when they are issued.

Except actual users are not using i9s or dual xeons. Yes, we have
slower reference hardware, but that also makes the argument of using the
same thing as actual users less relevant: you can't develop on machines
that actually look like what users have. So, as long as you have the
slower reference hardware to test, it doesn't seem to me it should
matter what OS you're running on your development machine.

Mike
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Jeff Muizelaar
On Thu, Oct 26, 2017 at 7:02 PM, Gregory Szorc  wrote:
> I also share your desire to not issue fancy video cards in these machines
> by default. If there are suggestions for a default video card, now is the
> time to make noise :)

Intel GPUs are the best choice if you want to be like bulk of our
users. Otherwise any cheap AMD GPU is going to be good enough.
Probably the number and kind of display outputs are what matters most.

-Jeff
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Jeff Muizelaar
On Thu, Oct 26, 2017 at 7:02 PM, Gregory Szorc  wrote:
> Unless you have requirements that prohibit using a
> VM, I encourage using this setup.

rr doesn't work in hyper-v. AFAIK the only Windows VM it works in is VMWare

-Jeff
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Gregory Szorc
On Thu, Oct 26, 2017 at 6:34 AM, Henri Sivonen  wrote:

> On Thu, Oct 26, 2017 at 9:15 AM, Henri Sivonen 
> wrote:
> > There's a huge downside, though:
> > If the screen stops consuming the DisplayPort data stream, the
> > graphical session gets killed! So if you do normal things like turn
> > the screen off or switch input on a multi-input screen, your graphical
> > session is no longer there when you come back and you get a login
> > screen instead! (I haven't yet formed an opinion on whether this
> > behavior can be lived with or not.)
>
> And the downsides don't even end there. rr didn't work. Plus other
> stuff not worth mentioning here.
>
> I guess going back to 16.04.1 is a better deal than 17.10.
>
> > P.S. It would be good for productivity if Mozilla issued slightly less
> > cutting-edge Nvidia GPUs to developers to increase the probability
> > that support in nouveau has had time to bake.
>
> This Mozilla-issued Quadro M2000 has been a very significant harm to
> my productivity. Considering how good rr is, I think it makes sense to
> continue to run Linux to develop Firefox. However, I think it doesn't
> make sense to issue fancy cutting-edge Nvidia GPUs to developers who
> aren't specifically working on Nvidia-specific bugs and, instead, it
> would make sense to issue GPUs that are boring as possible in terms of
> Linux driver support (i.e. Just Works with distro-bundled Free
> Software drivers). Going forward, perhaps Mozilla could issue AMD GPUs
> with computers that don't have Intel GPUs?
>
> As for the computer at hand, I want to put an end to this Nvidia
> obstacle to getting stuff done. It's been suggested to me that Radeon
> RX 560 would be well supported by distro-provided drivers, but the
> "*2" footnote at https://help.ubuntu.com/community/AMDGPU-Driver
> doesn't look too good. Based on that table it seems one should get
> Radeon RX 460. Is this the correct conclusion? Does Radeon RX 460 Just
> Work with Ubuntu 16.04? Is Radeon RX 460 going to be
> WebRender-compatible?
>

Sophana (CCd) is working on a new system build right now. It will be based
on the i9's instead of dual socket Xeons and should be faster and cheaper.
We can all thank AMD for introducing competition in the CPU market to
enable this to happen :)

I also share your desire to not issue fancy video cards in these machines
by default. If there are suggestions for a default video card, now is the
time to make noise :)

Also, the machines come with Windows by default. That's by design: that's
where the bulk of Firefox users are. We will develop better products if the
machines we use every day resemble what actual users use. I would encourage
developers to keep Windows on the new machines when they are issued.

I concede that developing Firefox on Linux is better than on Windows for a
myriad of reasons. However, that doesn't mean you have to forego Linux. I
use Hyper-V under Windows 10 to run Linux. I do most of my development
(editors, builds, etc) in that local Linux VM. I use an X server for
connecting to graphic Linux applications. The overhead of Hyper-V as
compared to native Linux is negligible. Unless I need fast graphics in
Linux (which is rare), I pretty much get the advantages of Windows *and*
Linux simultaneously. Unless you have requirements that prohibit using a
VM, I encourage using this setup.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Thomas Daede
On 10/26/2017 06:34 AM, Henri Sivonen wrote:
> As for the computer at hand, I want to put an end to this Nvidia
> obstacle to getting stuff done. It's been suggested to me that Radeon
> RX 560 would be well supported by distro-provided drivers, but the
> "*2" footnote at https://help.ubuntu.com/community/AMDGPU-Driver
> doesn't look too good. Based on that table it seems one should get
> Radeon RX 460. Is this the correct conclusion? Does Radeon RX 460 Just
> Work with Ubuntu 16.04? Is Radeon RX 460 going to be
> WebRender-compatible?
> 

I have a RX 460 in a desktop with F26 and can confirm that it works
out-of-the-box at 4K with the open source drivers, and will happily run
Pathfinder demos at <16ms frame time.* It also seems to run Servo's
Webrender just fine.

It's been superseded by the RX 560, which is a faster clock of the same
chip. It should work just as well, but might need a slightly newer
kernel than the 4xx to pick up the pci ids (maybe a problem with LTS
ubuntu?) The RX 570 and 580 should be fine too, but require power
connectors. The Vega models are waiting on a kernel-side driver rewrite
(by AMD) that will land in 4.15 (hopefully with new features and
regressions to the RX 5xx series...)

Intel graphics are also nice but only available on the E3 xeons AFAIK.
And nouveau is stuck, because new cards require signed firmware that
nVidia is unwilling to distribute.

* While Pathfinder happily renders at 60fps, Firefox draws frames slower
because of its WebGL readback path. That is not the fault of the GPU,
however.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Nathan Froyd
On Thu, Oct 26, 2017 at 9:34 AM, Henri Sivonen  wrote:
> As for the computer at hand, I want to put an end to this Nvidia
> obstacle to getting stuff done. It's been suggested to me that Radeon
> RX 560 would be well supported by distro-provided drivers, but the
> "*2" footnote at https://help.ubuntu.com/community/AMDGPU-Driver
> doesn't look too good. Based on that table it seems one should get
> Radeon RX 460. Is this the correct conclusion? Does Radeon RX 460 Just
> Work with Ubuntu 16.04? Is Radeon RX 460 going to be
> WebRender-compatible?

Can't speak to the WebRender compatibility issue, but I have a Radeon
R270 and a Radeon RX 470 in my Linux machine, and Ubuntu 16.04 seems
to be pretty happy with both of them.

-Nathan
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Jeff Muizelaar
Yeah. I'd suggest anyone who's running Linux on these machines just go
out and buy a $100 AMD GPU to replace the Quadro. Even if you don't
expense the new GPU and just throw the Quadro in the trash you'll
probably be happier.

-Jeff

On Thu, Oct 26, 2017 at 9:34 AM, Henri Sivonen  wrote:
> On Thu, Oct 26, 2017 at 9:15 AM, Henri Sivonen  wrote:
>> There's a huge downside, though:
>> If the screen stops consuming the DisplayPort data stream, the
>> graphical session gets killed! So if you do normal things like turn
>> the screen off or switch input on a multi-input screen, your graphical
>> session is no longer there when you come back and you get a login
>> screen instead! (I haven't yet formed an opinion on whether this
>> behavior can be lived with or not.)
>
> And the downsides don't even end there. rr didn't work. Plus other
> stuff not worth mentioning here.
>
> I guess going back to 16.04.1 is a better deal than 17.10.
>
>> P.S. It would be good for productivity if Mozilla issued slightly less
>> cutting-edge Nvidia GPUs to developers to increase the probability
>> that support in nouveau has had time to bake.
>
> This Mozilla-issued Quadro M2000 has been a very significant harm to
> my productivity. Considering how good rr is, I think it makes sense to
> continue to run Linux to develop Firefox. However, I think it doesn't
> make sense to issue fancy cutting-edge Nvidia GPUs to developers who
> aren't specifically working on Nvidia-specific bugs and, instead, it
> would make sense to issue GPUs that are boring as possible in terms of
> Linux driver support (i.e. Just Works with distro-bundled Free
> Software drivers). Going forward, perhaps Mozilla could issue AMD GPUs
> with computers that don't have Intel GPUs?
>
> As for the computer at hand, I want to put an end to this Nvidia
> obstacle to getting stuff done. It's been suggested to me that Radeon
> RX 560 would be well supported by distro-provided drivers, but the
> "*2" footnote at https://help.ubuntu.com/community/AMDGPU-Driver
> doesn't look too good. Based on that table it seems one should get
> Radeon RX 460. Is this the correct conclusion? Does Radeon RX 460 Just
> Work with Ubuntu 16.04? Is Radeon RX 460 going to be
> WebRender-compatible?
>
> --
> Henri Sivonen
> hsivo...@hsivonen.fi
> https://hsivonen.fi/
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Henri Sivonen
On Thu, Oct 26, 2017 at 9:15 AM, Henri Sivonen  wrote:
> There's a huge downside, though:
> If the screen stops consuming the DisplayPort data stream, the
> graphical session gets killed! So if you do normal things like turn
> the screen off or switch input on a multi-input screen, your graphical
> session is no longer there when you come back and you get a login
> screen instead! (I haven't yet formed an opinion on whether this
> behavior can be lived with or not.)

And the downsides don't even end there. rr didn't work. Plus other
stuff not worth mentioning here.

I guess going back to 16.04.1 is a better deal than 17.10.

> P.S. It would be good for productivity if Mozilla issued slightly less
> cutting-edge Nvidia GPUs to developers to increase the probability
> that support in nouveau has had time to bake.

This Mozilla-issued Quadro M2000 has been a very significant harm to
my productivity. Considering how good rr is, I think it makes sense to
continue to run Linux to develop Firefox. However, I think it doesn't
make sense to issue fancy cutting-edge Nvidia GPUs to developers who
aren't specifically working on Nvidia-specific bugs and, instead, it
would make sense to issue GPUs that are boring as possible in terms of
Linux driver support (i.e. Just Works with distro-bundled Free
Software drivers). Going forward, perhaps Mozilla could issue AMD GPUs
with computers that don't have Intel GPUs?

As for the computer at hand, I want to put an end to this Nvidia
obstacle to getting stuff done. It's been suggested to me that Radeon
RX 560 would be well supported by distro-provided drivers, but the
"*2" footnote at https://help.ubuntu.com/community/AMDGPU-Driver
doesn't look too good. Based on that table it seems one should get
Radeon RX 460. Is this the correct conclusion? Does Radeon RX 460 Just
Work with Ubuntu 16.04? Is Radeon RX 460 going to be
WebRender-compatible?

-- 
Henri Sivonen
hsivo...@hsivonen.fi
https://hsivonen.fi/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


More ThinkStation P710 Nvidia tips (was Re: Faster gecko builds with IceCC on Mac and Linux)

2017-10-26 Thread Henri Sivonen
On Thu, Mar 23, 2017 at 3:43 PM, Henri Sivonen  wrote:
> On Wed, Jul 6, 2016 at 2:42 AM, Gregory Szorc  wrote:
>> The Lenovo ThinkStation P710 is a good starting point (
>> http://shop.lenovo.com/us/en/workstations/thinkstation/p-series/p710/).
>
> To help others who follow the above advice save some time:
>
> Xeons don't have Intel integrated GPUs, so one has to figure how to
> get this up and running with a discrete GPU. In the case of Nvidia
> Quadro M2000, the latest Ubuntu and Fedora install images don't work.
>
> This works:
> Disable or enable the TPM. (By default, it's in a mode where the
> kernel can see it but it doesn't work. It should either be hidden or
> be allowed to work.)
> Disable secure boot. (Nvidia's proprietary drivers don't work with
> secure boot enabled.)
> Use the Ubuntu 16.04.1 install image (i.e. intentionally old
> image--you can upgrade later)
> After installing, edit /etc/default/grub and set
> GRUB_CMDLINE_LINUX_DEFAULT="" (i.e. make the string empty; without
> this, the nvidia proprietary driver conflicts with LUKS pass phrase
> input).
> update-initramfs -u
> update-grub
> apt install nvidia-375
> Then upgrade the rest. Even rolling forward the HWE stack works
> *after* the above steps.

Xenial set up according to the above steps managed to make itself
unbootable. I don't know why, but I suspect the nvidia proprietary
driver somehow fell out of use and nouveau froze.

The symptom is that a warning triangle (triangle with an exclamation
mark) shows up in the upper right part of the front panel and the
light of the topmost USB port in the front panel starts blinking.

Turning the computer off isn't enough to get rid of the warning
triangle and the blinking USB port light. To get rid of those,
disconnect the power cord for a while and then plug it back in.

After the warning triangle is gone, it's possible to boot Ubuntu
16.04.1 or 17.10 from USB to mount the root volume and make a backup
of the files onto an external disk.

Ubuntu 17.10 now boots on the hardware with nouveau with 3D enabled
(whereas 16.04.1 was 2D-only and the versions in between were broken).
However, before the boot completes, it seems to hang with the text:
[Firmware Bug]: TSC_DEADLINE disabled due to Errata: please update
microcode to version: 0xb20 (or later)
nouveau :01:00.0: bus: MMIO write of 012c FAULT at 10eb14
[ IBUS ]

Wait for a while. (I didn't time it, but the wait time is on the order
of half a minute to a couple of minutes.) Then the boot resumes.

The BIOS update from 2017-09-05 does not update the microcode to the
version the kernel wants to see. However, once Ubuntu 17.10 has been
installed, the intel-microcode package does. (It's probably a good
idea to update the BIOS for AMT and TPM bug fixes anyway.)

I left the box for installing proprietary drivers during installation
unchecked. I'm not sure it checking the box would install the nvidia
proprietary drivers, but the point of going with 17.10 instead
starting with 16.04.1 again is to use nouveau for OpenGL and avoid the
integration problems with the nvidia propriatery drivers.

The wait time during boot repeats with the installed system, but
during the wait, there's no text on the screen by default. Just wait.

On this system, with Ubuntu 17.10, nouveau seems to even qualify for
WebGL2 in Firefox.

There's a huge downside, though:
If the screen stops consuming the DisplayPort data stream, the
graphical session gets killed! So if you do normal things like turn
the screen off or switch input on a multi-input screen, your graphical
session is no longer there when you come back and you get a login
screen instead! (I haven't yet formed an opinion on whether this
behavior can be lived with or not.)

This applies to the live session on the install media, too. Therefore,
it's best to use another virtual console (ctrl-alt-F3) for restoring
backups. (GUI is now some weird dual existence in ctrl-alt-F1 and
ctrl-alt-F2.)

(Fedora 26 still doesn't boot on this hardware. I didn't try Fedora 27 beta.)

P.S. It would be good for productivity if Mozilla issued slightly less
cutting-edge Nvidia GPUs to developers to increase the probability
that support in nouveau has had time to bake.

-- 
Henri Sivonen
hsivo...@hsivonen.fi
https://hsivonen.fi/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform