Re: Unified compilation is going to ride the train

2015-01-13 Thread Ehsan Akhgari
I have started to work on removing support for non-unified builds over in
bug 1121000.x

On Thu, Nov 27, 2014 at 8:12 PM, Mike Hommey m...@glandium.org wrote:

 Hi,

 A year ago, when unified compilation was introduced to speed up builds,
 a couple issues were raised and we conservatively restricted them out
 of aurora/beta/release/esr.

 A year later, it's time to revisit this decision, and since afaik we
 haven't had problems specific to unified compilation on nightlies,
 including for crash reports, we can assume the issues are either gone
 or didn't exist in the first place (one problem that comes to mind is
 bug 943695, and it probably isn't a problem in practice, although weird)

 I know a lot of people have burned non-unified builds now and then.
 That's an annoyance and a distraction for getting things done. If
 unified compilation rides up to beta and we don't see significant
 problems, I think we can disable all periodic non-unified builds
 and make the few builds that are always non-unified unified again (a few
 debug builds are this way).

 The downside from doing so, though, is that non-unified build *will*
 be broken, and code purity (right includes in the right sources,
 mostly) won't be ensured. Do you think this is important enough to keep
 non-unified builds around?

 Mike
 ___
 dev-platform mailing list
 dev-platform@lists.mozilla.org
 https://lists.mozilla.org/listinfo/dev-platform




-- 
Ehsan
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-12-02 Thread Ryan VanderMeulen

On 11/27/2014 9:28 PM, Ehsan Akhgari wrote:

Another point in favor of dropping support for non-unified builds is
that it frees up some infrastructure resources that is currently used to
test those builds, and also makes the builds in some configurations
where we build non-unified by default (such as linux32) faster


We don't run tests on the periodic non-unified builds, FWIW. They're 
just a set of builds that get run every 3 hours on trunk branches.

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-12-02 Thread Ehsan Akhgari

On 2014-12-01 10:22 PM, Karl Tomlinson wrote:

On Fri, 28 Nov 2014 00:46:07 -0800, L. David Baron wrote:

On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?


Another disadvantage here is that it will make adding or removing
source files harder, because you'll have to clean up the accumulated
nonunified bustage that shows up when files are shifted around
between unified files.  (This might be somewhat harder to fix a year
or two later than it is when causing it.)


Would it be helpful to be more explicit about the special
treatment of include files in unified sources?

Perhaps have no include directives in UNIFIED_SOURCES files, but
have a separate .h file at the beginning of the unified file
(explicitly listed or automatically added by the unification) that
includes all necessary headers for all UNIFIED_SOURCES.

I guess it is less likely that include lists would be cleaned up
if they are in separate files, which would bring us back to IWYU.


The real issue is that when you open a .cpp file to edit some code you 
rarely think about what names you are using and where those names are 
declared.  The way I do this personally in many cases is by letting the 
compiler tell me about the names it cannot resolve.  It's not ideal, but 
I really doubt that the cognitive load required for keeping the list of 
include files correct and minimal is worth the effort.  I do however 
think that it's worth running IWYU on our code periodically and clean up 
the mess that we gather over time, but then again most people would not 
be bothered to even do that in their own modules, something I can hardly 
argue against.


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-12-01 Thread Karl Tomlinson
On Fri, 28 Nov 2014 00:46:07 -0800, L. David Baron wrote:
 On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:
 The downside from doing so, though, is that non-unified build *will*
 be broken, and code purity (right includes in the right sources,
 mostly) won't be ensured. Do you think this is important enough to keep
 non-unified builds around?

 Another disadvantage here is that it will make adding or removing
 source files harder, because you'll have to clean up the accumulated
 nonunified bustage that shows up when files are shifted around
 between unified files.  (This might be somewhat harder to fix a year
 or two later than it is when causing it.)

Would it be helpful to be more explicit about the special
treatment of include files in unified sources?

Perhaps have no include directives in UNIFIED_SOURCES files, but
have a separate .h file at the beginning of the unified file
(explicitly listed or automatically added by the unification) that
includes all necessary headers for all UNIFIED_SOURCES.

I guess it is less likely that include lists would be cleaned up
if they are in separate files, which would bring us back to IWYU.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread L. David Baron
On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:
 The downside from doing so, though, is that non-unified build *will*
 be broken, and code purity (right includes in the right sources,
 mostly) won't be ensured. Do you think this is important enough to keep
 non-unified builds around?

Another disadvantage here is that it will make adding or removing
source files harder, because you'll have to clean up the accumulated
nonunified bustage that shows up when files are shifted around
between unified files.  (This might be somewhat harder to fix a year
or two later than it is when causing it.)

-David

-- 
턞   L. David Baron http://dbaron.org/   턂
턢   Mozilla  https://www.mozilla.org/   턂
 Before I built a wall I'd ask to know
 What I was walling in or walling out,
 And to whom I was like to give offense.
   - Robert Frost, Mending Wall (1914)


signature.asc
Description: Digital signature
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Mike Hommey
On Fri, Nov 28, 2014 at 11:57:56AM +0900, ISHIKAWA,chiaki wrote:
 On 2014/11/28 10:12, Mike Hommey wrote:
 Hi,
 
 A year ago, when unified compilation was introduced to speed up builds,
 a couple issues were raised and we conservatively restricted them out
 of aurora/beta/release/esr.
 
 A year later, it's time to revisit this decision, and since afaik we
 haven't had problems specific to unified compilation on nightlies,
 including for crash reports, we can assume the issues are either gone
 or didn't exist in the first place (one problem that comes to mind is
 bug 943695, and it probably isn't a problem in practice, although weird)
 
 I know a lot of people have burned non-unified builds now and then.
 That's an annoyance and a distraction for getting things done. If
 unified compilation rides up to beta and we don't see significant
 problems, I think we can disable all periodic non-unified builds
 and make the few builds that are always non-unified unified again (a few
 debug builds are this way).
 
 The downside from doing so, though, is that non-unified build *will*
 be broken, and code purity (right includes in the right sources,
 mostly) won't be ensured. Do you think this is important enough to keep
 non-unified builds around?
 
 Mike
 
 
 Can we debug unified build binary using gdb under Unixens?
 
 I mean the unified source is gone by the time the binary runs,
 and I wonder what the gdb would do to print the source lines.
 
 Of course, if gdb gets confused, we can create a non-unified binary to use
 source code debugging, etc. [I am talking about C++ side of the debugging.]
 
 But, if gdb gets confused,
 then  not supporting non-unified build for C++ debugging
 sounds tough. People's opinion may vary, but I have seen many issues
 with C++ binary  and so am not comfortable dropping non-unified build,
 if gdb and other debugger aids get confused.
 
 Of course, producing end-user binary wins by using unified compilation. I
 agree.

Local builds have defaulted to unified builds for a year. If that was
causing problems with debuggers, we'd have heard about it a thousand
times already.

Mike
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Mike Hommey
On Fri, Nov 28, 2014 at 12:46:07AM -0800, L. David Baron wrote:
 On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:
  The downside from doing so, though, is that non-unified build *will*
  be broken, and code purity (right includes in the right sources,
  mostly) won't be ensured. Do you think this is important enough to keep
  non-unified builds around?
 
 Another disadvantage here is that it will make adding or removing
 source files harder, because you'll have to clean up the accumulated
 nonunified bustage that shows up when files are shifted around
 between unified files.  (This might be somewhat harder to fix a year
 or two later than it is when causing it.)

Well, if files are shifted around in a what that breaks the build in a
very hard to fix way, it's still possible to adjust moz.build in a way
that avoids the files being shifted around to begin with.

Mike
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Mike Hommey
On Fri, Nov 28, 2014 at 06:29:49PM +0900, Mike Hommey wrote:
 On Fri, Nov 28, 2014 at 12:46:07AM -0800, L. David Baron wrote:
  On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:
   The downside from doing so, though, is that non-unified build *will*
   be broken, and code purity (right includes in the right sources,
   mostly) won't be ensured. Do you think this is important enough to keep
   non-unified builds around?
  
  Another disadvantage here is that it will make adding or removing
  source files harder, because you'll have to clean up the accumulated
  nonunified bustage that shows up when files are shifted around
  between unified files.  (This might be somewhat harder to fix a year
  or two later than it is when causing it.)
 
 Well, if files are shifted around in a what that breaks the build in a
 very hard to fix way, it's still possible to adjust moz.build in a way
 that avoids the files being shifted around to begin with.

Also note that even with both unified and non-unified builds being
green, there are many ways a file shift can trigger build errors.

Mike
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Jonathan Kew

On 28/11/14 08:46, L. David Baron wrote:

On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?


Another disadvantage here is that it will make adding or removing
source files harder, because you'll have to clean up the accumulated
nonunified bustage that shows up when files are shifted around
between unified files.  (This might be somewhat harder to fix a year
or two later than it is when causing it.)



IMO, it seems worth maintaining a non-unified build, to minimize this 
obscure fragility that will otherwise tend to accumulate over time. We 
could reduce the infrastructure load by doing the non-unified build on a 
more occasional basis; perhaps once a day would be enough?


We already have builds that (normally) happen once a day: nightlies. How 
about switching to a pattern where in addition to the nightly build, we 
also kick off a non-unified build for each platform on the same 
changeset? If that fails, we file a bug, and the normal expectation 
should be that such bugs can and will be fixed within a day (more or 
less), so the non-unified builds aren't left perma-broken.


JK

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Nicolas B. Pierron

On 11/28/2014 11:06 AM, Jonathan Kew wrote:

On 28/11/14 08:46, L. David Baron wrote:

On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?


Another disadvantage here is that it will make adding or removing
source files harder, because you'll have to clean up the accumulated
nonunified bustage that shows up when files are shifted around
between unified files.  (This might be somewhat harder to fix a year
or two later than it is when causing it.)



IMO, it seems worth maintaining a non-unified build, to minimize this
obscure fragility that will otherwise tend to accumulate over time. We could
reduce the infrastructure load by doing the non-unified build on a more
occasional basis; perhaps once a day would be enough?

We already have builds that (normally) happen once a day: nightlies. How
about switching to a pattern where in addition to the nightly build, we also
kick off a non-unified build for each platform on the same changeset? If
that fails, we file a bug, and the normal expectation should be that such
bugs can and will be fixed within a day (more or less), so the non-unified
builds aren't left perma-broken.


I agree, we should keep non-unified builds as it keeps our individual files 
valid from the C++ point-of-view.  If this is taking too many resources, 
then I think it is acceptable to do it less frequently.


What is identified by non-unified build is a problem of responsibility. 
Finding missing symbols is the responsibility of the person who is adding 
references without including headers.  This is not at the charge of the 
person who is adding/removing files from a moz.build.


I know I made these mistake multiple times, and having B2G builds reporting 
such issues was helpful at cleaning my patches at the earliest time.


--
Nicolas B. Pierron

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Ehsan Akhgari

On 2014-11-28 6:17 AM, Nicolas B. Pierron wrote:

On 11/28/2014 11:06 AM, Jonathan Kew wrote:

On 28/11/14 08:46, L. David Baron wrote:

On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?


Another disadvantage here is that it will make adding or removing
source files harder, because you'll have to clean up the accumulated
nonunified bustage that shows up when files are shifted around
between unified files.  (This might be somewhat harder to fix a year
or two later than it is when causing it.)



IMO, it seems worth maintaining a non-unified build, to minimize this
obscure fragility that will otherwise tend to accumulate over time. We
could
reduce the infrastructure load by doing the non-unified build on a more
occasional basis; perhaps once a day would be enough?

We already have builds that (normally) happen once a day: nightlies. How
about switching to a pattern where in addition to the nightly build,
we also
kick off a non-unified build for each platform on the same changeset? If
that fails, we file a bug, and the normal expectation should be that such
bugs can and will be fixed within a day (more or less), so the
non-unified
builds aren't left perma-broken.


I agree, we should keep non-unified builds as it keeps our individual
files valid from the C++ point-of-view.  If this is taking too many
resources, then I think it is acceptable to do it less frequently.


The question is: what do we gain from doing that, technical purity 
aside?  Note that as Mike mentioned, even with doing both unified and 
non-unified builds, you may still get build failures when 
adding/removing .cpp files, so keeping support for non-unified builds 
will not fix that issue.



What is identified by non-unified build is a problem of responsibility.
Finding missing symbols is the responsibility of the person who is
adding references without including headers.  This is not at the charge
of the person who is adding/removing files from a moz.build.


That is not the only failure mode though.  You may for example get into 
a situation where Unified_foo0.cpp includes windows.h and that header 
#defines CreateEvent to something else, and you remove a file from the 
unified compilation causing a file from Unified_foo1.cpp to fall into 
Unified_foo0.cpp and break because of the CreateEvent macro that is now 
in effect in that translation unit.


Also, as I have mentioned upthread, we have never been in a situation 
where each source file includes all of the headers that it requires, and 
unified builds only barely make that problem worse.



I know I made these mistake multiple times, and having B2G builds
reporting such issues was helpful at cleaning my patches at the earliest
time.


The point is, no amount of non-unified/unified build config combinations 
can detect scenarios such as the above.


Cheers,
Ehsan
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread ISHIKAWA,chiaki

On 2014/11/28 18:26, Mike Hommey wrote:

On Fri, Nov 28, 2014 at 11:57:56AM +0900, ISHIKAWA,chiaki wrote:

On 2014/11/28 10:12, Mike Hommey wrote:

Hi,

A year ago, when unified compilation was introduced to speed up builds,
a couple issues were raised and we conservatively restricted them out
of aurora/beta/release/esr.

A year later, it's time to revisit this decision, and since afaik we
haven't had problems specific to unified compilation on nightlies,
including for crash reports, we can assume the issues are either gone
or didn't exist in the first place (one problem that comes to mind is
bug 943695, and it probably isn't a problem in practice, although weird)

I know a lot of people have burned non-unified builds now and then.
That's an annoyance and a distraction for getting things done. If
unified compilation rides up to beta and we don't see significant
problems, I think we can disable all periodic non-unified builds
and make the few builds that are always non-unified unified again (a few
debug builds are this way).

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?

Mike



Can we debug unified build binary using gdb under Unixens?

I mean the unified source is gone by the time the binary runs,
and I wonder what the gdb would do to print the source lines.

Of course, if gdb gets confused, we can create a non-unified binary to use
source code debugging, etc. [I am talking about C++ side of the debugging.]

But, if gdb gets confused,
then  not supporting non-unified build for C++ debugging
sounds tough. People's opinion may vary, but I have seen many issues
with C++ binary  and so am not comfortable dropping non-unified build,
if gdb and other debugger aids get confused.

Of course, producing end-user binary wins by using unified compilation. I
agree.


Local builds have defaulted to unified builds for a year. If that was
causing problems with debuggers, we'd have heard about it a thousand
times already.

Mike



Aha, I noticed I have a following line
in my MOZCONFIG file:

ac_add_options --disable-unified-compilation

Hmm... I think I have put it as soon as some trials
with unified compilation occurred.
I will wait and see.

TIA



___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Jonathan Kew

On 28/11/14 14:36, Ehsan Akhgari wrote:


The question is: what do we gain from doing that, technical purity
aside?  Note that as Mike mentioned, even with doing both unified and
non-unified builds, you may still get build failures when
adding/removing .cpp files, so keeping support for non-unified builds
will not fix that issue.


What is identified by non-unified build is a problem of responsibility.
Finding missing symbols is the responsibility of the person who is
adding references without including headers.  This is not at the charge
of the person who is adding/removing files from a moz.build.


That is not the only failure mode though.  You may for example get into
a situation where Unified_foo0.cpp includes windows.h and that header
#defines CreateEvent to something else, and you remove a file from the
unified compilation causing a file from Unified_foo1.cpp to fall into
Unified_foo0.cpp and break because of the CreateEvent macro that is now
in effect in that translation unit.

Also, as I have mentioned upthread, we have never been in a situation
where each source file includes all of the headers that it requires, and
unified builds only barely make that problem worse.


I know I made these mistake multiple times, and having B2G builds
reporting such issues was helpful at cleaning my patches at the earliest
time.


The point is, no amount of non-unified/unified build config combinations
can detect scenarios such as the above.


While it's true that there are failure scenarios that regular 
non-unified builds can not detect, I think the basic problem where a 
source file fails to #include (directly or indirectly) the headers that 
it needs, is common enough to be worth catching early. I've seen it 
happen any number of times, in both my patches and others'. I don't 
think we should allow that pattern to remain - and spread - in the tree, 
as it will become an increasing source of fragility and pain.


JK

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Nicolas B. Pierron

On 11/28/2014 03:36 PM, Ehsan Akhgari wrote:

On 2014-11-28 6:17 AM, Nicolas B. Pierron wrote:

On 11/28/2014 11:06 AM, Jonathan Kew wrote:

On 28/11/14 08:46, L. David Baron wrote:

On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?


Another disadvantage here is that it will make adding or removing
source files harder, because you'll have to clean up the accumulated
nonunified bustage that shows up when files are shifted around
between unified files.  (This might be somewhat harder to fix a year
or two later than it is when causing it.)



IMO, it seems worth maintaining a non-unified build, to minimize this
obscure fragility that will otherwise tend to accumulate over time. We
could
reduce the infrastructure load by doing the non-unified build on a more
occasional basis; perhaps once a day would be enough?

We already have builds that (normally) happen once a day: nightlies. How
about switching to a pattern where in addition to the nightly build,
we also
kick off a non-unified build for each platform on the same changeset? If
that fails, we file a bug, and the normal expectation should be that such
bugs can and will be fixed within a day (more or less), so the
non-unified
builds aren't left perma-broken.


I agree, we should keep non-unified builds as it keeps our individual
files valid from the C++ point-of-view.  If this is taking too many
resources, then I think it is acceptable to do it less frequently.


The question is: what do we gain from doing that, technical purity aside?
Note that as Mike mentioned, even with doing both unified and non-unified
builds, you may still get build failures when adding/removing .cpp files, so
keeping support for non-unified builds will not fix that issue.


Indeed, both will report errors, but not at the same time.

What we gain is some kind of confidence that we might be using the right 
symbols, and not another one which appear to have the same name (similar to 
Bug 1105781)


Note that issues similar to Bug 1105781 might still compile correctly based 
on the breadth first lookup strategy of namespaces.  This problem is not 
unique to unified builds, only emphasized.



What is identified by non-unified build is a problem of responsibility.
Finding missing symbols is the responsibility of the person who is
adding references without including headers.  This is not at the charge
of the person who is adding/removing files from a moz.build.


That is not the only failure mode though.  You may for example get into a
situation where Unified_foo0.cpp includes windows.h and that header #defines
CreateEvent to something else, and you remove a file from the unified
compilation causing a file from Unified_foo1.cpp to fall into
Unified_foo0.cpp and break because of the CreateEvent macro that is now in
effect in that translation unit.

Also, as I have mentioned upthread, we have never been in a situation where
each source file includes all of the headers that it requires, and unified
builds only barely make that problem worse.


I would be happy to drop the non-unified builds if we have a way to verify 
that the symbol resolution are bound the same way in unified and non-unified 
builds.


In fact, the only value provided by the non-unified build does not imply 
that we have to link anything.  Only that we resolve the symbols in a 
translation unit.


Would there be a way to instrument a compiler such that we can produce such 
reports in unified build and non-unified builds, and then compare them 
against each others?  This would also be useful for class definitions in 
headers, to ensure that one class is always compiled the same way in all 
translation units.



--
Nicolas B. Pierron
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Ehsan Akhgari

On 2014-11-28 11:02 AM, Jonathan Kew wrote:

On 28/11/14 14:36, Ehsan Akhgari wrote:


The question is: what do we gain from doing that, technical purity
aside?  Note that as Mike mentioned, even with doing both unified and
non-unified builds, you may still get build failures when
adding/removing .cpp files, so keeping support for non-unified builds
will not fix that issue.


What is identified by non-unified build is a problem of responsibility.
Finding missing symbols is the responsibility of the person who is
adding references without including headers.  This is not at the charge
of the person who is adding/removing files from a moz.build.


That is not the only failure mode though.  You may for example get into
a situation where Unified_foo0.cpp includes windows.h and that header
#defines CreateEvent to something else, and you remove a file from the
unified compilation causing a file from Unified_foo1.cpp to fall into
Unified_foo0.cpp and break because of the CreateEvent macro that is now
in effect in that translation unit.

Also, as I have mentioned upthread, we have never been in a situation
where each source file includes all of the headers that it requires, and
unified builds only barely make that problem worse.


I know I made these mistake multiple times, and having B2G builds
reporting such issues was helpful at cleaning my patches at the earliest
time.


The point is, no amount of non-unified/unified build config combinations
can detect scenarios such as the above.


While it's true that there are failure scenarios that regular
non-unified builds can not detect, I think the basic problem where a
source file fails to #include (directly or indirectly) the headers that
it needs, is common enough to be worth catching early. I've seen it
happen any number of times, in both my patches and others'. I don't
think we should allow that pattern to remain - and spread - in the tree,
as it will become an increasing source of fragility and pain.


I'm not disagreeing with you, I'm just trying to say that we have 
_never_ imposed this, either before or after we introduced unified 
builds.  See the dependencies of 
https://bugzilla.mozilla.org/show_bug.cgi?id=includehell for a lot of 
examples of those cases.  Therefore, dropping support for non-unified 
builds neither makes things better or worse from this perspective.


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Ehsan Akhgari

On 2014-11-28 11:18 AM, Nicolas B. Pierron wrote:

On 11/28/2014 03:36 PM, Ehsan Akhgari wrote:

On 2014-11-28 6:17 AM, Nicolas B. Pierron wrote:

On 11/28/2014 11:06 AM, Jonathan Kew wrote:

On 28/11/14 08:46, L. David Baron wrote:

On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to
keep
non-unified builds around?


Another disadvantage here is that it will make adding or removing
source files harder, because you'll have to clean up the accumulated
nonunified bustage that shows up when files are shifted around
between unified files.  (This might be somewhat harder to fix a year
or two later than it is when causing it.)



IMO, it seems worth maintaining a non-unified build, to minimize this
obscure fragility that will otherwise tend to accumulate over time. We
could
reduce the infrastructure load by doing the non-unified build on a more
occasional basis; perhaps once a day would be enough?

We already have builds that (normally) happen once a day: nightlies.
How
about switching to a pattern where in addition to the nightly build,
we also
kick off a non-unified build for each platform on the same
changeset? If
that fails, we file a bug, and the normal expectation should be that
such
bugs can and will be fixed within a day (more or less), so the
non-unified
builds aren't left perma-broken.


I agree, we should keep non-unified builds as it keeps our individual
files valid from the C++ point-of-view.  If this is taking too many
resources, then I think it is acceptable to do it less frequently.


The question is: what do we gain from doing that, technical purity aside?
Note that as Mike mentioned, even with doing both unified and non-unified
builds, you may still get build failures when adding/removing .cpp
files, so
keeping support for non-unified builds will not fix that issue.


Indeed, both will report errors, but not at the same time.

What we gain is some kind of confidence that we might be using the right
symbols, and not another one which appear to have the same name (similar
to Bug 1105781)

Note that issues similar to Bug 1105781 might still compile correctly
based on the breadth first lookup strategy of namespaces.  This problem
is not unique to unified builds, only emphasized.


Bug 1105781 is a great example of an issue that has nothing to do with 
unified builds.  ;-)  The reason why these issues sometimes keep coming 
up with (non-)unified builds is that each configuration compiles a 
different amount of code in the same translation unit.  That is in fact 
an argument _for_ dropping support for non-unified builds, as it reduces 
the number of build configurations we need to keep compiling.



What is identified by non-unified build is a problem of responsibility.
Finding missing symbols is the responsibility of the person who is
adding references without including headers.  This is not at the charge
of the person who is adding/removing files from a moz.build.


That is not the only failure mode though.  You may for example get into a
situation where Unified_foo0.cpp includes windows.h and that header
#defines
CreateEvent to something else, and you remove a file from the unified
compilation causing a file from Unified_foo1.cpp to fall into
Unified_foo0.cpp and break because of the CreateEvent macro that is
now in
effect in that translation unit.

Also, as I have mentioned upthread, we have never been in a situation
where
each source file includes all of the headers that it requires, and
unified
builds only barely make that problem worse.


I would be happy to drop the non-unified builds if we have a way to
verify that the symbol resolution are bound the same way in unified and
non-unified builds.


https://code.google.com/p/include-what-you-use/

At the risk of repeating myself once more, this has nothing to do with 
unified builds, and we have never been in a good spot with regards to 
this.  Also as someone who has run this tool many times on our code 
base, using it is a losing battle, people just keep adding more of these 
issues, because for the majority of people, if something compiles now, 
it's good enough from this perspective.



In fact, the only value provided by the non-unified build does not imply
that we have to link anything.  Only that we resolve the symbols in a
translation unit.


Translation units do not matter for this purpose, source code files (for 
example .h files) do.



Would there be a way to instrument a compiler such that we can produce
such reports in unified build and non-unified builds, and then compare
them against each others?  This would also be useful for class
definitions in headers, to ensure that one class is always compiled the
same way in all translation units.


Again, IWYU is the tool.
___
dev-platform 

Unified compilation is going to ride the train

2014-11-27 Thread Mike Hommey
Hi,

A year ago, when unified compilation was introduced to speed up builds,
a couple issues were raised and we conservatively restricted them out
of aurora/beta/release/esr.

A year later, it's time to revisit this decision, and since afaik we
haven't had problems specific to unified compilation on nightlies,
including for crash reports, we can assume the issues are either gone
or didn't exist in the first place (one problem that comes to mind is
bug 943695, and it probably isn't a problem in practice, although weird)

I know a lot of people have burned non-unified builds now and then.
That's an annoyance and a distraction for getting things done. If
unified compilation rides up to beta and we don't see significant
problems, I think we can disable all periodic non-unified builds
and make the few builds that are always non-unified unified again (a few
debug builds are this way).

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?

Mike
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-27 Thread Ehsan Akhgari

On 2014-11-27 8:12 PM, Mike Hommey wrote:

Hi,

A year ago, when unified compilation was introduced to speed up builds,
a couple issues were raised and we conservatively restricted them out
of aurora/beta/release/esr.

A year later, it's time to revisit this decision, and since afaik we
haven't had problems specific to unified compilation on nightlies,
including for crash reports, we can assume the issues are either gone
or didn't exist in the first place (one problem that comes to mind is
bug 943695, and it probably isn't a problem in practice, although weird)

I know a lot of people have burned non-unified builds now and then.
That's an annoyance and a distraction for getting things done. If
unified compilation rides up to beta and we don't see significant
problems, I think we can disable all periodic non-unified builds
and make the few builds that are always non-unified unified again (a few
debug builds are this way).


I am not aware of any major outstanding issues with unified builds.


The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?


FWIW we have never really ensured the right includes in the right 
sources.  ;-)  But more seriously, I think the benefits of unified 
builds in terms of build speed + the benefits of having one less build 
configuration that people can break outweighs the technical purity here. 
 So I'd support dropping support for non-unified builds.


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-27 Thread Ehsan Akhgari

On 2014-11-27 9:16 PM, Ehsan Akhgari wrote:

On 2014-11-27 8:12 PM, Mike Hommey wrote:

Hi,

A year ago, when unified compilation was introduced to speed up builds,
a couple issues were raised and we conservatively restricted them out
of aurora/beta/release/esr.

A year later, it's time to revisit this decision, and since afaik we
haven't had problems specific to unified compilation on nightlies,
including for crash reports, we can assume the issues are either gone
or didn't exist in the first place (one problem that comes to mind is
bug 943695, and it probably isn't a problem in practice, although weird)

I know a lot of people have burned non-unified builds now and then.
That's an annoyance and a distraction for getting things done. If
unified compilation rides up to beta and we don't see significant
problems, I think we can disable all periodic non-unified builds
and make the few builds that are always non-unified unified again (a few
debug builds are this way).


I am not aware of any major outstanding issues with unified builds.


The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?


FWIW we have never really ensured the right includes in the right
sources.  ;-)  But more seriously, I think the benefits of unified
builds in terms of build speed + the benefits of having one less build
configuration that people can break outweighs the technical purity here.
  So I'd support dropping support for non-unified builds.


Another point in favor of dropping support for non-unified builds is 
that it frees up some infrastructure resources that is currently used to 
test those builds, and also makes the builds in some configurations 
where we build non-unified by default (such as linux32) faster.


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-27 Thread ISHIKAWA,chiaki

On 2014/11/28 10:12, Mike Hommey wrote:

Hi,

A year ago, when unified compilation was introduced to speed up builds,
a couple issues were raised and we conservatively restricted them out
of aurora/beta/release/esr.

A year later, it's time to revisit this decision, and since afaik we
haven't had problems specific to unified compilation on nightlies,
including for crash reports, we can assume the issues are either gone
or didn't exist in the first place (one problem that comes to mind is
bug 943695, and it probably isn't a problem in practice, although weird)

I know a lot of people have burned non-unified builds now and then.
That's an annoyance and a distraction for getting things done. If
unified compilation rides up to beta and we don't see significant
problems, I think we can disable all periodic non-unified builds
and make the few builds that are always non-unified unified again (a few
debug builds are this way).

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?

Mike



Can we debug unified build binary using gdb under Unixens?

I mean the unified source is gone by the time the binary runs,
and I wonder what the gdb would do to print the source lines.

Of course, if gdb gets confused, we can create a non-unified binary to 
use source code debugging, etc. [I am talking about C++ side of the 
debugging.]


But, if gdb gets confused,
then  not supporting non-unified build for C++ debugging
sounds tough. People's opinion may vary, but I have seen many issues
with C++ binary  and so am not comfortable dropping non-unified build,
if gdb and other debugger aids get confused.

Of course, producing end-user binary wins by using unified compilation. 
I agree.


TIA


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform