Re: Unified compilation is going to ride the train

2014-11-28 Thread L. David Baron
On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:
 The downside from doing so, though, is that non-unified build *will*
 be broken, and code purity (right includes in the right sources,
 mostly) won't be ensured. Do you think this is important enough to keep
 non-unified builds around?

Another disadvantage here is that it will make adding or removing
source files harder, because you'll have to clean up the accumulated
nonunified bustage that shows up when files are shifted around
between unified files.  (This might be somewhat harder to fix a year
or two later than it is when causing it.)

-David

-- 
턞   L. David Baron http://dbaron.org/   턂
턢   Mozilla  https://www.mozilla.org/   턂
 Before I built a wall I'd ask to know
 What I was walling in or walling out,
 And to whom I was like to give offense.
   - Robert Frost, Mending Wall (1914)


signature.asc
Description: Digital signature
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: prebuilt libraries?

2014-11-28 Thread Neil

Gregory Szorc wrote:


Please read http://www.conifersystems.com/whitepapers/gnu-make/.


after a command fails, |make| does not delete the partially built 
output file


.DELETE_ON_ERROR was added to address this.

--
Warning: May contain traces of nuts.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Mike Hommey
On Fri, Nov 28, 2014 at 11:57:56AM +0900, ISHIKAWA,chiaki wrote:
 On 2014/11/28 10:12, Mike Hommey wrote:
 Hi,
 
 A year ago, when unified compilation was introduced to speed up builds,
 a couple issues were raised and we conservatively restricted them out
 of aurora/beta/release/esr.
 
 A year later, it's time to revisit this decision, and since afaik we
 haven't had problems specific to unified compilation on nightlies,
 including for crash reports, we can assume the issues are either gone
 or didn't exist in the first place (one problem that comes to mind is
 bug 943695, and it probably isn't a problem in practice, although weird)
 
 I know a lot of people have burned non-unified builds now and then.
 That's an annoyance and a distraction for getting things done. If
 unified compilation rides up to beta and we don't see significant
 problems, I think we can disable all periodic non-unified builds
 and make the few builds that are always non-unified unified again (a few
 debug builds are this way).
 
 The downside from doing so, though, is that non-unified build *will*
 be broken, and code purity (right includes in the right sources,
 mostly) won't be ensured. Do you think this is important enough to keep
 non-unified builds around?
 
 Mike
 
 
 Can we debug unified build binary using gdb under Unixens?
 
 I mean the unified source is gone by the time the binary runs,
 and I wonder what the gdb would do to print the source lines.
 
 Of course, if gdb gets confused, we can create a non-unified binary to use
 source code debugging, etc. [I am talking about C++ side of the debugging.]
 
 But, if gdb gets confused,
 then  not supporting non-unified build for C++ debugging
 sounds tough. People's opinion may vary, but I have seen many issues
 with C++ binary  and so am not comfortable dropping non-unified build,
 if gdb and other debugger aids get confused.
 
 Of course, producing end-user binary wins by using unified compilation. I
 agree.

Local builds have defaulted to unified builds for a year. If that was
causing problems with debuggers, we'd have heard about it a thousand
times already.

Mike
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Mike Hommey
On Fri, Nov 28, 2014 at 12:46:07AM -0800, L. David Baron wrote:
 On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:
  The downside from doing so, though, is that non-unified build *will*
  be broken, and code purity (right includes in the right sources,
  mostly) won't be ensured. Do you think this is important enough to keep
  non-unified builds around?
 
 Another disadvantage here is that it will make adding or removing
 source files harder, because you'll have to clean up the accumulated
 nonunified bustage that shows up when files are shifted around
 between unified files.  (This might be somewhat harder to fix a year
 or two later than it is when causing it.)

Well, if files are shifted around in a what that breaks the build in a
very hard to fix way, it's still possible to adjust moz.build in a way
that avoids the files being shifted around to begin with.

Mike
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Mike Hommey
On Fri, Nov 28, 2014 at 06:29:49PM +0900, Mike Hommey wrote:
 On Fri, Nov 28, 2014 at 12:46:07AM -0800, L. David Baron wrote:
  On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:
   The downside from doing so, though, is that non-unified build *will*
   be broken, and code purity (right includes in the right sources,
   mostly) won't be ensured. Do you think this is important enough to keep
   non-unified builds around?
  
  Another disadvantage here is that it will make adding or removing
  source files harder, because you'll have to clean up the accumulated
  nonunified bustage that shows up when files are shifted around
  between unified files.  (This might be somewhat harder to fix a year
  or two later than it is when causing it.)
 
 Well, if files are shifted around in a what that breaks the build in a
 very hard to fix way, it's still possible to adjust moz.build in a way
 that avoids the files being shifted around to begin with.

Also note that even with both unified and non-unified builds being
green, there are many ways a file shift can trigger build errors.

Mike
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: prebuilt libraries?

2014-11-28 Thread Thomas Zimmermann
Hi Gregory

 
 Please read http://www.conifersystems.com/whitepapers/gnu-make/. That is
 one of my go to articles for explaining why make sucks.

I would not point people to this article as it is flawed. I won't go
through the points it mentions. Some are relevant, others aren't, and
some probably depend on the user's expectation.

What I really criticize is that the authors are often simply ignorant.
There are several examples of this, but the worst one is the case of
recursive make. They cite Recursive Make Considered Harmful, yet they
insist on using recursive make and then complain about how it leads to
problems; ignoring existing solutions provided in RMCH.

Another point to mention is that Conifer Systems sells a competing build
system. They have a financial interest in making make look bad; in
contrast to merely improving the State Of The Art.

Best regards
Thomas

p.s. I'd rather like to stop the discussion soon, because it's quite OT
at this point.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Jonathan Kew

On 28/11/14 08:46, L. David Baron wrote:

On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?


Another disadvantage here is that it will make adding or removing
source files harder, because you'll have to clean up the accumulated
nonunified bustage that shows up when files are shifted around
between unified files.  (This might be somewhat harder to fix a year
or two later than it is when causing it.)



IMO, it seems worth maintaining a non-unified build, to minimize this 
obscure fragility that will otherwise tend to accumulate over time. We 
could reduce the infrastructure load by doing the non-unified build on a 
more occasional basis; perhaps once a day would be enough?


We already have builds that (normally) happen once a day: nightlies. How 
about switching to a pattern where in addition to the nightly build, we 
also kick off a non-unified build for each platform on the same 
changeset? If that fails, we file a bug, and the normal expectation 
should be that such bugs can and will be fixed within a day (more or 
less), so the non-unified builds aren't left perma-broken.


JK

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Nicolas B. Pierron

On 11/28/2014 11:06 AM, Jonathan Kew wrote:

On 28/11/14 08:46, L. David Baron wrote:

On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?


Another disadvantage here is that it will make adding or removing
source files harder, because you'll have to clean up the accumulated
nonunified bustage that shows up when files are shifted around
between unified files.  (This might be somewhat harder to fix a year
or two later than it is when causing it.)



IMO, it seems worth maintaining a non-unified build, to minimize this
obscure fragility that will otherwise tend to accumulate over time. We could
reduce the infrastructure load by doing the non-unified build on a more
occasional basis; perhaps once a day would be enough?

We already have builds that (normally) happen once a day: nightlies. How
about switching to a pattern where in addition to the nightly build, we also
kick off a non-unified build for each platform on the same changeset? If
that fails, we file a bug, and the normal expectation should be that such
bugs can and will be fixed within a day (more or less), so the non-unified
builds aren't left perma-broken.


I agree, we should keep non-unified builds as it keeps our individual files 
valid from the C++ point-of-view.  If this is taking too many resources, 
then I think it is acceptable to do it less frequently.


What is identified by non-unified build is a problem of responsibility. 
Finding missing symbols is the responsibility of the person who is adding 
references without including headers.  This is not at the charge of the 
person who is adding/removing files from a moz.build.


I know I made these mistake multiple times, and having B2G builds reporting 
such issues was helpful at cleaning my patches at the earliest time.


--
Nicolas B. Pierron

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Critical - XULRunner 34 fails with Couldn't load XPCOM in MacOSX

2014-11-28 Thread Benjamin Smedberg


On 11/27/2014 10:38 PM, allencb...@gmail.com wrote:

I've reported this in bugzilla. Any one has any workaround?


I commented in the bug 
(https://bugzilla.mozilla.org/show_bug.cgi?id=1105044). This is probably 
due to the MacOS v2 signing work which restructured the bundles. Since 
XULRunner is not maintained, you or other XULRunner users will probably 
have to debug this to figure out what's going on and propose a solution.


--BDS

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Ehsan Akhgari

On 2014-11-28 6:17 AM, Nicolas B. Pierron wrote:

On 11/28/2014 11:06 AM, Jonathan Kew wrote:

On 28/11/14 08:46, L. David Baron wrote:

On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?


Another disadvantage here is that it will make adding or removing
source files harder, because you'll have to clean up the accumulated
nonunified bustage that shows up when files are shifted around
between unified files.  (This might be somewhat harder to fix a year
or two later than it is when causing it.)



IMO, it seems worth maintaining a non-unified build, to minimize this
obscure fragility that will otherwise tend to accumulate over time. We
could
reduce the infrastructure load by doing the non-unified build on a more
occasional basis; perhaps once a day would be enough?

We already have builds that (normally) happen once a day: nightlies. How
about switching to a pattern where in addition to the nightly build,
we also
kick off a non-unified build for each platform on the same changeset? If
that fails, we file a bug, and the normal expectation should be that such
bugs can and will be fixed within a day (more or less), so the
non-unified
builds aren't left perma-broken.


I agree, we should keep non-unified builds as it keeps our individual
files valid from the C++ point-of-view.  If this is taking too many
resources, then I think it is acceptable to do it less frequently.


The question is: what do we gain from doing that, technical purity 
aside?  Note that as Mike mentioned, even with doing both unified and 
non-unified builds, you may still get build failures when 
adding/removing .cpp files, so keeping support for non-unified builds 
will not fix that issue.



What is identified by non-unified build is a problem of responsibility.
Finding missing symbols is the responsibility of the person who is
adding references without including headers.  This is not at the charge
of the person who is adding/removing files from a moz.build.


That is not the only failure mode though.  You may for example get into 
a situation where Unified_foo0.cpp includes windows.h and that header 
#defines CreateEvent to something else, and you remove a file from the 
unified compilation causing a file from Unified_foo1.cpp to fall into 
Unified_foo0.cpp and break because of the CreateEvent macro that is now 
in effect in that translation unit.


Also, as I have mentioned upthread, we have never been in a situation 
where each source file includes all of the headers that it requires, and 
unified builds only barely make that problem worse.



I know I made these mistake multiple times, and having B2G builds
reporting such issues was helpful at cleaning my patches at the earliest
time.


The point is, no amount of non-unified/unified build config combinations 
can detect scenarios such as the above.


Cheers,
Ehsan
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread ISHIKAWA,chiaki

On 2014/11/28 18:26, Mike Hommey wrote:

On Fri, Nov 28, 2014 at 11:57:56AM +0900, ISHIKAWA,chiaki wrote:

On 2014/11/28 10:12, Mike Hommey wrote:

Hi,

A year ago, when unified compilation was introduced to speed up builds,
a couple issues were raised and we conservatively restricted them out
of aurora/beta/release/esr.

A year later, it's time to revisit this decision, and since afaik we
haven't had problems specific to unified compilation on nightlies,
including for crash reports, we can assume the issues are either gone
or didn't exist in the first place (one problem that comes to mind is
bug 943695, and it probably isn't a problem in practice, although weird)

I know a lot of people have burned non-unified builds now and then.
That's an annoyance and a distraction for getting things done. If
unified compilation rides up to beta and we don't see significant
problems, I think we can disable all periodic non-unified builds
and make the few builds that are always non-unified unified again (a few
debug builds are this way).

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?

Mike



Can we debug unified build binary using gdb under Unixens?

I mean the unified source is gone by the time the binary runs,
and I wonder what the gdb would do to print the source lines.

Of course, if gdb gets confused, we can create a non-unified binary to use
source code debugging, etc. [I am talking about C++ side of the debugging.]

But, if gdb gets confused,
then  not supporting non-unified build for C++ debugging
sounds tough. People's opinion may vary, but I have seen many issues
with C++ binary  and so am not comfortable dropping non-unified build,
if gdb and other debugger aids get confused.

Of course, producing end-user binary wins by using unified compilation. I
agree.


Local builds have defaulted to unified builds for a year. If that was
causing problems with debuggers, we'd have heard about it a thousand
times already.

Mike



Aha, I noticed I have a following line
in my MOZCONFIG file:

ac_add_options --disable-unified-compilation

Hmm... I think I have put it as soon as some trials
with unified compilation occurred.
I will wait and see.

TIA



___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Jonathan Kew

On 28/11/14 14:36, Ehsan Akhgari wrote:


The question is: what do we gain from doing that, technical purity
aside?  Note that as Mike mentioned, even with doing both unified and
non-unified builds, you may still get build failures when
adding/removing .cpp files, so keeping support for non-unified builds
will not fix that issue.


What is identified by non-unified build is a problem of responsibility.
Finding missing symbols is the responsibility of the person who is
adding references without including headers.  This is not at the charge
of the person who is adding/removing files from a moz.build.


That is not the only failure mode though.  You may for example get into
a situation where Unified_foo0.cpp includes windows.h and that header
#defines CreateEvent to something else, and you remove a file from the
unified compilation causing a file from Unified_foo1.cpp to fall into
Unified_foo0.cpp and break because of the CreateEvent macro that is now
in effect in that translation unit.

Also, as I have mentioned upthread, we have never been in a situation
where each source file includes all of the headers that it requires, and
unified builds only barely make that problem worse.


I know I made these mistake multiple times, and having B2G builds
reporting such issues was helpful at cleaning my patches at the earliest
time.


The point is, no amount of non-unified/unified build config combinations
can detect scenarios such as the above.


While it's true that there are failure scenarios that regular 
non-unified builds can not detect, I think the basic problem where a 
source file fails to #include (directly or indirectly) the headers that 
it needs, is common enough to be worth catching early. I've seen it 
happen any number of times, in both my patches and others'. I don't 
think we should allow that pattern to remain - and spread - in the tree, 
as it will become an increasing source of fragility and pain.


JK

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Nicolas B. Pierron

On 11/28/2014 03:36 PM, Ehsan Akhgari wrote:

On 2014-11-28 6:17 AM, Nicolas B. Pierron wrote:

On 11/28/2014 11:06 AM, Jonathan Kew wrote:

On 28/11/14 08:46, L. David Baron wrote:

On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to keep
non-unified builds around?


Another disadvantage here is that it will make adding or removing
source files harder, because you'll have to clean up the accumulated
nonunified bustage that shows up when files are shifted around
between unified files.  (This might be somewhat harder to fix a year
or two later than it is when causing it.)



IMO, it seems worth maintaining a non-unified build, to minimize this
obscure fragility that will otherwise tend to accumulate over time. We
could
reduce the infrastructure load by doing the non-unified build on a more
occasional basis; perhaps once a day would be enough?

We already have builds that (normally) happen once a day: nightlies. How
about switching to a pattern where in addition to the nightly build,
we also
kick off a non-unified build for each platform on the same changeset? If
that fails, we file a bug, and the normal expectation should be that such
bugs can and will be fixed within a day (more or less), so the
non-unified
builds aren't left perma-broken.


I agree, we should keep non-unified builds as it keeps our individual
files valid from the C++ point-of-view.  If this is taking too many
resources, then I think it is acceptable to do it less frequently.


The question is: what do we gain from doing that, technical purity aside?
Note that as Mike mentioned, even with doing both unified and non-unified
builds, you may still get build failures when adding/removing .cpp files, so
keeping support for non-unified builds will not fix that issue.


Indeed, both will report errors, but not at the same time.

What we gain is some kind of confidence that we might be using the right 
symbols, and not another one which appear to have the same name (similar to 
Bug 1105781)


Note that issues similar to Bug 1105781 might still compile correctly based 
on the breadth first lookup strategy of namespaces.  This problem is not 
unique to unified builds, only emphasized.



What is identified by non-unified build is a problem of responsibility.
Finding missing symbols is the responsibility of the person who is
adding references without including headers.  This is not at the charge
of the person who is adding/removing files from a moz.build.


That is not the only failure mode though.  You may for example get into a
situation where Unified_foo0.cpp includes windows.h and that header #defines
CreateEvent to something else, and you remove a file from the unified
compilation causing a file from Unified_foo1.cpp to fall into
Unified_foo0.cpp and break because of the CreateEvent macro that is now in
effect in that translation unit.

Also, as I have mentioned upthread, we have never been in a situation where
each source file includes all of the headers that it requires, and unified
builds only barely make that problem worse.


I would be happy to drop the non-unified builds if we have a way to verify 
that the symbol resolution are bound the same way in unified and non-unified 
builds.


In fact, the only value provided by the non-unified build does not imply 
that we have to link anything.  Only that we resolve the symbols in a 
translation unit.


Would there be a way to instrument a compiler such that we can produce such 
reports in unified build and non-unified builds, and then compare them 
against each others?  This would also be useful for class definitions in 
headers, to ensure that one class is always compiled the same way in all 
translation units.



--
Nicolas B. Pierron
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Ehsan Akhgari

On 2014-11-28 11:02 AM, Jonathan Kew wrote:

On 28/11/14 14:36, Ehsan Akhgari wrote:


The question is: what do we gain from doing that, technical purity
aside?  Note that as Mike mentioned, even with doing both unified and
non-unified builds, you may still get build failures when
adding/removing .cpp files, so keeping support for non-unified builds
will not fix that issue.


What is identified by non-unified build is a problem of responsibility.
Finding missing symbols is the responsibility of the person who is
adding references without including headers.  This is not at the charge
of the person who is adding/removing files from a moz.build.


That is not the only failure mode though.  You may for example get into
a situation where Unified_foo0.cpp includes windows.h and that header
#defines CreateEvent to something else, and you remove a file from the
unified compilation causing a file from Unified_foo1.cpp to fall into
Unified_foo0.cpp and break because of the CreateEvent macro that is now
in effect in that translation unit.

Also, as I have mentioned upthread, we have never been in a situation
where each source file includes all of the headers that it requires, and
unified builds only barely make that problem worse.


I know I made these mistake multiple times, and having B2G builds
reporting such issues was helpful at cleaning my patches at the earliest
time.


The point is, no amount of non-unified/unified build config combinations
can detect scenarios such as the above.


While it's true that there are failure scenarios that regular
non-unified builds can not detect, I think the basic problem where a
source file fails to #include (directly or indirectly) the headers that
it needs, is common enough to be worth catching early. I've seen it
happen any number of times, in both my patches and others'. I don't
think we should allow that pattern to remain - and spread - in the tree,
as it will become an increasing source of fragility and pain.


I'm not disagreeing with you, I'm just trying to say that we have 
_never_ imposed this, either before or after we introduced unified 
builds.  See the dependencies of 
https://bugzilla.mozilla.org/show_bug.cgi?id=includehell for a lot of 
examples of those cases.  Therefore, dropping support for non-unified 
builds neither makes things better or worse from this perspective.


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Unified compilation is going to ride the train

2014-11-28 Thread Ehsan Akhgari

On 2014-11-28 11:18 AM, Nicolas B. Pierron wrote:

On 11/28/2014 03:36 PM, Ehsan Akhgari wrote:

On 2014-11-28 6:17 AM, Nicolas B. Pierron wrote:

On 11/28/2014 11:06 AM, Jonathan Kew wrote:

On 28/11/14 08:46, L. David Baron wrote:

On Friday 2014-11-28 10:12 +0900, Mike Hommey wrote:

The downside from doing so, though, is that non-unified build *will*
be broken, and code purity (right includes in the right sources,
mostly) won't be ensured. Do you think this is important enough to
keep
non-unified builds around?


Another disadvantage here is that it will make adding or removing
source files harder, because you'll have to clean up the accumulated
nonunified bustage that shows up when files are shifted around
between unified files.  (This might be somewhat harder to fix a year
or two later than it is when causing it.)



IMO, it seems worth maintaining a non-unified build, to minimize this
obscure fragility that will otherwise tend to accumulate over time. We
could
reduce the infrastructure load by doing the non-unified build on a more
occasional basis; perhaps once a day would be enough?

We already have builds that (normally) happen once a day: nightlies.
How
about switching to a pattern where in addition to the nightly build,
we also
kick off a non-unified build for each platform on the same
changeset? If
that fails, we file a bug, and the normal expectation should be that
such
bugs can and will be fixed within a day (more or less), so the
non-unified
builds aren't left perma-broken.


I agree, we should keep non-unified builds as it keeps our individual
files valid from the C++ point-of-view.  If this is taking too many
resources, then I think it is acceptable to do it less frequently.


The question is: what do we gain from doing that, technical purity aside?
Note that as Mike mentioned, even with doing both unified and non-unified
builds, you may still get build failures when adding/removing .cpp
files, so
keeping support for non-unified builds will not fix that issue.


Indeed, both will report errors, but not at the same time.

What we gain is some kind of confidence that we might be using the right
symbols, and not another one which appear to have the same name (similar
to Bug 1105781)

Note that issues similar to Bug 1105781 might still compile correctly
based on the breadth first lookup strategy of namespaces.  This problem
is not unique to unified builds, only emphasized.


Bug 1105781 is a great example of an issue that has nothing to do with 
unified builds.  ;-)  The reason why these issues sometimes keep coming 
up with (non-)unified builds is that each configuration compiles a 
different amount of code in the same translation unit.  That is in fact 
an argument _for_ dropping support for non-unified builds, as it reduces 
the number of build configurations we need to keep compiling.



What is identified by non-unified build is a problem of responsibility.
Finding missing symbols is the responsibility of the person who is
adding references without including headers.  This is not at the charge
of the person who is adding/removing files from a moz.build.


That is not the only failure mode though.  You may for example get into a
situation where Unified_foo0.cpp includes windows.h and that header
#defines
CreateEvent to something else, and you remove a file from the unified
compilation causing a file from Unified_foo1.cpp to fall into
Unified_foo0.cpp and break because of the CreateEvent macro that is
now in
effect in that translation unit.

Also, as I have mentioned upthread, we have never been in a situation
where
each source file includes all of the headers that it requires, and
unified
builds only barely make that problem worse.


I would be happy to drop the non-unified builds if we have a way to
verify that the symbol resolution are bound the same way in unified and
non-unified builds.


https://code.google.com/p/include-what-you-use/

At the risk of repeating myself once more, this has nothing to do with 
unified builds, and we have never been in a good spot with regards to 
this.  Also as someone who has run this tool many times on our code 
base, using it is a losing battle, people just keep adding more of these 
issues, because for the majority of people, if something compiles now, 
it's good enough from this perspective.



In fact, the only value provided by the non-unified build does not imply
that we have to link anything.  Only that we resolve the symbols in a
translation unit.


Translation units do not matter for this purpose, source code files (for 
example .h files) do.



Would there be a way to instrument a compiler such that we can produce
such reports in unified build and non-unified builds, and then compare
them against each others?  This would also be useful for class
definitions in headers, to ensure that one class is always compiled the
same way in all translation units.


Again, IWYU is the tool.
___
dev-platform 

Re: Studying Lossy Image Compression Efficiency, July 2014

2014-11-28 Thread songofapollo
On Tuesday, July 15, 2014 7:34:35 AM UTC-7, Josh Aas wrote:
 This is the discussion thread for Mozilla's July 2014 Lossy Compressed Image 
 Formats Study and the Mozilla Research blog post entitled Mozilla Advances 
 JPEG Encoding with mozjpeg 2.0.

It would help if you would use much more distinct colors in your graphs of the 
results. It can be very hard to keep track of which is which. You used two 
shades of red/purple, and three shades of blue/green/teal. That's a bizarre 
decision for graphs meant to be easily understood.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Default storage

2014-11-28 Thread Jan Varga

Hi,

Just a heads up that default storage has landed on m-c, bug 1083927.
I'm posting to this list because there are some changes in the way how 
quota manager clients store data (mostly IndexedDB).


Old structure:
profile
  storage
persistent
temporary

New structure:
profile
  storage
permanent
temporary
default

Some scripts (especially for B2G) put stuff into profiles directly, so 
they have to be adjusted a bit.

I expect that in most cases it should be enough to change:

profile/storage/persistent
to:
profile/storage/permanent

or

profile/storage/persistent/chrome
to:
profile/storage/permanent/chrome

For more details see the bug and sorry for any inconvenience.

Jan

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform